Measuring me: is your body the future of security?
Apple has reignited the biometric-security debate by including a fingerprint scanner on the iPhone 5s. However, the possibilities for authenticating our devices using our bodies goes much further than our fingers.
Forget fingerprints; passwords are passé: what about smartphones that measure how hands “shake” when clicking icons; keyboards that analyse the speed and style of your typing; and wearable computers that track the way you walk and the pattern of your heartbeat?
All of the above are being researched and developed right now – which isn’t as surprising as it may seem. The top-end smartphones in our pockets are already highly sophisticated sensor clusters containing accelerometers, gyroscopes, compasses, thermometers, GPS units and biometric readers.
In this feature, we reveal how these “measurement of me” authentication systems work, when we’re likely to be able to use them and whether they really do offer greater security than your existing password.
Unique human behaviours
Think biometrics, and chances are you’ll picture fingerprints. If you’re more imaginative – or up on your security reading – you may also think of facial or iris recognition.
However, biometrics has moved on from these relatively simplistic measures of an individual, and can therefore be used as a method of authentication within the security sphere.
There are many unique human behaviours that can be monitored and together build up a biosignature that’s impossible to forge – think in terms of your gait as you walk, the pressure you exert when you tap or swipe a touchscreen, or even the routines you follow in a typical day. While such authentication alternatives sound advanced, they must prove their mettle against the oft-abused but ubiquitous password to ever become widely used.
To start, let’s consider SilentSense, an authentication framework being developed by researchers at the Illinois Institute of Technology, which uses data mined from “touch behaviour”, including user biometrics and micro-movements of a device as it’s used. Researchers gathered data in the background using the sensors already integrated in a smartphone and, with their own software, were able to monitor and measure operating dynamics, such as touch, that are unique to the user.
In their research paper, authors Cheng Bo, Lan Zhang and Xiang-Yang Li explain how the three principal gestures of tapping (such as clicking icons), scrolling (reading emails, browsing or tweeting) and flinging (turning pages in an ebook reader) are used to identify an individual.
The three principal gestures of tapping (such as clicking icons), scrolling (reading emails, browsing or tweeting) and flinging (turning pages in an ebook reader) are used to identify an individual
The features analysed include the co-ordinates where the screen was touched, the pressure exerted and the duration of the contact, all of which can be extracted from Android’s application programming interface (API).
Since people touch their smartphones differently depending on the app they’re using, both touch and “reaction” are measured. To measure the “reaction”, the researchers looked at the position of the device and the different amplitudes of vibration caused by each touch. These patterns can be readily observed by accessing the accelerometer and gyroscope built in to a smartphone.
The researchers also had to consider the context in which a phone is being used: if the user is walking, on a train or in a car, for example. They countered this problem by combining movement-based biometrics – which can identify your speed by the changes in movement of your device – with historical, touch-based biometrics to provide an authentication template that proved to be more than 99% accurate during trials on Android-based HTC Evo 3D and Samsung Galaxy S III devices.
The end result is a method of verifying whether the current user is the authorised owner of the device based upon historical behavioural biometrics – the “measurement of me”, in other words.