Neural networks just hacked your fingerprints

Fingerprints are supposed to be unique markers of a person’s identity. Detectives look for fingerprints in crime scenes. Your phone’s fingerprint sensor means only you can unlock the screen. The truth, however, is that fingerprints might not be as secure as you think – at least not in an age of machine learning.

Neural networks just hacked your fingerprints

A team of researchers has demonstrated that, with the help of neural networks, a “masterprint” can be used to fool verification systems. A masterprint, like a master key, is a fingerprint that can be used to open many different doors. In the case of fingerprint identification, it does this by tricking a computer into thinking the print could belong to a number of different people.  

“Our method is able to design a MasterPrint that a commercial fingerprint system matches to 22% of all users in a strict security setting, and 75% of all users at a looser security setting,” the researchers ­– Philip Bontrager, Julian Togelius and Nasir Memon – claim in a paper.

The team trialled two methods for generating masterprints using generative adversarial networks (GAN), which are a type of algorithm used in machine learning. In both methods, the researchers trained a GAN to create partial fingerprint images based on a series of actual fingerprint images. The differences between the two methods then get quite technical (one is “based on evolutionary optimisation of latent variables”, while the other “uses gradient descent to find latent variables that maximise the number of activated outputs”).

The outputs were then tested on three proxy recognition systems (convolutional neural networks) and two external fingerprint recognition systems. They were able to fool all of these – to varying degrees – that the fingerprints belonged to many individuals. The implications are that biometric security systems could be breached with these masterprints, which could be printed onto – say – custom gloves. While this will no doubt be music to the ears of spies and international assassins, it also has the potential to undermine the day-to-day security in a huge number of situations, from unlocking smartphones to passing through US border security.

The researchers say they want to continue testing “in the wild”, against smartphone fingerprint recognition and other types of real-world authentication systems. You can read their full account of the research online.

Images: Kevin Dooley. Source: Prosthetic Knowledge

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.