Is YOUR safety at risk? AI creates fake fingerprints so realistic they can trick biometric scanners
- Researchers were able to imitate more than one in five fingerprints
- They did this using a neural network called the Generative Adversarial Network.
- Researcher suggests fingerprint identification may become less secure
From unlocking smartphones to authorizing payments, fingerprints are widely used to identify people.
However, a team of researchers has now succeeded in accurately copying real fingerprints and creating fake ones called âDeepMasterPrintsâ.
The researchers – who created the fake prints using a neural network – were able to imitate more than one in five fingerprints.
These new technological developments suggest that fingerprint identification may become less and less secure.
Scroll down for video
From unlocking smartphones to authorizing payments, fingerprints are widely used to identify people. However, a team of researchers have now successfully copied real fingerprints accurately and created fake ones called “DeepMasterPrints” (pictured)
“MasterPrints are real or synthetic fingerprints that can coincidentally match a large number of fingerprints,” wrote researchers, led by Philip Bontrager of New York University, in the paper presented at a security conference in Los Angeles.
“In this work, we generate full image-level MasterPrints known as DeepMasterPrints, which have far superior attack accuracy than previous methods.”
The method, called Latent Variable Evolution, is created by forming a generative antagonist network (GAN) on real fingerprint images.
GANs “teach” an algorithm about a particular subject – in this case fingerprints – by providing it with massive amounts of information.
GANs are made up of two neural networks that learn by examining raw data.
One examines the raw data (fingerprints) while the other generates fake images based on the dataset.
Fingerprint systems typically don’t read the entire fingerprint, but simply record the part that touches the scanner first, the Guardian writes.
This means that they are easier to forge than full prints.
The GAN created several fake fingerprints that matched the real ones enough to fool the scanner as well as the human eye.
The researchers found that it was able to mimic more than one in five fingerprints while a bionic system should only have an error rate of one in a thousand.
The researchers – who created the fake prints using a neural network – were able to imitate more than one in five fingerprints (stock image)
âThe underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis,â the researchers wrote.
They hope their research will help develop more secure authentication systems in the future.
âExperiments with three different fingerprint matchers and two different data sets show that the method is robust and does not depend on the artifacts of a particular fingerprint matching device or data set.
“This idea is surprisingly under-explored and could be useful in research into computational creativity as well as in other areas of security,” the researchers found.
WHAT IS BEHAVIORAL BIOMETRY?
Physical biometrics, such as fingerprints, facial recognition, and retinal scans, are currently more commonly used for security purposes.
However, behavioral biometrics – which includes things like the way you walk – are able to capture unique things about a person’s behavior and movements.
They also include things like voice tagging and signature analysis.
Researchers at the University of Manchester have developed an AI biometric verification system that measures an individual’s gait or gait pattern.
This non-intrusive technique can successfully check people with 99.3% accuracy after stepping on a pressure pad on the floor – and they don’t even have to take their shoes off.
Behavioral biometrics are already being used for authentication in financial institutions and businesses.
Once people provide their biometric data, the AI ââselects specific data points that it processes using an algorithm.