Silly AI-generated fingerprint scanners – Naked Security

0


We’ve had fake videos, fake faces, and now researchers have developed a method for AI systems to create their own fingerprints.

Not only that, but the machines have figured out how to create impressions that trick fingerprint readers more than one in five times. Research could present problems for fingerprint-based biometric systems that rely on unique patterns to grant access to users.

The research team, working at New York University Tandon and Michigan State University, used the fact that fingerprint readers do not scan an entire finger at a time. Instead, they scan parts of fingerprints and compare them to what’s in the database. Previous research has revealed that some of these partial impressions contain characteristics common to many other partial impressions. This gives them the potential to act as a kind of skeleton key for fingerprint readers. They are called MasterPrints.

The researchers set out to train a neural network to create its own MasterPrints that could be used to trick fingerprint readers into granting access. They were successful, with a system they call Latent Variable Evolution (LVE), and published the results in an article.

They used a common AI tool to create realistic data called the Generative Adversarial Network (GAN). They trained this network to recognize realistic images by feeding it in large numbers. They do the same with artificially generated images so that he understands the difference between the two. Then they take the statistical model that the neural network produces as it learns and feeds it to a generator. The generator uses this model to produce realistic images and repeats the process so that it can improve.

The researchers took these generated images and tested them against fingerprint matching algorithms to see which performed the best. He then used another algorithm to scale the fingerprint to further improve those results.

This is because the AI ​​system uses mathematical algorithms to develop human fingerprints that can bypass biometric scanners.

The team used two sets of data to train their fingerprint generator: a set of traditional rolled ink fingerprints and a set of fingerprints captured by capacitive readers like those found in smartphones. Capacitive fingerprints produced better results.

Biometric systems such as fingerprint readers can be set to different levels of security by adjusting their false match rate. This is the percentage of incorrect fingerprints he would approve. The research team tested the fingerprint scanning algorithms at a false match rate of 0.1%, which is expected to mistakenly approve the wrong fingerprint once in a thousand. The fingerprint reader accepted its generated MasterPrints, which it calls DeepMasterPrints, 22.5% of the time.

The researchers said the LVE method appeared to produce partial fingerprint images containing enough common characteristics to trick fingerprint readers at far above average rates. They added that these artificial fingerprints could be used to launch a practical attack on fingerprint readers.

Experiments with three different fingerprint matchers and two different data sets show that the method is robust and does not depend on the artifacts of a particular fingerprint matching device or data set.

This is all a little worrying, if someone is able to spoof your fingerprints then they don’t have to steal them (and if they do, you can’t upgrade or change your fingerprints) . If someone were to develop this into a functional feat, perhaps by printing the images with capacitive ink, it could cause problems for many fingerprint recognition systems.


Share.

Comments are closed.