Fingerprint and face scanners aren’t as secure as we think


Despite what every spy movie from the past 30 years would have you believe, the fingerprint and face scanners used to unlock your smartphone or other devices are not as secure as they claim.

While it’s not great if your password is made public during a data breach, you can at least change it easily. If the scan of your fingerprint or face – known as “biometric template data” – is revealed in the same way, you could be in real trouble. After all, you can’t get a new fingerprint or a new face.

Your biometric template data is permanently and uniquely linked to you. Exposing this data to hackers could seriously compromise user privacy and the security of a biometric system.

Current techniques provide effective security against breaches, but advances in artificial intelligence (AI) make these protections obsolete.

How biometric data could be hacked

If hackers wanted to gain access to a system protected by a fingerprint or facial reader, they can do so in several ways:

  1. Your fingerprint or facial scan (model data) stored in the database could be replaced by a hacker to gain unauthorized access to a system.
  2. A physical copy or spoof of your fingerprint or face can be created from stored model data (with Play-Doh, for example) to gain unauthorized access to a system.
  3. Stolen model data could be reused to gain unauthorized access to a system.
  4. The stolen model data could be used by a hacker to illegally track an individual from one system to another.

Biometric data needs urgent protection

Today, biometric systems are increasingly used in our civil, commercial and national defense applications.

Consumer devices equipped with biometric systems are found in everyday electronic devices like smartphones. MasterCard and Visa both offer credit cards with an integrated fingerprint reader. And portable fitness devices are increasingly using biometrics to unlock smart cars and homes.

So how can we protect the raw data of the models? A range of encryption techniques have been proposed. These fall into two categories: cancellable biometrics and biometric cryptosystems.

In voidable biometrics, complex mathematical functions are used to transform data from the original model when your fingerprint or face is scanned. This transformation is irreversible, which means that there is no risk that the data from the transformed model will be converted back to your original fingerprint or facial scan.

In the event of a violation of the database containing the transformed model data, the stored records may be deleted. Additionally, when you re-scan your fingerprint or face, the scan produces a new unique pattern, even if you use the same finger or the same face.

In biometric cryptosystems, the data from the original model is combined with a cryptographic key to generate a “black box”. The cryptographic key is the “secret” and the request data is the “key” to unlock the black box so that the secret can be recovered. The cryptographic key is released upon successful authentication.

AI makes security more difficult

In recent years, new biometric systems that incorporate AI have really come to the forefront of consumer electronics. Think: smart cameras with built-in AI capability to recognize and track specific faces.

But AI is a double-edged sword. As new developments, such as deep artificial neural networks, have improved the performance of biometric systems, potential threats could arise from the integration of AI.

For example, researchers at New York University created a tool called DeepMasterPrints. It uses deep learning techniques to generate fake fingerprints which can unlock a large number of mobile devices. It’s similar to how a master key can unlock every door.

The researchers also demonstrated how deep artificial neural networks can be trained so that original biometric inputs (such as an image of a person’s face) can be obtained from the stored model data.

New data protection techniques are needed

Thwarting these types of threats is one of the most pressing issues facing designers of AI-based secure biometric recognition systems.

Existing encryption techniques designed for non-AI-based biometric systems are incompatible with AI-based biometric systems. New protection techniques are therefore necessary.

University researchers and biometric scanner manufacturers should work together to secure users’ sensitive biometric template data, thereby minimizing the risk to user privacy and identity.

In academic research, special attention should be paid to the two most important aspects: recognition accuracy and safety. As this research falls within Australia’s science and research priority for cybersecurity, the public and private sectors are expected to provide more resources for the development of this emerging technology.

This article first appeared on The Conversation.


Comments are closed.