Synthetic Fingerprints Make Biometric/Fingerprint Recognition Systems Vulnerable

From smartphone lock systems to identity verification, people consider fingerprint scans a viable method of security. However, scientists have figured out ways to ditch fingerprint recognition and biometric systems via synthetic fingerprints. It means all systems and devices relying on fingerprints could be vulnerable to spoofing.

DeepMasterPrints – Novel Development Of Synthetic Fingerprints

A team of researchers from the New York University Tandon School of Engineering has shared its study regarding synthetic fingerprints. As demonstrated, using neural networks, synthesizing human fingerprints artificially has now become possible. The researchers even succeeded in developing a fake fingerprint that could potentially spoof any fingerprint recognition system. They have published their findings as a research paper.

As explained, they took inspiration from DeepMasterPrints from the MasterPrints developed earlier by Roy et al. Regarding MasterPrints, they explain,

“MasterPrints are a set of real or synthetic fingerprints that can fortuitously match with a large number of other fingerprints. Therefore, they can be used by an adversary to launch a dictionary attack against a specific subject that can compromise the security of a fingerprint-based recognition system.”

With this inspiration, the team developed images that appear similar to natural fingerprints. For this, they have employed Latent Variable Evolution (LVE) technology. The researchers have claimed this to be the very first work creating synthetic MasterPrints at image-level. Summarizing their work, they state,

“The proposed method, referred to as Latent Variable Evolution, is based on training a Generative Adversarial Network on a set of real fingerprint images. Stochastic search in the form of the Covariance Matrix Adaptation Evolution Strategy is then used to search for latent input variables to the generator network that can maximize the number of impostor matches as assessed by a fingerprint recognizer.”

Security Risks To Biometric System

Exploiting synthetic fingerprints could lead to dictionary attacks. Such attacks do not require a potential attacker to know an individual’s real fingerprints. Small fingerprint sensors are particularly vulnerable to these attacks as they work only on partial fingerprints. These devices easily accept input fingerprints matching with any one of the various partial fingerprints of an individual saved in the database.

While the previous study dealing with MasterPrints could only create partial fingerprints, the technology demonstrated herewith allows creating images of complete fingerprints. These images can certainly spoof any system requiring human fingerprints, where a single DeepMasterPrint can trick the system for multiple fingerprints. Thus, the present study perhaps emerges as a security risk to the authentication of fingerprint authentication procedures. As stated by in the paper,

“…it is theoretically possible to design DeepMasterPrints for any fingerprint system that accepts images. Further, the attack can potentially be launched at the sensor level by transferring the images to a spoof artifact.”

Consequently, the present biometric and fingerprint authentication procedures need major improvements to prevent the risks of such attacks.

Take your time to comment on this article.

Related posts

Apple Addressed Two Zero-Day Flaws In Intel-based Macs

Really Simple Security Plugin Flaw Risks 4+ Million WordPress Websites

Glove Stealer Emerges A New Malware Threat For Browsers