Saturday , June 3 2023

It shows that the fingerprint can infringe the fingerprint reader.



[ad_1]

Master fingerprints are real or composite fingerprints that match at the same time many fingerprints of a real person. In this work, a team of researchers at New York University (NYU) and Michigan State University (MSU) generated a master fingerprint image using a method known as a latent variable model, and these fingerprints, called "DeepMasterPrints" You can re-create fingerprints for recognition systems that are efficient and can be exploited through attacks similar to "dictionary attacks".

In a paper published at the Biosafety Conference (BTAS 2018), experts say that to create a DeepMasterPrint, an expert needs to consider two things. On the other hand, for ergonomic reasons, the fingerprint sensor is very small (like a smartphone) and works with some of your fingerprint images. Therefore, identification of identity by a small portion of the fingerprint is not as easy as reading a complete fingerprint, so that part of the fingerprint of the finger may not match. Another portion of the fingerprint of the other finger is high. Researcher Aditi Roy considered this fact and presented the concept of a master fingerprint, a set of real or synthetic fingerprints that could coincide with many other fingerprints.

The second consideration is that some fingerprints have a common feature. In other words, incorrect fingerprints containing many common characteristics are more likely to match different fingerprints.

Here, researchers used artificial intelligence algorithms called "antagonistic gene networks" to artificially generate new fingerprints that could match as many partial fingerprints as possible. In this way they have developed an artificial fingerprint library that serves as the master key for a particular biometric system. You also do not need to have a fingerprint sample belonging to a particular person, but it can be run on anonymous topics and there is still some potential for success.

It is very difficult for an attacker to use something similar to DeepMasterPrint, but because there is a lot of work involved in optimizing artificial intelligence for a particular system, each system is different and is an example of what can happen over time. . This year we saw something similar at the Black Hat security conference, demonstrating proof of concept that it is possible for IBM researchers to develop malicious code that uses artificial intelligence to perform attacks based on face recognition.

[ad_2]
Source link