In a paper presented at a biometric security conference (BTAS 2018), specialists explain that to create DeepMasterPrint specialists have considered two things. On the one hand, for ergonomic reasons, fingerprint sensors are often very small (just like in smartphones), which makes them work using part of the image of a user's fingerprint. Therefore, since identification of identity through small portions of fingerprints is not an easy task, as it might be when reading a complete fingerprint, the possibility that a fingerprint part does not coincide with another portion of a fingerprint different finger is high. Researcher Aditi Roy has considered this and introduced the concept of fingerprints that represent a set of real or synthetic fingerprints that can coincide with a large number of fingerprints.
The second thing they have taken into consideration is that some prints have common features between them. Which means that a false impression that contains many common features has more realistic chances to match other fingerprints.
From here, researchers have used a type of artificial intelligence algorithm called the "antagonist genetic network" to artificially create new fingerprints that match as many partial fingerprints as possible. In this way, they have been able to develop a library of artificial fingerprints that function as key keys for a particular biometric identification system. Additionally, there is no need to have a fingerprint that belongs to an individual, but it can be executed against anonymous subjects and still has a good margin.
Although it is very difficult for an attacker to use something similar to DeepMasterPrint, because it takes a lot of work to optimize artificial intelligence for a particular system, because each system is different, it is an example of what could happen in time and something to be taken into account. Something similar has been noticed this year at the Black Hat Security Conference, when IBM researchers demonstrated through the concept that it was possible to develop malware programs that use artificial intelligence to perform facial recognition attacks.