Working with biometrics is always a balancing act. With passwords, authentication is simple — either it matches or it doesn't. But when that "password" is part of a user's body, whether a face scan, iris match, or just a regular old fingerprint, systems have to anticipate and account for a little bit of wiggle room. After all, you don't want that face scan failing because you got a pimple, or your fingerprint rejected because you touch the sensor at a slightly different angle each time. But now a new attack takes advantage of the flexibility programmed into these systems, generating fake "universal" fingerprints.

These synthetic fingerprints, which the researchers behind them call  DeepMasterPrints, were created by feeding a neural network images of real fingerprints until it could generate its own. These prints were then analyzed using the same sort of verification algorithms employed by the scanners on our phones, and modified over and over in subtle ways until they passed — even though they didn't actually match.

By repeating this with a large data set, the team was able to come up with fingerprint images that have enough elements in common with the average person's prints that scanners can readily be tricked into giving a false positive. This isn't a matter of matching to just one person, either — the DeepMasterPrints are designed to work equally as well with any user.

Real scans of fingerprints (left) vs. DeepMasterPrints fakes (right)

How well, exactly? It depends on how exacting the scanner you're trying to fool is. Every fingerprint scanner has to accept some rate of false positives — situations where an unauthorized print is mistakenly interpreted as authorized. A very tolerant scanner might let through 1.0% of false positives when using real fingerprints, but DeepMasterPrints are able to fool that kind of scanner a shocking 77% of the time.

Stricter scanners with only 0.1% false positives are still tricked by DeepMasterPrints over 22% of the time, and even ones rejecting all but 0.01% of unauthorized prints will fall for DeepMasterPrints at a rate north of 1% — two whole orders of magnitude above their intended threshold.

None of this is quite enough to have us full-on reject the idea of biometrics and go back to just using PINs and passwords, but it's certainly an eye-opening look into just how much security we're giving up for the sake of convenience. Here's hoping that future devices can be built with attacks like this in mind, and offer more robust false-positive rejection.