Researchers have generated artificial fingerprints that function as "master keys" for fingerprint biometric systems.
I mean, this shouldn't happen, right? Biometrics are supposed to be something you are, and unique to you.
OK, we all know that, in implementation, you can have some level of false positive results. But in a system that should have an error rate of only one in a thousand, they were able to fake one in five. (That's 200 in a thousand.)
As usual in these cases, the researchers were taking advantage of implementation errors, and those probably can be tightened up, but lots of fingerprint scanners won't be ...
The moral issue happens when it leaves the labs, and is used for the perpetration of crime - then people sit up and review such items. Anything is possible these days and once again - when these technologies are created, do they think through the implications and to whom uses them?
I very much doubt it.
Then it becomes another problem we have to go solve.
Regards
Caute_cautim