The software returned incorrect matches 98% of the time.
Facial recognition has become an increasingly popular tool for law enforcement agencies, but what happens when the software behind it is faulty?
A new report from The Independent released Sunday revealed that the U.K.'s Metropolitan Police facial recognition software returns incorrect matches 98 percent of the time. The technology works by scanning people in a video feed and comparing their faces to those to images stored in reference libraries. Out of 104 alerts, only 2 were positive matches.
The same system is used by the South Wales police, where they had similar rates of success. Since June 2017 the system turned out more than 2,400 false positives.
The police forces are testing the software on a trial basis.
"I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use," the U.K.’s biometrics commissioner, Professor Paul Wiles told The Independent.
The U.K. police forces aren't the only ones experimenting with facial recognition. U.S. Customs and Border Protection is testing facial recognition at airports around the country while the U.S. Army is developing a technology that allows facial recognition to work in the dark.
Of course there isn't only one version of facial recognition software used around the world, but as more and more organizations adopt biometric technology, accuracy levels are something to be concerned about.