U.K. Facial Recognition Software Used By Police Highly Inaccurate, Report Finds

Zapp2Photo/Shutterstock.com

The software returned incorrect matches 98% of the time.

Facial recognition has become an increasingly popular tool for law enforcement agencies, but what happens when the software behind it is faulty?

A new report from The Independent released Sunday revealed that the U.K.'s Metropolitan Police facial recognition software returns incorrect matches 98 percent of the time. The technology works by scanning people in a video feed and comparing their faces to those to images stored in reference libraries. Out of 104 alerts, only 2 were positive matches.

The same system is used by the South Wales police, where they had similar rates of success. Since June 2017 the system turned out more than 2,400 false positives.

The police forces are testing the software on a trial basis.

"I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use," the U.K.’s biometrics commissioner, Professor Paul Wiles told The Independent.

The U.K. police forces aren't the only ones experimenting with facial recognition. U.S. Customs and Border Protection is testing facial recognition at airports around the country while the U.S. Army is developing a technology that allows facial recognition to work in the dark.

Private companies are also embracing biometric technologies. Multiple airlines are testing facial recognition tech on passengers and it has been a feature on social media for several years now.

Of course there isn't only one version of facial recognition software used around the world, but as more and more organizations adopt biometric technology, accuracy levels are something to be concerned about.