Experts Tell Congress Facial Recognition’s Bias Problem May Be Here to Stay

sarime/Shutterstock

Despite significant improvements in the tech’s overall performance, he said, “it’s unlikely” researchers will ever make systems equally accurate across racial and other demographic lines.

The accuracy of facial recognition tools has improved by leaps and bounds in recent years, but according to experts, it may be impossible to fully remedy the technology’s demographic bias.

Facial recognition tools tend to work less effectively for women, people of color and the elderly, but those demographic differences are shrinking as the technology improves, according to Charles Romine, director of the Information Technology Lab at the National Institute of Standards and Technology. 

In 2017, NIST expanded a program to help government vendors test and improve their facial recognition systems, and algorithms’ overall performance has made “significant progress” since then, Romine said. Though accuracy still varies widely from system to system, he said, some of today’s best algorithms correctly identifying subjects some 99.7% of the time.

But while a rising tide lifts all boats, Romine said, technologists may never be able to build a system that identifies every type of person with the same level of accuracy.

“It’s unlikely that we will ever achieve a point where every single demographic is identical in performance across the board, whether that’s age, race or sex,” he told the House Homeland Security Committee on Tuesday. “We want to know just exactly how much the difference is.”

Today, NIST is finalizing a report on demographic differences in facial recognition based on data collected during their vendor testing program. Romine told the committee the results will be released sometime this fall.

Romine joined a trio of Homeland Security Department officials discuss the scope and scale of the agency’s efforts to use facial recognition and other biometric technology. While some lawmakers were interested in the technical limitations of the tools, much of the hearing revolved around the integrity and legality of the department’s various pilot programs.

John Wagner, deputy executive assistant commissioner of field operations at Customs and Border Protection, told lawmakers the agency hasn’t seen any “significant error rates” related to demographics in its biometric entry and exit program. When pressed by lawmakers about the initiative, which uses facial recognition to monitor people entering and exiting the country through airports and border checkpoints, he highlighted convenience the technology offers travelers and stressed that the program fits squarely within CBP’s existing authorities.

He also told lawmakers CBP is strengthening its oversight of industry pilot programs in response to the recent data breach at one of its surveillance system contractors.

During the hearing, Transportation Security Administration Assistant Administrator Austin Gould also discussed the agency’s plans to expand biometric identification of domestic travelers, and Secret Service Chief Technology Officer Joseph DiPietro described a small scale pilot program using facial recognition around the White House grounds.

“I am not opposed to biometric technology and recognize it can be valuable to homeland security and facilitation,” Chairman Bennie Thompson, D-Miss., said during the hearing. “However, its proliferation across [Homeland Security] raises serious questions about privacy, data security, transparency and accuracy.”

Today, there are virtually no laws on the books governing the use of facial recognition systems by federal law enforcement agencies like Homeland Security and the FBI, and lawmakers from both sides of the aisle have started calling for more congressional oversight of the programs.

The hearing comes days after the Washington Post reported Immigration and Customs Enforcement and the FBI are tapping into state driver’s license databases to collect data on millions of Americans without their consent.