Watchdog: Privacy Laws Aren't Keeping Up with Facial Recognition Technology

ra2studio/Shutterstock.com

In a new report, GAO warns consumers that existing federal privacy laws do not expressly limit how businesses can use images of their faces.

A future is fast approaching in which billboards call out pedestrians by name, displaying ads they're likely to be interested in -- and in the process endangering an individual's ability to remain anonymous in public, a federal watchdog says. 

In a new report, the Government Accountability Office warns consumers, tech companies and privacy groups that existing federal privacy laws do not expressly limit how businesses can use facial recognition technology. 

Some government agencies and privacy advocacy groups believe facial recognition technology, if unregulated, could "give businesses or individuals the ability to identify almost anyone in public without their knowledge or consent and to track people’s locations, movements and companions," the report said.

In its report -- which examined only the commercial, nongovernmental uses of the technology in the United States -- GAO noted that some existing federal laws could be stretched to regulate where businesses can capture facial images, what they can share with retailers and practices that are deceptive to consumers.

But it concluded, “federal law does not expressly regulate whether and when firms should notify consumers of their use of facial recognition technology, or seek an individual’s consent prior to identifying them, which has been an area of concern to privacy advocacy organizations and others.”

Facial recognition technology is popular today, GAO found. For instance, on popular social networks such as Facebook, users can quickly "tag" people in photos; security cameras can identify shoplifters in retail stores; and building security systems can restrict entry, among other applications. 

More commonly, the technology is used for "for detecting characteristics (such as age or gender) to tailor digital advertising, rather than identifying unique individuals," GAO found. For instance, televisions or kiosks showing ads in retail stores might have cameras that can recognize a viewer's age or gender, and "target advertisements accordingly."

Facial recognition systems usually use a camera to capture a person's image, often by searching for facial characteristics to create a "face print," the report said. Current facial recognition technology can, by comparing the print to a database of other images, identify the face in an image, estimate age, race or gender, or attempt to verify that a person's face matches their identification photo.

In the future, digital signs "may be used to identify customers by name and target advertising to them based on past purchases or other personal information available about them," GAO warned in the report, citing interviews with Federal Trade Commission staff. "Facial recognition systems can also be designed to alert staff when known customers enter the store, according to a software vendor with whom we spoke."

The recent surge in information re-sellers, who might share with other companies what they gather about individuals from facial recognition systems "raises questions about whether face prints and associated personal data may one day be sold or shared," especially for marketing purposes, GAO found. 

And the technology isn't infallible -- it could generate an incorrect match in a security system, leading authorities after the wrong person.

Despite the lack of explicit federal protection against the misuse of facial recognition technology, some groups, such as the National Telecommunications and Information Administration, the International Biometrics and Identification Association and the American Civil Liberties Union are examining the technology's implications on privacy. ACLU has issued guidelines for companies using the technology, urging they "allow individuals to access, correct and delete their face print information," among other recommendations intended to improve consumer privacy.

Some existing laws, such as the Video Voyeurism Prevention Act of 2004, could be applied to facial recognition. That law deems it illegal to intentionally capture images of individuals without their consent in areas where they have a reasonable expectation of privacy.

“While the definition of a private area does not include a person’s face, the act could affect the specific placement of cameras in certain parts of commercial spaces, such as retail stores’ dressing rooms," GAO found.

The states of Texas and Illinois have adopted their own privacy laws addressing the commercial use of biometric data. These laws require private groups to obtain individuals' consent before collecting biometric identifiers and generally prohibit them from sharing such identifiers with third parties, except for certain purposes.

(Image via ra2studio/ Shutterstock.com)