In 9 U.S. Cities, Airport Security Is Now Scanning Your Face

kai celvin/Shutterstock.com

“DHS should not be scanning the faces of Americans as they depart on international flights—but DHS is doing it anyway,” warns a new report that finds facial recognition practices may be violating the law.

On Wednesday morning, I queued up at the assigned gate at Dulles Airport to board a flight to India. I was distractedly checking work emails in the line, so it took me a while to realize that the routine boarding pass scan at the gate was actually something different.

“What is this?” I asked, when I got to the front and saw what looked like a newfangled camera.

The Transportation Security Administration agent did not reply to my question. Instead, he just told me to stand and look ahead, shepherding me through like he had everyone in front of me.

I did what I was told—I am not an American citizen, and I didn’t know what the risks of declining the scan would be. But later on I repeated my question to the airline official who checked my boarding pass at the gate. “Oh, that’s a facial recognition scan,” he replied.

The flight attendants inside the Emirates-JetBlue plane did not know, or say, much more. As far as I could tell, no one on my plane—which certainly had some U.S. citizens—was given an explanation or an option to opt out.

Washington, D.C., is one of nine cities—the others being Atlanta, Chicago, Las Vegas, Miami, New York City, and Houston—where facial recognition software is being used for international travelers on certain flights with certain airlines—JetBlue is among them. The software basically scans a person’s face and matches it to a repository of images to verify that the person boarding is really who they say they are.

I’m not the only one concerned about what the use of this technology might mean: A scathing new report, released Thursday by the Georgetown Law Center on Privacy and Technology, finds that these scans—which are dubious in their efficacy and accuracy—are routinely being conducted without heed to federal law or privacy rights.

“Without explicit authorization, [Department of Homeland Security] should not be scanning the faces of Americans as they depart on international flights—but DHS is doing it anyway,” write authors Harrison RudolphLaura M. Moy, and Alvaro Bedoya. “As currently envisioned, the program represents a serious escalation of biometric scanning of Americans, and there are no codified rules that constrain it.”

The will to use facial recognition technology at U.S. borders and ports of entry has been around for a while—but the Trump administration has made the “Biometric Entry-Exit Tracking System” a priority. The expressed purpose is to verify the identities of non-citizens who are leaving the country, to help contain visa fraud and overstays. But it’s never been clearly established whether these scans will actually do the trick; even DHS itself isn’t convinced that the biometric exit system is effective. In other words, it’s not clear why this strategy is needed to solve the problem of visa overstays.

If DHS is going to implement facial recognition, there is a procedure it has to follow. According to the Georgetown report, the agencies running this program have not followed the required federal rulemaking process through which they notify the public and ask for their input. As a result, the program is on “shaky legal grounds,” the report finds.

So what does DHS allow and what can you do? According to the report, face scans appear mandatory for foreign nationals under the current rule, but they may be optional for U.S. citizens. In fact, DHS and CBP have laid out some alternatives for travelers who don’t want to have their face scanned. A traveler requesting to sit out the scan can have their documents manually checked by either the flight staff or the CBP officer, depending on the airline, government agency, and airport.

But it’s unclear whether agents are informing passengers about this option—the one in my case did not—and whether they are even required to under the current policy. Passengers may therefore have to know that their face is being scanned and affirmatively seek to opt out. This is amongst the primary concerns the report raises that would have been addressed if the policy had gone through proper rulemaking procedures.

At the heels of this report, Sens. Mike Lee, a Republican from Utah, and Ed Markey, a Democrat from Massachusetts, have sent the DHS secretary a letter requesting that the agency hold off on the program until these legal and privacy concerns are addressed.

Perhaps the biggest question critics have with this and other uses of facial recognition technology is: How accurate is the algorithm in properly identifying faces and matching them with useful information? And the answer is not satisfactory. Via the report:

According to DHS’ own data, DHS’ face recognition systems erroneously reject as many as 1 in 25 travelers using valid credentials. At this high rate, DHS’ error-prone face scanning system could cause 1,632 passengers to be wrongfully delayed or denied boarding every day at New York’s John F. Kennedy (JFK) International Airport alone. What’s more, DHS does not appear to have any sense of how effective its system will be at actually catching impostors—the system’s primary goal.

The folks that are more likely to bear the brunt of these potential inaccuracies are people of colorparticularly women, if previous research is to be believed. Algorithms tend not to work as well on all genders and races. “It’s as if DHS has hired a billion-dollar bouncer to check IDs but never checked how good he is at spotting a fake,” said Moy, who co-authored the report, in a statement. “They also don’t know if he’s biased against certain groups of people. These questions are mission-critical, and they’re unanswered.”

A CBP spokesperson responded to the report via email, saying that the agency “published frequently asked questions and produced signage used at the technical demonstration locations that explains alternative procedures are available.” She added that CBP analyzes its metrics, and that “demonstrations have a matching rate in the high 90 percentile.”

What privacy and civil rights advocates most fear is that surveillance technology like this, if not checked and clearly defined, will ultimately lead to ever-larger privacy intrusions. What could be next? Scanning the faces of protesters and journalists? Running these images through criminal databases without consent or probable cause? Targeting consumers for profit?

Right now, there are virtually no limits on how this information could be shared between government agencies and private companies, the report’s authors say. And the potential implications for the freedom of speech, movement, and assembly are massive.