Did Body Cameras Backfire?

Lutsenko_Oleksandr/Shutterstock.com

Featured eBooks

Digital First
Cloud Smarter
Cybersecurity & the Road Ahead

Body cameras were supposed to fix a broken system. What happened?

In 2014, when Police Officer Darren Wilson fatally shot an unarmed black teenager named Michael Brown on a street in Ferguson, Missouri, police brutality rocketed to the center of the national discourse on race. Law enforcement needed more accountability, activists argued, and body cameras became the state’s preferred corrective. The Obama administration’s Department of Justice offered more than $23 million in grants for new cameras in 2015, the year after Brown’s death, and another $20 million in 2016. Then-candidate Hillary Clinton called for mandatory body cameras nationwide. In 2018, a New York judge mandated that all NYPD officers wear them, as part of efforts to end “stop and frisk” tactics. The future of policing, it seemed, had arrived.

Less than five years later, though, that momentum has slowed. Cameras have allegedly fallen off; data have been deleted or mislabeled. A 2018 report found that most body-camera footage from fatal shootings never sees the light of day. As of June, the NYPD had a backlog of nearly 800 footage requests.

And now some police-reform advocates argue that recent technological advances mean these cameras are increasingly used not to scrutinize police, but to surveil the public. Recorded footage uploads to the cloud, allowing police to hold more images and videos, and to hold them longer. Object recognition lets officers quickly search through hours of footage to find items of interest (a red backpack, for example). With live-streaming, officers can send everything they see back to department headquarters nearly instantaneously.

And facial recognition—once clunky, slow, and error-prone—can now theoretically turn any police officer’s chest-mounted camera into a sophisticated surveillance apparatus, identifying people as they walk by. California banned the technology in body cameras earlier this year, after civil-liberties groups rallied to stop law enforcement from using facial recognition in cities such as San Francisco and Oakland. The ACLU applauded the law for stopping body cameras from “being transformed into roving surveillance devices that track our faces, voices, and even the unique way we walk.”

Now police-accountability activists are at odds with privacy activists, who worry about an inescapable dragnet that leaves anyone at risk of being scanned, identified, and matched against criminal or immigration databases, all without a warrant or even suspicion of committing a crime.

“For us, body-worn cameras are an unalloyed, positive good. From the point of view of advocates, it’s a much more complicated situation,” said Jonathan Darche, executive director of the Civilian Complaint Review Board, which investigates misconduct allegations against NYPD officers. “[The cameras] enable us to find out what’s going on with any particular incident far more often than when we don’t have the footage. By having [it], we can make the determination that a … discourtesy was used or an improper threat was used, and that’s been a real game changer for our agency.”

The CCRB’s foremost goal is police accountability. When a person makes an allegation of officer misconduct, the board investigates it by reviewing medical records, acquiring video evidence, and speaking to the complainant, the responding officer, and potential witnesses. Public CCTV cameras can be low-quality and rarely have audio, which is crucial if a person alleges verbal harassment or threats. Bystanders usually record cell-phone video only once they’ve noticed something’s wrong, potentially cutting out useful context. Body cameras, on the other hand, automatically save footage of the 30 seconds prior to the officer pressing “record,” an advance that Darche said has made all the difference in investigations. “We’re able to see what happened before people realized that the incident required recording,” he said.

But some privacy and immigrants-rights advocates argue that sometimes, body cameras bring awareness without bringing justice.

“You take the case around Eric Garner,” said Kesi Foster, an organizer with Make the Road New York, which focuses on immigrants’ rights and police accountability. “Everyone saw what happened in that video,” he said, referring to a viral video of NYPD Officer Daniel Pantaleo putting Garner in a fatal choke hold. “And it still took over five years of organizing and community pressure for one officer … to be held accountable.”

What’s more, some worry that body cameras are being used to surveil citizens. Earlier this month, the Electronic Frontier Foundation (EFF) released a request for information (RFI) from Customs and Border Protection revealing its interest in body cameras equipped with facial recognition. The cameras could match a person’s face against a database of preexisting images—for example, mug shots or driver’s-license photos—in real time, allowing, for example, a CBP officer the near-instantaneous ability to confirm a person’s identity just by scanning his face as he walks by. (Utah, Washington State, and Vermont have already reportedly granted ICE access to state DMV records and photos.) The widespread adoption of facial-recognition technology means that simply existing in public puts one at risk of being scanned—which could have an enormous chilling effect on undocumented immigrants living in border communities routinely patrolled by the CBP.

Biometric databases such as the CBP’s also attract hackers who can sell the valuable information on the dark web. In March, the CBP announced that hackers had stolen thousands of travelers’ ID photos and license-plate images from a subcontractor. If the CBP’s database scales to include images generated from its proposed body-camera project, more people will be vulnerable to a security breach.

The EFF’s report is the first indication of a federal agency seeking to put facial-recognition technology in its body cameras. There’s no timetable for CBP adoption of facial-recognition-equipped body cameras and no indication of whether any company has responded to the RFI.