Artificial Intelligence for Policing Stirs Ethics Concerns

San Francisco Police officer Joe Juarez wears a body camera while patrolling outside of AT&T Park before a baseball game.

San Francisco Police officer Joe Juarez wears a body camera while patrolling outside of AT&T Park before a baseball game. Jeff Chiu/AP

A top body camera company has launched a panel to explore the issue. Civil rights and privacy groups aren't totally satisfied.

A leading vendor of police body cameras and other law enforcement technology announced Thursday it would convene a panel of experts to serve as an “artificial intelligence ethics board.”

The company, Axon, formerly known as Taser, says its ultimate goal in developing artificial intelligence technology is to remove the need for police officers to do manual paperwork.

But a coalition of 42 organizations involved in civil rights and privacy issues responded swiftly to the board’s formation, raising red flags about emerging facial recognition technology and other issues.

Axon CEO and founder, Rick Smith, said in a statement that the company believes the advancement of AI technology will "empower police officers to connect with their communities versus being stuck in front of a computer screen doing data entry."

“We also believe," he added, "AI research and technology for use in law enforcement must be done ethically and with the public in mind."

The company says that 37 major U.S. cities have adopted its body-worn camera technology.

Members of the eight-person board include academics, civil liberties and legal experts and police professionals. The company is offering a $5,000 annual stipend, plus $5,000 per meeting to members if they choose to accept the money, an Axon spokesperson said by email.

The group met for the first time Thursday in Scottsdale, Arizona.

Barry Friedman, a professor and director of the Policing Project at New York University's School of Law, is one of the panel members.

"I think something like this is desperately needed," he said of the board during a phone interview. "In fact, I think it would be great if it existed on an industry-wide basis."

"Very often in policing the technology is driving what's happening more than any set of considered choices by the public," he added.

In a letter to the board, the coalition of civil rights groups and other organizations said they are interested in the panel’s work, but concerned about the direction of Axon's product development.

They went on to offer a set of recommendations.

One is that Axon should not enable real time facial recognition analysis of live video captured by officer-worn body cameras. The letter says this technology threatens to chill constitutional freedoms, that research indicates it will never be perfectly accurate or reliable, and that its accuracy varies based on gender and race.

Last year, Axon, then Taser, acquired two startup firms to bolster its artificial intelligence efforts.

It described one, Dextro, Inc, as providing "the first computer-vision and deep learning system to make the visual contents in video searchable in real time." The other acquisition was a team from Fossil Group, Inc. that included researchers focused on improving the accuracy, efficiency and speed of processing images and video.

The prospect of combining facial recognition software and body-worn cameras raises the possibility that police officers could learn in real time when they cross paths with a suspect, or a missing person.

"The technology is not as far away as we might think," said Harlan Yu, executive director of Upturn, a nonprofit that concentrates on issues at the intersection of social justice and technology.

He added: "It's going to make mistakes. And putting it in an environment where you have a police officer, with a gun, who has to make a split second decision based on a potentially error-prone technology is just a recipe for disaster."

Friedman said that he, too, has an "extraordinary amount of nervousness around real time facial recognition of body cameras" and that he thinks even non-real-time use of the technology would require a great deal of thought and safeguards.

As of mid-day Thursday, he said that real time facial recognition had not been discussed in the board's first meeting. But he said he expected it would come up before the end of the day.

The groups that sent the letter to the ethics board also urged that the panel incorporate input from people in heavily policed communities.

"In particular, survivors of mass incarceration, survivors of law enforcement harm and violence, and community members who live closely among both populations must be included," the letter says.

"One of our main concerns is that among the people who are on the board, no one directly represents the community, the individuals, who are most impacted by Axon’s technology," said Yu.

The letter also recommends that the board look for new ways to limit unethical uses of Axon technology, like contract terms. And it calls for the scope of the board's work to cover all of the company's products.

Laura Moy, deputy director of Georgetown Law's Center on Privacy and Technology, one of the groups that signed the letter, said she thinks an ethical review of police technology is a positive idea.

"But we're reserving judgement on whether or not this particular implementation is a good one," she said. "At this point, it's very difficult to tell whether this board will actually be one that exacts close scrutiny and has power to limit the development of Axon's products."

Friedman said he would have been surprised if Axon had given the board any sort of power to block product development decisions that the company would have otherwise made. He also said he did not intend to endorse anything the company does.

"We just made that clear as a group," he noted.

"The company knows that it's at the cutting edge of a space that is important but also fraught and where there are risks. And it decided that before it moved into that space to get advice," Friedman said. 

"I know there are other vendors that are rushing ahead with facial recognition on body cameras and aren't having these conversations," he added. "I think that's a mistake."