Taser International, the company most associated with electrical weapons used by police, is adopting the freemium business model of Silicon Valley. It has renamed itself Axon, the name of its body camera cloud storage and analytics platform, and plans to give away its cameras and services for a year in hopes of persuading more police departments to eventually pay for its enterprise system for cataloging and analyzing camera footage.
Down the line, Axon envisions an automated system of police reports derived from cataloging and analyzing footage from the body-worn cameras.
CEO Rick Smith says these connected systems would allow police to spend more time doing their jobs, rather than paperwork, with the added perk of installing accountability into every interaction the police have with the public. The future of police work, Smith says, is a technologist’s dream, with cameras automating menial tasks like note-taking and report-typing, so police can interact with the public more effectively.
“Eighty percent of American cops go out on the job with a gun, but no camera,” Smith says. “We want to accelerate the adoption of this technology. We think it’s wildly accepted by all sides of the political spectrum, that cops with cameras are good for society.”
Axon will give away free body cameras and access to their digital services to any accredited police force that asks, for one year. After that, police departments will have to see whether the services are useful enough to begin paying.
AI to Power Police Reports
The technology that could enable the future Smith imagines isn’t the cameras themselves, but artificial intelligence tasked with processing the video and audio—transcribing police interviews and automatically generating timelines, descriptions and testimonies surrounding crime. Algorithms considered intelligent today are mainly specialized to accomplish one task with great speed—an algorithm can identify objects in photos far faster than humans, or transcribe hours of speech into text in minutes.
But developing these tools, especially for a niche as specific as police body camera footage, isn’t easy. AI from big tech companies like Google and IBM isn’t necessarily applicable to specialized tasks. When Axon was Taser in February, it acquired two AI teams focused on understanding video footage, startup Dextro and the computer vision division of Fossil Group.
Smith may believe his company’s technology will revolutionize police work, but civil rights activists warn such tools could also be used for surveillance or other unlawful activities by police; Taser’s “nonlethal” weapons have been misused in the past.
“Body cameras and AI use with video and facial recognition are all tools, and they’re very powerful tools,” says Clare Garvie, associate at Georgetown Law and co-author of the Perpetual Lineup project. “So it really comes down to whether law enforcement can use them in a positive way while we have sufficiently put controls in place against them being misused.”
To assuage ethical concerns, Axon will be forming an AI and privacy ethics advisory board with full oversight of the Axon product line which will be led by former Dextro CEO David Luan. Axon’s ethics and privacy board is not yet formed, but Smith says it will be in place before AI with “substantial possibility for abuse” is developed by the company. Coupled with the ethics board, Axon will work with police departments to instruct proper use of the software.
“All of our development work on AI is on the noncontroversial stuff. It’s redaction of videos, identifying where faces are so they can be redacted, but not doing facial identification of individuals. Things like transcribing audio for the purposes of populating reports,” Smith says.
He equivocated on whether Axon will add facial recognition functionality to its software. “It isn’t as simple as saying we’re not going to deploy facial identification technology,” Smith says. “The fact is that if we don’t, doesn’t mean no one else will. And similarly, although that technology could be abusive, you could imagine ways where it could be very very helpful. It comes down to how it’s being used and how it’s being used in a way that’s transparent, that it’s auditable.”
No matter the technology or oversight built into Axon, Carol Rose, executive director of the American Civil Liberties Union Massachusetts, says the onus to use this technology correctly still falls on the police departments themselves.
“An ethics board for a private company is still going to be an ethics board that’s serving a company that has profit motives and a bottom line,” Rose says. “What’s most important is that a police department that adopts body cameras have clear data minimization and privacy policies in place. That’s the role of the state, to protect people.”
A product on Axon’s roadmap might also help keep police accountable. Without AI tools, abuse caught on body cameras might go otherwise unnoticed amongst years of otherwise typical footage, Smith says. Now, this process is a manual slog where video is randomly sampled and reviewed for patterns of abuse.
The company is now training its video recognition AI on footage that’s been reviewed for misconduct multiple times, to try and automatically detect scenes with potential abuse. With that technology, the company could then audit past video as well.
By looking at footage when camera movement indicates the officer is running, body camera audio when voices are raised, or when racial epithets are heard, could all be potential indicators of an abusive situation.
“That would be a way to automate looking for patterns of abusive behavior,” Smith says.