When Algorithms Take the Stand

sebra/Shutterstock.com

A case soon to be decided by the Wisconsin Supreme Court considers the proper role of mathematical prediction in the courtroom—and beyond.

In February of 2013, Eric Loomis was found driving a car that had been used in a shooting. He was arrested; he pleaded guilty to eluding an officer and no contest to operating a vehicle without its owner’s consent. The judge in Loomis’s case gave him a 6-year prison sentence for those offenses—a length determined in part not just by Loomis’s criminal record, but also by his score on the COMPAS scale, an algorithmically determined assessment that aims, and claims, to predict an individual’s risk of recidivism.

Law enforcement in Wisconsin, where Loomis lived at the time of his arrest, rely on COMPAS and algorithms like it to augment human intuition and analysis with, they claim, a more objective approach to justice. Loomis’s score suggested that he had a high risk of committing another crime; thus, his 6-year sentence.

Loomis appealed the ruling, on the grounds that the judge’s use of the predictive algorithm in his sentencing decision violated due process. (COMPAS is a proprietary algorithm, and the inputs that inform its ultimate risk assessments are, to the public, largely opaque.) The case made its way to the Wisconsin Supreme Court, and a ruling is expected to come later this week.

For Julia Angwin, a technology reporter at ProPublica who has spent the past year focusing her reporting efforts on COMPAS and other so-called “risk assessment algorithms,” the Loomis case is about more than the U.S. criminal justice system, and about more even than the constitutional rights that inform that system.

The case and the ideas at stake in it also serve as yet another reminder that algorithms are increasingly inflecting all areas of civic, and commercial, life. Credit scores. News. Policing.

As Angwin argued today at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic: In ways big and small, algorithms make judgments that, under the guise of cold, hard “data,” directly affect people’s lives—for better, often, but sometimes for worse.

The irony, Angwin argues, is that we—as a criminal justice system, as a political body, and as a culture—are taking an all-too-human approach to our algorithmic infrastructure: We trust it too much. We have not yet thought as rigorously or as strategically as we need to about its effects. We have not fully considered whether, and indeed how, to regulate the algorithms that are helping to regulate our lives.

“This is a wild and wooly area,” Angwin said. “It really feels like the Wild West at the moment.”

It makes some sense that we are inclined to trust algorithms as objective and, as such, unobjectionable. The appeal of a system like COMPAS is that it proposes to inject objectivity into a criminal justice system that has been compromised, too many times, by human failings.

What that appeal tends to elide, though, is the fact that algorithms, their science-y names notwithstanding, are as fallible as the people and the institutions that write them. Angwin researched the results of COMPAS, in particular—and found, after comparing the algorithm’s predictions with real-world outcomes, that it had only about 60 percent accuracy (“just slightly more than a coin toss”).

Worse still, the algorithm, as ProPublica’s article bluntly summed it up, “is biased against blacks.”

This wasn’t the first time that algorithms had exhibited that bias. And, for Angwin, what all that amounts to is the need for a more skeptical approach to algorithms in general: what she calls “algorithmic accountability.”

We’re still living in a time of general tech-optimism, Angwin argued—a time in which new technologies (smartphones! Facebook! robots that help you unload the dishwasher!) promise to make our lives both more efficient and enjoyable. Those technologies may help to make our justice system more equitable; they might not. The point is, we owe it to ourselves—and to Eric Loomis, and to every other person whose life might be altered by an algorithm—to find out.