Roberts: Legal field will be ‘significantly affected by AI’

Chief Justice John Roberts speaks at the funeral of retired Associate Justice Sandra Day O'Connor on Dec. 19, 2024 at Washington National Cathedral in Washington, D.C.

Chief Justice John Roberts speaks at the funeral of retired Associate Justice Sandra Day O'Connor on Dec. 19, 2024 at Washington National Cathedral in Washington, D.C. Chip Somodevilla/Getty Images

Chief Justice John Roberts focused his end-of-year report on the intersection of technology and the legal system.

The U.S. Supreme Court is beginning the new year with scrutiny over artificial intelligence softwares, with Chief Justice John Roberts pondering both the possibilities and dangers of the technology’s use in legal proceedings. 

“Machines cannot fully replace key actors in court,” Roberts writes in an end-of-year reflection on the growing role of emerging technologies like AI and machine learning in legal practices. 

Roberts raised concerns regarding AI’s algorithmic capabilities in various areas of the legal field, from helping law students write essays and aid with research, to judges leveraging AI to make predictions about discretionary decisions like flight risks and potential recidivism. In particular, he noted the problems with the “hallucinations” that occur when AI systems mistakenly generate non-existent case references in legal briefs. 

But Roberts also acknowledged the inevitability of AI, as many AI applications “indisputably” help aspects of the legal field and litigation processes, particularly at a trial level. These predicted changes in courtrooms will alter how judges perform their duties, and how they understand the role of AI systems, he wrote.

In addition to court operations, Roberts also acknowledged the “welcome potential to

smooth out any mismatch between available resources and urgent needs in our court system,” by providing laypersons with better tools to access legal processes.

A key part of that will hinge on incorporating experts like technologists and programmers into the judicial system to help guide these changes and prevent misuse. 

“Gone are the days when the quill pen alone was sufficient to maintain a docket; courts could not do our work without technologists and cybersecurity experts in the Department of Technology Services at the Administrative Office of the U.S. Courts, at the circuit-wide level, and in individual courts,” Roberts concluded. 

As Roberts penned this year-in-review focusing on AI, court filings across the U.S. have echoed his warnings and sentiments. For example, Michael Cohen, a former attorney to former President Donald Trump, explained in a December 29 court filing that he and his own attorney mistakenly referenced AI-generated case citations in an earlier motion. 

Those AI-generated citations were “hallucinated” by the technology and did not exist in real life. 

Organizations and individuals have also launched legal claims against AI developers for using their content to train such algorithms, violating their copyright. And lawmakers have introduced legislation that would require AI developers to provide more transparency about their training data to — in part — assist in such copyright claims.