Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Graig Graziosi

Supreme Court chief justice warns of dangers of AI after blunder involving ex-Trump fixer

Copyright 2022 The Associated Press. All rights reserved

Supreme Court Chief Justice John Roberts discussed AI and its possible impact on the judicial system in a year-end report published over the weekend.

Mr Roberts acknowledged that the emerging tech was likely to play an increased role in the work of attorneys and judges, but he did not expect to be fully replaced anytime soon.

“Legal determinations often involve gray areas that still require application of human judgment. I predict that human judges will be around for a while," Mr Roberts wrote, according to Reuters. "But with equal confidence I predict that judicial work – particularly at the trial level — will be significantly affected by AI."

He said there may come a time when legal research will be "unimaginable" without the assistance of AI, for example, and noted that the tech could provide access to legal resources to Americans who may otherwise have been unable to afford them.

“These tools have the welcome potential to smooth out any mismatch between available resources and urgent needs in our court system,” he wrote.

The chief justice noted that the benefits of AI wouldn't come without some risks and drawbacks.

“AI obviously has great potential to dramatically increase access to key information for lawyers and non-lawyers alike. But just as obviously it risks invading privacy interests and dehumanizing the law,” he said.

In addition to those risks, popular LLM chatbots like ChatGPT and Google's Bard can produce false information — referred to as "hallucinations" rather than "mistakes" — which means users are rolling the dice anytime they take trust the bots without checking their work first.

Michael Cohen, Donald Trump's former lawyer and fixer, admitted that he had used an AI to look up court case records, which he then gave as a list of citations to his legal team. Unfortunately for Mr Cohen, the cases he cited didn't exist — they were an AI's dream.

Mr Roberts suggested it was "always a bad idea" to cite non-existent court cases in your legal filings.

Due to the potential pitfalls of AI reliance, Mr Roberts urged legal workers to exercise "caution and humility" when relying on the chatbots for their work.

The 5th US Circuit Court of Appeals in New Orleans seems to agree with Mr Roberts' view that AI-assisted lawyering may produce some less-than-satisfactory work. The court has proposed a rule that would require lawyers to either certify that that did not rely on AI software to draft briefs, or that a human fact-checked and edited any text generated by a chatbot.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.