Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
National
Josh Taylor

Fake cases, judges’ headaches and new limits: Australian courts grapple with lawyers using AI

A illustration of a smartphone with gavel
Australian lawyers have begun using AI but judges are worried about what this could mean for the veracity of some court documents. Photograph: Maksim Kabakou/Alamy

The lawyer had a back injury and was short on time. Submissions were due in an immigration case, where he was representing someone in an appeal against a decision by the minister. He had heard about the use of artificial intelligence in legal work and logged into ChatGPT, inserting some words relevant to the case to prepare a summary of cases. He thought the summary read well so he incorporated the details and references into the submissions without checking the details.

Then, about two weeks later, the immigration minister’s outline of submissions highlighted 17 cases cited in the applicant’s documents that did not exist.

The lawyer was deeply embarrassed and apologetic when this was brought before the federal court last year. But the minister took the strong view, reflected in the subsequent federal circuit and family court ruling, that generative AI use was of concern to the public and so it was important to set an example and “nip in the bud” lawyers using generative AI without checking their work.

The court referred the matter to the New South Wales legal services commissioner for review.

As AI grows in popularity in some industries, the full extent of its use in the legal profession is unknown.

The media company Thomson Reuters, which offers its own AI software to lawyers, surveyed 869 private practice professionals in Australia last year. It suggested 40% worked at firms that were experimenting with the use of AI but proceeding with caution.

The results suggested 9% of lawyers were actively using AI in daily operations. Nearly one in three said they would like a generative AI legal assistant.

‘Correct and ethical use’

In a case last year, a Melbourne lawyer was referred to the Victorian legal complaints body after admitting using AI software in a family court case that generated false case citations and caused a hearing to be adjourned.

The solicitor, representing a husband in a dispute between a married couple, provided the court with a list of prior cases that had been requested by the judge.

Neither the judge nor her associates were able to identify the cases in the list. When the matter returned to court, the lawyer confirmed the list had been prepared using the legal software Leap, which has a generative AI component.

Leap specifically allows legal practitioners to check the work output by the AI, but the lawyer did not do this in this instance when the technology “hallucinated” citations.

Christian Beck, Leap’s chief executive, says lawyers around the world are beginning to adopt AI technology.

“At Leap, we encourage the correct and ethical use of our integrated AI products, and have implemented a range of mitigation, education and professional development measures as part of our service offering,” Beck says.

“LawY provides users with a free lawyer verification process that is underpinned by experienced, local lawyers. We also take confidentiality incredibly seriously by deploying technology that is not used to train large language models, avoiding the risk of exposure of confidential material.”

It’s not just lawyers being criticised for their use of AI. Those providing affidavits to courts, or people representing themselves, are of concern to judges.

In one case last year an offender tendered a personal reference from his brother, which the Australian Capital Territory’s supreme court said “strongly suggest[ed]” it was written by a large-language model such as ChatGPT.

There was a statement in the brother’s affidavit that said he had known his brother “both personally and professionally for an extended period”, which the court suggested cast doubt on whether the brother had prepared it himself.

And in a decision by the Victorian court of appeal regarding the dismissal of a student by a university, it was noted that some documents filed by the applicant “contained some case citations to cases that do not exist”.

“I have not reproduced those citations, lest they contribute to the problem of LLM [large language model] AI inventing case citations,” Justice Kristen Walker said.

Last month an expert witness called by the Minnesota attorney general to speak about misinformation was rebuked after he was found to have used fake article citations generated by AI to support the state’s arguments.

‘Still early days’

So far, there haven’t been more than a handful of complaints to the various state bodies that oversee lawyers in Australia.

“It is still early days and we do anticipate we may receive more complaints as the use of generative AI becomes more commonplace,” a spokesperson for the NSW Legal Services Commissioner said.

The Queensland legal services commissioner, Megan Mahon, says technology is changing the way the public engages with lawyers.

“A self-help approach is one thing, but ensuring AI does not further enable unlawful and unqualified delivery of legal services is of great concern,” Mahon says.

“It is therefore critically important that anyone seeking legal advice ensures they are engaging with a qualified and licensed legal practitioner.”

On Monday a practice note issued by the NSW supreme court put limits on the use of generative AI by lawyers, including stipulating it must not be used to generate affidavits, witness statements, character references or other material tendered in evidence or used in cross-examination.

Jeannie Paterson, a professor of law and the director of the Centre for AI and Digital Ethics at the University of Melbourne, says the errors may be seen from lawyers with fewer resources or experience, and that it shows the need for lawyers to be trained on use of AI.

“I think this is an AI literacy problem as much as a slack solicitor problem,” she says.

“Train people in where it is useful, because once you start using it in a conscious way, you realise it’s actually not good at these sorts of things.”

“Our legal system is going to implode if we don’t have that sort of literacy.”

The Victorian legal services board has identified lawyers’ improper use of AI as a key risk.

“It’s important for lawyers to remember that it’s their duty to provide accurate legal information, not the duty of an AI program,” a spokesperson said.

“Unlike a professionally trained lawyer, AI can’t exercise superior judgment, or provide ethical and confidential services.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.