Josh Taylor 

Melbourne lawyer referred to complaints body after AI generated made-up case citations in family court

Legal professional used software to generate a case citation list, but did not use documents that had undergone human verification
  
  

The family law court building in Sydney, Australia
A lawyer who submitted unverified AI-assisted research to the family court has apologised and says he will ‘take the lessons learned to heart’. Photograph: Artist85/AAP

A Melbourne lawyer has been referred to the Victorian legal complaints body after admitting to using artificial intelligence software in a family court case that generated false case citations and caused a hearing to be adjourned.

In a 19 July 2024 hearing, an anonymous solicitor representing a husband in a dispute between a married couple provided the court with a list of prior cases that had been requested by Justice Amanda Humphreys in relation to an enforcement application in the case.

When Humphreys returned to her chambers, she said in a ruling that neither herself nor her associates were able to identify the cases in the list. When the matter returned to court the lawyer confirmed that the list had been prepared using legal software Leap, and he said there was an AI element to Leap’s software.

He acknowledged he did not verify the accuracy of the information before submitting it to the court.

In the initial ruling, first reported by Crikey, the lawyer was given a month to respond as to why he should not be referred to the Legal Services Board and Commissioner for investigation, and in a ruling from August and published this month, Humphreys referred the solicitor for investigation.

In her ruling, Humphreys says the solicitor admitted the AI software had been used to generate the list but it had not been reviewed by he or anyone else, and the cases were not real.

The lawyer offered an “unconditional apology” to the court and said he would “take the lessons learned to heart” and asked not to be referred for investigation. He said he did not fully understand how the software worked, and acknowledged the need to verify AI-assisted research for accuracy. He made a payment to the solicitors for the other party for the costs of the thrown away hearing.

Humphrey said she accepted the apology and acknowledged the stress it caused meant it was unlikely to be repeated, but a referral for investigation was important given it was in the public interest for the Victorian Legal Services Board and Commissioner to examine professional conduct issues, given the increasing use of AI tools in law.

Humphrey noted the family court had not yet issued guidelines on AI use, but the supreme court of Victoria and the county court of Victoria’s guidelines state practitioners using AI tools should know how they work and their limitations.

A spokesperson for Leap told Guardian Australia that verifying the work was a key part of a lawyer’s ethical obligations. The company said it provides a free verification process where a human, local experienced lawyer examines the output from its software to ensure accuracy.

The spokesperson said that verification process was done in this case, but those documents were not used by the lawyer. The software firm did not elaborate on why they were not used, and the man’s name has been kept anonymous.

“Despite the legal professional using [Leap software] LawY’s verification process, which sent the user the correct information just four hours after requesting it and well before appearing in court, the user unfortunately did not utilise this correct information in court,” the spokesperson said.

“This example provides a timely reminder that AI is a powerful tool but must be used appropriately by users to add value to legal practice.”

The company said 66,000 legal professionals worldwide use its software.

An August 2024 announcement from Leap states that firms can “generate precedents with ease using AI” with AI-powered templates to be used to produce letters, emails and forms in less than a minute.

The case is not the first of lawyers being provided false information via use of AI tools. In Canada in March, a lawyer in a custody case who used ChatGPT to find previous case law found the large language model provided false cases.

It was also reported last year that lawyers representing a man allegedly injured on a flight had also used ChatGPT to find cases to support their argument and found the software had generated what the court called “bogus opinions”.

 

Leave a Comment

Required fields are marked *

*

*