As artificial intelligence (AI) tools become increasingly embedded in legal practice, courts worldwide are now confronting the consequences of their misuse. Recent cases from the United Kingdom, United States, and Australia highlight the risks of unverified AI-generated content, and remind lawyers that professional obligations remain non-delegable.
United Kingdom: Duty to the Court Prevails
Ayinde v London Borough of Haringey [2024] EWHC 1799 (KB)
Facts
• Mr Ayinde applied for social housing and, after being refused urgent interim accommodation, brought judicial review proceedings under Housing Act 1996 (UK) s 188(3).
• His barrister filed grounds supported by five case citations, which Haringey’s solicitors discovered did not exist.
• When challenged, the barrister dismissed the issue as “minor citation errors” without explanation.
Decision
• The court found the cases were fictitious and the grounds also misstated legislation.
• The court made a wasted costs order in favour of Haringey.
• The court referred the barrister to the Bar Standards Board, and the solicitors to the Solicitors Regulation Authority.
• At a Hamid hearing, the barrister admitted negligence but denied deliberate use of generative AI, suggesting she may have inadvertently relied on AI-influenced Google results.
Key Points
• Submitting fake cases is a serious breach of duty; attempts to downplay such errors as “cosmetic” are unacceptable.
• The court considered contempt proceedings but held back due to counsel’s inexperience.
• The judge warned that future breaches may attract “severe sanctions.”
United States: Fabricated Citations Carry Consequences
In Shahid v Essam, 368 Ga App 155 (2023), the Georgia Court of Appeals overturned a trial decision incorporating AI-fabricated case citations into the trial judgment and appellate briefs. The husband’s lawyer had submitted fictitious authorities generated by AI, which the lower court accepted uncritically.
Shahid v Essam 368 Ga App 155 (2023) (US)
Facts
- After moving to Texas, a wife sought to reopen a divorce decree, arguing improper service by publication.
- The husband’s lawyer opposed, filing submissions that included 11 fabricated authorities out of 15, some repeated in the trial judgment and appeal.
- The lawyer also sought attorney’s fees on appeal, citing a case that did not exist and misstating Georgia law.
Decision
- The Georgia Court of Appeals found it could not meaningfully review the trial decision because fabricated authorities had been relied upon.
- The Court vacated the trial judgment and remanded the matter for rehearing.
- The court imposed the maximum available sanction of $2,500 costs against the husband’s lawyer.
Key Points
- The court condemned using fictitious cases as damaging to litigants, the courts, and the justice system.
- Fake authorities waste resources, undermine confidence in judicial rulings, and harm the profession’s reputation.
- The Court applied Cost sanctions, but the judgment stressed the seriousness of the misconduct and its potential systemic consequences.
Key Takeaway
The Court of Appeals emphasised that using reliable AI tools is not inherently improper, but lawyers remain personally accountable for the accuracy of all filings. A costs order was imposed on the husband’s lawyer, underlining that failure to verify AI-produced citations independently is a breach of professional obligations.
This decision underscores the profound ethical breach in presenting fake cases to a court. While cost sanctions and reputational damage are consequences, they may not be enough to deter misconduct. The ruling highlights the need for the legal profession and regulators to take more decisive action—potentially through disciplinary proceedings—against lawyers who submit fabricated authorities. Protecting the integrity of the justice system demands more than a symbolic penalty.
Australia: Indemnity Costs for AI Errors
Facts
- In a native title proceeding, the applicant filed a summary document with citations to anthropological and historical materials.
- Respondents (First Nations Legal and Research Services) attempted to source the cited materials, discovering most did not exist.
- A junior solicitor working remotely relied on Google Scholar; the court inferred that generative AI contributed to fabricated citations.
- The solicitor’s supervising principal admitted the work was unchecked and accepted responsibility for inadequate supervision.
Decision
- Justice Murphy held that the solicitor had failed to verify citations, compounded by insufficient supervision.
- The applicant’s law firm was ordered to pay the respondents’ costs on an indemnity basis.
- Neither the junior solicitor nor the supervisor was referred to the Victorian Legal Services Board, though that option remained open.
Key Points
- Generative AI increases risks of false citations, but lawyers remain responsible for verification.
- Proper supervision of junior staff is essential; failure to supervise attracts personal cost consequences.
- The Federal Court stressed that AI use must be consistent with practitioners’ overriding duty to the court and justice.
As AI tools evolve, legal practice must adapt. These cases are cautionary tales and reminders that diligence, judgment, and ethical responsibility remain at the core of legal practice. AI can enhance the profession—but only if used with care and accountability.
Leave a Reply