“CHANGE AT THE BAR AND THE GREAT CHALLENGE OF GEN AI” 

The issue of hallucinations in generative AI (Gen AI) is no longer just a concern for self-represented litigants; it has also led to the disciplining of lawyers in multiple jurisdictions, including Australia, for relying on AI-generated false content. There are at least eight types of “hallucinated content” that parties use in legal cases, often involving multiple categories simultaneously. It is rare for only one category of hallucination to occur in any given case.

In Ayinde v London Borough of Haringey [2025] EWHC 1383 (Admin) and Hamad Al-Haroun v Qatar National Bank QPSC and QNB Capital LLC (heard together) 

Fake Citations: AI’s ability to produce citations that do not exist is a serious concern. Used to support false legal propositions or, less dangerously, to misrepresent true propositions. The potential for these to slip past lawyers and courts unnoticed, even entering judgments, underscores the need for heightened vigilance.

AI-generated misattributed citations, which reference real cases with incorrect case names or irrelevant cases, can lead to confusion or misjudgment. Instances where litigants relied on non-existent or unrelated case law highlight the need to scrutinise AI-generated content.

Fabricated Authorities: AI fabricates or misrepresents real legal authorities, including cases that do not exist or incorrect citations, leading to unreliable legal arguments.

Irrelevant Authorities: AI cites legitimate but irrelevant cases, sometimes in a manner that misleads or confuses the court, as seen in the example of a litigant citing cases unrelated to the legal issue.

Falsified Quotes: AI generates fake quotes or summaries from genuine cases or legislative documents, sometimes attributing them to the wrong sources, which can distort or fabricate legal arguments.

Fabricated External Materials: AI invents references to parliamentary debates, reports, or other documents—such as false citations from legislative debates or official reports—to support legal submissions.

Non-Existent Legislation or Incorrect References: AI references legislation or rules that either do not exist or are no longer in force, such as citing repealed laws or non-existent procedural rules.

Fabricated Evidence: AI can create fake evidence, including expert reports, affidavits, or other documentary exhibits, which may mislead courts or parties.

This proliferation of AI hallucinations raises serious concerns about AI-assisted legal research and argumentation reliability. It underscores the responsibility and diligence required to verify AI-generated content in legal proceedings.

“[24] The court’s response will depend on the particular facts of the case.

Relevant factors are likely to include: (a) the importance of setting and enforcing proper standards; (b) the circumstances in which submission of false material before the court; (c) whether an immediate, full and a truthful explanation is given to the court and to other parties to the case; (d) the steps taken to mitigate the damage, if any; (e) the time and expense incurred by other parties to the case, and the resources used by the court in addressing the matter; (f) the impact on the underlying litigation and (g) the overriding objective of dealing with cases justly and at proportionate cost.”

The best practice for any party relying on hallucinated material generated by Gen-AI, whether they are a legal practitioner or a self-represented litigant, is to inform the court of the error promptly, admit to using AI, issue an unconditional apology, take full responsibility, propose positive measures to rectify the mistake, and, where appropriate, seek leave to file an affidavit that openly and honestly explains the circumstances in which the deponent obtained the relied-upon material.

Source Supreme Court of NSW

Leave a Reply

Your email address will not be published. Required fields are marked *