The Use of Generative Artificial Intelligence (AI)

Guidelines for Responsible Use by Non-Lawyers

These guidelines apply across Queensland’s courts and tribunals (including the Supreme, District, Magistrates, Land, Children’s, Industrial, Planning and Environment Courts, QIRC and QCAT).

They are intended for non-lawyers (self-represented litigants, McKenzie friends, lay advocates, employment advocates) who may use AI chatbots (e.g. ChatGPT, Claude, Copilot, Gemini) in preparing for proceedings.

Key Point: Generative AI is not a substitute for legal advice. Chatbots often provide inaccurate information about Australian law. They should not be your only or primary source of legal information. Whenever possible, seek advice from a lawyer or consult reliable resources:

  • AustLII – www.austlii.edu.au
  • Queensland Judgments – www.queenslandjudgments.com.au
  • Queensland Legislation – www.legislation.qld.gov.au
  • Legal Aid Queensland – www.legalaid.qld.gov.au

1. Understanding Generative AI

  • Chatbots are based on Large Language Models (LLMs), which predict word patterns – they do not reason like humans, nor understand truth.
  • Their responses are probabilistic guesses, not authoritative answers.
  • They cannot provide tailored legal advice, assess your case, or consider cultural/emotional factors.

What they can do:

  • Help organise information into structured documents.
  • Suggest headings, formatting, tone, grammar, or style improvements.
  • Provide general explanations of laws or concepts.

Limitations:

  • Not trained on authoritative or up-to-date Australian law or court procedures.
  • May produce fake cases, citations, or legislation.
  • Cannot reliably predict case outcomes or success rates.
  • Responses may be biased, misleading, or incomplete depending on the training data and your prompts.

2. Confidentiality, Suppression and Privacy

  • Do not enter private, confidential, privileged, or suppressed information.
  • Many AI tools store prompts and may use them in responses to other users.
  • Inputting such material risks breaching suppression orders or exposing sensitive details.

3. Ensuring Accuracy

  • You are responsible for checking the accuracy of anything submitted to the court.
  • AI-generated outputs may be outdated, wrong, or based on foreign law.
  • Courts may impose costs orders if inaccurate or fictitious material (e.g. fake citations) delays proceedings.
  • Always cross-check with a lawyer (if possible) or reliable legal databases.

4. Ethical Issues

  • AI reflects the errors and biases in its training data.
  • Copyright and plagiarism risks arise if AI is used to summarise or reframe intellectual property (e.g. textbooks).
  • AI-generated material must be checked for accuracy, meaning, and attribution.
  • For speeches or submissions, AI may help outline ideas, but sources should be verified and cited when appropriate.

5. Security

  • Follow standard security practices when using AI tools (work devices, secure logins, avoid public or shared platforms where possible).

Source: Queensland Courts

Leave a Reply

Your email address will not be published. Required fields are marked *