A Victorian solicitor used an AI-powered research tool to generate fictitious legal authorities and presented them to the court. This incident raises concerns about the reliance on technology, the risks of AI in the legal sector, and accountability when errors occur. Similar cases have been reported in the UK, the US, and Canada. There is a need for stricter standards for lawyers using AI tools, better support from law firms providing AI tools, and concerns about AI’s tendency to produce incorrect outputs. It also points out Australia’s lack of comprehensive AI regulations and the need for better standards and protections for AI use in legal practice. The key lesson is that lawyers must not rely solely on AI-generated outputs and must rigorously verify their accuracy.
source: Lexology
Leave a Reply