AI is creating fake legal cases and making its way into real courtrooms, with disastrous results
AI has also played a hand in creating music, driverless race cars and spreading misinformation, among other things.
- AI has also played a hand in creating music, driverless race cars and spreading misinformation, among other things.
- It’s therefore highly concerning that fake law, invented by AI, is being used in legal disputes.
How do fake laws come about?
There is little doubt that generative AI is a powerful tool with transformative potential for society, including many aspects of the legal system. But its use comes with responsibilities and risks. Lawyers are trained to carefully apply professional knowledge and experience, and are generally not big risk-takers. However, some unwary lawyers (and self-represented litigants) have been caught out by artificial intelligence.
- When prompted by a user, they can create new content (both text and audiovisual).
- Although content generated this way can look very convincing, it can also be inaccurate.
- This is the result of the AI model attempting to “fill in the gaps” when its training data is inadequate or flawed, and is commonly referred to as “hallucination”.
It’s happening already
- The lawyers, unaware that ChatGPT can hallucinate, failed to check that the cases actually existed.
- Michael Cohen, Donald Trump’s former lawyer, gave his own lawyer cases generated by Google Bard, another generative AI chatbot.
- He believed they were real (they were not) and that his lawyer would fact check them (he did not).
What’s being done about it?
- Several US state bars and courts have issued guidance, opinions or orders on generative AI use, ranging from responsible adoption to an outright ban.
- Law societies in the UK and British Columbia, and the courts of New Zealand, have also developed guidelines.
- In Australia, the NSW Bar Association has a generative AI guide for barristers.
- Many lawyers and judges, like the public, will have some understanding of generative AI and can recognise both its limits and benefits.
Vicki McNamara is affiliated with the Law Society of NSW (as a member). Michael Legg does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.