This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minutes read

'Algorithmic Serendipity': Lessons From the World of AI Hallucination

Stories about attorneys and litigants using artificial intelligence (AI) to generate court documents have started popping up all over the country with increasingly negative results.

In one of the most recent tales, a pro se litigant, who is also running for a seat in the Missouri state legislature, was ordered to pay his opposing party $10,000 for filing a “frivolous appeal” because his AI-generated briefing contained citations to fictitious legal cases. The litigant, a business owner, was appealing an order that he and his associated companies pay a former employee $311,000 in unpaid wages, damages, and fees. In addition to not filing the required documents and not following the appellate rules generally, the business owner submitted an AI-generated appellate brief littered with false and inaccurate case citations. More specifically, only two of the business owner's 24 case citations were accurate as to case name and citation. Of the other 22, 17 were completely fictitious case names and citations, and five – those the court referred to as “algorithmic serendipity” – were real case names but paired with fictitious citations.

The business owner apologized to the court, blaming a retained consultant/attorney he allegedly hired to prepare his appellate brief. The court was not swayed and awarded damages despite recognizing that the business owner was not a lawyer and was unrepresented by counsel.

We have seen enough of these cases to identify some obvious lessons about the use of generative AI in law.

  • AI makes things up. Not all of the time and not always in the same way, but hallucinations are real.
  • Courts won't care what excuse you have for submitting false cases or citations to the court. Nor will they care if you are not a lawyer.
  • AI may be useful in identifying possible relevant cases, but you have to read. You have to read the case, read the parentheticals, check the citation, and make sure it supports your case.
  •  Legal professionals are under pressure to use AI to increase efficiency and decrease costs. Neither of those objectives is achieved if hallucinations are introduced into the case. Use AI as a tool, not the tool.
  • “Algorithmic serendipity” may be the best new phrase of 2024.

Stay tuned.  Despite the advancements and promise of AI, odds are there are more hallucinations – and lessons – to come.

Sherman and Howard’s AI Task Force is working to stay abreast of these new developments and the implications for our clients. This article is part of a series of client advisories that will discuss the potential benefits and drawbacks of generative AI in the workplace, relevant statutory and regulatory guidance, litigation risks, and other critical issues. If you’d like to subscribe to these advisories, click HERE.

"We regret that Appellant has given us our first opportunity to consider the impact of fictitious cases being submitted to our Court, an issue which has gained national attention in the rising availability of generative A.I." - Honorable Kurt S. Odenwald, Missouri Court of Appeals


artificial intelligence