When a judge discovers that a case being cited just doesn’t exist, a certain kind of silence descends upon the courtroom. No theatrical outrage, no intense legal drama. A brief pause, a thoughtful inquiry, and an exhausting afternoon for the receiving lawyer. Incompetence in the traditional sense is not the cause of the silence that has been occurring more frequently lately. It is more recent, unfamiliar, and challenging to regulate. Due to time constraints and pressure, attorneys have begun to delegate their thinking to chatbots.
Silently, the issue surfaced. A solicitor here, a junior barrister there, each silently pasting questions into ChatGPT and replicating anything that seemed plausible. Most of them might have thought the technology understood what it was talking about. Usually, it does sound that way. In a sense, that’s the entire trick. With clear sentences and the occasional Latin flourish, generative AI writes like a self-assured lawyer, and the citations it creates are identical to those found in real judgments. Until someone takes the trouble to look.
| Detail | Information |
|---|---|
| Issue | Misuse of generative AI in legal filings |
| Leading Case | Ayinde v London Borough of Haringey |
| Secondary Case | Al-Haroun v Qatar National Bank |
| Jurisdiction Cited | Hamid jurisdiction (High Court of England and Wales) |
| Presiding Judge | Dame Victoria Sharp, President of the King’s Bench Division |
| Regulatory Body Involved | Bar Standards Board and Solicitors Regulation Authority |
| Study Referenced | Stanford (2024) — chatbot hallucination rates of 58%–82% on legal queries |
| Working Group | Civil Justice Council AI working group |
| Key Public Statement | Ian Jeffery, Chief Executive of the Law Society |
| Technology at Centre | Large language models (GPT-based generative systems) |
| Sanction Risk | Professional misconduct referral, possible contempt proceedings |
During the judicial review process in the Ayinde case, a student attorney submitted five fictitious authorities. Almost hourly, her explanations changed from harmless citation mistakes to an ambiguous tale about files pulled out of a personal box. The judge didn’t find it impressive. The court described the event as a professional embarrassment and found that she had either purposefully made up the cases or relied on a generative tool without checking a word of what it generated. In some respects, the Al-Haroun situation was worse.
A lawyer acknowledged using an AI system to compile research for his own client, but he filed it without verifying. The court described it as a “lamentable failure,” which is as close to outright frustration as English judges usually get.

There’s a feeling that despite being warned time and time again, the profession did not fully believe it. According to a Stanford study conducted last year, between 58 and 82 percent of the time, general-purpose chatbots had hallucinations about legal queries. This percentage is so high that it almost seems absurd until you consider it appearing in a filing. The next version will be better, according to Sam Altman. According to reports, newer models occasionally experience more hallucinations rather than fewer. As technology advances, it becomes more difficult to identify errors rather than easier.
The part that customers notice is the cost. Every fictitious citation results in additional hearings, fees, and court time. The picture becomes uncomfortable when you multiply that over a system that is already overextended.
During London International Disputes Week, Lord Justice Birss made it clear that you can use AI to summarize a document, but only after you’ve read it. He said it’s absurd to skip the reading and let the machine handle both tasks. It’s difficult to ignore how basic that advice seems and how important it seems to have become.
Regulators are rushing outside the courts. A working group has been established by the Civil Justice Council. There has been pressure on the Law Society and Bar Council to revise their guidelines. It remains to be seen if any of it slows the drift. All attorneys already have the tools on their desktops; they are marketed as efficient and reasonably priced.
It’s still unclear if the legal system can handle the mess without losing some of its credibility. For now, the courts are improvising, the lawyers are nervous, and somewhere a chatbot is confidently inventing its next case.
Disclaimer
Nothing published on Creative Learning Guild — including news articles, legal news, lawsuit summaries, settlement guides, legal analysis, financial commentary, expert opinion, educational content, or any other material — constitutes legal advice, financial advice, investment advice, or professional counsel of any kind. All content on this website is provided strictly for informational, educational, and news reporting purposes only. Consult your legal or financial advisor before taking any step.
