Artificial intelligence is rapidly changing the way legal work is carried out. From drafting documents to summarising evidence, AI tools are becoming a common feature of modern litigation.
The courts’ position is that whilst AI can be helpful, it must be used carefully and responsibility always remains with the lawyer.
Recent cases show that misuse of AI, particularly where it leads to inaccurate or fabricated legal material being put before the court, can have very serious consequences.
Use AI with caution but always verify
The courts recognise that AI has legitimate uses in litigation. It can assist with document-heavy exercises such as disclosure and can help lawyers work more efficiently.
However, judges are increasingly concerned about the risk, especially the potential for AI to produce confident but incorrect answers, sometimes referred to as “hallucinations”. Judicial guidance emphasises that any material produced using AI must be carefully checked before it is relied upon.
Personal Accountability
To enforce these standards, the courts rely on what is known as the Hamid jurisdiction, which allows judges to investigate and address poor conduct by lawyers directly.
This enables the court not only to determine the dispute between the parties but also to examine how the case has been conducted. Where there are concerns, such as misleading submissions or inaccurate legal authorities, the court can require the lawyers involved to explain themselves. This can lead to orders requiring lawyers to pay costs personally, referrals to professional regulators and, in serious cases, contempt proceedings
Importantly, this process focuses on individual responsibility. It is no defence to say that an error arose from AI or from reliance on another source. Lawyers are expected to verify everything that goes before the court.
Recent cases
The leading authority of Ayinde v London Borough of Haringey (2025) is the best illustration of the courts’ approach.
In that case, court documents contained references to at least five cases that did not exist and there were errors in the legal analysis presented to the court. The court considered that these issues may have arisen from the use of generative AI (albeit it did not make a definitive finding that AI was used).
Under the Hamid jurisdiction, the court found there had been serious failures of competence and scrutiny and emphasised that putting false material before the court is a breach of duty, even if not deliberate. It expressed wider concerns about training and supervision of junior lawyers. The court said that the Claimant’s lawyers’ conduct should be referred to the relevant professional regulators.
More recent cases show the courts taking an increasingly proactive approach.
In Elden v HMRC [2026], the tribunal imposed practical safeguards to prevent similar issues arising, including requirements to provide full copies of any authorities relied upon, use accurate quotations from judgments and confirm that all references have been independently checked
These developments demonstrate that the courts are moving beyond general warnings to actively policing how legal material is prepared and presented.
A widely reported US case, Mata v Avianca (2023), involved lawyers submitting a court document that relied on entirely fabricated case law generated by AI. The court imposed financial penalties and criticised the lawyers for failing to verify their sources. This case is frequently cited as a cautionary example of what can go wrong when AI outputs are trusted without proper checking.
What this means in practice
Parties involved in litigation should be reassured that the courts are actively safeguarding the integrity of the legal process as new technologies develop. For lawyers involved in litigation, AI must be approached and used with caution. Any legal authority or statement of law must be independently verified and our duty to the court remains unchanged.
If you would like to discuss a dispute or litigation matter, please contact Jonathan Tyler, our Group Head of Litigation, at jonathan.tyler@wellerslawgroup.com or on 01732 446361.

