ARTICLE
1 July 2025

AI And The Legal Sector From One Lawyer's Perspective

HL
Hunters

Contributor

For over 300 years, we have worked with individuals, businesses, trusts and organisations of all kinds to advise on legal issues. Consistently recognised in the Times’ Best Law Firms, we offer comprehensive legal solutions, including litigation, tax and estate planning, family, property, and business services, with a dedicated, partner-led team.
It is widely accepted that AI has a future and some of the uses of the technology can not only be useful, but groundbreaking. New products are being developed to ‘assist' professionals and in certain situations...
United Kingdom Technology

It is widely accepted that AI has a future and some of the uses of the technology can not only be useful, but groundbreaking. New products are being developed to 'assist' professionals and in certain situations, AI is being used to do more basic tasks, normally given to juniors to do, resulting in those roles no longer being needed but, on the flip side, making business more automated and more cost effective.

The legal sector is no exception. When you mention AI to lawyers, some react with excitement, some with reservations and some shudder with fear, thinking AI will soon make our roles redundant.

But, in my opinion, AI is not yet the refined support staff function that can be relied on fully.

There are several examples where lawyers around the world are using AI software to prepare their lawsuits and court submissions inadvertently using 'halucinated' cases. In 2023, a US judge fined two lawyers and a law firm $5,000 after fake citations generated by ChatGPT were submitted to the Court. In February this year, two other American lawyers included fictitious case citations in a lawsuit against Walmart, having used AI software. In the Ontario Supreme Court, a lawyer involved, faced contempt of court for submitting and relying on cases that did not exist.

A similar situation arose in the UK High Court in the case of Frederick Ayinde v The London Borough of Haringey [2025] EWHC 1040 (Admin). The Claimant's lawyer in that case had put forward fake cases in their submission. The court determined that the claimant's legal representatives had acted in an improper, unreasonable or negligent manner. It was especially critical of their attempt to downplay the submission of fabricated cases as merely "cosmetic errors." The court found that presenting false summaries of five fictitious cases amounted to clear professional misconduct. As a result, the court ordered both the claimant's solicitors and counsel to pay wasted costs and directed that the transcripts of the proceedings be forwarded to the appropriate professional regulatory bodies.

A recent study also engaged with the UK Judiciary to see whether AI could take on judicial tasks. The study by Erin Solovey, Brian Flanagan, and Daniel Chen (2025) - 'Interacting with AI at Work: Perceptions and Opportunities from the UK Judiciary', is fascinating in its results and the insights on what AI needs to do to improve. For now, the study found that:

  • Justice is rooted in human decision making and reasoning.
  • The human component in justice holds value.
  • The value of the human component is case-dependent and role-dependent.
  • Decision making is collaborative with multiple points of view.

Though AI may not yet take over the judicial process, the participants of the study, which included 5 members of the Supreme Court, found that 'there was a general sense of opportunity regarding the use of AI for various tasks within the judiciary'.

AI is also being used more and more within the property sector. New software is constantly being introduced to review titles, carry out searches and even compile reports for buyers. While this is sold as a way to make the process quicker, a full review would still be required to ensure that these 'reports' are correct and contain all the relevant information. A simple 'copy and paste' job of the information by AI is not good enough when interpretation and explanation is needed. Most AI experts accept the fact that AI reviews of documents aren't always 100% correct, but the mistakes being made aren't consistent. So if a lawyer has to review the AI-reviewed report and to crosscheck all the information, is this really assisting in saving time or doubling the work required? To rely solely on AI reports would also increase the risk of negligence considerably, which isn't a risk worth taking.

In other sectors too, colleagues fully expect that one day AI will be able to generate their contracts and agreements, which take years of training and experience for a lawyer to perfect, but it's not quite there yet and certain nuances cannot be factored in. I hope the day AI is able to prepare legal documents doesn't come during my career.

There is no doubt, however, that AI has a future, and it would be foolish to think it doesn't. As it continues to develop and improve, the errors we're finding today will be what the AI software learns from. As we continue to use the software, we will hone its skills. For legal practitioners, however, if you don't keep up with the changes, you could get left behind.

For now, however, our jobs are safe. But what does the future hold?

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More