ChatGPT for Legal Advice: The Risks
- dmrgordon
- Jul 9
- 4 min read

In recent years, artificial intelligence (AI) has changed how many tasks are performed across multiple sectors, including the legal field. In the UK, there has been a notable rise in individuals representing themselves in court, known as litigants in person (LiPs). With the increasing availability of AI tools for drafting legal documents, many LiPs are turning to these technologies to save time and money. With its ability to generate fluent, persuasive text in seconds, ChatGPT may seem like an ideal assistant for drafting legal submissions. However, despite its utility, relying on ChatGPT for legal drafting carries significant risks.
Overconfidence and Misplaced Trust
Because ChatGPT often produces text that sounds professional and confident, it can be tempting to assume that it is legally correct. However, just because something sounds right doesn’t mean it is right. LiPs in particular tend to focus on what they believe to be morally correct, rather than what is in fact legally correct (the two are not always aligned). As such, there is a danger that litigants may submit poorly argued or procedurally defective documents without realising it. This may result in wasted time, costs orders, or losing your case entirely.
AI Can Generate False or Misleading Legal Information
ChatGPT is known to occasionally "hallucinate"—that is, invent information that sounds believable but is factually and legally incorrect. This includes:
Misstated legal principles: It may misrepresent what the law actually says, or apply the wrong standard for your type of case.
Fake case law: AI has shown a remarkable tendency to invent case names, precedent, and rulings that appear legitimate and read well, but do not exist.
Misrepresentation of precedent: Even when referencing real cases, it might misstate legal holdings or misapply principles.
In England and Wales, court submissions must be accurate and are always signed by a statement of truth. Submitting information that is false - whether intentionally or not - can at best seriously undermine your credibility. At worst, it may even lead to costs being awarded against you, or your case being struck out or dismissed entirely.
To date, a number of cases have emerged in which LiPs have submitted fictitious case law (or pursued hopeless legal arguments), believing themselves to be in a strong position. It is important to understand that, although some leeway is often allowed, LiPs are subject to the same rules and expectations as law firms.
AI Doesn’t Understand the UK Legal System
ChatGPT was trained on a mixture of global sources, including US law, which differs significantly from the legal systems of England & Wales, Scotland, and Northern Ireland. As a result, it may:
Confuse UK and US law: For example, refer to a “motion to dismiss” instead of a “strike out application”.
Misunderstand court procedure For example, suggest you submit documents or evidence, or even bring claims in ways that don’t match UK practice.
Use terminology or concepts that don’t apply to your jurisdiction.
For example, if you’re bringing a claim in the County Court, the rules you must follow are set out in the Civil Procedure Rules (CPR). ChatGPT may not cite or apply these rules correctly, or may reference rules that don’t exist at all. Further, it may advise you to bring a more standard Part 7 claim in circumstances where a Part 8 claim is more appropriate. It is important for LiPs to be aware of these pitfalls before relying too heavily on AI to navigate the legal system.
Lack of Personalisation
Every case is different. ChatGPT does not know the full details of your situation, nor does it ask the kind of follow-up questions a solicitor or barrister would. This means it cannot:
Tell you whether you have a strong case
Help you comply with specific court directions
Advise you on how to present evidence or cross-examine a witness
Without tailored advice, there’s a real risk of making serious procedural or legal errors (for example, filing documents late; addressing the wrong legal points; or failing to meet disclosure obligations).
Risk of Breaching Confidentiality
When using ChatGPT, you are often sending data to a server over the internet. If you input personal, sensitive, or confidential information about your case, you may inadvertently:
Violate data protection laws (including the UK GDPR)
Jeopardise your privacy or your opponent’s privacy
Expose sensitive strategy or information that should remain confidential
As a litigant in person, you are still expected to handle information responsibly, particularly if your case involves children, finances, or medical matters.
You Are Still Responsible for What You Submit
Even if you use AI to help you write your documents, you remain responsible for everything you submit to the court. UK judges will not excuse mistakes simply because a submission was drafted with the help of AI. In fact:
Courts may take a dim view of AI-generated content if it’s misleading, inaccurate, or overly argumentative.
If your documents contain false legal authorities or irrelevant arguments, this can damage your case and waste court time.
You may be seen as failing to comply with various duties that you owe the Court.
Navigating the Risks of AI in Legal Drafting
The integration of AI tools into legal document drafting is a significant development for litigants in person in the UK. While AI can ease many challenges, LiPs should remain acutely aware of the risks involved.
ChatGPT and similar AI tools can offer basic assistance - helping you write in plain English or summarising your thoughts clearly - but they are no substitute for proper legal advice.
If you are a litigant in person, it is crucial to use these tools with care and always check the information yourself, ideally against reliable UK legal sources.
Whenever possible, consult a qualified solicitor or legal advisor before relying on AI-generated arguments or documents in court. The stakes in litigation are often too high to take chances.
If you need affordable legal advice to guide your journey through the legal system, please get in touch with us: info@advicehub.co.uk or message us on WhatsApp!


