How to spot ChatGPT output masquerading as legal analysis

Is your client using generative AI to do research?

How to spot ChatGPT output masquerading as legal analysis
Monica Goyal

Have you noticed or suspected that your clients are using ChatGPT to research or produce legal drafts and then give their work to you to “look over”? Not only can this consume much time and cause frustrating errors, but you may also realize that your new competitor for legal work has a name: ChatGPT.

I’ve recently found several instances where I suspected that clients used ChatGPT or some other generative AI tool. On one occasion, a client claimed it was perfectly legal to create a new class of shares through a resolution. When I advised them that was not the case, they sent me their “research.” I could immediately tell that the “research” was produced by ChatGPT.

ChatGPT can be handy for creating drafts, but it can also hallucinate and provide inaccurate or misleading content that can be harmful if used in a legal context. However, if you haven’t been looking at generative AI solutions, you may not know what to look out for, so I wanted to share some points that help me identify when a client is using ChatGPT for research or document creation.

Look for language and style mismatches

Compare the document with the client’s previous communications. ChatGPT tends to write in a more formal tone that may differ from the client's usual writing style. If you see a sudden change in that style, it may be a sign that they used ChatGPT.

Sentence structure and headings

Another way to spot ChatGPT is to check the length of sentences and, secondly, the use of nontypical characters in sentences. ChatGPT often uses long sentences with lists separated by commas, which may not match your client's natural style (or anyone’s natural style, for that matter).

Heading formatting can also indicate the use of ChatGPT. When copying headings from other sources, the tool sometimes leaves asterisks (**) or number signs (#) around them as formatting marks. When ChatGPT material is copied, the formatting marks are also copied, indicating the use of ChatGPT.

Overly generalized content

ChatGPT can generate well-structured but generalized content. The GPT model has been trained on the information publicly available on the internet (where there is much information). This information tends to be general and lacks depth. Even when you upload a particular document or reference a website to ChatGPT, it tends to summarize the document or website in a generalized way. So, if a document or email from the client contains overly broad sections or lacks detailed, case-specific information, this may indicate AI involvement.

Lack of personal perspective

Law is very fact-dependent. A legal response to a client query should be based on the client's facts. Law school exams teach us to apply facts from different cases to our present scenarios. ChatGPT is a mathematical model that works with probabilities, providing guesses at the answer, which is not necessarily accurate, and the answer may misquote the client’s information. The answers will be too generic and not meet the client's needs. In my experience, ChatGPT could be fine for simple general queries but not for complex and more nuanced issues.

Factual inaccuracies and legal misinterpretations

Interestingly, many clients are still unaware of the incident in New York City, where a lawyer used ChatGPT to draft a legal brief and cited several fake cases that The New York Times reported. This should be a warning to not only lawyers but everyone.

Unfortunately, many people don’t realize how ChatGPT works. They may see it as a time-saving magic pill, but they don’t realize that it can hallucinate and make up stuff. They assume that the information will be factually accurate and that the law will be accurate. Of course, clients seek legal information online, but the legal websites they find are not specific to their situation and are usually insufficient to address their issues. Armed with tools like ChatGPT, some (or likely many) clients turn to these websites for legal information and document generation. However, they do not know how to frame the prompt or evaluate the output. You will know that ChatGPT was used because the client is taking a legally incorrect position or the output document is wrong.

If you do suspect AI generated something, you could use an AI detector like Hive. AI detectors allow you to copy and paste the content, and then the tool will let you know if any of the content is AI-generated with a confidence score. Whichever detector tool you choose, check the reviews to see whether the detector performs well. The challenge for lawyers is to bridge the knowledge gap and educate clients on the potential issues with using generative AI in the legal context.

As AI technology continues to evolve and improve, clients will undoubtedly turn to tools like ChatGPT and MS Copilot to help them with legal work, and lawyers might find themselves spending time fixing the problems introduced by what I predict will be an increasing number of do-it-yourselfers.

We must remain vigilant and encourage communication with clients to ensure transparency around their use of AI so we can be extra careful to identify any possible legal issues.  Remember, ChatGPT is a tool, not the Oracle, and it could be your new competitor for legal work.

Monica Goyal will appear at Canadian Lawyer’s LegalTech Summit Canada on June 12 in Toronto.

Recent articles & video

Roundup of law firm hires, promotions, departures: July 15, 2024 update

SCC reinforces Crown's narrow scope to appeal acquittal

Final changes to competition laws will require more sophisticated merger analysis: Blakes lawyers

Ontario Court of Appeal upholds paramedics' convictions over death of shooting victim

BC Court of Appeal upholds class action certification in Capital One data breach case

BC Supreme Court awards damages for chronic pain and mental health issues from car accident

Most Read Articles

BC Supreme Court dismisses applications seeking personal liability of estate executor

BC Supreme Court upholds trust company's estate administration amid beneficiary dispute

Alberta Court of Appeal reinstates sanctions on naturopathic doctor for unprofessional conduct

Government of Canada publishes a report to tackle anti-black racism in the justice system