Law Ruler April 2024
Ready Set Scale 770
share TWEET PIN IT share share 0
Nothing But the Ruth!

Make AI Your Intern, Not Your Replacement as a Lawyer

By Ruth Carter

OK. Let’s get into the bogus ChatGPT brief debacle.

bogus ChatGPT brief

You might think that certain things don’t need to be said:

You’d be wrong.

Of course, I’m referring to the recent revelation that a lawyer submitted a brief in federal court containing several nonexistent cases — thanks to ChatGPT and some seriously bad judgment.

Here’s What Happened in the ‘Bogus’ Case Citations Snafu

New York attorney Steven Schwartz of Levidow, Levidow & Oberman used ChatGPT to write a brief and submitted it in federal court to the Southern District of New York without verifying that the cases ChatGPT cited and quoted were accurate.

Well, that’s not entirely accurate. Schwartz did ask ChatGPT if the cases were real. Apparently, he didn’t know that ChatGPT has a track record of making false statements. (Artificial intelligence is only as good as the information used to train it.)

When did the lawyer realize his mistake?

Probably after the opposing party submitted a letter to the judge questioning the authenticity of the cases cited in the brief. In May, the judge confirmed that at least six cases cited “appear to be bogus judicial decisions with bogus quotes and bogus internal citations,” and set a date for a sanctions hearing.

Oh, geez. Is he a new lawyer?

Schwartz has over 30 years of experience.

What’s going to happen to him?

We’ll see. Schwartz submitted an affidavit to the court where he admitted full responsibility for his actions. Schwartz also stated that this was the first time he had used ChatGPT for legal research and that he will not use it again without verifying the information.

His hearing is Thursday, June 8. We’ll see if this misstep has other repercussions regarding his employment and ability to practice law.

Writer, podcaster and all-around great guy Jay Acunzo suggests that you think of AI as your intern. It can supplement your efforts, but it should never be a replacement for your research, arguments and writing.

And just as with any intern, always verify the accuracy of the information an AI tool gives you.

Remember, clients hire a specific lawyer because they want to benefit from that professional’s education, skills and experience. If a client wanted to be represented by AI, they’d do that instead.

If I were going to integrate ChatGPT or a similar generative AI platform into my workflow, I’d write the first draft myself. Then I’d put it through the AI for suggestions on how to make my arguments more persuasive if I’m arguing to the court — or make me sound scarier if I were writing a demand letter. (To protect the attorney-client privilege, I’d change the names of the parties and other identifying information as needed.)

As of this writing, I have not used ChatGPT or any generative AI for legal work. I’m actually more interested in using AI to generate the verbiage I use in contracts and letters on a regular basis, so I don’t have to wrack my brain trying to remember which client file to look in when I need a particular block of text. (For more ideas on using AI in law practice, read my interview with Paul Roetzer, founder and CEO of Marketing AI Institute.)

Does the Bogus Cases Incident Signal a Bigger Problem?

As I pondered how a lawyer could use AI to write a brief and submit it to the court without double-checking its work, many follow-up questions zipped through my brain:

  • How did the lawyer not recognize so many of the cases cited by ChatGPT? Did he do any other research on this matter?
  • How little time did he give himself to write this brief that he didn’t have time to check the cases?
  • Does this situation indicate a bigger problem regarding attorney workloads? Did the lawyer have too many pressing client matters to give each one the time and attention it needed?
  • Is there a bigger issue to address besides time management and stress, like substance abuse or a mental health issue?

For clarification, I’m not suggesting that Steven Schwartz has a problem with addiction or a mental health disorder. This situation made me wonder what might be happening in a law firm or an attorney’s life that would contribute to them looking for a quick fix to get their work done. If an attorney doesn’t check an AI’s work before using it, I suspect it’s because they didn’t have time to do it.

Hopefully, Schwartz is the only attorney who has to make the mistake of trying to let artificial intelligence do the work for them instead of assisting them.

Image © iStockPhoto.com.

Don’t miss out on our daily practice management tips. Subscribe to Attorney at Work’s free newsletter here >

share TWEET PIN IT share share
Ruth Carter Ruth Carter

Ruth Carter — lawyer, writer and professional speaker — is Of Counsel with Venjuris, focusing on intellectual property, business, internet and flash mob law. Named an ABA Journal Legal Rebel, Ruth is the author of “The Legal Side of Blogging for Lawyers,” as well as “Flash Mob Law: The Legal Side of Planning and Participating in Pillow Fights, No Pants Rides, and Other Shenanigans.” Ruth blogs at GeekLawFirm.com and UndeniableRuth.com.

More Posts By This Author
MUST READ Articles for Law Firms Click to expand
envelope

Welcome to Attorney at Work!

Sign up for our free newsletter.

x

All fields are required. By signing up, you are opting in to Attorney at Work's free practice tips newsletter and occasional emails with news and offers. By using this service, you indicate that you agree to our Terms and Conditions and have read and understand our Privacy Policy.