News, Technology

ChatGPT Gone Wrong: US Lawyer Regrets Using ChatGPT After His Filing Was Found to Have an Imaginary Reference

Written by Abdullah Shahid ·  1 min read >
Lawyer uses ChatGPT
Suing an airline over an alleged personal injury, lawyer Peter LoDuca submitted a ChatGPT created brief citing several previous court cases, none of which were true

Ever since its release, ChatGPT is being used by a number of different professions, but the Chatbot comes with a warning which clearly states that the information it provides can sometimes actually be false and ignoring this warning is what led lawyers Peter LoDuca and Steven A Schwartz into a huge problem.

Working for a client that was suing an airline over an alleged personal injury, lawyer Peter LoDuca submitted a filing citing examples of other similar court cases so that his case could go forward, however, the cited examples, which were researched through ChatGPT turned out to be incorrect.

It was the Airline’s lawyer who wrote back to the court saying that he was unable to find any one of the six cases cited in the filing submitted by Lawyer Peter LoDuca.

“Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations,” said Judge Castel, who then called upon the legal team to explain the act.

After further investigation it was found that Peter LoDuca, who was leading the case, the filing was not actually prepared by him, but a fellow lawyer named Steven A Schwartz, who works at the same law firm as LoDuca.

I was “unaware that its (ChatGPT’s) content could be false,” said Lawyer Schwartz, who is currently facing a court hearing of his own, following this incident.

Mr Schwartz, who has been a lawyer for more than 30 years, submitted a written statement agreeing to his mistake, saying that he “greatly regrets” using the chatbot for legal research, even vowing to never use it again.

 

Read more:

Nvidia to Build Israeli Supercomputers as AI Demand Soars