Latest

Attorney Faces Consequences for Using ChatGPT to Fabricate Evidence in Court Case

  • Now Themis doesn’t know who to prosecute
  • The neural network is to blame for falsifying the facts
  • But it won’t go to jail

American attorney Stephen A.. Schwartz used ChatGPT to gather material for one of his cases.

We don’t know what made a lawyer with 20 years of experience resort to a neural network. But the results were scandalous.

So Schwartz helped a client win compensation from Avianca Airlines.

The victim was injured during the flight – a flight attendant hit the man with a cart while he was serving food. The case was heard in federal court in Manhattan.

So Schwartz had to provide compelling evidence that his client was harmed. To avoid digging through the documents, he asked ChatGPT to do so.

Without verifying the facts, the attorney filed a 10-page document. There, the chatbot presented several similar incidents on the plane and examples of court decisions.

Specifically, the neural network cited Martinez v. Delta Air Lines, Zickerman v. Korean Air Lines, and Varghese v. China Southern Airline.

The problem is that such people do not exist in reality. And the court cases are all fiction of the chatbot. Schwartz now faces fines and disbarment.

Last week OpenAI released a version of ChatGPT for iOS. And they also offered a paid subscription that generates “fewer fakes.”