By Kathryn ArmstrongBBC News
56 minutes ago
ChatGPT can answer questions in natural, human-like language and mimic other writing styles
A New York attorney faces a court hearing of his own after his firm used the AI tool ChatGPT for legal research.
A judge said the court was faced with an “unprecedented circumstance” after finding a case file referred to example legal cases that did not exist.
The lawyer using the tool told the court he “didn’t know that the content could be wrong”.
ChatGPT creates original texts on request, but comes with warnings that it may “provide inaccurate information”.
In the original case, a man was suing an airline for alleged bodily harm. His legal team filed a brief citing several previous court cases to provide precedent as to why the case should go ahead.
But the airline’s lawyers later wrote to the judge to say they could not find several of the cases mentioned in the brief.
“Six of the cases filed appear to be falsified court decisions with false citations and false internal citations,” Judge Castel wrote in an order asking the man’s legal team to explain themselves.
In the course of several files, it turned out that the research had not been carried out by Peter LoDuca, the plaintiff’s lawyer, but by one of his colleagues in the same law firm. Steven A. Schwartz, who has practiced law for more than 30 years, used ChatGPT to search for similar previous cases.
In his written statement, Mr. Schwartz clarified that Mr. LoDuca was not involved in the investigation and had no knowledge of how it was conducted.
Mr. Schwartz added that he “greatly regrets” relying on the chatbot, which he had never used before for legal research and “didn’t know its content could be wrong”.
He has vowed never to use AI to “augment” his legal research in the future “without an absolute verification of its authenticity.”
Screenshots attached to the file appear to show a conversation between Mr. Black and ChatGPT.
“Is Varghese a real case,” reads one message, referring to Varghese v. China Southern Airlines Co Ltd, one of the cases that no other attorney could find.
ChatGPT responds with “Yes it is” and prompts “S” to ask “What is your source?”
After a “double check,” ChatGPT replies that the case is real and can be found in legal reference databases such as LexisNexis and Westlaw.
It says the other cases it has brought to Mr. Schwartz are also real.
Both attorneys, who work for Levidow, Levidow & Oberman, were asked to explain at a hearing on June 8 why they should not be subject to disciplinary action.
Millions of people have used ChatGPT since its launch in November 2022.
It can answer questions in natural, human-like language and also mimic other writing styles. As a database, it uses the internet as it was in 2021.
There have been concerns about the potential risks of artificial intelligence (AI), including the possible spread of misinformation and bias.