Lawyers continue to get burned by artificial intelligence that fabricates cases and fabricates estimates. Some lawyers are now naming the software they used.
Last month, a Louisiana personal injury lawyer apologized after filing a brief that cited an actual court decision but cited text that didn’t exist. The error appeared in two filings in the 19th Judicial District Court in Baton Rouge and was pointed out by opposing attorneys.
“I am trying to understand how I made this mistake,” Ross LeBlanc, a partner at Dudley Devourer, said in a private letter to Judge William Jorden on March 27. He said he began using an artificial intelligence program called Eve to draft arguments earlier this year. At first, I frequently checked the citations. “They were always correct when I checked,” he wrote.
That consistency gave him confidence, he said, and he eventually stopped checking.
“I never thought something like this would happen to me,” LeBlanc wrote, adding that she didn’t know if the mistake was related to Eve’s software or if she copied and pasted something in a hurry.
Eve CEO Jay Madeswaranmu told Business Insider on Thursday that after a thorough audit of the Dudley Devourer case, the company confirmed that Eve had “no illusions about any litigation citations in this matter,” including fabricated citations.
The court sanctioned a lawyer who submitted a brief containing errors created by artificial intelligence (often referred to as “illusions”). Last week, Sullivan & Cromwell, one of the nation’s oldest and most elite law firms, apologized to a federal judge for a similar blunder.
What’s new here is the blame game. Publishing the names of tools that lawyers worked on could draw attention to the companies that developed the software and negatively impact their reputations.
Legal software companies like Harvey, Legora, and Eve have raised billions of dollars on the promise that they can help lawyers work faster and provide businesses with a level of reliability that general-purpose tools can’t provide. If your software starts humiliating your customers in court, that trust erodes.
Damien Charlotin, a French researcher who tracks hallucinations in court filings, estimates that in less than 10% of cases it is possible to identify the software used. He suspects many lawyers keep that part private. This is because they rely on free chatbots like ChatGPT and other off-the-shelf tools that may not be allowed for their clients’ operations.
Last year, the lawyer at Latham & Watkins who defended Anthropic in a copyright lawsuit made headlines for citing a non-existent article. The lawyer said the mistake was due to Anthropic’s use of its own chatbot, Claude, which fabricated the article’s title and author name.
Eve co-founders David Zeng, Jay Madheswaran, and Matt Noe. eve
Eve uses large-scale language models to build software for plaintiffs’ attorneys to help them draft documents, plan medical histories, and submit and respond to discovery requests. The company was valued at $1 billion after raising a $103 million funding round about a year ago. Eve currently processes more than 200,000 documents and other results a month, an increase of about 100 times compared to a year ago, Madeswarammu said.
LeBlanc told the judge that he was wary of the technology in general because of “horror stories” of psychedelic cases. He said he was persuaded by Eve who pitched the tool to his firm and assured lawyers that safeguards were in place to reduce mistakes. He believed that as long as he conducted his own legal research and directed the software to rely only on approved sources, the risks were limited.
Later, the opposing attorney in the personal injury case pointed out his mistake.
Mr. LeBlanc’s apology comes in the wake of a separate incident earlier this month involving a trip and fall incident at a Lowe’s store. Opposing attorneys discovered the hallucinations in briefs filed by Dudley Devourer and included LeBlanc’s letter in a request that the court expand its investigation into possible sanctions.
Attorney Dudley Devourger filed a motion to quash the opposing attorney’s claims, arguing that the cases are unrelated. The firm also indicated that its lawyers used Mr. Claude to prepare briefs in Mr. Lowe’s case.
It is a widely shared view among software companies and law firms that artificial intelligence can assist with research and drafting, but the responsibility for the final product rests with the humans signing the application.
Madeswaran said Eve makes that clear in contracts and onboarding new customers. The software also includes features designed to catch errors before they end up in court, but they don’t always work. Some errors are harder to spot than others, he said. Verifying that a case exists is easier than verifying that estimates are accurate.
As the legal profession rushes to implement artificial intelligence, the likelihood of mistakes being discovered increases. Courts are getting smarter about technology, and opposing lawyers are adjusting their tactics. In addition to attacking legal arguments, lawyers are scrutinizing filings for errors that could undermine the credibility of the other side.
Chad Dudley, a founding partner at Dudley Devourer, a firm with about 40 lawyers, said his firm trains its lawyers to carefully consider the results produced and asks them to agree to use the technology responsibly.
LeBlanc said he hopes other lawyers learn from his mistakes. He told Business Insider on Thursday that Eve helped him move faster under time pressure, but that he was “sick to his stomach” and couldn’t sleep after the mistake surfaced.
“Whatever technology comes out, I have a responsibility to check everything,” he said.
He doesn’t blame Eve for the blunder. Still, he’s putting down his tools for now.
“Given what happened, I think it makes sense to touch the grass and have a cooling off period,” he said.
Any tips? To contact this reporter via email, please specify the following address: mrussell@businessinsider.com Or send a signal at @MeliaRussell.01. Use a personal email address and non-work device. Here’s a guide to sharing your information securely.
