Sullivan & Cromwell apologizes to judge for hallucinating AI in action

AI For Business


A partner at elite Wall Street law firm Sullivan & Cromwell has written a letter of apology to a federal bankruptcy judge for a court filing that included AI hallucinations.

Andrew Diedrich, the firm’s co-head of global finance and restructuring, said in an April 18 memo that previous filings contained inaccurate quotes and other errors, including AI hallucinations.

“‘Hallucinations’ are instances where artificial intelligence tools fabricate case citations, misquote authorities, or generate legal sources that do not exist,” Dietderich’s letter said. “We deeply regret that this happened.”

The motion included an incorrect case name and number, indicating that the quotes from the case were clearly fabricated, according to a table attached to the letter.

Dietderich said the error by Sullivan & Cromwell, the law firm representing the bankrupt Prince Global Holdings, was discovered by the law firm Boies Schiller Flexner LLP, representing the creditors, and he thanked them and apologized.

He said Sullivan & Cromwell, a 140-year-old firm with more than 1,000 lawyers, has comprehensive policies regarding the use of AI and safeguards to avoid this very scenario. He said the company’s policies were not followed, and the errors also passed through the company’s review process for citations.

Mr. Dietderich wrote to Manhattan-based Chief Justice Martin Glenn, saying he and the firm are “acutely aware of our responsibility to ensure the accuracy of all filings.”

“I take responsibility for failing to do so,” he said, adding that he would submit a revised application.

Representatives for the company and Mr. Dietderich did not respond to Business Insider’s inquiries. Representatives for the judge also did not respond.

When it comes to AI illusions in legal work, Sullivan & Cromwell is on your side.

False legal citations will become more common starting in 2023, according to legal researcher Damien Charlotin, who maintains a public database of AI hallucinations in lawsuits.