Does AI belong in the courtroom? Texas judges don’t think so.

AI News


A federal judge in Texas ordered that AI-dependent legal briefs not be brought to court. But one legal tech executive says AI has been lurking in lawsuits for years, and curbing it could be a big challenge.

“We’re trying to educate judges about things like this,” said Andy Wilson, CEO of legal technology firm Logical. I think it’s a futile effort just to know.

The order, issued by Northern District Court Judge Brantley Starr, requires attorneys filing documents in his case files that the documents contain content generated by the Large Language Modeling (LLM) AI tool. It is believed to be the first of its kind to require proof of non-existence. His ChatGPT on OpenAI, Harvey.AI, Google Bard – or human review for accuracy.

“My mandate is an attempt to maintain the strengths of generative AI while managing the weaknesses,” Judge Starr told Yahoo Finance. “But judges are passive, they settle what is imposed on us, and therefore the innovations we ultimately face are never as cutting-edge.”

Starr said one of the many strengths of legal AI is its ability to search mountains of data. The main drawback he sees is that the system tends to “hallucinate” by fabricating case law citations and corroborating citations. Hallucinations are scenarios in which AI-generated text looks plausible but is factually, semantically, or syntactically wrong.

In a post on the court’s website, Starr confirmed how to make machines comply with the ethical requirements of practicing the law, as well as how the creators of the technology programmed their personal prejudices, prejudices and beliefs into the system. explained that there was no way around it.

“Thus, these systems have no allegiance to any client, to the rule of law, to the laws and constitutions of the United States, or to truth,” the judge wrote.

OpenAI CEO Sam Altman testifies before the Senate Judiciary Privacy, Technology, and Law Subcommittee hearing titled “Surveillance of AI: Rules for Artificial Intelligence.”  May 16, 2023 at the Capitol in Washington, USA.Reuters/Elizabeth Franz

OpenAI CEO Sam Altman testifies at the US Capitol on May 16, 2023 in Washington, D.C.Reuters/Elizabeth Franz

Rebecca Johnson, director of public relations for the Texas Bar Association, said the Texas Bar Association has not taken a position on the use of AI in the legal profession. Other bar associations, including the American Bar Association and the New York State Bar Association, also said they have not taken official positions on the use of AI.

But in February, the ABA passed a resolution calling on AI developers and users to maintain human oversight and control and to take responsibility for the harm their AI tools cause.

The court also asked Congress and government officials to consider these standards when passing AI laws and regulations.

The New York State Bar Association studies the impact of AI beyond LLM. Susan DeSantis, the association’s communications director, said the association is also investigating how facial recognition, digital finance and currency AI will impact legal professionals.

Standing Order Issued by Judge Brantley Starr of the United States District Court for the District of Dallas for the Northern District of Texas

Standing Order Issued by Judge Brantley Starr of the United States District Court for the District of Dallas for the Northern District of Texas

LLMs such as ChatGPT have enabled systems like Logikcull to discern nuances in communication that once required human calculus, Wilson said. Logikcull can comb through terabytes of electronically stored documents, databases, videos, emails and Slack messages to flag data relevant to legal inquiries, Wilson said.

Yet no one yet has an answer to Judge Starr’s concern that machines cannot be held accountable for the ethical requirements of legal practice, he added.

“Not enough has been said about the ethical use of AI,” Wilson said, warning that sensitive personal data such as a person’s likeness or voice is no longer safe from forgery.

Starr’s order comes days after reports surfaced that a New York attorney cited fake AI-generated quotes in court filings when defending a client.

For now, Judge Star hopes that his orders will allow attorneys to recognize that generative AI may make false statements, and that checking the AI’s work will avoid sanctions for false statements. says there is.

“These platforms are incredibly powerful and have many uses in the law…” Judge Starr wrote. “But legal conferences are not among them.”

Alexis Keenan is a legal reporter at Yahoo Finance. Follow Alexis on Twitter @alexiskweed.

Follow Yahoo Finance twitter, Facebook, Instagram, flip board, LinkedInand YouTube

Find live stock market quotes and the latest business and financial news





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *