How lawyers can use AI in litigation and to help their clients

AI For Business


This article is “Build IT” is a series that highlights the digital technology trends that are disrupting industries.

Michael Cohen, the lawyer who gained notoriety for working for Donald Trump, asked a federal judge in December to overlook his latest illegal activity: fabricating a lawsuit using generative AI. Cohen used Google Bird, the precursor to Google Gemini, to cite lawsuits that didn't exist. Cohen claimed ignorance, saying he mistook the chatbot for a “super-sophisticated search engine.”

Cohen is not alone in this error: A federal judge fined two lawyers $5,000 last year for citing false lawsuits, and in February a court fined them $10,000 on an appeal that cited nearly two dozen false lawsuits.

These failures may suggest that AI is useless in the practice of law, but some lawyers and legal experts told BI that's not necessarily the case. While generative AI accuracy can be a danger zone, many lawyers are turning to it as the legal industry becomes more complex.

Daniel Beneke, head of machine learning at international law firm Baker McKenzie, said AI models are becoming better at “interpreting and generating complex legal language”, which is at the heart of legal work.


Headshot of Daniel Beneke in a red blazer

Daniel Beneke, head of machine learning at Baker McKenzie.

Courtesy of Daniel Beneke



Lawyer's Co-Pilot

Founded in 1949, Baker McKenzie has more than 6,500 lawyers in 70 offices around the world. Beneke said the firm's interest in AI predated generative AI, but the recent emergence of large language models (LLMs) has sparked a wave of innovation. The firm is working on building generative AI to draft legal advice for large volumes of employment law questions, and recently won an award from Law.com.

Beneke said AI tools are especially useful for dealing with legal ramifications that stem from common issues like cybersecurity incidents, where even minor incidents can overwhelm companies with regulatory compliance requirements, requiring a small team of lawyers to work for days and racking up expensive costs.

The pinnacle of AI applications over the next 5 to 10 years will be the empowerment of lawyers.
Cecilia Ginity, CEO and co-founder of GC AI

Beneke said the company's tools are designed to deliver precise advice to significantly reduce the time lawyers spend understanding their clients' regulatory requirements.

Beneke stressed that the company's goal is quality, not efficiency, and said the time saved sorting through regulatory requirements would be better spent developing customer response strategies for incidents.

Cecilia Giniti, CEO and co-founder of GC AI, predicted that this trend will dominate the discussion of AI in the legal profession. “The pinnacle of AI applications over the next five to 10 years is lawyer augmentation,” she said. “It’s the lawyer’s co-pilot.”

Popular media often focuses on the most romanticized aspects of the law — prosecutors grilling defendants in court and hard-working lawyers developing novel legal strategies — but Ginity says the legal profession straddles a “very long tail” of tedious work, and the reality is often less glamorous.


Cecilia Ginity was pictured in a headshot wearing a gold necklace and a black blazer.

Cecilia Giniti, CEO and co-founder of GC AI;

Courtesy of Cecilia Ginity



Like Beneke, Giniti gave the example of regulatory requirements: In January, the Federal Trade Commission sent requests for information to five companies, including Microsoft and OpenAI, asking for emails and other potentially hundreds of documents as part of the FTC's investigation into competition in the AI ​​industry.

Complying with these requests can require hundreds of hours of work as lawyers sift through documents looking for relevant information. This is important work – failure to comply can result in severe penalties and further investigation – but it is also repetitive, tedious and time-consuming work.

Ginity said the AI ​​”co-pilot” would enable lawyers to “practice to the fullest extent of their qualifications” and “do what they are most capable of – which is what they enjoy”.

GPT-4 appears in court

The appeal of a tool that can tirelessly comb through documents for lawyers is huge, but it is overshadowed by AI's greatest weakness: hallucinations.

IBM describes hallucinations as when an AI tool’s LLM sees patterns where none exist and produces “meaningless or totally inaccurate output.”As Cohen discovered, this can happen when an AI chatbot is asked to answer a specific query that isn’t well-represented in the training data.

It may be surprising that AI tools built for lawyers generally don't use models trained specifically for the legal industry. Most rely on the same generalized LLMs that anyone can access, with OpenAI's GPT being by far the most popular. “Right now, there are no models out there that are more powerful than GPT-4,” Ziniti says.

AI legal assistant product CoCounsel says it's taking several steps to reduce illusions: It uses a combination of search augmentation generation, a technique that bases its AI responses on the documents provided, and prompts that encourage the LLM to keep its responses focused on the content of the document.

OpenAI operates a fleet of servers dedicated to CoCounsel, giving CoCounsel engineers more control over the model's output and helping with regulatory compliance, as the information provided to CoCounsel is not shared broadly.

Jake Heller, head of CoCounsel product at Thomson Reuters, said the company has established a “trust team” of lawyers and AI engineers to ensure CoCounsel “is getting the answer right.” The AI ​​assistant also provides citation links to alleviate concerns about accuracy.


In his headshot, Jake Heller is wearing a white button-up under a black blazer.

Jake Heller, head of product for CoCounsel at Thomson Reuters;

Courtesy of Jake Heller



AI cannot replace lawyers

There's another fear driving lawyers to AI: other lawyers.

Heller said all law firms and lawyers exist within a “competitive dynamic.” Law firms compete for a limited number of clients, and plaintiffs and defendants compete to win cases. Giniti described the legal profession as an “adversarial system” that incentivizes each lawyer to put forward the best case possible for their client.

So AI is unlikely to eliminate lawyers' jobs. Instead, it may be seen as an extension of trends that took root at the dawn of the computer age.

“We physically reviewed every document in every case,” Heller said, “and physically printed out every email that might be relevant and stored them in bank boxes in the basement.”

In a sense, the problem runs parallel to the solution.
Daniel Beneke, head of machine learning at Baker McKenzie

Times have changed. To the extent practical, manual review has been replaced by electronic review. In the legal industry, there is a subfield called e-discovery that is dedicated to searching and classifying electronic documents.

Lawyers may turn to AI to help them deal with the forces that seek to rein it in: regulation. The complexity of government regulation is “growing exponentially,” Beneke said, adding that “in some ways, the problem and the solution are happening at the same time.” This is especially true for an international firm like Baker McKenzie, which advises clients in dozens of countries.

Ultimately, the adoption of AI in the legal industry comes down to the reality that there are only so many hours in a day, and while manually reviewing every document that could potentially be relevant to a case may seem great, it's often not the best use of a lawyer's time.

“In three to five years, not using AI for legal work will be the same as not using online search for legal work today,” Ginity said.

She added that lawyers have a professional responsibility not to inflate billable hours, a responsibility that has been codified by many legal organizations, including the American Bar Association.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *