Using well-written prompts, AI tools can quickly create documents containing reliable legal arguments.
The brief looks sophisticated, but are its claims legitimate and supported by law?
A new course, “AI and the Law,” taught by Associate Professor Terry O’Reilly at Willamette Law School, aimed to shed light on AI legal tools and their potential applications as students consider them.
When the Dean’s Office asked faculty for course proposals on AI, O’Reilly thought it would be a good incentive to prepare an outline to learn more about the subject, which had been of intense interest since the release of ChatGPT 3.5 in late November 2022.
O’Reilly examined news articles, law and accounting firm reports, entertainment industry bargaining agreements containing AI clauses, technical reports, and legal filings from AI-related cases. He also experimented with AI tools, created quick handouts on topics he understood well, and evaluated the tools’ current strengths and weaknesses.
Given the novelty of the subject, he was able to work with the dean’s office to organize enough material for the course in about a week. He made significant adjustments before and during the course to keep pace with the rapid development of AI and AI tools for lawyers.
During the course, Mr. O’Reilly provided an overview of the technology of large-scale language models and the legal issues surrounding the development and application of AI.
To assess the quality of AI’s legal work products, students first reviewed the format and style of the law firm’s work products: internal memos, client memos, legal opinions, and legal briefs. We then explored AI tools that are designed for or can be adapted for legal use and experimented with ways to assign tasks, structure queries, and review and adjust AI output for appropriate formatting.
Their assignment involved a simulation of a law firm using AI to draft a firm policy document, a memo to a client identifying potential environmental law issues in the acquisition of industrial property, and a document addressing legal issues arising from the use of AI.
“The focus of this course was to use existing artificial intelligence tools to obtain sound legal analysis, which is difficult for students to achieve, for anyone,” O’Reilly says. “The ability of a particular AI tool to produce reliable legal documents may also vary from session to session in the same week. Additionally, new versions of the tool may respond differently to instructions than previous versions.”
Although there were occasional problems with some of the tools, students evaluated their progress and exchanged ideas and discoveries about AI.
The course will be offered again next year, but O’Reilly said he would be surprised if it was still offered unchanged five years from now. If AI development stalls, this course may become obsolete. Or, if progress continues, there may be too much content in one class.
In any case, he recognizes the attention that AI requires. He says it’s wise to recognize its capabilities and limitations.
“Using sophisticated technology to create AI instructions does not guarantee a satisfactory output,” O’Reilly says. “In any case, each claim and cited authority should continue to be considered and confirmed. This is not new and may never change. However, with care and experience using certain AI platforms, useful legal analyzes and drafts can often be obtained, sometimes quickly and in impressive quality. So, for better or worse, be careful.”
