Recent empirical research has shown that in copyright infringement cases, people place greater responsibility on allegedly infringing works when they believe the creator is an AI.
“AI Artists on the Stand: Biases Against Artificially Generated Works in Copyright Law,” co-authored by Assistant Professor Patty of the University of Miami, Associate Professor Joseph Avery of the Alan Herbert School of Business, and Associate Professor Mike Schuster of the University of Georgia, is the first to examine how the presence of AI distorts legal outcomes in copyright infringement cases. The study, published in the UC Irvine Law Review, reveals a larger perceptual bias at work.
“Even if humans and AI do exactly the same thing with the same inputs and outputs, people will still react differently,” Avery says. “It's as if our vision has changed.”
Avery said the reaction reveals a deeper mystery. In other words, the intangible, the creative process, changes the tangible, the way we perceive the resulting work. This perceptual bias, which judges works created by AI differently than works created by humans, leads to attributing more fault to AI-created works, creating what researchers call the “AI litigation penalty.”
“Humans are consistently tougher when AI is involved,” Avery said. “They see AI as more culpable and impose greater damages.”
Empirical research resulted in litigation fines. Participants were shown an original copyrighted work and asked to rate two identical, allegedly infringing works created under the same conditions. One by a human creator and the other by an AI. Ultimately, participants judged the work produced by AI to be unethical, unfair, and of low quality. And when placed as mock jurors, they deemed AI-generated work to be significantly more infringing or plagiaristic than the same human-produced work.
The fines in this AI lawsuit are not limited to copyright issues. Avery's upcoming investigation shows the company also has ongoing patent and trade secret legal disputes.
What causes the perceptual biases and resulting litigation penalties remains unclear, but is central to Avery's research.
“Perhaps we want to reward what feels human, but sometimes that instinct gets mixed up in legal decisions,” Avery said. “We think there may be a variety of reasons, and we're going to figure them out.”
The findings could serve as a legal warning for artists who use AI and the companies that employ them. More broadly, Avery suggests that the very reason copyright exists could also be threatened.
“The purpose of copyright is to encourage the production and dissemination of creative works,” Avery said. “If we start punishing works simply because AI was part of the process, we may risk chilling innovation and limiting what people can imagine.”
However, Avery notes that this perceptual bias could emerge as quickly as AI evolves, a hypothesis he is currently researching.
“As we become more familiar with AI, our reactions may change.”
/Open to the public. This material from the original organization/author may be of a contemporary nature and has been edited for clarity, style, and length. Mirage.News does not take any institutional stance or position, and all views, positions, and conclusions expressed herein are those of the authors alone. Read the full text here.
