Researchers affirm the value of human creativity in AI

AI News


James C. Kaufman examines questions about the impact of AI on education, equity, and creative work in two new books.

As generative artificial intelligence tools rapidly permeate classrooms, workplaces, and creative industries, the question of what these systems mean for human creativity becomes increasingly urgent. Can AI really be creative? Can expanding access to ideas and inspiration level the playing field, or do we risk weakening the very skills that education is meant to develop?

For James C. Kaufman, a professor of educational psychology at the University of Connecticut's Neag School of Education, the answer is complex and requires caution.

Recent research co-authored by Kaufman and his newly edited academic book on generative AI and creativity suggests that while artificial intelligence can support some aspects of creative work, it cannot replace human creativity. Rather, it has the potential to amplify existing differences in skills, judgment and expertise, raising important questions for education, equity and the future of creative work.

Unlike many tech enthusiasts, Kaufman approaches AI with research-based skepticism rather than fear of the technology itself.

“Most creativity researchers tend to fall into two camps: those who are very excited about AI and those who are deeply concerned about it,” Kaufman says. “I'm in the second camp.”

A key source of his concern is how quickly generative AI systems have been released and adopted, often without the safeguards, testing, and regulatory frameworks that typically accompany innovative technologies.

“AI was being actively used by people before we had the time to study it carefully,” Kaufman says. “This is especially problematic when we're talking about learning, creativity, and long-term skill development.”

Before we even had time to study it carefully, AI was being actively used by people. This is especially problematic when we're talking about learning, creativity, and long-term skill development. — James C. Kaufman

In a recent two-part study conducted with collaborators at other institutions, Kaufman and his colleagues investigated how people engage in creative tasks both alone and with the assistance of large-scale language models (LLMs). Participants completed storytelling tasks on their own or with AI support. The researchers then assessed creativity, intelligence, and overall performance in both conditions. This study has not yet been peer-reviewed or accepted for publication.

“What we found is that creativity and intelligence still matter,” Kaufman says. “Participants who were more creative without AI also tended to perform better when collaborating with AI.”

Rather than leveling the creative gap, AI acted as an amplifier, benefiting those who already had stronger creative and cognitive skills.

“If you already have strength in an area, you should be able to use AI more effectively,” Kaufman says. “AI doesn’t suddenly make everyone equally creative.”

The reason, he explains, lies in how creativity actually works. Generating ideas is just part of the process. Creativity also requires evaluating ideas, refining them, and deciding which ones are worth pursuing.

“AI is much better at generating ideas than it is at evaluating them,” Kaufman said. “Human judgment is still required to decide what is meaningful, what is original, and what is worth pursuing.”

The evaluation stage relies heavily on experience, intelligence, and metacognition: awareness of one's own strengths, limitations, and goals.

“Knowing what kind of help you actually need from AI is a skill in itself,” Kaufman said, offering a simple example to illustrate the point. “If you think of AI as something that creates jobs at about a B or B-plus level, someone who is already working at an A level could use it selectively and still produce good jobs. But if someone is operating below that level, that ceiling could simply be the output of the AI.”

Implications for learning and equity

Nowhere are the implications of these findings more concerning than in education.

“The goal of the mission is not the final product,” Kaufman said. “The goal is to learn how to do the job.”

When students rely heavily on AI to create essays, stories, or problem solutions, they may achieve acceptable results, but they risk bypassing the cognitive effort necessary for meaningful learning. Several recent studies suggest that creativity and learning gains are often lost when AI assistance is removed.

“This suggests that students are not necessarily acquiring lasting skills,” Kaufman said. “They outsource their work.”

“The goal of an assignment is not the final product. The goal is to learn how to do the job.” — James C. Kaufman

Kaufman also points to evidence from other studies showing that students often overestimate the degree to which they collaborate with AI, reporting thoughtful engagement even when usage data shows a lot of copy-pasting. One of the study's central findings challenges the common notion that AI will “democratize” creativity.

“Creativity is already one of the most democratic human traits we have,” he says. “There are generally no significant differences in creative potential across gender, culture, or socio-economic status.”

Instead, he warns, AI could lead to new inequalities.

“As the quality of the paid version increases and the quality of the free version decreases, access becomes increasingly important,” Kaufman says. “The most powerful tools will be available to those who can afford them.”

A changing creative landscape

Its impact extends beyond the classroom to the creative industries themselves. Many entry-level creative jobs, such as caption writing, concept art and freelance digital illustration, are already being replaced by AI systems, Kaufman said.

“These are the kinds of jobs that people looking to establish themselves in creative fields rely on,” Kaufman said. “When those are gone, the entire pipeline of talent is cut off.”

He worries that this will create a polarized creative landscape, with hobbyist creativity on one side and well-funded elites doing creative production on the other. These concerns are explored in more detail in his new edited book, Generative Artificial Intelligence and Creativity: Precautions, Prospects, and Possibilities, co-edited with Matthew J. Worwood, an adjunct assistant professor in the UW School of Digital Media and Design. This book brings together scholars from psychology, education, computer science, philosophy, and related fields to explore the impact of AI on creative thinking, education, assessment, and ethics.

Matthew Worwood
Matthew J. Worwood, adjunct assistant professor in the UW School of Digital Media and Design, co-edited the new book Generative Artificial Intelligence and Creativity, Precautions, Prospects, and Possibilities. (Photo provided by Department of Digital Media Design)

Working across disciplines, Worwood said one of the most surprising aspects of the project was the range of perspectives contributors brought to the conversation about generative AI and creativity.

“I was surprised by the diversity of thinking about generative AI,” says Worwood. “We rarely share perspectives within a single context, which makes the conversations fun and insightful, but also challenging.”

Worwood said that diversity strengthens the book's central argument that AI should be treated as a tool rather than a replacement for human creativity, especially in education.

“Responsible and intentional use starts with teachers and learning professionals,” says Worwood. “Start by learning your learning goals and take the time to consider how your choices in using AI will support or hinder your students from achieving their goals.”

He added that transparency is important and cautioned against allowing decisions about the use of AI in education to be determined primarily by technologists.

Responsible and intentional use starts with teachers and learning professionals. It starts with a learning goal and takes the time to consider how the choices you make in using AI will support or hinder students from achieving that goal. — Matthew J. Worwood

“At the lower grade levels, this should be a teacher guided by an administrative team that consults with subject matter experts and learning science scholars,” he says. “At the moment, I worry that too often we ignore advice from engineers who may not fully understand how learning works.”

At higher levels of education, Worwood envisions a gradual transition to student autonomy.

“We want to get to the point where students can decide when and how AI is used to support their learning,” he says. “But thoughtful guidance will be important during that transition period.”

Ultimately, Kaufman frames AI as powerful, not inherently good or inherently bad.

“Creativity itself is neutral,” Kaufman says. “Even if everyone becomes more creative, the world doesn't automatically become a better place. The same is true for intelligence and AI.”

What matters, he added, is who controls these tools, how they are used, and whether agencies invest in thoughtful oversight. The challenge for educators, policy makers, and creative professionals is not whether to use AI, but how to use it without sacrificing learning, equity, or human judgment.

“We live in interesting times,” Kaufman says. “And we're still deciding what kind of future we're going to create with these tools.”



Source link