Yakovchuk/GettyWhat was the last thing you asked an AI chatbot to do?
Perhaps you asked for an essay structure to answer a difficult question, provide an insightful analysis of a thick data set, or make sure your cover letter matches the job description.
Some experts worry that outsourcing these types of tasks can slow down the brain and negatively impact critical thinking and problem-solving skills.
Earlier this year, the Massachusetts Institute of Technology (MIT) published a study showing that people who used ChatGPT to write essays had decreased activity in brain networks associated with cognitive processing during the exercise.
These people also weren't able to quote from essays as easily as study subjects who didn't use the AI chatbot.
The researchers said the study demonstrated “the pressing issue of exploring possible declines in learning ability.”
All 54 participants were from MIT and nearby universities. Their brain activity was recorded using electroencephalography (EEG), which involves placing electrodes on the scalp.
Some of the prompts participants used asked the AI to summarize the essay question, track sources, and refine grammar and style.
It was also used to generate and clarify ideas, although some users felt that AI was not very good at this.
“AI makes it too easy to find answers.”
Separately, Carnegie Mellon University and Microsoft, which operates Copilot, found that people's ability to solve problems can decrease when they rely too much on AI.
They surveyed 319 white-collar workers who use AI tools at least once a week at work about how they apply critical thinking when using AI tools.
They looked at 900 examples of tasks given to AI, ranging from analyzing data for new insights to checking whether a piece of work meets certain rules.
The study found that greater confidence in a tool's ability to perform a task was associated with “less critical thinking effort.”
“While GenAI can improve employee efficiency, it can hinder critical engagement in work, leading to long-term over-reliance on tools and diminished independent problem-solving skills.”
A study published in October by Oxford University Press (OUP) looked at British schoolchildren as well.
The results showed that 6 out of 10 people felt that AI was having a negative impact on their academic skills.
So, with the massive explosion in AI use, are our cognitive skills at risk of decline?
Klaus Wedfeld/GettyDr. Alexandra Tomescu, a generative AI expert at OUP who worked on the school survey, says this is not necessarily the case.
“Our research shows that 9 out of 10 students say AI has helped them develop at least one skill related to academics, such as problem-solving, creativity, and revision.
“But at the same time, around a quarter say the use of AI has made their job too easy…i.e. [it’s] It's a pretty subtle picture. ”
She added that many students are seeking further guidance on how to use AI.
ChatGPT, which has more than 800 million weekly active users, according to boss Sam Altman, has published a set of 100 prompts for students designed to help them get the most out of the technology.
But Wayne Holmes, professor of critical research on artificial intelligence and education at University College London (UCL), says this is not enough.
He would like to see more academic research done on the impact of AI tools on learning before encouraging students to use them.
He tells us: “Currently, there is no large-scale independent evidence about the effectiveness, safety, or even the idea that they have a positive impact of these tools in education.”
Will the results be better, but will the learning be worse?
Professor Holmes points to research on cognitive atrophy, a decline in a person's abilities and skills after using AI.
This poses a problem for radiologists, who use AI tools to help interpret X-rays before diagnosing patients, he says.
A study published last year by Harvard Medical School found that while AI assistance did improve the performance of some clinicians, it harmed others for reasons researchers don't fully understand.
The authors called for more research into how humans interact with AI, so that we can find ways to use AI tools that “enhance human performance, rather than detract from it.”
Professor Holmes is concerned that students, whether in school or university, may become too reliant on AI to do the work for them and miss out on the fundamental skills that education provides.
With the help of AI, students' essays may score better, but the question is whether their understanding will suffer.
Professor Holmes said: “They're doing better, but they're actually learning less.”
Jaina Devani, head of international education at OpenAI, which owns ChatGPT, and who helped secure the deal with Oxford University, said the company is “very aware of this discussion right now.”

She told the BBC: “I definitely don't think students should use ChatGPT to outsource their work.”
In her view, this is best used as a tutor rather than just providing answers.
The example she gave is of a student who uses the study mode setting to go back and forth in ChatGPT.
Type in a question that is difficult to answer, and the chatbot will break down its components to help you understand your question.
The example she gave is of a student who is doing an assignment late at night on a topic that he doesn't quite understand.
”[If] I'm going to give a presentation, but it's already midnight, so I don't plan on emailing you. [university] Get a tutor and ask for help,” she says.
“I think ChatGPT really has the potential to accelerate learning when used in a targeted way.”
However, Professor Holmes insists that any student using AI tools needs to be aware of how their inferences work and how the companies providing the tools handle their data. He emphasizes the need to constantly check the results.
“This is more than just the latest version of a calculator,” he says, explaining the far-reaching capabilities and impact of AI.
“I would never tell my students, 'You shouldn't use AI'… but what I'm saying is you need to understand all the different things about AI so you can make informed decisions.”

