A group of medical experts said in a recent editorial that artificial intelligence could undermine medical students' “critical thinking skills” if they are not taught how to use it properly.
“The goal of using AI to enhance education, rather than allowing it to erode independent reasoning, is a worthy pursuit,” the authors, professors at the University of Missouri School of Medicine, said in a Dec. 1 editorial in the medical journal BMJ Evidence-Based Medicine. “AI is disrupting traditional learning and assessment methods, requiring adjustments in medical school and training curricula.”
Don't miss:
The editorial says medical schools have “largely inadequate institutional policies and guidance” regarding the use of AI in student homework and training. It added that if the use of AI is unchecked, medical professionals could become over-reliant on the technology and lose important skills.
“What happens if a server or AI service goes down?” says the BMJ's “Evidence-Based Medicine” editorial. “This effect is especially ominous for learners who are working on developing skills in the first place, because they are not given that opportunity along the way.”
Students need to learn how to effectively use AI tools and verify their working, the essay says.
“Medical training should include practice rejecting inappropriate AI advice and explaining why it is dangerous to follow it.”
Trending: EA co-founders shape this VC-backed marketplace—You can now invest in gaming's next big platform
AI in healthcare
Artificial intelligence has quickly become mainstream in many clinics and hospitals. According to the American Medical Association, two-thirds of doctors will be using AI in their practice by 2024, up from 38% the year before.
At the same time, the World Economic Forum said in a report earlier this year that AI adoption in healthcare is below average compared to other industries. One reason for the slow rate of adoption is “increasing mistrust” in AI's capabilities and effectiveness, the report said.
The authors of the BMJ editorial said that the distrust was understandable given the fact that AI sometimes creates false sources.
“Convinced falsehoods and hallucinating sources remain frequent failure modes for AI models,” they said.
Large language models are “very likely” to produce incorrect and potentially dangerous information when used in clinical practice, according to a study published earlier this year in the journal Communications Medicine.
Such cases were brought to the spotlight by a report released by the US Secretary of Health earlier this year. Robert F. Kennedy Jr. Cited research that doesn't exist.
SEE ALSO: GM-backed EnergyX solves lithium supply crisis — Invest before expanding global production
Planning for AI in healthcare
Medical students should be evaluated on how they use AI in clinical practice, not just their final results, the editorial authors said.
“This can be accomplished by asking students to ‘show us their work’, provide a paper trail, and even submit the LLM prompts they used along with a written rationale for accepting or rejecting the AI output,” they said.
The BMJ's Evidence-Based Medicine editorial added that students should also be assessed in non-AI environments to ensure fundamental skills are honed.
“This may be feasible and is particularly important for bedside communication, physical examination, teamwork, and professional judgment,” the report said.
In addition, AI literacy should also be included in classes, according to the essay.
“While medical trainees may not need to fully understand the technical data engineering details or AI model training pipelines, they do need to understand the process in principle and grasp the concepts that underpin its strengths and weaknesses,” the report said.
Read next: The secret to Buffett's wealth?Private real estate—Gain access to institutions yourself
Image: Shutterstock
Next step: Transform your trading with Benzinga Edge's unique market trading ideas and tools. Click now to access unique insights It gives you an edge in today's competitive market.
Get the latest stock analysis from Benzinga:
This article AI could impair medical students' critical thinking skills, experts warn. “What happens if my server or AI service goes down?” originally appeared on Benzinga.com
© 2025 Benzinga.com. Benzinga does not provide investment advice. Unauthorized reproduction is prohibited.
