From ASCO Post Staff
Posted: 2023/5/3 10:25:00 AM
Last update:
Incorrect advice provided by artificial intelligence (AI)-based decision support systems can compromise the performance of radiologists of all levels of expertise when reading mammograms. Radiology.
Background
AI-based decision support systems for mammograms, often touted as a “second eye” for radiologists, could be one of the most promising applications of AI in radiology. However, as technology expands, there are concerns that radiologists may become susceptible to automation bias (an individual’s propensity to prefer suggestions from automated decision-making systems). Several studies have already shown that the introduction of computer-assisted detection into mammography workflows can degrade radiologist performance. However, no studies have examined the impact of AI-based decision support systems on radiologists’ ability to read mammograms accurately.
The Breast Imaging Reporting and Data System (BI-RADS) classification is a standard method used by radiologists to describe and classify breast imaging findings. The BI-RADS category is not diagnostic, but helps physicians determine next steps of care.
Research methods and results
In a new prospective study, researchers had 27 radiologists read 50 mammograms and then asked them to provide BI-RADS categories assisted by an AI-based decision support system, reading the mammograms. We determined how automation bias affects radiologists of varying experience levels during clinical trials. .
Researchers presented mammograms in two randomized sets. AI is correct he suggested the BI-RADS category he training set of 10 mammograms. A set containing erroneous his BI-RADS categories allegedly suggested by the AI in 12 of 40 mammograms.
After conducting an analysis, researchers found that radiologists were significantly worse at assigning the correct BI-RADS category when an AI-based decision support system suggested an incorrect BI-RADS category. . For example, in almost 80% of cases where the AI-based decision support system suggested the correct BI-RADS category, the inexperienced radiologist assigned his correct BI-RADS category. When the AI system suggested the wrong category, his accuracy fell below 20%. An experienced radiologist with an average of more than 15 years of experience noticed that accuracy dropped from 82% for him to 45.5% for him when an AI-based decision support system suggested incorrect categories. I was.
Conclusion
“We expected that inaccurate AI predictions would influence decisions made by radiologists in our study, especially less experienced radiologists,” explained the lead study author. Thomas Dratsch, MD, PhD, Professor of Medicine at the Institute of Diagnostic and Interventional Radiology, University Hospital Cologne. “Nevertheless, it is surprising that even an experienced radiologist was found to be adversely affected by the AI system’s judgment, albeit to a lesser extent than an inexperienced radiologist,” he added. I was.
Researchers need to carefully consider the effects of human-machine interaction to ensure safe deployment and accurate diagnostic performance when utilizing AI-based decision support systems. I hope their new discovery shows.
“Given the repetitive and highly standardized nature of mammography screening, automation bias may be a concern when AI systems are integrated into workflows. It highlights the need to implement appropriate safeguards when incorporating AI into radiation processing to mitigate,” emphasized Dr. Dratsch.
Researchers have proposed potential safeguards to reduce automation bias. This includes presenting radiologists with confidence levels in AI-based decision support systems by displaying the probability of each output, teaching radiologists about the AI system’s reasoning process, includes ensuring that Take responsibility for your decisions.
Researchers plan to use tools such as eye-tracking technology to better understand the decision-making process of radiologists using AI systems.
“[W]We want to explore the most effective way to present the output of AI to radiologists in a way that promotes critical engagement while avoiding the pitfalls of automation bias,” concludes Dratsch.
Disclosure: For full study author disclosures, visit pubs.rsna.org.
The content of this post has not been reviewed by the American Society of Clinical Oncology (ASCO®) and does not necessarily reflect the views or opinions of ASCO®.