AI helps cancer patients better understand CT reports

AI News


Medical reports written in technical jargon can cause difficulties for patients. A team from the Technical University of Munich (TUM) investigated how artificial intelligence can make CT findings easier to understand. In this study, reading time was reduced and patients rated the automatically simplified text as easier to understand and more informative.

To simplify the original document, the researchers used an open source large-scale language model operated in compliance with data protection regulations on the computers of the TUM University Hospital. For example: “The silhouette of the cardiac mediastinum is midline.The cardiac chambers are usually opaque. […] “A small amount of pericardial effusion is observed” was simplified by AI to: “Heart: The report shows a small amount of fluid around the heart. This is a common finding and your doctor will decide if it requires attention.”

In medicine, it is necessary to use easy-to-understand language.

From a researcher’s perspective, making medical terminology more accessible is more than just an aid. “Ensuring patients understand their reports, tests, and treatments is a central pillar of modern medicine. This is the only way to ensure informed consent and strengthen health literacy,” says Felix Busch, a physician assistant at the Institute of Diagnostic and Interventional Radiology and final co-author of the study published in the journal Radiology.

Previous research has shown that AI models can make professional medical texts easier to understand, but little was known about their impact on real patients. Therefore, the team included 200 patients who underwent CT imaging at TUM University Hospital due to a diagnosis of cancer. Half received the original report and the other half received an automatically simplified version.

Shorten reading time, improve satisfaction

The results were clear: reading time was reduced from an average of 7 minutes for the original report to 2 minutes. Patients who received the abbreviated findings reported that they were much easier to read (81% vs. 17%) and understand (80% vs. 9%). They were also much more likely to rate it as helpful (82% compared to 29%) and useful (82% compared to 27%). “A variety of objective measurements confirmed that simplified reports are more readable,” says Felix Busch.

Future research is needed to determine whether these benefits translate into measurable improvements in patient health outcomes. However, from a researcher’s perspective, this study clearly shows that patients can improve their understanding and benefit from AI-powered simplification of medical reports. “In addition to expert reports, we could also offer automatically simplified reports as an additional service, but a prerequisite is that an optimized and safe AI solution is available in the clinic,” says Felix Busch.

Review by medical experts still required

The team advises patients not to use chatbots like ChatGPT as stand-in doctors to simplify reporting. “Aside from data protection concerns, language models always carry the risk of factual errors,” said the study’s lead author, Dr. Philip Plucker. In the study, 6% of the results generated by AI contained factual inaccuracies, 7% omitted information, and 3% added new information. However, before the report is provided to the patient, it will be checked for errors and corrected if necessary. “Language models are useful tools, but they cannot replace medical staff. In the worst-case scenario, patients may receive incorrect information about their disease if their findings are not verified by trained experts,” Professor Plucker concludes.

/Open to the public. This material from the original organization/author may be of a contemporary nature and has been edited for clarity, style, and length. Mirage.News does not take any institutional position or position, and all views, positions, and conclusions expressed herein are those of the authors alone. Read the full text here.



Source link