RSNA 2022 Day 4: Back to the Basics of AI, ABUS Training, and ML at RO

AI Basics


November 30, 2022 — On Day 4 of RSNA 2022, an estimated 260 individual events will take place over 12 hours within Chicago’s McCormick Place, with approximately 650 vendor exhibits highlighting the latest products and technologies. I was. The Imaging Technology News (ITN) editorial team has once again provided summaries of key sessions that serve as a representative sample of the quality programming that forms the core of the 108th Scientific and Annual Meetings of the Radiological Society of North America. increase.

Attending three in-depth sessions, participants heard from leading authorities on the fundamentals of artificial intelligence (AI), received tutorials aimed at understanding how automated breast ultrasound systems work, and Gained broad insight into machine learning development. Examination and practice of radiation oncology. Here are some key highlights and what to expect in more detail and upcoming issues in the coming weeks.

Back to Basics: Radiation AI in 2022

This in-depth session will be covered more extensively by ITN in a separate article, but the presence of this session (and a panel of respected AI experts) on the RSNA 2022 agenda is newsworthy in and of itself. Its place on the schedule is a logical follow-up to the popular AI Fireside Chat that was announced at RSNA 2021, reflecting the rapid growth of the technology seen on the show floor and in current practice.

The panel included four highly regarded AI specialists.

RSNA 2022 Gold Medal Winner Catherine P. Andriole, Ph.D., is Associate Professor of Radiology at Harvard Medical School, Brigham and Women’s Hospital, and Director of Academic Research and Education at the Brigham Data Science Office in Massachusetts. A longtime member of the RSNA Radiology Informatics Committee, Dr. Andriole is currently a member of the RSNA Machine Learning Steering and Data Standards subcommittee. She is a faculty member of her RSNA Imaging AI Certificate Program, a subject matter expert on her Grant Oversight Committee at the R&E Foundation, and co-director of Imaging AI in Practice Demonstration.

Linda Moy, MD, is a Professor of Radiology at the NYU Grossman School of Medicine and has been appointed to the NYU Center for Advanced Imaging Innovation and Research and the NYU Vilcek Institute of Graduate Biomedical Sciences. She is the Director of Breast MRI (Clinical and Research) across the NYU Health Network and co-leads the International AI Team across the 5 NYU Langone Health Agencies.

Dania Daye, MD, PhD, Interventional Radiologist at Massachusetts General Hospital, Faculty at Harvard Medical School and MGH/HST Martinos Biomedical Imaging Center. Among other leadership roles, she was co-chair of her MGH’s Women’s Radiology Steering Committee, and currently she is co-chair of the MGH Radiology Department’s Diversity, Equity and Inclusion Committee. increase.

Walter F. Wiggins, MD, PhD, is a Board Certified Neuroradiologist and Assistant Professor of Radiology, Neuroradiology at Duke Health. Qure.ai’s strategic advisor, Wiggins said he fIt focuses on the use of advanced image processing and image analysis techniques in diagnostic imaging of the brain, head, neck, and spine, with a particular focus on the clinical implementation of artificial intelligence (AI) techniques for medical imaging. I’m here.

Daye also consulted the recently published RSNA journal Radiology report “Implementation of Clinical Artificial Intelligence in Radiology: How Decides and How?” As lead author, along with Wiggins and others, Daye identified clinical AI imperatives. data science; cross-modality; and delivery. She and the panel further addressed the need for AI governance and shared key takeaways for successful AI implementations.

ITN spoke with Daye before the program to ask about the goals of the session. She shared the following insights:

“I think there has been an explosion of AI applications that we have seen in the market in the last few years. There are still a lot of radiologists out there, and we’ve seen a lot of interest in getting more familiar and actually learning the basics of implementation. to teach people who are just starting to dive into the very basics, learn a little more about data science, implementation, and governance, see what’s out there, and what to do to provide the infrastructure for people to get started. This is certainly the beginning of a very long journey, but we want to help you get there by dipping your feet in the water, so to speak.”

Classroom-style teaching on cutting-edge AI

An interactive hands-on reading workshop was led by Georgia Giakoumis Spear, MD. With widely-reported real-world results that adding her FDA-approved AI tool to automated breast ultrasounds can boost reading speeds with increased confidence in physicians, Spear believes that radiologists Addressed her ABUS advantages in supporting workloads. Lasers focused on detection and patient care.

“The whole point of ABUS is to complement mammography and detect intermediate cancers that go undetected by mammography and have a high mortality rate,” Speer told training participants, adding: are saving lives. ”

A follow-up to her session at RSNA 2021, featured on ITN, this GE Healthcare Vendor Workshop presents eight different case studies and is joined by Spear as an instructor through the ABUS system, which incorporates QView CAD technology. guided the person Qview Medical, a breast cancer detection pioneer and manufacturer of FDA-approved artificial intelligence (AI) software systems for breast cancer screening, and GE Healthcare launches QView’s QVCAD software on GE Healthcare’s Invenia ABUS 2.0 We announced in November 2021 that we can. (Automated Breast Ultrasound) has an open platform technology that allows the integration of third-party AI tools to optimize the reading workflow. Invenia ABUS 2.0 is the first FDA-approved ultrasound-assisted breast screening technology specifically designed to detect cancer in dense breast tissue. A powerful AI assistant leverages intelligent algorithms to help detect breast lesions, helping physicians increase reading speed with confidence.

A passionate advocate and widely recognized leader in advancing ABUS for breast cancer screening and diagnosis, Spear enthusiastically shared her insights, experiences, and procedural tips with her “students.” . She stressed the importance of having an automated breast ultrasound done on the same day as the mammogram, which helped improve cancer detection while lowering recall.

Plenary on Machine Learning in RO Clinical Trials and Clinical Practice

A key question asked and answered by high-level panelists during a plenary session focused on machine learning in radiation oncology was, “How can AI be used in clinical trials?”

Session presenters, introduced by Past President of RSNA, Bruce Haffty, MD, discussed ongoing initiatives to advance trials in support of radiation oncology, the wide range of work done to date, and the future. provided an update on the outlook for His four panelists, led by Quynh-Thu Le, MD, Professor, Chair, Department of Radiation Oncology at Stanford, pointed out the fundamental ways AI is applied to radiation oncology. imaging (stimulation); treatment planning; plan approval and QA. provision of radiation therapy; and follow-up care. She outlined the hierarchical relationships and definitions between deep learning, machine learning, and artificial intelligence.

Addressing the possibilities and pitfalls of ML

“There is data to suggest that AI can optimize eligibility criteria, match patients to trials, and address health equity in trial participation and enrollment,” said Le. She explored the role of machine learning in optimizing radiotherapy workflows that can increase education, time savings and consistency.

Presenters include automated segmentation, AI in image processing for personalized care, Trial Pathfinder, the development of the Minimum Common Oncology Data Elements (mCODE), the community that formed it, CodeX, and NRG/ RTOG.

Other panelists included Felix Feng MD, University of California, San Francisco, who focused on pathology and spoke about the value of clinical trials on evidence information. Feng shared the ongoing mapping within the EHR and the importance of the involvement and support of federal agencies, radiologists and physicists, and addressed some of the reasons why AI tools are promising as biomarkers, particularly Focus on his performance, turnaround time. Digital world, and access — because it is potentially easier to implement internationally.

Michael Gensheimer, MD, Clinical Associate Professor, Radiation Oncology-Radiation Therapy, Stanford, discusses how machine learning of digital data, including images, digital pathology, and electronic medical records, is being used for patient risk stratification. contributed to the detailed technical evaluation of For predicting treatment efficacy as well as for clinical trials.

Ruijiang Li, PhD, DABR, Associate Professor, Department of Radiation Oncology (Medical Physics) Stanford described advances in machine learning and image analysis for personalized cancer treatment. He took note of a phrase regularly employed in Human-Centered Artificial Intelligence (HAI) at Stanford University. This underscores the collective sentiment seen and heard at RSNA 2022. “Human in Charge, Machine in the Loop”.

All were focused on the common goals of cancer detection and diagnosis, response and treatment prediction, and prognosis, and acknowledged the collective consensus that everyone involved in data collection needs to do a better job. rice field.

More information: https://www.rsna.org/annual-meeting

Learn more about RSNA22 here





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *