In a groundbreaking study published in intestinesMukherjee et al. developed and validated a radiomics-based early detection model (REDMOD), an automated artificial intelligence (AI) framework that identifies subtle preclinical imaging signs of pancreatic ductal adenocarcinoma in routine computed tomography (CT) scans. The model targets “visually occult” diseases (changes undetectable to radiologists) and addresses a major barrier to early detection of cancer, where more than 85% of cases are diagnosed at an advanced stage.
Research details
The REDMOD framework was trained and validated on a large multi-institutional dataset that reflects the real-world clinical situation and low prevalence of early-stage pancreatic ductal adenocarcinoma. The study included 1,462 CT scans, including 219 prediagnostic scans from patients later diagnosed with pancreatic ductal adenocarcinoma and 1,243 control scans from cancer-free individuals, all with at least 3 years of follow-up. These were divided into a training cohort (n = 969) and an independent testing cohort (n = 493).
The fully automated pipeline integrates deep learning-based pancreas segmentation and radiological feature extraction, initially producing 968 quantitative image features per scan. These were reduced to 40 key features using minimum redundancy, maximum relevance selection and incorporated into a heterogeneous ensemble model combining logistic regression and gradient boosting algorithms.
Main results
Radiomics analysis revealed that 90% of the selected features were obtained from the multiscale wavelet filtered images, which was significantly higher than the unfiltered data (below the curve). [AUC] = 0.82 vs. 0.74; P = .007). The model was further evaluated for robustness across facilities and imaging platforms, as well as longitudinal stability in repeated imaging.
In an independent test cohort, REDMOD achieved an AUC of 0.82 (95% confidence interval). [CI] = 0.81 to 0.83), sensitivity was 73.0% (95% CI = 60.0% to 78.7%), and specificity was 81.1% (95% CI = 75.2% to 93.1%). This performance significantly exceeded that of radiologists, whose pooled sensitivity was 38.9% (P < .001), the AI model has nearly twice the detection rate. The benefits increased as the lead time increased. More than 24 months before diagnosis, sensitivity was 68.0% for REDMOD versus 23.0% for radiologists.
Sensitivity remained 75.0% throughout the prediagnosis period, both 3–12 months and 12–24 months before diagnosis. The model detected cancer with a median lead time of 475 days and maintained consistent performance across internal and external validation cohorts, with specificity of 81.3% and 87.5%, respectively. REDMOD showed 90% to 92% concordance with repeat imaging and also demonstrated strong longitudinal stability.
The authors concluded that “REDMOD is an automated, mechanically grounded, long-term stable, externally validated AI that outperforms radiologists for detecting pancreatic ductal adenocarcinoma at a visually hidden pre-diagnostic stage. These properties make REDMOD suitable for prospective validation in high-risk cohorts and a necessary step to move the paradigm from late-stage symptomatic diagnosis to aggressive preclinical inhibition.”
Dr. Sovanlal Mukherjeeof the Mayo Clinic Department of Radiology in Rochester, Minnesota, is the corresponding author of this paper. intestines article.
disclosure: This research was funded by the National Institutes of Health, Mayo Clinic Comprehensive Cancer Center, and others. For full research author disclosures, please visit bmj.com.
