Robots enable complex surgery autonomously through AI training

Machine Learning


summary: The surgical robot trained with a real-life procedure video autonomously performed the critical stages of gallbladder removal, adapting to unexpected situations and responding to voice commands. This breakthrough shows how artificial intelligence can combine accuracy with the flexibility needed for a real medicine.

Using a machine learning framework similar to ChatGpt, the robots demonstrated professional level performance even under challenging and diverse conditions. Researchers see this as an important step into a fully autonomous and reliable surgical system that can be independently supported or manipulated.

Important facts:

  • Autonomous Adaptability: The robot adapted in real time to anatomy differences, unexpected events, and oral corrections.
  • Imitation learning: It was learned by watching the human surgeon's videos and was enhanced with contextual task captions.
  • Surgical Milestones: A 17-task complex gallbladder removal phase was successfully performed, with results comparable to the results.

sauce: jhu

The robot trained in the surgical video performed a long stage of gallbladder removal without the help of humans. The robot was first manipulated with a realistic patient, and was able to respond to voice commands from the team during surgery and learnt.

The robots were not performed sparingly, even in unexpected scenarios typical of real-life emergency situations, in exams and with the expertise of a skilled human surgeon.

This shows surgery that involves performing a robot.
The robot took longer to do the work than the human surgeon, but the results were comparable to that of a specialized surgeon. Credit: Neuroscience News

The federally funded research led by researchers at Johns Hopkins University is a transformative advance in surgical robotics, allowing robots to run both mechanical accuracy and human-like adaptability and understanding.

“This advancement can move from a robot that can perform certain surgical tasks to a robot that really understands the surgical procedure,” said medical roboticist Axel Krieger.

“This is an important distinction that brings us significantly closer to a clinically viable autonomous surgical system that works in a messy, unpredictable reality of real-world patient care.”

The results of the survey are published today Science robot.

In 2022, Star, the autonomous robot in Krieger's smart organization, performed his first autonomous robot surgery on a living animal. This is laparoscopic surgery in pigs. However, the robot required specially marked tissue, operated in a highly controlled environment, followed strict and prescribed surgical plans. Krieger said it was like teaching a robot to drive along a carefully mapped route.

But his new system is like “teaching robots to navigate roads of any state, and responding wisely to what you encounter,” he says.

The surgical robotic trans hierarchy, SRT-H truly performs surgery, adapts to individual anatomical features in real time, makes decisions on the spot, and self-corrects when things don't go as expected.

Built with the same machine learning architecture that runs ChatGpt, SRT-H is interactive and can accommodate voice commands (“grab the gallbladder head”) and modifications (“moving the left arm slightly left”). The robot learns from this feedback.

“This work represents a major leap from previous efforts to tackle some of the fundamental barriers to deploying autonomous surgical robots in the real world,” said Juon “Brian” Kim, a former doctoral researcher at Johns Hopkins, currently at Stanford University.

“Our work shows that AI models can be sufficiently reliable for surgical autonomy.

Last year, Krieger's team used the system to train the robots, performing three basic surgical tasks: needle manipulation, lifting body tissue and sutures. Each of these tasks took a few seconds.

The gallbladder removal procedure is much more complicated, a few minutes string of 17 tasks. The robot had to identify specific ducts and arteries, accurately and strategically position clips, and cut parts with scissors.

SRT-H learned how to do gallbladder work by watching a video of a Johns Hopkins surgeon doing it on a pig's corpse. The team enhanced visual training with captions describing the task. After watching the video, the robot performed the surgery with 100% accuracy.

The robot took longer to do the work than the human surgeon, but the results were comparable to that of a specialized surgeon.

“Just as surgical residents often learn different parts at different speeds, this study shows the promise to develop autonomous robotic systems in similarly modular and progressive ways,” says co-author Jeff Jopling, surgeon at Johns Hopkins.

The robot works perfectly in anatomical conditions that were not uniform, such as during an unexpected detour, when researchers change the robot's starting position and add blood-like dyes that changed the appearance of the gallbladder and surrounding tissue.

“For me, it shows that it is possible to do really complicated surgical procedures autonomously,” Krieger said.

“This is a proof of the concept that it is possible, and this imitation learning framework can automate such complex procedures with such high robustness.”

Second, the team wants to train and test the system in more types of surgeries and expand its capabilities to perform fully autonomous surgery.

The authors include Juo Tung Chen, a student of Dr. Johns Hopkins. Johns Hopkins visits graduate student Pascal Hansen. Stanford University PhD student Lucy X. Sce; Johns Hopkins undergraduate Antony Goldenberg; Samuel Schmidgar, a student of Dr. Johns Hopkins. Paul Maria Shackle, a former Johns Hopkins postdoc. Anton Deguette, Johns Hopkins research engineer. Brandon M. White, surgical companion of Chelsea Finn, assistant professor at Stanford University. De Ru Tsai and Richard Cha from optosurgical.

Research News on this robot and AI

author: Jill Rosen
sauce: jhu
contact: Jill Rosen – Je
image: This image is credited to Neuroscience News

Original research: Closed access.
“SRT-H: Axel Krieger et al. Science robot


Abstract

SRT-H: A hierarchical framework for autonomic surgery via language conditional imitation learning

Research on autonomic surgery primarily focuses on simple task automation in a controlled environment. However, practical surgical applications require long-term flexible manipulation and robust generalization to the inherent variability of human tissue.

These challenges remain difficult to address using existing logic-based or traditional end-to-end learning strategies.

To address this gap, we propose a hierarchical framework for performing dexterous and elder surgical procedures. Our approach uses high-level policies for task planning and low-level policies to generate low-level trajectories.

High-level planners help to plan in the language space, generate task-level or corrective instructions that guide the robots to long-range steps, and recover from errors caused by low-level policies.

Ablation studies were conducted to validate the framework and evaluate the key components of the system through ex vivo experiments on cholecystectomy, a commonly practiced minimally invasive surgery.

Our method achieves a 100% success rate with eight different vivo gallbladders that operate completely autonomously without human intervention. The hierarchical approach has improved the policy's ability to recover from inevitable suboptimal conditions in highly dynamic environments of realistic surgical applications.

This study demonstrates step-level autonomy in surgical procedures and marks a milestone towards the clinical deployment of autonomous surgical systems.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *