Scientists are working on using artificial intelligence to make bionic limbs behave more like natural limbs.
Scott Detrow, host:
Scientists are using artificial intelligence to help bionic limbs behave more like their natural counterparts. NPR's John Hamilton reports on an experimental hand that shares control with the user to perform difficult tasks, such as lifting a Styrofoam coffee cup.
John Hamilton, Signed: Modern bionic hands can rotate, move individual fingers, and manipulate objects. It can also detect electrical signals emitted by the muscles used to control their movements. But University of Utah researcher Marshall Trout says most prosthetic arms aren't very smart yet.
Marshall Trout: The guy has to sit there and really focus on what he's doing. They have to keep their eyes on what they're trying to handle, which is actually quite different from how an intact hand should behave.
HAMILTON: That's one reason why many people who get fancy prosthetic arms end up not using them. So Trout and a team of scientists set out to create a smarter prosthetic limb that would function more like a human hand.
Trout: Because I know where the coffee cup is, I can get there without paying too much attention. Then, as your hand gets closer, you can somehow feel where it is, and you will naturally grasp it and touch it. And that's what we wanted to recreate in our system.
Hamilton: The team turned to artificial intelligence to take on some of these subconscious functions. This meant detecting not just the signals coming from the muscles, but the intentions behind them. For example, Trout said, the AI control system has learned to detect the smallest twitches in the muscles that bend the hand.
Trout: The moment it detects that slight bend, the machine's controller kicks in and says, “Oh, I'm trying to grab something.'' We don't just sit still.
HAMILTON: To make this approach work, scientists modified the bionic hand by adding sensors that can see and feel. This allows the AI system to measure the distance to an object and evaluate its shape. Meanwhile, pressure sensors in the fingertips informed the user how tightly the prosthetic hand was holding objects. To test the system, Trout's team asked four people who had lost their natural hands to drink water from a cup using an artificially intelligent version, with and without the aid of AI.
TROUT: With mechanical assistance, they were able to pick up these cups very reliably and mimic taking a sip of water from them. And without the help of machines, humans would just crush it or drop it every time.
HAMILTON: The results will be published in the journal Nature Communications. And Jacob George, who oversaw the research at Utah's Neurorobotics Institute, said they address a problem people often encounter when using superhuman technology.
Jacob George: We can create robotic hands that can perform tasks better than human users. However, when you actually give it to someone, they don't like it. You know, they actually hate this.
Hamilton: Because they feel like it's not part of them, it's outside of their control. George said the artificial intelligence system shares control with the user.
George: Machines are doing things, humans are doing things, and we're combining the two.
HAMILTON: George says this is an important step toward prosthetic limbs that feel like extensions of your body.
George: Ultimately, when you create an embodied robotic hand, it becomes part of that user's experience. It becomes more than just a tool, it becomes a part of yourself.
HAMILTON: One reason we feel this way about our hands is that they are partially controlled by reflexes in the brain stem and spinal cord. John Downey of the University of Chicago, who was not involved in the study, says this means the thinking part of our brains doesn't have to worry about the details of every action.
John Downey: All of our motor control involves subconscious reflexes. Therefore, it will be important to provide a robotic mimic of these reflex loops.
HAMILTON: But Downey says sometimes we need the human brain to do things that artificial intelligence can't yet do.
Downey: You can do tasks that require very little force, such as threading a needle, or even grab and lift a child in very precise ways. Its dynamic range is far beyond what robots can normally handle.
Hamilton: For now, anyway.
John Hamilton, NPR News.
(SOUNDBITE OF MUSIC)
Copyright © 2025 NPR. Unauthorized reproduction is prohibited. For more information, please visit our website's Terms of Use and Permissions page at www.npr.org.
Accuracy and availability of NPR transcripts may vary. Transcript text may be revised to correct errors or match audio updates. Audio on npr.org may be edited from its original broadcast or after publication. The reliable recording of NPR's programming is the audio recording.
