Robotics integrates AI and sensing to enable systems that perceive, interpret, and interact with the world. Aude Villard from the Swiss Federal Institute of Technology Lausanne (EPFL) is primarily interested in how sensors can help control such robotic systems. Her team studies soft robotics, which are flexible and deformable and often use tactile sensors rather than vision.
Previously, tactile sensors were rigid and only measured pressure along the sensor’s surface, Billard explains. These are now flexible and stretchable sensors that can detect not only strain but also forces in other directions. Such sensors enable fine manipulation and can be placed throughout the robot body, including bending joints.
Tactile sensors generate data two orders of magnitude faster than vision. Additionally, although pressure sensors are highly accurate, changes in temperature can cause the values they produce to deviate from calibration. There were two options to address this problem, Billard explains. “Either we stop using sensors and wait until someone designs a better sensor, or we try to understand them by modeling their drift over time,” she says.
As AI techniques become capable of modeling complex nonlinear effects, people increasingly choose the latter option. This improves the accuracy and reliability of the sensor. Billard explains that when he started using LSTM around 2000, it had the potential to help predict pressure values in tactile sensors and how they would drift over time.
Another problem with tactile sensors is that when the robot touches something, the robot also deforms due to that contact, affecting measurements, said Cecilia Laski of the National University of Singapore. We explain that the human brain anticipates sensory input to minimize the need for continuous sensing. Researchers can emulate this process with robots to improve performance while reducing computational load. By using a predictive system based on traditional modeling rather than AI, Laschi and colleagues were able to use tactile sensing to enable the soft robot to recognize touch, make real-time decisions, and autonomously maneuver around the maze’s walls.7. This approach helps distinguish between deformations caused by the robot’s own movement and contact with another surface, and could be made possible by AI, she explains.
Laschi’s team also uses continuous learning algorithms. This involves training a model for new tasks sequentially while preserving previously learned tasks. These allow soft robots to adapt to deterioration and changes in morphology, ensuring long-term autonomy. Soft actuators puncture or leak, but continuously learning controllers adapt to damage and readapt as self-healing materials recover. The result is a resilient soft robot that learns, endures wear and tear, and relearns tasks.
As sensing and AI systems become more common, it will be essential to use continuous learning and other approaches to enable legacy systems to adapt to new hardware. Billard explains that when trying to adapt algorithms to different sensors, it is necessary to separate the basic cognition that the sensor performs from the specific form of information that the sensor generates.
Villard explains that what the AI system learned on the older V1 sensor may not be applicable to the newer V2 sensor, which has higher resolution, different sensitivity, or entirely new capabilities. “We need to determine which aspects of V1 data are also valid for V2 and develop ways to reconcile the discrepancies, essentially bridging the sensory gap between the two generations of hardware,” she emphasizes. In planning for such problems, data analysis can lead to the second important direction of the use of AI in sensing: device design.
