Using adaptive motion systems, robots learn human touch with less data

Machine Learning


Japanese researchers have developed an adaptive motion reproduction system that allows robots to generate human-like movements using surprisingly small amounts of training data.

Despite rapid advances in robotic automation, most systems struggle when objects vary in weight, stiffness, and texture. Pre-trained movements often fail outside of controlled environments, limiting robots to predictable tasks on the factory floor.

This limitation becomes important as robots move into real-world environments such as kitchens, hospitals, and homes. In such an environment, robots must constantly adjust the way humans instinctively grasp and apply force.

Unlike human hands, robotic systems do not have the ability to intuitively adapt to unfamiliar objects. This gap represents one of the biggest barriers to deploying robots in dynamic, unstructured environments.

Teaching robots senses

To address this challenge, a Japanese research team developed a new adaptive movement regeneration system based on Gaussian process regression.

The research was led by Akira Takakura of Keio University.

Motion reproduction systems typically rely on recording human movements and replaying them by a robot using remote control.

However, these systems fail when the physical properties of the objects differ from the original training data.

The new approach goes beyond linear models by using Gaussian process regression, a technique that can map complex nonlinear relationships with limited data.

By recording human grasping motions across objects of varying stiffness levels, the model learns how the object’s properties relate to the force and position applied by the human.

This allows the system to infer the intent of human actions and generate appropriate actions for objects it has never seen before.

“Developing the ability to manipulate common objects in robots is essential if we are to enable robots to interact with objects in everyday life and respond appropriately to the forces they encounter,” explains Dr. Takahiro Nozaki.

Powerful Results, Widespread Impact

The team tested the system against traditional motion reproduction systems, linear interpolation methods, and typical imitation learning models.

For interpolation tasks where object stiffness was within the training range, the system reduced position errors by at least 40 percent and force errors by 34 percent.

Extrapolation tasks involving objects outside the training range reduced position error by 74%.

In all scenarios, the Gaussian process regression-based system significantly outperformed existing methods.

The ability to recreate accurate human-like movements using minimal data has the potential to significantly reduce the cost and complexity of deploying adaptive robots across industries.

“Because this technology works with small amounts of data and lowers the cost of machine learning, it has the potential to be applied in a wide range of industries, such as life support robots that need to adapt their movements to different goals each time. It also lowers the hurdles for companies that have been unable to implement machine learning because they require large amounts of learning data,” Takakura said.

This research builds on Keio University’s years of research in haptic feedback, motion modeling, and haptic technology.

The group’s previous research on highly sensitive robotic arms and avatar robots has been recognized by the IEEE, the Japanese government, and Forbes.

This research brings automation one step closer to working reliably in the unpredictable real world by enabling robots to adapt their touches and movements like humans.

This research IEEE Transactions on Industrial Electronics.



Source link