News

[Press Release] Bringing Human Dexterity to Robots by Combining Human Motion and Tactile Sensation

2026.01.08

Researchers develop an adaptive motion system that allows robots to generate human-like movements with minimal data.

Despite rapid robotic automation advancements, most systems struggle to adapt their pretrained movements to dynamic environments with objects of varying stiffness or weight. To tackle this challenge, researchers from Japan have developed an adaptive motion reproduction system using Gaussian process regression. By learning the relationship between human motion and object properties, their method enables robots to accurately replicate human grasping behaviors using small training datasets and manipulate unfamiliar objects with remarkable precision and efficiency.

Accelerating progress in robotic automation promises to revolutionize industries and improve our lives by replacing humans in risky, physically demanding, or repetitive tasks. While existing robots already excel in controlled environments such as assembly lines, the ultimate frontier of automation lies in dynamic environments found in tasks such as cooking, assisting the elderly, and exploration. To realize this goal, one of the key barriers is making robots capable of adapting to touch. Unlike human hands, which intuitively adjust their grip for objects of unknown weight, friction, or stiffness, most robotic systems lack this crucial form of adaptability.

To transfer sophisticated human dexterity to machines, researchers have developed various motion reproduction systems (MRSs). These are centered around accurately recording human movements and recreating them in robots via teleoperation. However, MRSs tend to encounter problems if the properties of the object being handled change or do not match those of the recorded movement. This limits the versatility of MRSs and, in turn, the applicability of robots in general.

To address this fundamental challenge, a research team from Japan has developed a novel system designed to adaptively model and reproduce complex human motions. The study was led by Master's student Mr. Akira Takakura from the Graduate School of Science and Technology, Keio University, and co-authored by Associate Professor Takahiro Nozaki, Department of System Design Engineering; Doctoral student Kazuki Yane; Professor Emeritus Shuichi Adachi, also from Keio University; and Assistant Professor Tomoya Kitamura from Tokyo University of Science, Japan. Their paper was published in IEEE Transactions on Industrial Electronics, a world-leading international academic journal in this field, on December 30 , 2025.

The team's core breakthrough was moving past linear modeling strategies and instead using Gaussian process regression (GPR). This is a regression technique that can accurately map complex nonlinear relationships, even with a small amount of training data. By recording human grasping motions for multiple objects, the GPR model was trained to identify the relationship between the object's 'environmental stiffness' and the necessary position and force commands issued by the human. In turn, this process effectively reveals the human's underlying motion intention, or 'human stiffness'--allowing the robot to generate appropriate motion for objects it has never encountered. "Developing the ability to manipulate commonplace objects in robots is essential for enabling them to interact with objects in daily life and respond appropriately to the forces they encounter," explains Dr. Nozaki.

To validate their approach, the researchers tested it against conventional MRSs, linear interpolation, and a typical imitation learning model. The proposed GPR system demonstrated significantly enhanced performance in reproducing accurate motion commands for both interpolation and extrapolation. For interpolation, which involves handling objects with a stiffness that falls within the limits of the training set, it reduced the average root-mean-square error (RMSE) by at least 40% for position and 34% for force. Meanwhile, for extrapolation of objects harder or softer than those in the training set, the results were equally robust, exhibiting a 74% reduction in position RMSE. Most importantly, the proposed method based on GPR markedly outperformed all other methods.

By accurately modeling human-object interactions with minimal training data, this new take on MRSs will help generate dexterous motion commands for a wide range of objects. This ability to capture and recreate complex human skills will ultimately enable robots to move beyond rigid contexts and toward providing more sophisticated services. "Since this technology works with a small amount of data and lowers the cost of machine learning, it has potential applications across a wide range of industries, including life-support robots, which must adapt their movements to different targets each time, and it can lower the bar for companies that have been unable to adopt machine learning due to the need for large amounts of training data," comments Mr. Takakura.

Worth noting, this research group at Keio University has been actively engaged in research concerning the transmission, preservation, and reproduction of force-tactile feedback. Their previous efforts in this field have covered a wide range of topics, such as the reduction of data traffic, motion modeling, and haptic transplant technology. Their groundbreaking work on sensitive robotic arms and 'avatar' robots has been widely recognized by electronics research institutions like the IEEE, as well as by organizations such as the Government of Japan and Forbes.

Image

image

Title: Robotic avatar replicating human motion
Caption: This image depicts the real-time transfer of a human's motion to a robotic avatar, enabling the latter to perform a dexterous task.
Credit: Associate Professor Takahiro Nozaki from Keio University, Japan
Image source link: N/A
License type: Original content
Usage restrictions: Cannot be reused without permission.

Reference
Title of original paper: Motion Reproduction System for Environmental Impedance Variation via Data-driven Identification of Human Stiffness
Journal: IEEE Transactions on Industrial Electronics
DOI: https://doi.org/10.1109/TIE.2025.3626633

About Associate Professor Takahiro Nozaki from Keio University
Dr. Takahiro Nozaki received his B.E., M.E., and Ph.D. from Keio University, Yokohama, Japan, in 2010, 2012, and 2014, respectively. In 2014, he joined Yokohama National University, Yokohama, Japan, as a Research Associate. In 2015, he joined Keio University, where he is currently an Associate Professor. He was also a Visiting Scientist with the Massachusetts Institute of Technology, Cambridge, USA, from 2019 to 2021. He was one of the winners of the IEEE Industrial Electronics Society Under-35 Innovators Contest in 2019.
https://nozaki-lab.jp/
https://k-ris.keio.ac.jp/html/100011714_en.html

About Mr. Akira Takakura from Keio University
Mr. Akira Takakura received a B.E. degree in System Design Engineering from Keio University, Yokohama, Japan, in 2024. He is currently working toward an M.E. degree. His research interests include adaptive control, system identification, robotics, and haptics.


[Enquiry]
Contact: Takahiro Nozaki
E-mail: nozaki@sd.keio.ac.jp