Japanese study reveals AI can track pianist's muscle activity from video alone

AI Video & Visuals


researchers Tokyo University of Science and Sony Computer Science Laboratories recently published a new study investigating the use of an AI system that can reconstruct fine muscle activity in the hands using only video footage of a musician playing the piano.

Prior to this study, similar studies on muscle movement required the use of electrodes on the skin. This technique can only measure gross motor movements using large muscles, and anatomical differences between individuals have made it difficult to draw generalized conclusions from these datasets.

The researchers built a deep learning framework for this task and trained it on a comprehensive dataset of recordings from professional pianists. The new system provides a low-cost, non-invasive method to analyze fine motor control. This will help optimize rehabilitation strategies, enhance performance training, and future developments in human-machine interaction.

This dataset, named PianoKPM, captures the precision with which professional pianists move, press, and control their hands. It contains 12.6 hours of synchronized data from 20 pianists performing seven different musical tasks. Each performance was recorded with 60 frames/s multi-view video, 3D hand poses, 1 kHz keystroke data, audio, and 2 kHz EMG signals from six small hand muscles. The dataset includes more than 5 million pose frames and 28 million EMG samples, creating the first detailed map linking visible movement to internal muscle activity.

“Using this dataset, we propose PianoKPM Net, which estimates high-frequency electromyography from posture data,” the professor said. Hideki Koikeled the research.

“The combination of PianoKPM Net and PianoKPM datasets creates a foundation for affordable access to physiological and muscle activity signals within the body, supporting advances in human augmentation and advanced human-machine interaction.”



Source link