summary: An international team of researchers has developed a new method that uses motion capture and EMOKINE software to decipher emotions from movement. The team recorded dancers performing choreographed movements expressing different emotions and analyzed the kinematic characteristics of their movements.
ZENODO and the EMOKINE software, freely available on GitHub, provide innovative tools to study emotion expression through whole-body movements. This interdisciplinary approach benefits experimental psychology, affective neuroscience, and AI-driven analysis of visual media.
Key Facts:
- EMOKINE software analyzes the kinematic characteristics of emotional movements.
- Motion capture technology was used to record the dancers' movements as they expressed six different emotions.
- EMOKINE is open source and can be adapted to a variety of motion capture systems.
sauce: Max Planck Institute
Is it possible to interpret how we feel from the way we move? How can we use empirical methods to study emotions “from the outside”?
To answer these questions, a large international and interdisciplinary research team led by the Max Planck Institute for Empirical Aesthetics (MPIEA) in Frankfurt am Main, Germany, has developed an integrated scientific methodology.
The researchers developed the EMOKINE software to measure objective kinematic characteristics of movements expressing emotions using artistic and digital means, such as motion capture technology.
The results of the study were recently published in the journal Behavioral research methods.

The team had professional dancers repeat short choreographed dance routines in front of a green screen, where they were asked to express a range of emotions through their movements, including anger, contentment, fear, happiness, neutrality and sadness.
To capture the dance movements as “data”, the scientists dove into the MPIEA technology pool: dancers wore XSENS® full-body motion capture suits equipped with a total of 17 highly sensitive sensors.
Combined with a film camera, dynamic body movements were measured and recorded. The researchers then programmed the software EMOKINE to extract objective kinematic characteristics (movement parameters) and deliver these movement parameters from the dataset at the touch of a button.
Computer-tracked whole-body movements
A total of 32 statistics from 12 movement parameters were collected and extracted from the pilot dance dataset. The kinematic parameters recorded were, for example, limb speed, acceleration, and contraction.
“We identified 12 kinematic features of emotional whole-body movements that have been discussed individually in the literature on previous studies. We then extracted them all from one and the same dataset and subsequently input the features into the EMOKINE software,” reports lead author Julia F. Christensen from MPIEA.
Motion tracking has been used in many fields in recent years because objectively recording motion parameters can provide insight into people's intentions, emotions, and states of mind. However, this research requires a theory-based methodology to draw meaningful conclusions from the recorded data.
“This study shows how artistic practice, psychology and computer science can work together in an ideal way to develop methods for studying human cognition,” said co-lead author Andres Fernandez of the Max Planck Institute for Intelligent Systems in Tübingen, Germany.
The methodological framework that accompanies the software package explicitly uses dance movements to study emotion, setting it apart from previous research approaches that have often used video clips of “emotional movements” such as waving or walking.
“We are particularly excited about the publication of this work, which involved the participation of many experts, including the Goethe University in Frankfurt am Main, the University of Glasgow and the film team from WiseWorld Ai in Portugal.
“It brings together psychology, neuroscience, computer science, and experiential aesthetics, but also the fields of dance and film,” summarizes senior author Gemma Roig, professor in the Computer Science, Computational Vision and AI Lab at Goethe University.
Open Source Software Packages
EMOKINE is freely available on ZENODO and GitHub and can be adapted to other motion capture systems with minor modifications. These freely available digital tools can be used to analyze the emotional expressions and everyday movements of dancers and other groups of artists.
The researchers now hope that the EMOKINE software they developed will be used in experimental psychology, affective neuroscience and computer vision, specifically the AI-assisted analysis of visual media, a branch of AI that enables computers and systems to extract meaningful information from digital images, videos and other visual inputs.
EMOKINE can help scientists answer the research question of how the kinematic parameters of whole-body movements communicate different intentions, emotions, and states of mind to observers.
About this AI research news
author: Kivan Sarkos
sauce: Max Planck Institute
contact: Kivan Sarkos – Max Planck Institute
image: Image courtesy of Neuroscience News
Original Research: The access is closed.
“EMOKINE: A Software Package and Computational Framework for Scaling Up the Creation of Highly Controlled Emotional Whole-Body Movement Datasets,” by Julia F. Christensen et al. Behavioral research methods
Abstract
EMOKINE: A software package and computational framework for scaling up the creation of highly controlled emotional whole-body movement datasets.
EMOKINE is a software package and dataset creation suite for the study of emotional whole-body movements in experimental psychology, affective neuroscience, and computer vision.
A computational framework, comprehensive procedures, pilot datasets, observer evaluations, and kinematic feature extraction code are provided to facilitate future large-scale dataset creation.
Furthermore, the EMOKINE framework shows how a complex set of behaviors can advance emotion research. Traditionally, such studies often use stimuli based on emotional “behaviors,” such as waving or walking.
Here, instead, the pilot dataset is provided with short dance choreography, which the dancers repeat several times, expressing a different emotional intent in each repetition: anger, contentment, fear, joy, neutral, sadness, etc.
The dataset was professionally filmed and recorded using XSENS® motion capture technology (17 sensors, 240 frames per second).
32 statistics were extracted offline from the 12 kinematic features, and for the first time on one dataset the following statistics were extracted: velocity, acceleration, angular velocity, angular acceleration, limb contraction, distance to centre of mass, movement volume, dimensionless jerk (integral), head angle (with respect to vertical axis and back), and spatial (convex hull 2D and 3D). Means, median absolute deviations (MAD), and maxima were calculated as appropriate.
The EMOKINE software is also applicable to other motion capture systems and is publicly available in the Zenodo repository.
The release on GitHub includes (i) the code to extract the 32 statistics, (ii) a rigging plugin for Python to convert MVNX files to Blender format (MVNX = output file XSENS® system), and (iii) custom Python script-based software to assist with face blurring, the last two of which are provided under the GPLv3 license.