AI decodes emotions through gestures.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Abstract: A new method developed by an international research team uses motion capture and EMOKINE software to decode emotions from movements. The team recorded a dancer performing choreography expressing different emotions and analyzed the dynamic characteristics of her movements.

The EMOKINE software, freely available on ZENODO and GitHub, provides an innovative tool for studying emotional expression through whole-body movements. This interdisciplinary approach can benefit experimental psychology, affective neuroscience, and AI-assisted analysis of visual media.

Important facts:

  1. EMOKINE software analyzes the dynamic characteristics of emotional movements.
  2. Motion capture technology recorded the dancer's movements to express six emotions.
  3. EMOKINE is open source and compatible with various motion capture systems.

Source: Max Planck Institute

Is it possible to decode how we feel from our movements? How can emotions be studied “from the outside” using experimental methods?

To answer these questions, a large international and interdisciplinary research team led by the Max Planck Institute for Experimental Aesthetics (MPIEA) in Frankfurt am Main, Germany, has developed an integrated scientific methodology.

Using artistic and digital means such as motion capture technology, the researchers developed EMOKINE software to measure objective kinematic properties of movements that express emotion.

The results of this research were recently published in the journal Behavioral research methods.

Movement tracking has been used in many fields in recent years because the objective recording of movement parameters can provide insight into people's intentions, feelings, and state of mind. Credit: Neuroscience News

The team had a professional dancer repeat short dance choreographies in front of a green screen. He was asked to express different emotions through his movements: anger, satisfaction, fear, happiness, neutrality and sadness.

To capture dance movements as “data”, scientists dived into MPIEA's technology pool: the dancer wore a full-body motion capture suit from XSENS®, equipped with a total of 17 highly sensitive sensors.

The dynamic movements of the body were measured and recorded in conjunction with a film camera. The researchers then extracted the objective kinematic characteristics (motion parameters) and programmed the software EMOKINE, which provided these movement parameters from the data set at the touch of a button.

Computerized tracking for whole body movement

A total of 32 statistics were compiled from 12 movement parameters and extracted from the pilot dance dataset. Kinematic parameters recorded were, for example, velocity, acceleration, or limb contraction.

“We identified 12 dynamic features of emotional whole-body movements that have been discussed separately in the literature on previous research. We then extracted them all from the same data set, and subsequently combined the features into EMOKINE. fed into the software,” reports MPIEA first author Julia F. Christensen.

Movement tracking has been used in many fields in recent years because the objective recording of movement parameters can provide insight into people's intentions, feelings, and state of mind. However, this research requires a theory-based methodology to draw meaningful conclusions from the recorded data.

“This work shows how artistic practice, psychology, and computer science can work together in an exemplary way to develop methods for studying human cognition,” said Max Planck Institute for Human Development in Tübingen, Germany. says Andres Fernandez, co-first author of Intelligent Systems.

The methodological framework that accompanies the software package, and which explicitly uses dance movements to study emotion, differs from previous research methods, which often use video clips of “emotional actions.” Actions such as waving or walking.

“We are particularly excited about the publication of this work, which involves many experts, for example a film team from Goethe University Frankfurt am Main, the University of Glasgow, and Portugal's WiseWorld Ai.

“It brought together disciplines from psychology, neuroscience, computer science, and experimental aesthetics, but also from dance and film,” summarizes senior author Gemma Roig, professor of computer science, computational vision, and the AI ​​Lab at Goethe University. do

Open source software package

EMOKINE is freely available on ZENODO and GitHub and can be adapted to other motion capture systems with minor modifications. These freely available digital tools can be used to analyze the emotional expression and everyday movements of dancers and other groups of performers.

The researchers now hope that the EMOKINE software they developed will be used in experimental psychology, affective neuroscience and computer vision, particularly in AI-assisted visual media analysis, a branch of AI that Enables computers and systems to extract meaningful information. Digital images, videos, and other visual input.

EMOKINE will help scientists answer research questions about how the kinetic parameters of whole-body movements convey different intentions, feelings and states of mind to the observer.

About this artificial intelligence research news

the author: Kevan is happy
Source: Max Planck Institute
contact: Kevan Sarkhush – Max Planck Institute
Image: This image is credited to Neuroscience News.

Original Research: closed access
“Emokine: A Software Package and Computational Framework to Enhance the Generation of Highly Controlled Emotional Whole-Body Movement Datasets” by Julia F. Christensen et al. Behavioral research methods


Abstract

Emokin: a software package and computational framework to enhance the generation of highly controlled emotional full-body movement datasets.

EMOKINE is a software package and dataset generation suite for researching emotional whole-body movements in experimental psychology, affective neuroscience, and computer vision.

A computational framework, comprehensive instructions, a pilot dataset, observer classification, and kinematic feature extraction code are provided to facilitate scale-up of future dataset creations.

In addition, the EMOKINE framework outlines how complex sequences of movements can advance emotion research. Traditionally, such research has often used emotional 'action' based stimuli, such as hand waving or walking movements.

Here instead, a pilot dataset is provided with short dance choreographies, repeated several times by a dancer who expressed different emotional intentions at each repetition: anger, contentment, fear, happiness, neutrality, and sadness.

The dataset was simultaneously professionally filmed, and recorded using XSENS® motion capture technology (17 sensors, 240 frames/s).

Thirty-two statistics from 12 kinematic properties were extracted offline, for the first time in a single dataset: velocity, acceleration, angular velocity, angular acceleration, limb contraction, center-of-mass distance, momentum, dimensionless shock (integral). , head angle (with respect to vertical axis and back), and location (convex hull 2D and 3D). The mean, median absolute deviation (MAD), and maximum value were calculated as applicable.

The EMOKINE software is applicable to other motion capture systems and is openly available on the Zenodo Repository.

Releases on GitHub include: (i) code to extract 32 data, (ii) a rigging plugin for Python to convert MVNX file to Blender format (MVNX= Output File XSENS® System) , and (iii) a Python-script powerful custom software to help blur faces; The latter two under the GPLv3 license.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment