Javascript must be enabled to continue!
RAMAS: Russian Multimodal Corpus of Dyadic Interaction for studying emotion recognition
View through CrossRef
Emotion expression encompasses various types of information, including face and eye movement, voice and body motion. Most of the studies in automated affective recognition use faces as stimuli, less often they include speech and even more rarely gestures. Emotions collected from real conversations are difficult to classify using one channel. That is why multimodal techniques have recently become more popular in automatic emotion recognition. Multimodal databases that include audio, video, 3D motion capture and physiology data are quite rare. We collected The Russian Acted Multimodal Affective Set (RAMAS) the first multimodal corpus in Russian language. Our database contains approximately 7 hours of high-quality closeup video recordings of subjects faces, speech, motion-capture data and such physiological signals as electro-dermal activity and photoplethysmogram. The subjects were 10 actors who played out interactive dyadic scenarios. Each scenario involved one of the basic emotions: Anger, Sadness, Disgust, Happiness, Fear or Surprise, and some characteristics of social interaction like Domination and Submission. In order to note emotions that subjects really felt during the process we asked them to fill in short questionnaires (self-reports) after each played scenario. The records were marked by 21 annotators (at least five annotators marked each scenario). We present our multimodal data collection, annotation process, inter-rater agreement analysis and the comparison between self-reports and received annotations. RAMAS is an open database that provides research community with multimodal data of faces, speech, gestures and physiology interrelation. Such material is useful for various investigations and automatic affective systems development.
Title: RAMAS: Russian Multimodal Corpus of Dyadic Interaction for studying emotion recognition
Description:
Emotion expression encompasses various types of information, including face and eye movement, voice and body motion.
Most of the studies in automated affective recognition use faces as stimuli, less often they include speech and even more rarely gestures.
Emotions collected from real conversations are difficult to classify using one channel.
That is why multimodal techniques have recently become more popular in automatic emotion recognition.
Multimodal databases that include audio, video, 3D motion capture and physiology data are quite rare.
We collected The Russian Acted Multimodal Affective Set (RAMAS) the first multimodal corpus in Russian language.
Our database contains approximately 7 hours of high-quality closeup video recordings of subjects faces, speech, motion-capture data and such physiological signals as electro-dermal activity and photoplethysmogram.
The subjects were 10 actors who played out interactive dyadic scenarios.
Each scenario involved one of the basic emotions: Anger, Sadness, Disgust, Happiness, Fear or Surprise, and some characteristics of social interaction like Domination and Submission.
In order to note emotions that subjects really felt during the process we asked them to fill in short questionnaires (self-reports) after each played scenario.
The records were marked by 21 annotators (at least five annotators marked each scenario).
We present our multimodal data collection, annotation process, inter-rater agreement analysis and the comparison between self-reports and received annotations.
RAMAS is an open database that provides research community with multimodal data of faces, speech, gestures and physiology interrelation.
Such material is useful for various investigations and automatic affective systems development.
Related Results
Žanrovska analiza pomorskopravnih tekstova i ostvarenje prijevodnih univerzalija u njihovim prijevodima s engleskoga jezika
Žanrovska analiza pomorskopravnih tekstova i ostvarenje prijevodnih univerzalija u njihovim prijevodima s engleskoga jezika
Genre implies formal and stylistic conventions of a particular text type, which inevitably affects the translation process. This „force of genre bias“ (Prieto Ramos, 2014) has been...
Hubungan Dyadic Coping dengan Marital Satisfaction pada Dual Earner Family
Hubungan Dyadic Coping dengan Marital Satisfaction pada Dual Earner Family
Abstract. A dual-earner family is a family concept in which both the husband and wife work with the aim of maintaining the family's economic situation, which often poses its own ch...
Imagined worldviews in John Lennon’s “Imagine”: a multimodal re-performance / Visões de mundo imaginadas no “Imagine” de John Lennon: uma re-performance multimodal
Imagined worldviews in John Lennon’s “Imagine”: a multimodal re-performance / Visões de mundo imaginadas no “Imagine” de John Lennon: uma re-performance multimodal
Abstract: This paper addresses the issue of multimodal re-performance, a concept developed by us, in view of the fact that the famous song “Imagine”, by John Lennon, was published ...
What about males? Exploring sex differences in the relationship between emotion difficulties and eating disorders
What about males? Exploring sex differences in the relationship between emotion difficulties and eating disorders
Abstract
Objective: While eating disorders (ED) are more commonly diagnosed in females, there is growing awareness that men also experience ED and may do so in a different ...
Introduction: Autonomic Psychophysiology
Introduction: Autonomic Psychophysiology
Abstract
The autonomic psychophysiology of emotion has a long thought tradition in philosophy but a short empirical tradition in psychological research. Yet the past...
DAILY DYADIC COPING AND SALIVARY CORTISOL RHYTHMS IN COUPLES LIVING WITH EARLY-STAGE DEMENTIA
DAILY DYADIC COPING AND SALIVARY CORTISOL RHYTHMS IN COUPLES LIVING WITH EARLY-STAGE DEMENTIA
Abstract
Dyadic coping (i.e., how couples cope with stress) is associated with subjective well-being in dementia care dyads. However, we know little about dyadic cop...
Attachment traumatology and dyadic completion: Toddler trauma, ten years post-treatment
Attachment traumatology and dyadic completion: Toddler trauma, ten years post-treatment
Orientation: Research to elucidate trauma as a contagion in attachment traumatology began in 2014 with the construct of dyadic completion as a therapeutic goal for a traumatised to...
Oral Contraceptive Androgenicity and Cognitive Performance Among Women
Oral Contraceptive Androgenicity and Cognitive Performance Among Women
Oral contraceptives (OCs) may lower endogenous sex hormones while introducing synthetic progestins with varying degrees of biological androgenicity or masculinizing effects. No stu...

