Annotating Multimodal Behaviors Occurring During Non Basic Emotions

  • Jean-Claude Martin
  • Sarris Abrilian
  • Laurence Devillers
Conference paper

DOI: 10.1007/11573548_71

Part of the Lecture Notes in Computer Science book series (LNCS, volume 3784)
Cite this paper as:
Martin JC., Abrilian S., Devillers L. (2005) Annotating Multimodal Behaviors Occurring During Non Basic Emotions. In: Tao J., Tan T., Picard R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2005. Lecture Notes in Computer Science, vol 3784. Springer, Berlin, Heidelberg

Abstract

The design of affective interfaces such as credible expressive characters in story-telling applications requires the understanding and the modeling of relations between realistic emotions and behaviors in different modalities such as facial expressions, speech, hand gestures and body movements. Yet, research on emotional multimodal behaviors has focused on individual modalities during acted basic emotions. In this paper we describe the coding scheme that we have designed for annotating multimodal behaviors observed during mixed and non acted emotions. We explain how we used it for the annotation of videos from a corpus of emotionally rich TV interviews. We illustrate how the annotations can be used to compute expressive profiles of videos and relations between non basic emotions and multimodal behaviors.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Jean-Claude Martin
    • 1
  • Sarris Abrilian
    • 1
  • Laurence Devillers
    • 1
  1. 1.LIMSI-CNRSOrsayFrance

Personalised recommendations