Advertisement

Multimodal Labeling

  • Leon Rothkrantz
  • Pascal Wiggers
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5729)

Abstract

This paper is about automated labeling of emotions. Prototypes have been developed for extraction of features from facial expressions and speech. To train such systems data is needed. In this paper we report about the recordings of semi-spontaneous emotions. Multimodal emotional reactions are evoked in 21 controlled contexts. The purpose of this database is to make it a benchmark for the current and future emotion recognition studies in order to compare the results from different research groups. Validation of the recorded data is done online. Over 60 users scored the apex images (1.272 ratings), audio clips (201 ratings) and video clips (503 ratings) on the valence and arousal scale. Textual validation is done based on Whissell’s Dictionary of Affect in Language. A comparison is made between the scores of all four validation methods and the results showed some clusters for distinct emotions.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Boyle, E., Anderson, A.H., Newlands, A.: The effects of visibility on dialogue and performance in a co-operative problem solving task. Language and Speech 37, 1–20 (1994)Google Scholar
  2. 2.
    Russell, J.A., Fernandez-Dols, J.M.: The psychology of Facial Expression 9(3), 185–211 (1990)Google Scholar
  3. 3.
    Ekman, P.: Emotions Revealed. Times Books, New York (2003)Google Scholar
  4. 4.
    Darwin, C.: The Expression of the Emotions in Man and Animals (1872)Google Scholar
  5. 5.
    Martin, O., Kotsia, I., Macq, B., Pitas, I.: The eNTERFACE 2005 Audio-Visual Emotion Database, Atlanta (April 2006)Google Scholar
  6. 6.
    Ekman, P., Friesen, W.V.: The facial action coding system: A technique for measurement of facial movement (1978)Google Scholar
  7. 7.
    Cowie, R., Douglas-Cowie, E.: Emotion Recognition in human-Computer-Interaction. IEEE Signal Processing Magazine (January 2001)Google Scholar
  8. 8.
    Russell, J.A.: A circumplex model of affect. Journal of Personality and Social Psychology 39, 1167–1178 (1980); Vancouver, Canada (2001)Google Scholar
  9. 9.
    Desmet, P.M.A.: Designing Emotions, PhD dissertation, Delft University of Technology, Delft (2002)Google Scholar
  10. 10.
    Wojdel, A.: Knowledge Driven Facial Modelling, PhD dissertation, Delft University of Technology, Delft (2005)Google Scholar
  11. 11.
    Whissell, C.M., Dewson, M.J.: The dictionary of Affect in Language. In: Emotion: Theory, Research and Experience, vol. 18, pp. 113–131. Academic Press, New York (1989)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Leon Rothkrantz
    • 1
    • 2
  • Pascal Wiggers
    • 1
  1. 1.Man-Machine Interaction GroupDelft University of TechnologyDelftThe Netherlands
  2. 2.SEWACOThe Netherlands Defence AcademyDen HelderThe Netherlands

Personalised recommendations