What Should a Generic Emotion Markup Language Be Able to Represent?

  • Marc Schröder
  • Laurence Devillers
  • Kostas Karpouzis
  • Jean-Claude Martin
  • Catherine Pelachaud
  • Christian Peter
  • Hannes Pirker
  • Björn Schuller
  • Jianhua Tao
  • Ian Wilson
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4738)


Working with emotion-related states in technological contexts requires a standard representation format. Based on that premise, the W3C Emotion Incubator group was created to lay the foundations for such a standard. The paper reports on two results of the group’s work: a collection of use cases, and the resulting requirements. We compiled a rich collection of use cases, and grouped them into three types: data annotation, emotion recognition, and generation of emotion-related behaviour. Out of these, a structured set of requirements was distilled. It comprises the representation of the emotion-related state itself, some meta-information about that representation, various kinds of links to the “rest of the world”, and several kinds of global metadata. We summarise the work, and provide pointers to the working documents containing full details.


Emotion Recognition Emotion Word Data Annotation Emotion Category Emotional Speech 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Prendinger, H., Ishizuka, M.: Life-like Characters. Tools, Affective Functions and Applications. Springer, Berlin (2004)Google Scholar
  2. 2.
    Ekman, P.: Facial expression and emotion. American Psychologist 48, 384–392 (1993)CrossRefGoogle Scholar
  3. 3.
    Schröder, M., Pirker, H., Lamolle, M.: First suggestions for an emotion annotation and representation language. In: Proceedings of LREC 2006 Workshop on Corpora for Research on Emotion and Affect, Genoa, Italy, pp. 88–92 (2006)Google Scholar
  4. 4.
    Douglas-Cowie, E. et al.: HUMAINE deliverable D5g: Mid Term Report on Database Exemplar Progress (2006),
  5. 5.
    Ekman, P., Friesen, W.: The Facial Action Coding System. Consulting Psychologists Press, San Francisco (1978)Google Scholar
  6. 6.
    Tekalp, M., Ostermann, J.: Face and 2-d mesh animation in MPEG-4. Image Communication Journal 15, 387–421 (2000)Google Scholar
  7. 7.
    Devillers, L., Vidrascu, L., Lamel, L.: Challenges in real-life emotion annotation and machine learning based detection. Neural Networks 18, 407–422 (2005)CrossRefGoogle Scholar
  8. 8.
    Bevacqua, E., Raouzaiou, A., Peters, C., Caridakis, G., Karpouzis, K., Pelachaud, C., Mancini, M.: Multimodal sensing, interpretation and copying of movements by a virtual agent. In: André, E., Dybkjær, L., Minker, W., Neumann, H., Weber, M. (eds.) PIT 2006. LNCS (LNAI), vol. 4021, Springer, Heidelberg (2006)CrossRefGoogle Scholar
  9. 9.
    Batliner, A., et al.: Combining efforts for improving automatic classification of emotional user states. In: Proceedings IS-LTC 2006 (2006)Google Scholar
  10. 10.
    Blech, M., Peter, C., Stahl, R., Voskamp, J., Urban, B.: Setting up a multimodal database for multi-study emotion research in HCI. In: Proceedings of the 2005 HCI International Conference, Las Vegas (2005)Google Scholar
  11. 11.
    Peter, C., Herbon, A.: Emotion representation and physiology assignments in digital systems. Interacting With Computers 18, 139–170 (2006)CrossRefGoogle Scholar
  12. 12.
    Kopp, S., Krenn, B., Marsella, S., Marshall, A., Pelachaud, C., Pirker, H., Thórisson, K., Vilhjálmsson, H.: Towards a common framework for multimodal generation in ECAs: The Behavior Markup Language. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, pp. 205–217. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  13. 13.
    Cornelius, R.R.: The Science of Emotion. Research and Tradition in the Psychology of Emotion. Prentice-Hall, Upper Saddle River, NJ (1996)Google Scholar
  14. 14.
    Scherer, K.R.: Psychological models of emotion. In: Borod, J.C. (ed.) The Neuropsychology of Emotion, pp. 137–162. Oxford University Press, New York (2000)Google Scholar
  15. 15.
    Roesch, E., Fontaine, J., Scherer, K.: The world of emotion is two-dimensional – or is it? In: Presentation at the HUMAINE Summer school, Genoa, Italy (2006)Google Scholar
  16. 16.
    Scherer, K.R.: On the nature and function of emotion: A component process approach. In: Scherer, K.R., Ekman, P. (eds.) Approaches to emotion, pp. 293–317. Erlbaum, Hillsdale, NJ (1984)Google Scholar
  17. 17.
    Gross, J.J. (ed.): Handbook of Emotion Regulation. Guilford Publications, New York (2006)Google Scholar
  18. 18.
    Burkhardt, F., Paeschke, A., Rolfes, M., Sendlmeier, W., Weiss, B.: A database of german emotional speech. In: Proc. Interspeech 2005, Lisbon, Portugal, ISCA, pp. 1517–1520 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Marc Schröder
    • 1
  • Laurence Devillers
    • 2
  • Kostas Karpouzis
    • 3
  • Jean-Claude Martin
    • 2
  • Catherine Pelachaud
    • 4
  • Christian Peter
    • 5
  • Hannes Pirker
    • 6
  • Björn Schuller
    • 7
  • Jianhua Tao
    • 8
  • Ian Wilson
    • 9
  1. 1.DFKI GmbH, SaarbrückenGermany
  2. 2.LIMSI-CNRS, ParisFrance
  3. 3.Image, Video and Multimedia Systems Lab, Nat. Tech. Univ. AthensGreece
  4. 4.Univ. Paris VIIIFrance
  5. 5.Fraunhofer IGD, RostockGermany
  6. 6.OFAI, ViennaAustria
  7. 7.Tech. Univ. MunichGermany
  8. 8.Chinese Acad. of Sciences, BeijingChina
  9. 9.Emotion AI, TokyoJapan

Personalised recommendations