Perception of Blended Emotions: From Video Corpus to Expressive Agent

  • Stéphanie Buisine
  • Sarkis Abrilian
  • Radoslaw Niewiadomski
  • Jean-Claude Martin
  • Laurence Devillers
  • Catherine Pelachaud
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4133)


Real life emotions are often blended and involve several simultaneous superposed or masked emotions. This paper reports on a study on the perception of multimodal emotional behaviors in Embodied Conversational Agents. This experimental study aims at evaluating if people detect properly the signs of emotions in different modalities (speech, facial expressions, gestures) when they appear to be superposed or masked. We compared the perception of emotional behaviors annotated in a corpus of TV interviews and replayed by an expressive agent at different levels of abstraction. The results provide insights on the use of such protocols for studying the effect of various models and modalities on the perception of complex emotions.


Facial Expression Similarity Score Emotion Recognition Nonverbal Behavior Original Video 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abrilian, S., Devillers, L., Buisine, S., Martin, J.-C.: EmoTV1: Annotation of Real-life Emotions for the Specification of Multimodal Affective Interfaces. In: 11th Int. Conf. Human-Computer Interaction (HCII 2005), Las Vegas, Nevada, USA (2005)Google Scholar
  2. 2.
    Abrilian, S., Devillers, L., Martin, J.-C.: Annotation of Emotions in Real-Life Video Interviews: Variability between Coders. In: 5th Int. Conf. Language Resources and Evaluation (LREC 2006), Genoa, Italy (2006)Google Scholar
  3. 3.
    Albrecht, I., Schröder, M., Haber, J., Seidel, H.-P.: Mixed feelings: Expression of non-basic emotions in a muscle-based talking head. Journal of Virtual Reality on Language, Speech & Gesture 8(4) (2005) (special issue)Google Scholar
  4. 4.
    Bassili, J.N.: Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face. Jour. Pers. Soc. Psychol. 37(11) (1979)Google Scholar
  5. 5.
    Becker, C., Prendinger, H., Ishizuka, M., Wachsmuth, I.: Evaluating Affective Feedback of the 3D Agent Max in a Competitive Cards Game. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 466–473. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  6. 6.
    Buisine, S.: Conception et Évaluation d’Agents Conversationnels Multimodaux Bidirectionnels. PhD Thesis. Doctorat de Psychologie Cognitive - Ergonomie, Paris V. 8 avril, Direction J.-C. Martin & J.-C. Sperandio (2005),
  7. 7.
    Cacioppo, J.T., Petty, R.P., Losch, M.E., Kim, H.S.: Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. Journal of Personality and Social Psychology 50 (1986)Google Scholar
  8. 8.
    Constantini, E., Pianesi, F., Prete, M.: Recognizing Emotions in Human and Synthetic Faces: The Role of the Upper and Lower Parts of the Face. In: Intelligent User Interfaces (IUI 2005), San Diego, CA, USA, pp. 20–27 (2005)Google Scholar
  9. 9.
    De Carolis, B., Pelachaud, C., Poggi, I., Steedman, M.: APML, a Markup Language for Believable Behavior Generation. Life-like characters. In: Tools, affective functions and applications. Springer, Heidelberg (2004)Google Scholar
  10. 10.
    Devillers, L., Abrilian, S., Martin, J.-C.: Representing Real-Life Emotions in Audiovisual Data with Non Basic Emotional Patterns and Context Features. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 519–526. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Douglas-Cowie, E., Devillers, L., Martin, J.-C., Cowie, R., Savvidou, S., Abrilian, S., Cox, C.: Multimodal Databases of Everyday Emotion: Facing up to Complexity. In: 9th European Conf. Speech Communication and Technology (Interspeech 2005), Lisbon, Portugal, pp. 813–816 (2005)Google Scholar
  12. 12.
    Duy Bui, T.: Creating Emotions And Facial Expressions For Embodied Agents. PhD Thesis. University of Twente (2004)Google Scholar
  13. 13.
    Ekman, P.: Darwin, Deception, and Facial Expression. Annals of the New York Academy of Sciences 1000 (2003)Google Scholar
  14. 14.
    Ekman, P.: Emotion in the human face. Cambridge University Press, Cambridge (1982)Google Scholar
  15. 15.
    Ekman, P.: The Face Revealed. Weidenfeld & Nicolson, London (2003)Google Scholar
  16. 16.
    Ekman, P., Friesen, W.: Felt, false, miserable smiles. Journal of Nonverbal Behavior 6(4) (1982)Google Scholar
  17. 17.
    Ekman, P., Friesen, W.V.: Unmasking the face. A guide to recognizing emotions from facial clues. Prentice-Hall Inc., Englewood Cliffs (1975)Google Scholar
  18. 18.
    Feldman, R.S., Philippot, P., Custrini, R.J.: Social competence and nonverbal behavior. In: Fundamentals of Nonverbal Behavior. Cambridge University Press, Cambridge (1991)Google Scholar
  19. 19.
    Gallaher, P.: Individual differences in nonverbal behavior: Dimensions of style. Journal of Personality and Social Psychology 63 (1992)Google Scholar
  20. 20.
    Gouta, K., Miyamoto, M.: Emotion recognition, facial components associated with various emotions. Shinrigaku Kenkyu 71(3) (2000)Google Scholar
  21. 21.
    Hall, J.A., Matsumoto, D.: Gender differences in judgments of multiple emotions from facial expressions. Emotion 4(2) (2004)Google Scholar
  22. 22.
    Pelachaud, C., Mancini, M., Hartmann, B.: Implementing Expressive Gesture Synthesis for Embodied Conversational Agents. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 188–199. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  23. 23.
    Kipp, M.: Gesture Generation by Imitation. From Human Behavior to Computer Character Animation. Boca Raton, Florida (2004)Google Scholar
  24. 24.
    Martin, J.-C., Abrilian, S., Devillers, L.: Annotating Multimodal Behaviors Occurring During Non Basic Emotions. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 550–557. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  25. 25.
    Martin, J.-C., Abrilian, S., Devillers, L., Lamolle, M., Mancini, M., Pelachaud, C.: Levels of Representation in the Annotation of Emotion for the Specification of Expressivity in ECAs. In: Panayiotopoulos, T., Gratch, J., Aylett, R.S., Ballin, D., Olivier, P., Rist, T. (eds.) IVA 2005. LNCS (LNAI), vol. 3661, pp. 405–417. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  26. 26.
    Martin, J.-C., Niewiadomski, R., Devillers, L., Buisine, S., Pelachaud, C.: Multimodal Complex Emotions: Gesture Expressivity And Blended Facial Expressions. Pelachaud, C., Canamero, L. (eds.) Special issue of the Journal of Humanoid Robotics (to appear)Google Scholar
  27. 27.
    Pandzic, I.S., Forchheimer, R.: MPEG-4 Facial Animation. In: The Standard, Implementation and Applications, John Wiley & Sons, LTD, Chichester (2002)Google Scholar
  28. 28.
    Pelachaud, C.: Multimodal expressive embodied conversational agent. In: ACM Multimedia, Brave New Topics session, Singapore, pp. 683–689 (2005)Google Scholar
  29. 29.
    Rehm, M., André, E.: Catch Me If You Can - Exploring Lying Agents in Social Settings. In: Int. Conf. Autonomous Agents and Multiagent Systems (AAMAS 2005), pp. 937–944 (2005)Google Scholar
  30. 30.
    Richmond, V.P., Croskey, J.C.: Non Verbal Behavior in Interpersonal relations. Allyn & Bacon Inc. (1999)Google Scholar
  31. 31.
    Ruttkay, Z., Noot, H., ten Hagen, P.: Emotion Disc and Emotion Squares: tools to explore the facial expression face. Computer Graphics Forum 22(1) (2003)Google Scholar
  32. 32.
    Savvidou, S., Cowie, R., Douglas-Cowie, E.: Contributions of Visual and Auditory Channels to Detection of Emotion. In: British Psychological Society Annual Conference (NI Branch), Cavan, Republic of Ireland (2001)Google Scholar
  33. 33.
    Scherer, K.R.: Analyzing Emotion Blends. In: Proceedings of the Xth Conference of the International Society for Research on Emotions, Würzburg, Germany, pp. 142–148 (1998)Google Scholar
  34. 34.
    ten Ham, R., Theune, M., Heuvelman, A., Verleur, R.: Judging Laura: Perceived Qualities of a Mediated Human Versus an Embodied Agent. In: Panayiotopoulos, T., Gratch, J., Aylett, R.S., Ballin, D., Olivier, P., Rist, T. (eds.) IVA 2005. LNCS (LNAI), vol. 3661, pp. 381–393. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  35. 35.
    Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology 28 (1998)Google Scholar
  36. 36.
    Wallbott, H.G., Scherer, K.R.: Cues and Channels in Emotion Recognition. Journal of Personality and Social Psychology 51(4) (1986)Google Scholar
  37. 37.
    Wiggers, M.: Jugments of facial expressions of emotion predicted from facial behavior. Journal of Nonverbal Behavior 7(2) (1982)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Stéphanie Buisine
    • 1
  • Sarkis Abrilian
    • 2
  • Radoslaw Niewiadomski
    • 3
    • 4
  • Jean-Claude Martin
    • 2
  • Laurence Devillers
    • 2
  • Catherine Pelachaud
    • 3
  1. 1.LCPI-ENSAMParisFrance
  2. 2.LIMSI-CNRSOrsayFrance
  3. 3.LINC, IUT of MontreuilUniv. Paris 8MontreuilFrance
  4. 4.Department of Mathematics and Computer ScienceUniversity of PerugiaItaly

Personalised recommendations