Realistic Emotional Gaze and Head Behavior Generation Based on Arousal and Dominance Factors

  • Cagla Cig
  • Zerrin Kasap
  • Arjan Egges
  • Nadia Magnenat-Thalmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6459)


Current state-of-the-art virtual characters fall far short of characters produced by skilled animators in terms of behavioral adequacy. This is due in large part to the lack of emotional expressivity in physical behaviors. Our approach is to develop emotionally expressive gaze and head movement models that are driven parametrically in real-time by the instantaneous mood of an embodied conversational agent (ECA). A user study was conducted to test the perceived emotional expressivity of the facial animation sequences generated by these models. The results showed that changes in gaze and head behavior combined can be used to express changes in arousal and/or dominance level of the ECA successfully.


Emotional gaze and head behavior expressive animation virtual humans facial animation natural motion simulation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Argyle, M.: Bodily Communication. Methuen, London (1998)Google Scholar
  2. 2.
    Busso, C., Deng, Z., Neumann, U., Narayanan, S.: Learning Expressive Human-Like Head Motion Sequences from Speech. In: Data-Driven 3D Facial Animations, pp. 113–131. Springer, New York (2007)CrossRefGoogle Scholar
  3. 3.
    Dovidio, J.F., Ellyson, S.L.: Patterns of visual dominance behavior in humans. In: Power, Dominance, and Nonverbal Behavior, pp. 129–149. Springer, New York (1985)CrossRefGoogle Scholar
  4. 4.
    Exline, R., Gray, D., Schuette, D.: Visual behavior in a dyad as affected by interview content and sex of respondant. Journal of Personality and Social Psychology 1, 201–209 (1965)CrossRefGoogle Scholar
  5. 5.
    Field, T.: Infant gaze aversion and heart rate during face-to-face interactions. Infant Behaviour and Development 4, 307–315 (1981)CrossRefGoogle Scholar
  6. 6.
    Fukayama, A., Ohno, T., Mukawa, N., Sawaki, M., Hagita, N.: Messages embedded in gaze of interface agents - impression management with agent’s gaze. In: Proceedings of the ACM CHI 2002 Conference on Human Factors in Computing Systems Conference, pp. 41–48. ACM Press, Minnesota (2002)Google Scholar
  7. 7.
    Grimm, K., Kroschel, K.: Evaluation of natural emotions using self assessment manikins. In: Proceedings of the IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 381–385. IEEE Press, Mexico (2005)Google Scholar
  8. 8.
    Harris, C.S., Thackray, R.I., Shoenberger, R.W.: Blink rate as a function of induced muscular tension and manifest anxiety. Perceptual and Motor Skills 22, 155–160 (1966)CrossRefGoogle Scholar
  9. 9.
    International Standards Office. ISO/IEC IS 14496-2 Information technology - Coding of audio-visual objects - Part 2: Visual. ISO, Geneva (1999)Google Scholar
  10. 10.
    Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. In: Proceedings of ACM SIGGRAPH 2002, vol. 21, pp. 473–482 (2002)Google Scholar
  11. 11.
    Kasap, Z., Moussa, M.B., Chaudhuri, P., Magnenat-Thalmann, N.: Making Them Remember-Emotional Virtual Characters with Memory. IEEE Computer Graphics and Applications 29, 20–29 (2009)CrossRefGoogle Scholar
  12. 12.
    Lance, B., Marsella, S.C.: Emotionally Expressive Head and Body Movement During Gaze Shifts. In: Pelachaud, C., Martin, J.C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS (LNAI), vol. 4722, pp. 72–85. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  13. 13.
    Lee, S.P., Badler, J.B., Badler, N.I.: Eyes alive. ACM Transactions on Graphics 21, 637–644 (2002)Google Scholar
  14. 14.
    Mehrabian, A.: Analysis of the big-five personality factors in terms of the pad temperament model. Australian Journal of Psychology 48, 86–92 (1996)CrossRefGoogle Scholar
  15. 15.
    Mignault, A., Chaudhuri, A.: The Many Faces of a Neutral Face: Head Tilt and Perception of Dominance and Emotion. Journal of Nonverbal Behavior 27, 111–132 (2003)CrossRefGoogle Scholar
  16. 16.
    Paterson, H.M., Pollick, F.E., Sanford, A.J.: The Role of Velocity in Affect Discrimination. In: Proceedings of the Twenty-Third Annual Conference of the Cognitive Science Society, pp. 756–761. Lawrence Erlbaum Associates, London (2001)Google Scholar
  17. 17.
    Queiroz, R.B., Barros, L.M., Musse, S.R.: Providing expressive gaze to virtual animated characters in interactive applications. Computers in Entertainment (CIE) 6, 1–23 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Cagla Cig
    • 2
  • Zerrin Kasap
    • 1
  • Arjan Egges
    • 2
  • Nadia Magnenat-Thalmann
    • 1
  1. 1.MIRALabUniversity of GenevaSwitzerland
  2. 2.Department of Information and Computing SciencesUniversiteit UtrechtThe Netherlands

Personalised recommendations