Advertisement

An Intermediate Expressions’ Generator System in the MPEG-4 Framework

  • Amaryllis Raouzaiou
  • Evaggelos Spyrou
  • Kostas Karpouzis
  • Stefanos Kollias
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3893)

Abstract

A lifelike human face can enhance interactive applications by providing straightforward feedback to and from the users and stimulating emotional responses from them. An expressive, realistic avatar should not “express himself” in the narrow confines of the six archetypal expressions. In this paper, we present a system which generates intermediate expression profiles (set of FAPs) combining profiles of the six archetypal expressions, by utilizing concepts included in the MPEG-4 standard.

Keywords

Facial Expression Intermediate Expression Group Fear Narrow Confine Facial Expression Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.: Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine, 32–80 (2001)Google Scholar
  2. 2.
    DeCarolis, B., Pelachaud, C., Poggi, I., Steedman, M.: APML, A mark-up language for believable behavior generation, Life-Like Characters. Springer, Heidelberg (2004)Google Scholar
  3. 3.
    EC TMR Project PHYSTA Report: Review of Existing Techniques for Human Emotion Understanding and Applications in Human-Computer Interaction (1998), http://www.image.ece.ntua.gr/physta/reports/emotionreview.htm
  4. 4.
    Ekman, P.: Facial expression and Emotion. Am. Psychologist 48, 384–392 (1993)CrossRefGoogle Scholar
  5. 5.
    Hartmann, B., Mancini, M., Pelachaud, C.: Formational parameters and adaptive prototype instantiation for MPEG-4 compliant gesture synthesis. Computer Animation, 111. 3, 6, 7 (2002)Google Scholar
  6. 6.
    Preda, M., Prêteux, F.: Advanced animation framework for virtual characters within the MPEG-4 standard. In: Proc. of the Intl. Conference on Image Processing, Rochester, NY (2002)Google Scholar
  7. 7.
    Raouzaiou, A., Tsapatsoulis, N., Karpouzis, K., Kollias, S.: Parameterized facial expression synthesis based on MPEG-4. EURASIP Journal on Applied Signal Processing 2002(10), 1021–1038 (2002)CrossRefMATHGoogle Scholar
  8. 8.
    Tekalp, M., Ostermann, J.: Face and 2-D mesh animation in MPEG-4. Image Communication Journal 15(4-5), 387–421 (2000)Google Scholar
  9. 9.
    Whissel, C.M.: The dictionary of affect in language. In: Plutchnik, R., Kellerman, H. (eds.) Emotion: Theory, research and experience. The measurement of emotions, vol. 4. Academic Press, New York (1989)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Amaryllis Raouzaiou
    • 1
  • Evaggelos Spyrou
    • 1
  • Kostas Karpouzis
    • 1
  • Stefanos Kollias
    • 1
  1. 1.Image, Video and Multimedia Systems Laboratory, School of Electrical and Computer EngineeringNational Technical University of AthensAthensGreece

Personalised recommendations