Evaluating Emotional Content of Acted and Algorithmically Modified Motions

  • Klaus Lehtonen
  • Tapio Takala
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6758)

Abstract

Motion capture is a common method to create expressive motions for animated characters. In order to add flexibility when reusing motion data, many ways to modify its style have been developed. However, thorough evaluation of the resulting motions is often omitted. In this paper we present a questionnaire based method for evaluating visible emotions and styles in animated motion, and a set of algorithmic modifications to add emotional content to captured motion. Modifications were done by adjusting posture, motion path lengths and timings. The evaluation method was then applied to a set of acted and modified motions. The results show that a simple questionnaire is a useful tool to compare motions, and that expressivity of some emotions can be controlled by the proposed algorithms. However, we also found that motions should be evaluated using several describing dimensions simultaneously, as a single modification may have complex visible effects on the motion.

Keywords

computer animation motion capture emotional motion evaluation of motion style motion editing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Menache, A.: Understanding Motion Capture for Computer Animation and Video Games, p. 238. Academic Press, New York (2000)Google Scholar
  2. 2.
    Amaya, K., Bruderlin, A., Calvert, T.: Emotion from motion. In: Proc. Graphics interface GI 1996, pp. 222–229. Canadian Information Processing Society (1996)Google Scholar
  3. 3.
    Hsu, E., Pulli, K., Popovic, J.: Style translation for human motion. In: Proc. SIGGRAPH 2005, ACM Transactions on Graphics, vol. 24(3), pp. 1082–1089 (2005)Google Scholar
  4. 4.
    Shapiro, A., Cao, Y., Faloutsos, P.: Style components. Proc. Graphics Interface GI’06, pp. 33-39. Canadian Information Processing Society. (2006)Google Scholar
  5. 5.
    Bruderlin, A., Williams, L.: Motion signal processing. Proc. SIGGRAPH 1995, ACM Transactions on Graphics 14(3), 97–104 (1995)Google Scholar
  6. 6.
    Heloir, A., Kipp, M., Gibet, S., Courty, N.: Evaluating Data-Driven Style Transformation for Gesturing Embodied Agents. In: Prendinger, H., Lester, J., Ishizuka, M. (eds.) IVA 2008. LNCS (LNAI), vol. 5208, pp. 215–222. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  7. 7.
    Hachimura, K., Takashina, K., Yoshimura, M.: Analysis and evaluation of dancing movement based on LMA. In: IEEE International Workshop on Robot and Human Interactive Communication, pp. 294–299 (2005)Google Scholar
  8. 8.
    Ruttkay, Z.: Cultural dialects of real and synthetic emotional facial expressions. AI & Society 24(3), 307–315 (2009)CrossRefGoogle Scholar
  9. 9.
    Neff, M., Fiume, E.: From Performance Theory to Character Animation Tools. In: Klette, R., Metaxas, D., Rosenhahn, B. (eds.) Human Motion: Understanding, Modelling, Capture, and Animation, Springer, Heidelberg (2008)Google Scholar
  10. 10.
    Chi, D., Costa, M., Zhao, L., Badler, N.: The EMOTE model for effort and shape. In: Proc. SIGGRAPH 2000, pp. 173–182. ACM Press/Addison-Wesley Publishing Co., New York (2000)Google Scholar
  11. 11.
    Pollick, F., Paterson, H., Bruderlin, A., Sanford, A.: Perceiving affect from arm movement. Cognition 82(2) (2001)Google Scholar
  12. 12.
    Wallbott, H.: Bodily expression of emotion. European Journal of Social Psychology 28, 879–896 (1998)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Klaus Lehtonen
    • 1
  • Tapio Takala
    • 1
  1. 1.School of Science, Department of Media TechnologyAalto UniversityEspooFinland

Personalised recommendations