Advertisement

Corpus Creation and Perceptual Evaluation of Expressive Theatrical Gestures

  • Pamela Carreno-Medrano
  • Sylvie Gibet
  • Caroline Larboulette
  • Pierre-François Marteau
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8637)

Abstract

While human communication involves rich, complex and expressive gestures, available corpora of captured motions used for the animation of virtual characters contain actions ranging from locomotion to everyday life motions. We aim at creating a novel corpus of expressive and meaningful gestures, and we focus on body movements and gestures involved in theatrical scenarios. In this paper we propose a methodology for building a corpus of full-body theatrical gestures based on a magician show enriched with affective content.

We then validate the constructed corpus of theatrical gestures and sequences through several perceptual studies focusing on the complexity of the produced movements as well as the recognizability of the additional affective content.

Keywords

Emotional State Video Clip Motion Capture Daily Gesture Virtual Character 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Atkinson, A.P., Dittrich, W.H., Gemmell, A.J., Young, A.W.: Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 717–746 (2004)CrossRefGoogle Scholar
  2. 2.
    Aubert, C.: The art of pantomime. Dover Publications, Inc. (2003)Google Scholar
  3. 3.
    Bernhardt, D., Robinson, P.: Detecting emotions from connected action sequences. In: Badioze Zaman, H., Robinson, P., Petrou, M., Olivier, P., Schröder, H., Shih, T.K. (eds.) IVIC 2009. LNCS, vol. 5857, pp. 1–11. Springer, Heidelberg (2009)Google Scholar
  4. 4.
    Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., Volpe, G.: Multimodal Analysis of Expressive Gesture in Music and Dance Performances. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 20–39. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  5. 5.
    Carnegie Mellon University: Motion capture database (2003), http://mocap.cs.cmu.edu/
  6. 6.
    Cowie, R., Douglas-Cowie, E., Cox, C.: Beyond emotion archetypes: Databases for emotion modelling using neural networks. Neural Networks 18(4), 371–388 (2005)CrossRefGoogle Scholar
  7. 7.
    Ennis, C., Hoyet, L., Egges, A., McDonnell, R.: Emotion capture: Emotionally expressive characters for games. In: Proceedings of Motion on Games, MIG 2013, pp. 31:53–31:60 (2013)Google Scholar
  8. 8.
    Examinations, L.: Mime matters. Tech. rep., London Academy of Music & Dramatic Art (2012)Google Scholar
  9. 9.
    Jones, G.M.: Trade of the Tricks: Inside the Magician’s Craft. University of California Press (2011)Google Scholar
  10. 10.
    Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., Driessen, P.F.: Gesture-based affective computing on motion capture data. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 1–7. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Kendon, A.: Gesture: visible action as utterance. Cambridge University Press (2004)Google Scholar
  12. 12.
    Lecoq, J.: Theater of movement and gesture. Taylor & Francis (2006)Google Scholar
  13. 13.
    Ma, Y., Paterson, H.M., Pollick, F.E.: A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behavior Research Methods 38(1), 134–141 (2006)CrossRefGoogle Scholar
  14. 14.
    Macknik, S.L., King, M., Randi, J., Robbins, A., Thompson, J., Martinez-Conde, S., et al.: Attention and awareness in stage magic: turning tricks into research. Nature Reviews Neuroscience 9(11), 871–879 (2008)CrossRefGoogle Scholar
  15. 15.
    McDonnell, R., Jörg, S., McHugh, J., Newell, F.N., O’Sullivan, C.: Investigating the role of body shape on the perception of emotion. ACM TAP 6(3), 14 (2009)Google Scholar
  16. 16.
    Müller, M., Röder, T., Clausen, M., Eberhardt, B., Krüger, B., Weber, A.: Documentation mocap database hdm05. Tech. rep., Universität Bonn (2007)Google Scholar
  17. 17.
    Niewiadomski, R., Bevacqua, E., Mancini, M., Pelachaud, C.: Greta: An interactive expressive eca system. In: AAMAS 2009, vol. 2, pp. 1399–1400 (2009)Google Scholar
  18. 18.
    Piana, S., Staglianò, A., Odone, F., Verri, A., Camurri, A.: Real-time automatic emotion recognition from body gestures. CoRR (2014)Google Scholar
  19. 19.
    Pollick, F., Paterson, H., Bruderlin, A., Sanford, A.: Perceiving affect from arm movement. Cognition 82(2), B51–B61 (2001)CrossRefGoogle Scholar
  20. 20.
    Qualisys AB: Qualisys motion capture systems, http://www.qualisys.com/
  21. 21.
    Russell, J.A.: A circumplex model of affect. Journal of Personality and Social Psychology 39(6), 1161 (1980)CrossRefGoogle Scholar
  22. 22.
    Torresani, L., Hackney, P., Bregler, C.: Learning motion style synthesis from perceptual observations. In: NIPS, pp. 1393–1400 (2006)Google Scholar
  23. 23.
    University College London: UCLIC Affective body posture and motion database, http://web4.cs.ucl.ac.uk/uclic/people/n.berthouze/AffectME.html
  24. 24.
    University of Texas at Arlington: Human motion database (2011), http://smile.uta.edu/hmd/
  25. 25.
    Zibrek, K., Hoyet, L., Ruhland, K., McDonnell, R.: Evaluating the effect of emotion on gender recognition in virtual humans. In: SAP 2013, pp. 45–49 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Pamela Carreno-Medrano
    • 1
  • Sylvie Gibet
    • 1
  • Caroline Larboulette
    • 1
  • Pierre-François Marteau
    • 1
  1. 1.Université de Bretagne Sud, IRISA, Bâtiment ENSIBSVannesFrance

Personalised recommendations