Multimodal Analysis of Expressive Gesture in Music and Dance Performances

  • Antonio Camurri
  • Barbara Mazzarino
  • Matteo Ricchetti
  • Renee Timmers
  • Gualtiero Volpe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2915)


This paper presents ongoing research on the modelling of expressive gesture in multimodal interaction and on the development of multimodal interactive systems explicitly taking into account the role of non-verbal expressive gesture in the communication process. In this perspective, a particular focus is on dance and music as first-class conveyors of expressive and emotional content. Research outputs include (i) computational models of expressive gesture, (ii) validation by means of continuous ratings on spectators exposed to real artistic stimuli, and (iii) novel hardware and software components for the EyesWeb open platform (, such as the recently developed Expressive Gesture Processing Library. The paper starts with a definition of expressive gesture. A unifying framework for the analysis of expressive gesture is then proposed. Finally, two experiments on expressive gesture in dance and music are discussed. This research work has been supported by the EU IST project MEGA (Multisensory Expressive Gesture Applications, and the EU MOSART TMR Network.


Correct Classification Basic Emotion Automatic Classification Emotional Intensity Music Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bobick, A.F., Davis, J.: The Recognition of Human Movement Using Temporal Templates. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(3), 257–267 (2001)CrossRefGoogle Scholar
  2. 2.
    Bradsky, G., Davis, J.: Motion segmentation and pose recognition with motion history gradients. Machine Vision and Applications 13, 174–184 (2002)CrossRefGoogle Scholar
  3. 3.
    Cadoz, C., Wanderley, M.: Gesture – Music. In: Wanderley, M., Battier, M. (eds.) Trends in Gestural Control of Music (Edition électronique), IRCAM, Paris (2000)Google Scholar
  4. 4.
    Camurri, A., Lagerlof, I., Volpe, G.: Emotions and cue extraction from dance movements. International Journal of Human Computer Studies 59(1-2), 213–225 (2003)CrossRefGoogle Scholar
  5. 5.
    Camurri, A., De Poli, G., Leman, M.: MEGASE - A Multisensory Expressive Gesture Applications System Environment for Artistic Performances. In: Proc. Intl Conf CAST, GMD, St Augustin-Bonn, pp. 59 – 62 (2001)Google Scholar
  6. 6.
    Camurri, A., Hashimoto, S., Ricchetti, M., Trocca, R., Suzuki, K., Volpe, G.: EyesWeb – Toward Gesture and Affect Recognition in Interactive Dance and Music Systems. Computer Music Journal 24(1), 57–69 (2000)CrossRefGoogle Scholar
  7. 7.
    Canazza, S., De Poli, G., Drioli, C., Rodà, A., Vidolin, A.: Audio morphing different expressive intentions for Multimedia Systems. IEEE Multimedia 3, 79–83 (2000)Google Scholar
  8. 8.
    Chi, D., Costa, M., Zhao, L., Badler, N.: The EMOTE model for Effort and Shape. In: ACM SIGGRAPH 2000, New Orleans, LA, pp. 173–182 (2000)Google Scholar
  9. 9.
    Clarke, E.F., Davidson, J.W.: The body in music as mediator between knowledge and action. In: Thomas, W. (ed.) Composition, Performance, Reception: Studies in the Creative Process in Music, pp. 74–92. Oxford University Press, Oxford (1998)Google Scholar
  10. 10.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.: Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine (1) (2001)Google Scholar
  11. 11.
    Friberg, A., Colombo, V., Frydén, L., Sundberg, J.: Generating Musical Performances with Director Musices. Computer Music Journal 24(3), 23–29 (2000)CrossRefGoogle Scholar
  12. 12.
    Hashimoto, S.: KANSEI as the Third Target of Information Processing and Related Topics in Japan. In: Camurri, A. (ed.) Proceedings of the International Workshop on KANSEI: The technology of emotion. AIMI (Italian Computer Music Association) and DIST-University of Genova, pp. 101–104 (1997)Google Scholar
  13. 13.
    Kilian, J.: Simple Image Analysis By Moments. OpenCV library documentation (2001)Google Scholar
  14. 14.
    Krumhansl, C.L.: A perceptual analysis of Mozart’s piano sonata K.282: Segmentation, tension and musical ideas. Music Perception 13(3), 401–432 (1996)Google Scholar
  15. 15.
    Krumhansl, C.L.: Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine’s choreography of Mozart’s Divertimento. Musicae Scientiae 1(15), 63–85 (1997)Google Scholar
  16. 16.
    Kurtenbach, Hulteen: Gesture in Human-Computer Interaction. In: Laurel, B., (Ed.) The Art of Human-Computer Interface Design (1990)Google Scholar
  17. 17.
    Laban, R., Lawrence, F.C.: Effort. Macdonald&Evans Ltd., London (1947)Google Scholar
  18. 18.
    Laban, R.: Modern Educational Dance. Macdonald & Evans Ltd., London (1963)Google Scholar
  19. 19.
    Lagerlof, I., Djerf, M.: On cue utilization for emotion expression in dance movements. Manuscript in preparation, Department of Psychology, University of Uppsala (2001)Google Scholar
  20. 20.
    Liu, Y., Collins, R.T., Tsin, Y.: Gait Sequence Analysis using Frieze Patterns. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2351, pp. 657–671. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  21. 21.
    Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the International Joint Conference on Artificial Intelligence (1981)Google Scholar
  22. 22.
    McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University Of Chicago Press, Chicago (1992)Google Scholar
  23. 23.
    Palmer, C.: Music Performance. Annual Review of Psychology 48, 115–138 (1997)CrossRefGoogle Scholar
  24. 24.
    Pollick, F.E., Paterson, H., Bruderlin, A., Sanford, A.J.: Perceiving affect from arm movement. Cognition 82, B51–B61 (2001)CrossRefGoogle Scholar
  25. 25.
    Scherer, K.R.: Why music does not produce basic emotions: pleading for a new approach to measuring the emotional effects of music. In: Proc. Stockholm Music Acoustics Conference SMAC 2003, KTH, Stockholm, Sweden, pp. 25–28 (2003)Google Scholar
  26. 26.
    Sloboda, J.A., Lehmann, A.C.: Tracking performance correlates of changes in perceived intensity of emotion during different interpretations of a Chopin piano prelude. Music Perception 19(1), 87–120 (2001)CrossRefGoogle Scholar
  27. 27.
    Timmers, R.: Freedom and constraints in timing and ornamentation: investigations of music performance. Shaker Publishing, Maastricht (2002)Google Scholar
  28. 28.
    Wanderley, M., Battier, M. (eds.): Trends in Gestural Control of Music (Edition électronique). IRCAM, Paris (2000)Google Scholar
  29. 29.
    Zhao, L.: Synthesis and Acquisition of Laban Movement Analysis Qualitative Parameters for Communicative Gestures. Ph.D Dissertation, University of Pennsylvania (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Antonio Camurri
    • 1
  • Barbara Mazzarino
    • 1
  • Matteo Ricchetti
    • 1
  • Renee Timmers
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.InfoMus LabDIST – University of GenovaGenovaItaly

Personalised recommendations