Automatic Classification of Expressive Hand Gestures on Tangible Acoustic Interfaces According to Laban’s Theory of Effort

  • Antonio Camurri
  • Corrado Canepa
  • Simone Ghisio
  • Gualtiero Volpe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5085)

Abstract

Tangible Acoustic Interfaces (TAIs) exploit the propagation of sound in physical objects in order to localize touching positions and to analyse user’s gesture on the object. Designing and developing TAIs consists of exploring how physical objects, augmented surfaces, and spaces can be transformed into tangible-acoustic embodiments of natural seamless unrestricted interfaces. Our research focuses on Expressive TAIs, i.e., TAIs able at processing expressive user’s gesture and providing users with natural multimodal interfaces that fully exploit expressive, emotional content. This paper presents a concrete example of analysis of expressive gesture in TAIs: hand gestures on a TAI surface are classified according to the Space and Time dimensions of Rudolf Laban’s Theory of Effort. Research started in the EU-IST Project TAI-CHI (Tangible Acoustic Interfaces for Computer-Human Interaction) and is currently going on in the EU-ICT Project SAME (Sound and Music for Everyone, Everyday, Everywhere, Every way, www.sameproject.eu). Expressive gesture analysis and multimodal and cross-modal processing are achieved in the new EyesWeb XMI open platform (available at www.eyesweb.org) by means of a new version of the EyesWeb Expressive Gesture Processing Library.

Keywords

expressive gesture tangible acoustic interfaces natural interfaces multimodal interactive systems multimodal analysis of expressive movement 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Camurri, A., De Poli, G., Leman, M., Volpe, G.: Toward Communicating Expressiveness and Affect in Multimodal Interactive Systems for Performing Art and Cultural Applications. IEEE Multimedia Magazine 12(1), 43–53 (2005)CrossRefGoogle Scholar
  2. 2.
    Camurri, A., Mazzarino, B., Volpe, G.: Expressive interfaces. Cognition, Technology and Work.  6(1), 15–22 (2004)Google Scholar
  3. 3.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.: Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine 18(1), 32–80 (2001)CrossRefGoogle Scholar
  4. 4.
    Hashimoto, S.: KANSEI as the Third Target of Information Processing and Related Topics in Japan. In: Proc. International Workshop on KANSEI: The technology of emotion, Genova, pp. 101–104 (1997)Google Scholar
  5. 5.
    Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., Volpe, G.: Multimodal analysis of expressive gesture in music and dance performances. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 20–39. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  6. 6.
    Laban, R., Lawrence, F.C.: Effort. Macdonald & Evans Ltd., London (1947)Google Scholar
  7. 7.
    Laban, R.: Modern Educational Dance. Macdonald & Evans Ltd., London (1963)Google Scholar
  8. 8.
    Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology 28(6), 879–896 (1998)CrossRefGoogle Scholar
  9. 9.
    Camurri, A., Coletta, P., Drioli, C., Massari, A., Volpe, G.: Audio Processing in a Multimodal Framework. In: Proc. 118th AES Convention, Barcelona (2005)Google Scholar
  10. 10.
    Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 460–467. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Zhao, L.: Synthesis and Acquisition of Laban Movement Analysis Qualitative Parameters for Communicative Gestures, Ph.D. Dissertation, University of Pennsylvania (2001)Google Scholar
  12. 12.
    Pollick, F.E., Paterson, H., Bruderlin, A., Sanford, A.J.: Perceiving affect from arm movement. Cognition 82, B51–B61 (2001)CrossRefGoogle Scholar
  13. 13.
    Polotti, P., Sampietro, M., Sarti, A., Tubaro, S., Crevoisier, A.: Acoustic Localization of Tactile Interactions for the Development of Novel Tangible Interfaces. In: Proc. 8th Intl. Conference on Digital Audio Effects (DAFX 2005), Madrid (2005)Google Scholar
  14. 14.
    Kilian, J.: Simple Image Analysis By Moments, Open Computer Vision (OpenCV) Library documentation (2001)Google Scholar
  15. 15.
    Bornand, C., Camurri, A., Castellano, G., Catheline, S., Crevoisier, A., Roesch, E., Scherer, K., Volpe, G.: Usability evaluation and comparison of prototypes of tangible acoustic interfaces. In: Proc. Intl. Conference Enactive 2005, Genova, Italy (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Antonio Camurri
    • 1
  • Corrado Canepa
    • 1
  • Simone Ghisio
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.InfoMus Lab, DISTUniversity of GenovaGenovaItaly

Personalised recommendations