User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

  • Ginevra Castellano
  • Roberto Bresin
  • Antonio Camurri
  • Gualtiero Volpe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4738)

Abstract

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.

Keywords

Affective interaction expressive gesture multimodal environments interactive music systems 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Picard, R.: Affective Computing. MIT Press, Boston, MA (1997)Google Scholar
  2. 2.
    Hashimoto, S.: KANSEI as the Third Target of Information Processing and Related Topics in Japan. In: Proc. International Workshop on KANSEI: The technology of emotion, Genova, pp. 101–104 (1997)Google Scholar
  3. 3.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, N., Kollias, N., Fellenz, W., Taylor, J.: Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine 18(1), 32–80 (2001)CrossRefGoogle Scholar
  4. 4.
    Scherer, K.R., Wallbott, H.G.: Analysis of Nonverbal Behavior. In: HANDBOOK OF DISCOURSE: ANALYSIS, vol. 2(11). Academic Press, London (1985)Google Scholar
  5. 5.
    Wallbott, H.G., Scherer, K.R.: Cues and Channels in Emotion Recognition. Journal of Personality and Social Psychology 51(4), 690–699 (1986)CrossRefGoogle Scholar
  6. 6.
    DeMeijer, M.: The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior 13, 247–268 (1989)CrossRefGoogle Scholar
  7. 7.
    Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology, Eur. J. Soc. Psychol. 28, 879–896 (1998)Google Scholar
  8. 8.
    Boone, R.T., Cunningham, J.G.: Children’s decoding of emotion in expressive body movement: the development of cue attunement. Developmental Psychology 34, 1007–1016 (1998)CrossRefGoogle Scholar
  9. 9.
    Pollick, F.E.: The Features People Use to Recognize Human Movement Style. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 20–39. Springer, Heidelberg (2004)Google Scholar
  10. 10.
    Camurri, A., De Poli, G., Leman, M., Volpe, G.: Toward Communicating Expressiveness and Affect in Multimodal Interactive Systems for Performing Art and Cultural Applications. IEEE Multimedia Magazine 12(1), 43–53 (2005)CrossRefGoogle Scholar
  11. 11.
    Camurri, A.: Interactive Dance/Music Systems. In: Proc. Intl. Computer Music Conference ICMC 1995. In: The Banff Centre for the arts, Canada, (September, 3-7), pp. 245–252. ICMA-Intl.Comp.Mus.Association (1995)Google Scholar
  12. 12.
    Camurri, A., Trocca, R.: Movement and gesture in intelligent interactive music systems. In: Battier, M., Wanderley, M. (eds.) Trends in Gestural Control of Music, Ircam Publ. (2000)Google Scholar
  13. 13.
    Krueger, M.: Artificial Reality II. Addison-Wesley Professional, London, UK (1991)Google Scholar
  14. 14.
    Leman, M.: Embodied Music Cognition and Mediation Technology. MIT-Press, Cambridge, MA (in print)Google Scholar
  15. 15.
    Höök, K.: User-Centred Design and Evaluation of Affective Interfaces. In: From Brows to Trust. In: Ruttkay, Z., Pelachaud, C. (eds.) Evaluating Embodied Conversational Agents, Kluwer’s Human-Computer Interaction Series (2004)Google Scholar
  16. 16.
    Camurri, A., Coletta, P., Massari, A., Mazzarino, B., Peri, M., Ricchetti, M., Ricci, A., Volpe, G.: Toward real-time multimodal processing: EyesWeb 4.0, in Proc. AISB 2004 Convention: Motion, Emotion and Cognition, Leeds, UK (March 2004)Google Scholar
  17. 17.
    Friberg, A.: pDM: an expressive sequencer with real-time control of the KTH music performance rules. Computer Music Journal 30(1), 37–48 (2006)CrossRefGoogle Scholar
  18. 18.
    Friberg, A., Bresin, R., Sundberg, J.: Overview of the KTH rule system for music performance. Advances in Experimental Psychology, special issue on Music Performance 2(2-3), 145–161 (2006)Google Scholar
  19. 19.
    Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The Eyesweb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, Springer, Heidelberg (2004)Google Scholar
  20. 20.
    Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. International Journal of Human-Computer Studies 59(1-2), 213–225 (2003)Google Scholar
  21. 21.
    Camurri, A., Castellano, G., Ricchetti, M., Volpe, G.: Subject interfaces: measuring bodily activation during an emotional experience of music. In: Gibet, S., Courty, N., Kamp, J.F. (eds.) Gesture in Human-Computer Interaction and Simulation, vol. 3881, pp. 268–279. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  22. 22.
    Dahl, S., Friberg, A.: Visual perception of expressiveness in musicians’ body movements. Music Perception (forthcoming)Google Scholar
  23. 23.
    Bresin, R.: What is the color of that music performance? In: proceedings of the International Computer Music Conference - ICMC 2005, pp. 367–370 (2005)Google Scholar
  24. 24.
    Juslin, P., laukka, P.: Communication of emotions in vocal expression and music performance: Different channels, same code? Psychological Bulletin 129(5), 770–814 (2003)CrossRefGoogle Scholar
  25. 25.
    Juslin, P.N.: Communicating Emotion in Music Performance: a Review and Theoretical Framework. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion: Theory and research, pp. 309–337. Oxford: University Press, Oxford (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Ginevra Castellano
    • 1
  • Roberto Bresin
    • 2
  • Antonio Camurri
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.InfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, GenovaItaly
  2. 2.KTH, CSC School of Computer Science and Communication, Dept. of Speech Music and Hearing, Stockholm 

Personalised recommendations