Chapter

Affective Computing and Intelligent Interaction

Volume 4738 of the series Lecture Notes in Computer Science pp 501-510

User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

  • Ginevra CastellanoAffiliated withInfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova
  • , Roberto BresinAffiliated withKTH, CSC School of Computer Science and Communication, Dept. of Speech Music and Hearing, Stockholm
  • , Antonio CamurriAffiliated withInfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova
  • , Gualtiero VolpeAffiliated withInfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova

* Final gross prices may vary according to local VAT.

Get Access

Abstract

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.

Keywords

Affective interaction expressive gesture multimodal environments interactive music systems