Chapter

Gesture-Based Communication in Human-Computer Interaction

Volume 2915 of the series Lecture Notes in Computer Science pp 20-39

Multimodal Analysis of Expressive Gesture in Music and Dance Performances

  • Antonio CamurriAffiliated withInfoMus Lab, DIST – University of Genova
  • , Barbara MazzarinoAffiliated withInfoMus Lab, DIST – University of Genova
  • , Matteo RicchettiAffiliated withInfoMus Lab, DIST – University of Genova
  • , Renee TimmersAffiliated withInfoMus Lab, DIST – University of Genova
  • , Gualtiero VolpeAffiliated withInfoMus Lab, DIST – University of Genova

* Final gross prices may vary according to local VAT.

Get Access

Abstract

This paper presents ongoing research on the modelling of expressive gesture in multimodal interaction and on the development of multimodal interactive systems explicitly taking into account the role of non-verbal expressive gesture in the communication process. In this perspective, a particular focus is on dance and music as first-class conveyors of expressive and emotional content. Research outputs include (i) computational models of expressive gesture, (ii) validation by means of continuous ratings on spectators exposed to real artistic stimuli, and (iii) novel hardware and software components for the EyesWeb open platform (www.eyesweb.org), such as the recently developed Expressive Gesture Processing Library. The paper starts with a definition of expressive gesture. A unifying framework for the analysis of expressive gesture is then proposed. Finally, two experiments on expressive gesture in dance and music are discussed. This research work has been supported by the EU IST project MEGA (Multisensory Expressive Gesture Applications, www.megaproject.org) and the EU MOSART TMR Network.