Automatic Classification of Expressive Hand Gestures on Tangible Acoustic Interfaces According to Laban’s Theory of Effort
- Cite this paper as:
- Camurri A., Canepa C., Ghisio S., Volpe G. (2009) Automatic Classification of Expressive Hand Gestures on Tangible Acoustic Interfaces According to Laban’s Theory of Effort. In: Sales Dias M., Gibet S., Wanderley M.M., Bastos R. (eds) Gesture-Based Human-Computer Interaction and Simulation. GW 2007. Lecture Notes in Computer Science, vol 5085. Springer, Berlin, Heidelberg
Tangible Acoustic Interfaces (TAIs) exploit the propagation of sound in physical objects in order to localize touching positions and to analyse user’s gesture on the object. Designing and developing TAIs consists of exploring how physical objects, augmented surfaces, and spaces can be transformed into tangible-acoustic embodiments of natural seamless unrestricted interfaces. Our research focuses on Expressive TAIs, i.e., TAIs able at processing expressive user’s gesture and providing users with natural multimodal interfaces that fully exploit expressive, emotional content. This paper presents a concrete example of analysis of expressive gesture in TAIs: hand gestures on a TAI surface are classified according to the Space and Time dimensions of Rudolf Laban’s Theory of Effort. Research started in the EU-IST Project TAI-CHI (Tangible Acoustic Interfaces for Computer-Human Interaction) and is currently going on in the EU-ICT Project SAME (Sound and Music for Everyone, Everyday, Everywhere, Every way, www.sameproject.eu). Expressive gesture analysis and multimodal and cross-modal processing are achieved in the new EyesWeb XMI open platform (available at www.eyesweb.org) by means of a new version of the EyesWeb Expressive Gesture Processing Library.
Keywordsexpressive gesture tangible acoustic interfaces natural interfaces multimodal interactive systems multimodal analysis of expressive movement
Unable to display preview. Download preview PDF.