- 260 Downloads
Analysis of expressiveness in human gesture can lead to new paradigms for the design of improved human-machine interfaces, thus enhancing users’ participation and experience in mixed reality applications and context-aware mediated environments. The development of expressive interfaces decoding the highly affective information gestures convey opens novel perspectives in the design of interactive multimedia systems in several application domains: performing arts, museum exhibits, edutainment, entertainment, therapy, and rehabilitation. This paper describes some recent developments in our research on expressive interfaces by presenting computational models and algorithms for the real-time analysis of expressive gestures in human full-body movement. Such analysis is discussed both as an example and as a basic component for the development of effective expressive interfaces. As a concrete result of our research, a software platform named EyesWeb was developed (http://www.eyesweb.org). Besides supporting research, EyesWeb has also been employed as a concrete tool and open platform for developing real-time interactive applications.
KeywordsExpressive gesture Interactive multimedia systems Expressiveness in performing arts Human–computer interaction
We thank our colleagues at the DIST – InfoMus Lab and particularly Paolo Coletta, Massimiliano Peri, Matteo Ricchetti, Andrea Ricci, and Riccardo Trocca. We thank Ingrid Lagerlöf and Marie Djerf from the University of Uppsala who collaborated in the analysis of basic emotions in dance fragments.
This research is partially funded by the EU IST Project MEGA (Multisensory Expressive Gesture Applications) no. IST-1999–20410 (http://www.megaproject.org).
- Bobick AF, Davis J (2001) The recognition of human movement using temporal templates. In: IEEE Trans Pattern Anal Mach Intell 23(3):257–267Google Scholar
- Camurri A, Coletta P, Peri M, Ricchetti M, Ricci A, Trocca R, Volpe G (2000b) A real-time platform for interactive dance and music systems. In: Proceedings of the international conference on ICMC (ICMC2000), Berlin, 2000Google Scholar
- Camurri A, De Poli G, Leman M, Volpe G (2001) A multi-layered conceptual framework for expressive gesture applications. In: Proceedings of the workshop on current research directions in computer music, Barcelona, November 2001Google Scholar
- Camurri A, Lagerlöf I, Volpe G (2003) Emotions and cue extraction from dance movements. Int J Hum Comput Stud 59(1–2):213–225Google Scholar
- Chi D, Costa M, Zhao L, Badler N (2000) The EMOTE model for effort and shape. In: ACM SIGGRAPH’00, New Orleans, L.A., July 2000, pp 173–182Google Scholar
- Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, no 1, January 2001Google Scholar
- Hashimoto S (1997) KANSEI as the third target of information processing and related topics in Japan. In: Camurri A (ed) Proceedings of the international workshop on KANSEI: the technology of emotion. AIMI and DIST-University of Genova, pp 101–104Google Scholar
- Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211Google Scholar
- Laban R, Lawrence FC (1947) Effort. Macdonald and Evans, LondonGoogle Scholar
- Laban R (1963) Modern educational dance. Macdonald and Evans, LondonGoogle Scholar
- Leman M, Lesaffre M, Tanghe K (2001) A toolbox for perception-based music analysis. IPEM—Department of Musicology, Ghent University, http://www.ipem.rug.ac.be/toolbox
- Leman M, Vermeulen V, De Vooght L, Taelman J, Moelants D, Lesaffre M (2003) Correlation of gestural audio cues and perceived expressive qualities. In: Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction. Lecture notes in artificial intelligence, vol 2915. Springer, Berlin Heidelberg New YorkGoogle Scholar
- Schaeffer P (1977) Traité des Objets Musicaux, 2nd edn. Editions du Seuil, ParisGoogle Scholar
- Suzuki, K, Camurri A, Hashimoto S, Ferrentino P (1998) Intelligent agent system for human-robot interaction through artificial emotion. Proceedings of the IEEE international conference on systems, man, and cybernetics, IEEE CS Press, San Diego, pp1055–1060Google Scholar
- Volpe G (2003) Computational models of expressive gesture in multimedia systems. PhD dissertation, University of Genova, Faculty of EngineeringGoogle Scholar
- Wallbott HG (1980) The measurement of human expressions. In: von Rallfer-Engel W (ed) Aspects of communications. Lisse, Swets and Zeitlinger, pp 203–228Google Scholar