Cognition, Technology & Work

, Volume 6, Issue 1, pp 15–22 | Cite as

Expressive interfaces

  • Antonio Camurri
  • Barbara Mazzarino
  • Gualtiero Volpe
Original Article

Abstract

Analysis of expressiveness in human gesture can lead to new paradigms for the design of improved human-machine interfaces, thus enhancing users’ participation and experience in mixed reality applications and context-aware mediated environments. The development of expressive interfaces decoding the highly affective information gestures convey opens novel perspectives in the design of interactive multimedia systems in several application domains: performing arts, museum exhibits, edutainment, entertainment, therapy, and rehabilitation. This paper describes some recent developments in our research on expressive interfaces by presenting computational models and algorithms for the real-time analysis of expressive gestures in human full-body movement. Such analysis is discussed both as an example and as a basic component for the development of effective expressive interfaces. As a concrete result of our research, a software platform named EyesWeb was developed (http://www.eyesweb.org). Besides supporting research, EyesWeb has also been employed as a concrete tool and open platform for developing real-time interactive applications.

Keywords

Expressive gesture Interactive multimedia systems Expressiveness in performing arts Human–computer interaction 

Notes

Acknowledgments

We thank our colleagues at the DIST – InfoMus Lab and particularly Paolo Coletta, Massimiliano Peri, Matteo Ricchetti, Andrea Ricci, and Riccardo Trocca. We thank Ingrid Lagerlöf and Marie Djerf from the University of Uppsala who collaborated in the analysis of basic emotions in dance fragments.

This research is partially funded by the EU IST Project MEGA (Multisensory Expressive Gesture Applications) no. IST-1999–20410 (http://www.megaproject.org).

References

  1. Bobick AF, Davis J (2001) The recognition of human movement using temporal templates. In: IEEE Trans Pattern Anal Mach Intell 23(3):257–267Google Scholar
  2. Boone RT, Cunningham JG (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34:1007–1016CrossRefPubMedGoogle Scholar
  3. Camurri A, Coletta P, Ricchetti M, Volpe G (2000a) Expressiveness and physicality in interaction. J New Music Res 29(3):187–198CrossRefGoogle Scholar
  4. Camurri A, Coletta P, Peri M, Ricchetti M, Ricci A, Trocca R, Volpe G (2000b) A real-time platform for interactive dance and music systems. In: Proceedings of the international conference on ICMC (ICMC2000), Berlin, 2000Google Scholar
  5. Camurri A, De Poli G, Leman M, Volpe G (2001) A multi-layered conceptual framework for expressive gesture applications. In: Proceedings of the workshop on current research directions in computer music, Barcelona, November 2001Google Scholar
  6. Camurri A, Lagerlöf I, Volpe G (2003) Emotions and cue extraction from dance movements. Int J Hum Comput Stud 59(1–2):213–225Google Scholar
  7. Chi D, Costa M, Zhao L, Badler N (2000) The EMOTE model for effort and shape. In: ACM SIGGRAPH’00, New Orleans, L.A., July 2000, pp 173–182Google Scholar
  8. Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, no 1, January 2001Google Scholar
  9. Hashimoto S (1997) KANSEI as the third target of information processing and related topics in Japan. In: Camurri A (ed) Proceedings of the international workshop on KANSEI: the technology of emotion. AIMI and DIST-University of Genova, pp 101–104Google Scholar
  10. Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211Google Scholar
  11. Laban R, Lawrence FC (1947) Effort. Macdonald and Evans, LondonGoogle Scholar
  12. Laban R (1963) Modern educational dance. Macdonald and Evans, LondonGoogle Scholar
  13. Leman M, Lesaffre M, Tanghe K (2001) A toolbox for perception-based music analysis. IPEM—Department of Musicology, Ghent University, http://www.ipem.rug.ac.be/toolbox
  14. Leman M, Vermeulen V, De Vooght L, Taelman J, Moelants D, Lesaffre M (2003) Correlation of gestural audio cues and perceived expressive qualities. In: Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction. Lecture notes in artificial intelligence, vol 2915. Springer, Berlin Heidelberg New YorkGoogle Scholar
  15. Schaeffer P (1977) Traité des Objets Musicaux, 2nd edn. Editions du Seuil, ParisGoogle Scholar
  16. Suzuki, K, Camurri A, Hashimoto S, Ferrentino P (1998) Intelligent agent system for human-robot interaction through artificial emotion. Proceedings of the IEEE international conference on systems, man, and cybernetics, IEEE CS Press, San Diego, pp1055–1060Google Scholar
  17. Volpe G (2003) Computational models of expressive gesture in multimedia systems. PhD dissertation, University of Genova, Faculty of EngineeringGoogle Scholar
  18. Wallbott HG (1980) The measurement of human expressions. In: von Rallfer-Engel W (ed) Aspects of communications. Lisse, Swets and Zeitlinger, pp 203–228Google Scholar

Copyright information

© Springer-Verlag London Limited 2004

Authors and Affiliations

  • Antonio Camurri
    • 1
  • Barbara Mazzarino
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.InfoMus Lab – Laboratorio di Informatica MusicaleDIST – University of GenovaGenovaItaly

Personalised recommendations