Gesture Synthesis in a Real-World ECA

  • Patrick Olivier
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3068)

Abstract

We address the issue of spontaneous gesture synthesis for embodied conversation agents (ECAs), that is, the generation of appropriate gestures and their coordination with spoken utterances. After a characterization of the application constraints we establish the principal requirements of the gesture generation framework. We demonstrate how these requirements can be met by formulating the gesture generation as real-time search through gesture space (actually gesture and facial expression) under the constraints arising from the graphical model of the character and the linguistic properties of the utterance.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lexicle Customer Service (2004), http://www.lexicle.com
  2. 2.
    Jörding, T., Wachsmuth, I.: An Anthropomorphic Agent for the Use of Spatial Language. In: Coventry, K.R., Olivier, P. (eds.) Spatial Language: Cognitive and Computational Aspects, pp. 69–86. Kluwer, Dordrecht (2002)Google Scholar
  3. 3.
    Yan, H.: Paired Speech and Gesture Generation in Embodied Conversational Agents, M.S. thesis in the Media Lab. Cambridge, MA: MIT (2000)Google Scholar
  4. 4.
    Cassell, J., Vilhjálmsson, H., Bickmore, T.: BEAT: the Behavior Expression Animation Toolkit. In: Proceedings of SIGGRAPH 2001, Los Angeles, CA, August 12-17, pp. 477–486 (2001)Google Scholar
  5. 5.
    McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. The University of Chicago Press, Chicago (1992)Google Scholar
  6. 6.
    Fellbaum, C.D. (ed.): WordNet: An Electronic Lexical Database. MIT Press, Cambridge (1998)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Patrick Olivier
    • 1
  1. 1.Lexicle, Innovation Centre, York Science ParkYorkUK

Personalised recommendations