Human Centered Robot Systems pp 173-182

Part of the Cognitive Systems Monographs book series (COSMOS, volume 6)

Towards Meaningful Robot Gesture

  • Maha Salem
  • Stefan Kopp
  • Ipke Wachsmuth
  • Frank Joublin

Abstract

Humanoid robot companions that are intended to engage in natural and fluent human-robot interaction are supposed to combine speech with non-verbal modalities for comprehensible and believable behavior. We present an approach to enable the humanoid robot ASIMO to flexibly produce and synchronize speech and co-verbal gestures at run-time, while not being limited to a predefined repertoire of motor action. Since this research challenge has already been tackled in various ways within the domain of virtual conversational agents, we build upon the experience gained from the development of a speech and gesture production model used for our virtual human Max. Being one of the most sophisticated multi-modal schedulers, the Articulated Communicator Engine (ACE) has replaced the use of lexicons of canned behaviors with an on-the-spot production of flexibly planned behavior representations. As an underlying action generation architecture, we explain how ACE draws upon a tight, bi-directional coupling of ASIMO’s perceptuo-motor system with multi-modal scheduling via both efferent control signals and afferent feedback.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Maha Salem
    • 1
  • Stefan Kopp
    • 2
  • Ipke Wachsmuth
    • 3
  • Frank Joublin
    • 4
  1. 1.Research Institute for Cognition and RoboticsBielefeld UniversityGermany
  2. 2.Sociable Agents GroupBielefeld UniversityGermany
  3. 3.Artificial Intelligence GroupBielefeld UniversityGermany
  4. 4.Honda Research Institute EuropeOffenbachGermany

Personalised recommendations