Skip to main content
Log in

RUNA: a multimodal command language for home robot users

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

This article describes a multimodal command language for home robot users, and a robot system which interprets users’ messages in the language through microphones, visual and tactile sensors, and control buttons. The command language comprises a set of grammar rules, a lexicon, and nonverbal events detected in hand gestures, readings of tactile sensors attached to the robots, and buttons on the controllers in the users’ hands. Prototype humanoid systems which immediately execute commands in the language are also presented, along with preliminary experiments of faceto-face interactions and teleoperations. Subjects unfamiliar with the language were able to command humanoids and complete their tasks with brief documents at hand, given a short demonstration beforehand. The command understanding system operating on PCs responded to multimodal commands without significant delay.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bos J, Oka T (2007) A spoken language interface with a mobile robot. J Artif Life Robotics 11:42–47

    Article  Google Scholar 

  2. Prasad R, Saruwatari H, Shikano K (2004) Robots that can hear, understand and talk. Adv Robotics 18:533–564

    Article  Google Scholar 

  3. Perzanowski D, Schultz AC, Adams W, et al (2001) Building a multimodal human-robot interface. IEEE Intel Syst 16:16–21

    Article  Google Scholar 

  4. Iba S, Paredis CJJ, Khosla PK (2004) Interactive multi-modal robot programming. Proceedings of the 9th International Symposium on Experimental Robotics (ISER’ 04), Singapore

  5. Breazeal C, Brooks A, Gray J, et al (2004) Tutelage and collaboration for humanoid robots. Int J Humanoid Robots: World Sci 1:315–348

    Article  Google Scholar 

  6. Oka T, Yokota M (2007) Designing a multi-modal language for directing multipurpose home robots. Proceedings of the 12th International Symposium on Artificial Life and Robotics (AROB’ 07), Beppu, Japan

  7. Cheyer A, Martin D (2001) The open agent architecture. J Auton Agents Multi-Agent Syst 4:143–148

    Article  Google Scholar 

  8. Lee A, Kawahara A, Shikano K (2001) Julius: an open source realtime large vocabulary recognition engine. Proceedings of the 7th European Conference on Speech Communication and Technology, Aalborg, Denmark

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tetsushi Oka.

Additional information

This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January 31–February 2, 2008

About this article

Cite this article

Oka, T., Abe, T., Sugita, K. et al. RUNA: a multimodal command language for home robot users. Artif Life Robotics 13, 455–459 (2009). https://doi.org/10.1007/s10015-008-0603-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-008-0603-8

Key words

Navigation