The Use of Scripts Based on Conceptual Dependency Primitives for the Operation of Service Mobile Robots

  • Jesus Savage
  • Alfredo Weitzenfeld
  • Francisco Ayala
  • Sergio Cuellar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5399)

Abstract

This paper describes a Human-Robot interaction subsystem that is part of a robotics architecture, the ViRbot, used to control the operation of service mobile robots. The Human/Robot Interface subsystem consists of tree modules: Natural Language Understanding, Speech Generation and Robot’s Facial Expressions. To demonstrate the utility of this Human-Robot interaction subsystem it is presented a set of applications that allows a user to command a mobile robot through spoken commands. The mobile robot accomplish the required commands using an actions planner and reactive behaviors. In the ViRbot architecture the actions planner module uses Conceptual Dependency (CD) primitives as the base for representing the problem domain. After a command is spoken a CD representation of it is generated, a rule base system takes this CD representation, and using the state of the environment generates other subtasks represented by CDs to accomplish the command. In this paper is also presented how to represent context through scripts. Using scripts it is easy to make inferences about events for which there are incomplete information or are ambiguous. Scripts serve to encode common sense knowledge. Scripts are also used to fill the gaps between seemingly unrelated events.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
    Savage, J., Billinhurst, M., Holden, A.: The ViRbot: a virtual reality robot driven with multimodal commands. In: Expert Systems with Applications, vol. 15, pp. 413–419. Pergamon Press, Oxford (1998)Google Scholar
  3. 3.
    Schank, R.C.: Conceptual Information Processing. North-Holland Publishing Company, Amsterdam (1975)MATHGoogle Scholar
  4. 4.
    Microsoft Speech SDK (2006), http://www.microsoft.com/speech/
  5. 5.
    Rabiner, L., Biing-Hwang: Fundamentals of Speech Recognition. Prentice Hall, Englewood Cliffs (1993)MATHGoogle Scholar
  6. 6.
    Savage, J.: A Hybrid System with Symbolic AI and Statistical Methods for Speech Recognition. PhD Dissertation University of Washington (August 1995)Google Scholar
  7. 7.
    Dominey, P.F., Weitzenfeld, A.: A Robot Command, Interrogation and Teaching via Social Interaction. In: IEEE-RAS International Conference on Humanoid Robots, December 6-7, Tsukuba, Japan (2005)Google Scholar
  8. 8.
    Lytinen Steven, L.: Conceptual Dependency and Its Descendants. Computer Math. Applic. 23(2-5), 51–73 (1992)CrossRefMATHGoogle Scholar
  9. 9.
    Cohen, P.R.: The Role of Natural Language in a Multimodal Interface. In: Proceedings UIST 1994, pp. 143–149. ACM Press, New York (1992)Google Scholar
  10. 10.
    CLIPS Reference Manual Version 6.0. Technical Report Number JSC-25012. Software Technology Branch, Lyndon B. Johnson Space Center, Houston, TX (1994)Google Scholar
  11. 11.
    Schank, R.C., Leake, D.: Computer Understanding and Creativity. In: Kugler, H.-J. (ed.) Information Processing 1986, pp. 335–341. North-Holland, New York (1986)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Jesus Savage
    • 1
  • Alfredo Weitzenfeld
    • 2
  • Francisco Ayala
    • 1
  • Sergio Cuellar
    • 1
  1. 1.Bio-Robotics Laboratory Department of Electrical EngineeringUniversidad Nacional Autonoma de Mexico, UNAMMexico CityMexico
  2. 2.CANNES, Department of Computer Engineering, ITAMMexico CityMexico

Personalised recommendations