Advertisement

Autonomous Robots

, 22:87 | Cite as

A humanoid robot that pretends to listen to route guidance from a human

  • Takayuki Kanda
  • Masayuki Kamasima
  • Michita Imai
  • Tetsuo Ono
  • Daisuke Sakamoto
  • Hiroshi Ishiguro
  • Yuichiro Anzai
Article

Abstract

This paper reports the findings for a humanoid robot that expresses its listening attitude and understanding to humans by effectively using its body properties in a route guidance situation. A human teaches a route to the robot, and the developed robot behaves similar to a human listener by utilizing both temporal and spatial cooperative behaviors to demonstrate that it is indeed listening to its human counterpart. The robot's software consists of many communicative units and rules for selecting appropriate communicative units. A communicative unit realizes a particular cooperative behavior such as eye-contact and nodding, found through previous research in HRI. The rules for selecting communicative units were retrieved through our preliminary experiments with a WOZ method. An experiment was conducted to verify the effectiveness of the robot, with the results revealing that a robot displaying cooperative behavior received the highest subjective evaluation, which is rather similar to a human listener. A detailed analysis showed that this evaluation was mainly due to body movements as well as utterances. On the other hand, subjects' utterance to the robot was encouraged by the robot's utterances but not by its body movements.

Keywords

Human-robot interaction Embodied communication Cooperative body movement Humanoid robot Communication robot 

Notes

Acknowledgments

This research was in part supported by the National Institute of Information and Communications Technology of Japan and in part supported by the Japan Society for the Promotion of Science, Grants-in-Aid for Scientific Research No. 18500024.

References

  1. Billard, A. and Mataric, M. 2001. Learning human arm movements by imitation: Evaluation of a biologically inspired connectionist architecture. Robotics and Autonomous Systems, 37(2–3):145–160.zbMATHCrossRefGoogle Scholar
  2. Breazeal, C. and Scassellati, B. 1999. A context-dependent attention system for a social robot. In Proc. Int. Joint Conf. on Artificial Intelligence, pp. 1146–1151.Google Scholar
  3. Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjalmsson, H., and Yan, H. 1999. Embodiment in conversational interfaces: Rea. In Proceeding of the CHI' 99 Conference on Human Factors in Computing Systems, pp. 520–527.Google Scholar
  4. Fujita, M. 2001. AIBO: Towards the era of digital creatures. International Journal of Robotics Research, 20:781–794.CrossRefGoogle Scholar
  5. Hirai, K., Hirose, M., Haikawa, Y., and Takenaka, T. 1998. The development of the Honda humanoid robot. In Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1321–1326.Google Scholar
  6. Imai, M., Ono, T., and Ishiguro, H. 2003. Physical relation and expression: Joint attention for human-robot interaction. IEEE Transaction on Industrial Electronics, 50(4):636–643.CrossRefGoogle Scholar
  7. Jebara, T. and Pentland, A. 1999. Action reaction learning: Automatic visual analysis and synthesis of interactive behaviour. In Proceedings of International Conference on Computer Vision Systems.Google Scholar
  8. Kanda, T., Ishiguro, H., Imai, M., and Ono, T. 2003. Body movement analysis of human-robot interaction, In Proceedings of International Joint Conference on Artificial Intelligence, pp. 177–182.Google Scholar
  9. Kanda, T., Hirano, T., Eaton, D., and Ishiguro, H. 2004a. Interactive robots as social partners and peer tutors for children: A field trial, Human Computer Interaction, 19(1–2):61–84.CrossRefGoogle Scholar
  10. Kanda, T., Ishiguro, H., Imai, M., and Ono, T. 2004b. Development and evaluation of interactive humanoid robots. Proceedings of the IEEE, 92(11):1839–1850.CrossRefGoogle Scholar
  11. Kidd, C. and Breazeal, C. 2004. Effect of a robot on user perceptions. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2004), pp. 3559–3564.Google Scholar
  12. Maynard, S. 1986. On back-channel behavior in Japanese and English casual conversation. Linguistics, 24:1079–1108.CrossRefGoogle Scholar
  13. Miyashita, T. and Ishiguro, H. 2004. Human-like natural behavior generation based on involuntary motions for humanoid robots. Robotics and Autonomous Systems, 48:203–212.CrossRefGoogle Scholar
  14. Moore, C. and Dunham, P.J. (eds.). 1995. Joint attention: Its origins and role in development. Lawrence Erlbaum Associates.Google Scholar
  15. Nakano, Y., Reinstein, G., Stocky, T., and Cassell, J. 2003. Towards a model of face-to-face grounding. In Proc. Association for Computational Linguistics, pp. 553–561.Google Scholar
  16. Nakadai, K., Hidai, K., Mizoguchi, H., Okuno, H. G., and Kitano, H. 2001. Real-time auditory and visual multiple-object tracking for robots. International Joint Conference on Artificial Intelligence, pp. 1425–1432.Google Scholar
  17. Ono, T., Imai, M., and Ishiguro, H. 2001. A model of embodied communications with gestures between humans and robots, In Proceedings of Annual Meeting of the Cognitive Science Society, pp. 732–737.Google Scholar
  18. Ogawa, H. and Watanabe, T. 2001. InterRobot: Speech-driven embodied interaction robot. Advanced Robotics, 15(3):371–377.CrossRefGoogle Scholar
  19. Reeves, B. and Nass, C. 1996. The Media Equation.Google Scholar
  20. Sakagami, Y., Watanabe, R., Aoyama, C., Matsunaga, S., Higaki, N., and Fujimura, K. 2002. The intelligent ASIMO; System overview and integration. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'02), pp. 2478–2483.Google Scholar
  21. Sakamoto, D., Kanda, T., Ono, T., Kamashima, M., Imai, M., and Ishiguro, H. 2005. Cooperative embodied communication emerged by interactive humanoid robots. International Journal of Human-Computer Studies, 62:247–265.CrossRefGoogle Scholar
  22. Scassellati, B., 2000. Investigating models of social development using a humanoid robot. Biorobotics.Google Scholar
  23. Trafton, J.G., Cassimatis, N.L., Bugajska, M.D., Brock, D.P., Mintz, F.E., and Schulz, A.C. 2005. Enabling effective human-robot interaction using perspective-taking in robots. IEEE Trans. on Systems, man and Cybernetics, Part A: Systems and Humans, 35(4):460–470.CrossRefGoogle Scholar
  24. Tsukahara, S. and Hanazawa, S. 1997. Effects of back-channel in inter-human communication. In proceedings of 61st Annual Conference of the Japanese Psychology Society, p. 134.Google Scholar
  25. Whittaker, S. and O'Conaill, B. 1997. The role of vision in face-to-face and mediated communication. In K. Finn, A. Sellen, and S. Wilbur, (eds.) Video Mediated Communication, Lawrence Erlbaum Associates, pp. 23–49.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2006

Authors and Affiliations

  • Takayuki Kanda
    • 1
  • Masayuki Kamasima
    • 1
    • 2
  • Michita Imai
    • 1
    • 2
  • Tetsuo Ono
    • 1
    • 3
  • Daisuke Sakamoto
    • 1
    • 3
  • Hiroshi Ishiguro
    • 1
    • 4
  • Yuichiro Anzai
    • 2
  1. 1.ATR, Intelligent Robotics and Communication LaboratoriesKyotoJapan
  2. 2.Faculty of Science and Technology, Department of Information & Computer ScienceKeio UniversityKanagawaJapan
  3. 3.Department of Media Architecture, School of Systems Information ScienceFuture University-HakodateHokkaidoJapan
  4. 4.Department of Adaptive Machine Systems, Graduate School of EngineeringOsaka UniversityOsakaJapan

Personalised recommendations