Analyzing the Human-Robot Interaction Abilities of a General-Purpose Social Robot in Different Naturalistic Environments

  • J. Ruiz-del-Solar
  • M. Mascaró
  • M. Correa
  • F. Bernuy
  • R. Riquelme
  • R. Verschae
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5949)


The main goal of this article is to report and analyze the applicability of a general-purpose social robot, developed in the context of the RoboCup @Home league, in three different naturalistic environments: (i) home, (ii) school classroom, and (iii) public space settings. The evaluation of the robot’s performance relies on its degree of social acceptance, and its abilities to express emotions and to interact with humans using human-like codes. The reported experiments show that the robot has a large acceptance from expert and non-expert human users, and that it is able to successfully interact with humans using human-like interaction mechanisms, such as speech and visual cues (particularly face information). It is remarkable that the robot can even teach children in a real classroom.


Human-Robot Interaction Social Robots 


  1. 1.
    Correa, M., Ruiz-del-Solar, J., Bernuy, F.: Face Recognition for Human-Robot Interaction Applications: A Comparative Study. In: Iocchi, L., Matsubara, H., Weitzenfeld, A., Zhou, C. (eds.) RoboCup 2008: Robot Soccer World Cup XII. LNCS (LNAI), vol. 5399, pp. 473–484. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  2. 2.
    Francke, H., Ruiz-del-Solar, J., Verschae, R.: Real-time Hand Gesture Detection and Recognition using Boosted Classifiers and Active Learning. In: Mery, D., Rueda, L. (eds.) PSIVT 2007. LNCS, vol. 4872, pp. 533–547. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  3. 3.
    Hayashi, K., Sakamoto, D., Kanda, T., Shiomi, M., Koizumi, S., Ishiguro, H., Ogasawara, T., Hagita, N.: Humanoid Robots as a Passive-Social Medium – A Field Experiment at a Train Station. In: Proc. Conf. Human-Robot Interaction – HRI 2007, Virginia, March 8-11, pp. 137–144 (2007)Google Scholar
  4. 4.
    Ishiguro, H., Ono, T., Imai, M., Kanda, T.: Development of an interactive humanoid robot Robovie—An interdisciplinary approach. In: Jarvis, R.A., Zelinsky, A. (eds.) Robotics Research, pp. 179–191. Springer, New York (2003)CrossRefGoogle Scholar
  5. 5.
    Kanda, T., Ishiguro, H., Imai, M., Ono, T.: Development and Evaluation of Interactive Humanoid Robots. Proc. IEEE 92(11), 1839–1850 (2004)Google Scholar
  6. 6.
    Kanda, T., Ishiguro, H., Ono, T., Imai, M., Nakatsu, R.: Development and evaluation of an interactive humanoid robot Robovie. In: Proc. IEEE Int. Conf. Robotics and Automation, pp. 1848–1855 (2002)Google Scholar
  7. 7.
    Matsusaka, Y., et al.: Multi-person conversation robot using multimodal interface. In: Proc. World Multiconf. Systems, Cybernetics and Informatics, vol. 7, pp. 450–455 (1999)Google Scholar
  8. 8.
    Nakadai, K., Hidai, K., Mizoguchi, H., Okuno, H.G., Kitano, H.: Real-time auditory and visual multiple-object tracking for robots. In: Proc. Int. Joint Conf. Artificial Intelligence, pp. 1425–1432 (2001)Google Scholar
  9. 9.
    Ono, T., Imai, M., Ishiguro, H.: A model of embodied communications with gestures between humans and robots. In: Proc. 23rd Annu. Meeting Cognitive Science Soc., pp. 732–737 (2001)Google Scholar
  10. 10.
    Reeves, B., Nass, C.: The Media Equation. CSLI, Stanford (1996)Google Scholar
  11. 11.
    Ruiz-del-Solar, J., Aviles, R.: Robotics Courses for Children as a Motivation Tool: The Chilean Experience. IEEE Trans. on Education 47(4), 474–480 (2004)CrossRefGoogle Scholar
  12. 12.
    Ruiz-del-Solar, J., Correa, M., Bernuy, F., Cubillos, S., Mascaró, M., Vargas, J., Norambuena, S., Marinkovic, A., Galaz, J.: UChile HomeBreakers 2008 TDP. In: RoboCup Symposium 2008, CD Proceedings, Suzhou, China, July 15-18 (2008)Google Scholar
  13. 13.
    Ruiz-del-Solar, J., Quinteros, J.: Illumination Compensation and Normalization in Eigenspace-based Face Recognition: A comparative study of different pre-processing approaches. Pattern Recognition Letters 29(14), 1966–1979 (2008)CrossRefGoogle Scholar
  14. 14.
    Ruiz-del-Solar, J., Verschae, R., Vallejos, P., Correa, M.: Face Analysis for Human Computer Interaction Applications. In: Proc. 2nd Int. Conf. on Computer Vision Theory and Appl. – VISAPP 2007, Special Sessions, Barcelona, Spain, pp. 23–30 (2007)Google Scholar
  15. 15.
    Sakamoto, D., Kanda, T., Ono, T., Ishiguro, H., Hagita, N.: Android as a Telecommunication Medium with a Human-like Presence. In: Proc. Conf. Human-Robot Interaction – HRI 2007, Virginia, March 8-11, pp. 193–200 (2007)Google Scholar
  16. 16.
    Verschae, R., Ruiz-del-Solar, J.: A Hybrid Face Detector based on an Asymmetrical Adaboost Cascade Detector and a Wavelet-Bayesian-Detector. In: Mira, J., Álvarez, J.R. (eds.) IWANN 2003. LNCS, vol. 2686, pp. 742–749. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  17. 17.
    Verschae, R., Ruiz-del-Solar, J., Correa, M.: Gender Classification of Faces using Adaboost. In: Martínez-Trinidad, J.F., Carrasco Ochoa, J.A., Kittler, J. (eds.) CIARP 2006. LNCS, vol. 4225, pp. 68–78. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  18. 18.
    Verschae, R., Ruiz-del-Solar, J., Correa, M.: A Unified Learning Framework for object Detection and Classification using Nested Cascades of Boosted Classifiers. Machine Vision and Applications 19(2), 85–103 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • J. Ruiz-del-Solar
    • 1
    • 2
  • M. Mascaró
    • 1
  • M. Correa
    • 1
    • 2
  • F. Bernuy
    • 1
  • R. Riquelme
    • 1
  • R. Verschae
    • 1
  1. 1.Department of Electrical EngineeringUniversidad de Chile 
  2. 2.Center for Mining TechnologyUniversidad de Chile 

Personalised recommendations