Advertisement

Multimodal People Engagement with iCub

  • Salvatore M. Anzalone
  • Serena Ivaldi
  • Olivier Sigaud
  • Mohamed Chetouani
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 196)

Abstract

In this paper we present an engagement system for the iCub robot that is able to arouse in human partners a sense of “co-presence” during human-robot interaction. This sensation is naturally triggered by simple reflexes of the robot, that speaks to the partners and gazes the current “active partner” (e.g. the talking partner) in interaction tasks. The active partner is perceived through a multimodal approach: a commercial rgb-d sensor is used to recognize the presence of humans in the environment, using both 3d information and sound source localization, while iCub’s cameras are used to perceive his face.

Keywords

engagement attention personal robots 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Durlach, N., Slater, M.: Presence in shared virtual environments and virtual togetherness. Presence: Teleoperators & Virtual Environments (2000)Google Scholar
  2. 2.
    Zhao, S.: Toward a taxonomy of copresence. Presence: Teleoperators & Virtual Environments (2003)Google Scholar
  3. 3.
    Breazeal, C.: Toward sociable robots. Robotics and Autonomous Systems (2003)Google Scholar
  4. 4.
    Breazeal, C.L.: Designing sociable robots. The MIT Press (2004)Google Scholar
  5. 5.
    Al Moubayed, S., Baklouti, M., Chetouani, M., Dutoit, T., Mahdhaoui, A., Martin, J.C., Ondas, S., Pelachaud, C., Urbain, J., Yilmaz, M.: Generating robot/agent backchannels during a storytelling experiment. In: IEEE International Conference on Robotics and Automation, ICRA 2009. IEEE (2009)Google Scholar
  6. 6.
    Rich, C., Ponsler, B., Holroyd, A., Sidner, C.L.: Recognizing engagement in human-robot interaction. In: 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE (2010)Google Scholar
  7. 7.
    Metta, G., Sandini, G., Vernon, D., Natale, L., Nori, F.: The icub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems. ACM (2008)Google Scholar
  8. 8.
    Metta, G., Fitzpatrick, P., Natale, L.: Yarp: Yet another robot platform. International Journal on Advanced Robotics Systems (2006)Google Scholar
  9. 9.
    Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: Ros: an open-source robot operating system. In: ICRA Workshop on Open Source Software (2009)Google Scholar
  10. 10.
    Nakadai, K., Okuno, H.G., Nakajima, H., Hasegawa, Y., Tsujino, H.: An open source software system for robot audition hark and its evaluation. In: 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008. IEEE (2008)Google Scholar
  11. 11.
    Asono, F., Asoh, H., Matsui, T.: Sound source localization and signal separation for office robot jijo-2. In: Proceedings of the 1999 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 1999. IEEE (1999)Google Scholar
  12. 12.
    Viola, P., Jones, M.J.: Robust real-time face detection. International Journal of Computer Vision (2004)Google Scholar
  13. 13.
    Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001. IEEE (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Salvatore M. Anzalone
    • 1
  • Serena Ivaldi
    • 1
  • Olivier Sigaud
    • 1
  • Mohamed Chetouani
    • 1
  1. 1.Institut des Systemes Intelligents et de RobotiqueCNRS UMR 7222 & Universite Pierre et Marie CurieParisFrance

Personalised recommendations