Impacts of Robot Head Gaze on Robot-to-Human Handovers

Abstract

In this paper, we investigate the use of a robot’s gaze to improve the timing and subjective experience of face-to-face robot-to-human handovers. Based on observations of human gaze behaviors during face-to-face human–human handovers, we implement various gaze behaviors on a PR2 humanoid robot. We conducted two consecutive robot-to-human handover studies. Results show that when the robot continually gazes at a projected handover position while handing over an object, the human receivers reach for the object significantly earlier than when the robot looks down, away from the handover location; further, when the robot continually gazes at the receiver’s face instead of the handover position, the receivers reach for the object even earlier. When the robot—instead of continually gazing at a location—transitions its gaze from the handover position to the receivers’ face, or vice versa, the receivers’ reach time did not improve; however, the receivers perceive these gaze transitions to better communicate handover timing than continual gazes. Finally, the receivers perceive the robot to be more likeable and anthropomorphic when it looks at their face than when it does not. Findings from our studies indicate that robot’s use of gaze can help improve both fluency and subjective experience of the robot-to-human handover interactions.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

References

  1. 1.

    Admoni H, Dragan A, Srinivasa SS, Scassellati B (2014) Deliberate delays during robot-to-human handovers improve compliance with gaze communication. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, pp 49–56. doi:10.1145/2559636.2559682

  2. 2.

    Aleotti J, Micelli V, Caselli S (2012) Comfortable robot to human object hand-over. In: Proceedings of the 2012 IEEE RO-MAN, pp 771–776. doi:10.1109/ROMAN.2012.6343845

  3. 3.

    Andrist S, Tan XZ, Gleicher M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, pp 25–32. doi:10.1145/2559636.2559666

  4. 4.

    Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge

    Google Scholar 

  5. 5.

    Bartneck C, Kulić D, Croft E, Zoghbi S (2008) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81. doi:10.1007/s12369-008-0001-3

  6. 6.

    Basili P, Huber M, Brandt T, Hirche S, Glasauer S (2009) Investigating human–human approach and hand-over. In: Ritter H, Sagerer G, Dillmann R, Buss M (eds) Human centered robot systems, cognitive systems monographs. Springer, Heidelberg, pp 151–160. doi:10.1007/978-3-642-10403-9-16

  7. 7.

    Bennewitz M, Faber F, Joho D, Schreiber M, Behnke S (2005) Towards a humanoid museum guide robot that interacts with multiple persons. In: Proceedings of the 2005 5th IEEE-RAS international conference on humanoid robots, pp 418–423. doi:10.1109/ICHR.2005.1573603

  8. 8.

    Bose RC (1949) A note on Fisher’s inequality for balanced incomplete block designs. Ann Math Stat 20(4):619–620

    Article  MATH  Google Scholar 

  9. 9.

    Bradley RA, Terry ME (1952) Rank analysis of incomplete block designs: I. The method of paired comparisons. Biometrika 39(3/4):324–345

    MathSciNet  Article  MATH  Google Scholar 

  10. 10.

    Cakmak M, Srinivasa S, Lee MK, Forlizzi J, Kiesler S (2011a) Human preferences for robot–human hand-over configurations. In: Proceedings of the 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1986–1993. doi:10.1109/IROS.2011.6094735

  11. 11.

    Cakmak M, Srinivasa SS, Lee MK, Kiesler S, Forlizzi J (2011b) Using spatial and temporal contrast for fluent robot–human hand-overs. In: Proceedings of the 6th international conference on human–robot interaction, pp 489–496. doi:10.1145/1957656.1957823

  12. 12.

    Calisgan E, Haddadi A, Van der Loos HFM, Alcazar JA, Croft EA (2012) Identifying nonverbal cues for automated human–robot turn-taking. In: Proceedings of the 2012 IEEE RO-MAN, pp 418–423. doi:10.1109/ROMAN.2012.6343788

  13. 13.

    Chan WP, Parker CA, Van der Loos HFM, Croft EA (2013) A human-inspired object handover controller. Int J Robot Res 32(8):971–983. doi:10.1177/0278364913488806

    Article  Google Scholar 

  14. 14.

    Dehais F, Sisbot EA, Alami R, Causse M (2011) Physiological and subjective evaluation of a human–robot object hand-over task. Appl Ergon 42(6):785–791. doi:10.1016/j.apergo.2010.12.005

    Article  Google Scholar 

  15. 15.

    Dragan A, Srinivasa S (2013) Generating legible motion. In: Robotics: science and systems

  16. 16.

    Edsinger A, Kemp CC (2007) Human–robot interaction for cooperative manipulation: handing objects to one another. In: Proceedings of the 2007 IEEE RO-MAN, pp 1167–1172. doi:10.1109/ROMAN.2007.4415256

  17. 17.

    Grigore E, Eder K, Pipe A, Melhuish C, Leonards U (2013) Joint action understanding improves robot-to-human object handover. In: Proceedings of the 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 4622–4629. doi:10.1109/IROS.2013.6697021

  18. 18.

    Häring M, Eichberg J, André E (2012) Studies on grounding with gaze and pointing gestures in human–robot interaction. In: Ge S, Khatib O, Cabibihan JJ, Simmons R, Williams MA (eds) Social robotics, lecture notes in computer science. Springer, Berlin, pp 378–387. doi:10.1007/978-3-642-34103-8-38

  19. 19.

    Hoque MM, Das D, Onuki T, Kobayashi Y, Kuno Y (2012) An integrated approach of attention control of target human by nonverbal behaviors of robots in different viewing situations. In: Proceedings of the 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1399–1406. doi:10.1109/IROS.2012.6385480

  20. 20.

    Huber M, Rickert M, Knoll A, Brandt T, Glasauer S (2008) Human–robot interaction in handing-over tasks. In: Proceedings of the 2008 IEEE RO-MAN, pp 107–112. doi:10.1109/ROMAN.2008.4600651

  21. 21.

    Imai M, Kanda T, Ono T, Ishiguro H, Mase K (2002) Robot mediated round table: analysis of the effect of robot’s gaze. In: Proceedings of the 2002 IEEE RO-MAN, pp 411–416. doi:10.1109/ROMAN.2002.1045657

  22. 22.

    Imai M, Ono T, Ishiguro H (2003) Physical relation and expression: joint attention for human–robot interaction. IEEE Trans Ind Electron 50(4):636–643. doi:10.1109/TIE.2003.814769

    Article  Google Scholar 

  23. 23.

    Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Psychol 26:22–63

    Article  Google Scholar 

  24. 24.

    Kim J, Park J, Hwang YK, Lee M (2004) Advanced grasp planning for handover operation between human and robot: three handover methods in esteem etiquettes using dual arms and hands of home-service robot. In: Proceedings of the 2nd international conference on autonomous robots and agents, pp 34–39

  25. 25.

    Kirchner N, Alempijevic A, Dissanayake G (2011) Nonverbal robot-group interaction using an imitated gaze cue. In: Proceedings of the 6th international conference on human–robot interaction, pp 497–504. doi:10.1145/1957656.1957824

  26. 26.

    Koay K, Sisbot E, Syrdal D (2007) Exploratory study of a robot approaching a person in the context of handing over an object. In: AAAI spring symposium: multidisciplinary collaboration for socially assistive robotics, pp 18–24

  27. 27.

    Lallée S, Hamann K, Steinwender J, Warneken F, Martienz U, Barron-Gonzales H, Pattacini U, Gori I, Petit M, Metta G, Verschure P, Ford Dominey P (2013) Cooperative human robot interaction systems: IV. Communication of shared plans with Naïve humans using gaze and speech. In: Proceedings of the 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 129–136. doi:10.1109/IROS.2013.6696343

  28. 28.

    Langton SR, Bruce V (2000) You must see the point: automatic processing of cues to the direction of social attention. J Exp Psychol 26(2):747–757. doi:10.1037/0096-1523.26.2.747

    Google Scholar 

  29. 29.

    Lee MK, Forlizzi J, Kiesler S, Cakmak M, Srinivasa S (2011) Predictability or adaptivity?: designing robot handoffs modeled from trained dogs and people. In: Proceedings of the 6th international conference on human–robot interaction, pp 179–180. doi:10.1145/1957656.1957720

  30. 30.

    Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: Proceedings of the 2012 7th ACM/IEEE international conference on human–robot interaction (HRI), pp 285–292. doi:10.1145/2157689.2157797

  31. 31.

    Mainprice J, Gharbi M, Simeon T, Alami R (2012) Sharing effort in planning human–robot handover tasks. In: Proceedings of the 2012 IEEE RO-MAN, pp 764–770. doi:10.1109/ROMAN.2012.6343844

  32. 32.

    Matsusaka Y, Fujie S, Kobayashi T (2001) Modeling of conversational strategy for the robot participating in the group conversation. In: Seventh European conference on speech communication and technology, pp 2173–2176

  33. 33.

    Max Planck Institute for Psycholinguistics, The language archive, Nijmegen, The Netherlands (2014) ELAN | The language archive. http://tla.mpi.nl/tools/tla-tools/elan/

  34. 34.

    Moon A, Troniak DM, Gleeson B, Pan MKXJ, Zheng M, Blumer BA, MacLean K, Croft EA (2014) Meet me where I’m gazing: how shared attention aaze affects human–robot handover timing. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, pp 334–341. doi:10.1145/2559636.2559656

  35. 35.

    Mutlu B (2009) Designing gaze behavior for humanlike robots. PhD thesis, Carnegie Mellon University

  36. 36.

    Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human–robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, ACM Press, pp 61–68. doi:10.1145/1514095.1514109

  37. 37.

    Nagata K, Oosaki Y, Kakikura M, Tsukune H (1998) Delivery by hand between human and robot based on fingertip force-torque information. In: Proceedings 1998 IEEE/RSJ international conference on intelligent robots and systems, vol 2, pp 750–757. doi:10.1109/IROS.1998.727283

  38. 38.

    Rich C, Ponsler B, Holroyd A, Sidner C (2010) Recognizing engagement in human–robot interaction. In: Proceedings of the 2010 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 375–382. doi:10.1109/HRI.2010.5453163

  39. 39.

    Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(1–2):140–164. doi:10.1016/j.artint.2005.03.005

    Article  Google Scholar 

  40. 40.

    Sloetjes H, Wittenburg P (2008) Annotation by category: ELAN and ISO DCR. In: Proceedings of the 6th international conference on language resources and evaluation, pp 816–820

  41. 41.

    Strabala K, Lee MK, Dragan A, Forlizzi J, Srinivasa SS (2012) Learning the communication of intent prior to physical collaboration. In: Proceedings of the 2012 IEEE RO-MAN, pp 968–973. doi:10.1109/ROMAN.2012.6343875

  42. 42.

    Strabala KW, Lee MK, Dragan AD, Forlizzi JL, Srinivasa S, Cakmak M, Micelli V (2013) Towards seamless human–robot handovers. J Hum Robot Interact 2(1):112–132. doi:10.5898/JHRI.2.1.Strabala

    Article  Google Scholar 

  43. 43.

    Yamaoka F, Kanda T, Ishiguro H, Hagita N (2006) How contingent should a communication robot be? In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, pp 313–320, doi:10.1145/1121241.1121294

  44. 44.

    Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces, pp 140–145. doi:10.1145/1322192.1322218

  45. 45.

    Yoshikawa Y, Shinozawa K, Ishiguro H, Hagita N, Miyamoto T (2006) Responsive robot gaze to interaction partner. In: Proceedings of robotics: science and systems

  46. 46.

    Zheng M, Moon A, Gleeson B, Troniak DM, Pan MKXJ, Blumer BA, Meng MQH, Croft EA (2014) Human behavioural responses to robot head gaze during robot-to-human handovers. In: Proceedings of the 2014 IEEE international conference on robotics and biomimetics (ROBIO), pp 362–367. doi:10.1109/ROBIO.2014.7090357

Download references

Acknowledgments

Financial support for this research was provided by the Natural Sciences and Engineering Research Council of Canada, the Canada Foundation for Innovation and the UBC Institute for Computing, Information and Cognitive Systems. This project was partially supported by RGC GRF CUHK415512 awarded to Max Meng. This study was approved by the University of British Columbia Behavioral Research Ethics Board. We are grateful to all participants in our experiments. We also want to thank the following individuals who provided technical and other assistance throughout our work: Brian Gleeson, Daniel Troniak, Matthew Pan, Benjamin Blumer, Bikram Adhikari, Fernando Ramirez, Lars Bollmann, Katelyn Currie, Jake Rose, Junaed Sattar, Hongtao Chang and Lei Wang.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Minhua Zheng.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zheng, M., Moon, A., Croft, E.A. et al. Impacts of Robot Head Gaze on Robot-to-Human Handovers. Int J of Soc Robotics 7, 783–798 (2015). https://doi.org/10.1007/s12369-015-0305-z

Download citation

Keywords

  • Robot head gaze
  • Robot-to-human handover
  • Shared attention gaze
  • Face gaze