Advertisement

It’s in the Eyes: The Engaging Role of Eye Contact in HRI

  • Kyveli KompatsiariEmail author
  • Francesca Ciardo
  • Vadim Tikhanoff
  • Giorgio Metta
  • Agnieszka Wykowska
Article

Abstract

This paper reports a study where we examined how a humanoid robot was evaluated by users, dependent on established eye contact. In two experiments, the robot was programmed to either establish eye contact with the user, or to look elsewhere. Across the experiments, we altered the level of predictiveness of the robot’s gaze direction with respect to a subsequent target stimulus (in Exp.1 the gaze direction was non-predictive, in Exp. 2 it was counter-predictive). Results of subjective reports showed that participants were sensitive to eye contact. Moreover, participants felt more engaged with the robot when it established eye contact, and the majority attributed higher degree of human-likeness in the eye contact condition, relative to no eye contact. This was independent of predictiveness of the gaze cue. Our results suggest that establishing eye contact by embodied humanoid robots has a positive impact on perceived socialness of the robot, and on the quality of human–robot interaction (HRI). Therefore, establishing eye contact should be considered in designing robot behaviors for social HRI.

Keywords

Eye contact Social human–robot interaction Social attention iCub 

Notes

Acknowledgements

This Project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (Grant awarded to A. Wykowska, titled “InStance: Intentional Stance for Social Attunement. Grant agreement No: 715058).

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Supplementary material

Supplementary material 1 (MP4 8460 kb)

References

  1. 1.
    Takayama L, Ju W, Nass C (2008) Beyond dirty, dangerous and dull: what everyday people think robots should do. In: Proceedings of the 3rd ACM/IEEE, international conference on human robot interaction. Amsterdam, pp 25–32Google Scholar
  2. 2.
    Tapus A, Matarić MJ (2006) Towards socially assistive robotics. Int J Robot Soc Jpn 24:576–578Google Scholar
  3. 3.
    Cabibihan JJ, Javed H, Ang M, Aljunied SM (2013) Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. Int J Soc Robot 5:593–618Google Scholar
  4. 4.
    Martin RF, Carlos AD, Jose Maria CP, Gonzalo AD, Raul BM, Rivero S et al (2013) Robots in therapy for dementia patients. J Phys Agents 7:49–56Google Scholar
  5. 5.
    Mubin O, Stevens CJ, Shadid S, Mahmud A, Dong JJ (2013) A review of the applicability of robots in education. Technol Educ Learn 1:1–7Google Scholar
  6. 6.
    Tapus A, Mataric MJ, Scasselati B (2007) Socially assistive robotics [grand challenges of robotics]. IEEE Robot Autom Mag 14:35–42Google Scholar
  7. 7.
    Birks M, Bodak M, Barlas J, Harwood J, Pether M (2016) Robotic seals as therapeutic tools in an aged care facility: a qualitative study. J Aging Res, ID, p 8569602Google Scholar
  8. 8.
    Baron-Cohen S, Wheelwright S, Jolliffe T (1997) Is there a“language of the eyes”? Evidence from normal adults, and adults with autism or Asperger syndrome. Vis Cogn 4(3):311–331Google Scholar
  9. 9.
    Baron-Cohen S, Campbell R (1995) Are children with autism blind to the mentalistic significance of the eyes? Br J Dev Psychol 13(4):379–398Google Scholar
  10. 10.
    Dovidio J, Ellyson S (1982) Decoding visual dominance: attributions of power based on relative percentages of looking while speaking and looking while listening. Soc Psychol Q 45(2):106–113Google Scholar
  11. 11.
    Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100Google Scholar
  12. 12.
    Kampe KK, Frith CD, Frith U (2003) “Hey John”: signals conveying communicative intention toward the self activate brain regions associated with “mentalizing”, regardless of modality. J Neurosci 23:5258–5263Google Scholar
  13. 13.
    Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge, UKGoogle Scholar
  14. 14.
    Macrae CN, Hood BM, Milne AB, Rowe AC, Mason MF (2002) Are you looking at me? Eye gaze and person perception. Psychol Sci 13(5):460–464Google Scholar
  15. 15.
    Hamilton AFC (2016) Gazing at me: the importance of social meaning in understanding direct-gaze cues. Philos Trans R Soc B 371(1686):20150080Google Scholar
  16. 16.
    Senju A, Johnson M (2009) The eye contact effect: mechanisms and development. Trends Cogn Sci 13(3):127–134Google Scholar
  17. 17.
    Farroni T, Csibra G, Simion F, Johnson MH (2002) Eye contact detection in humans from birth. Proc Natl Acad Sci 99(14):9602–9605Google Scholar
  18. 18.
    Batki A, Baron-Cohen S, Wheelwright S, Connellan J, Ahluwalia J (2000) Is there an innate gaze module? Evidence from human neonates. Infant Behav Dev 23(2):223–229Google Scholar
  19. 19.
    Farroni T, Mansfield EM, Lai C, Johnson MH (2003) Infants perceiving and acting on the eyes: tests of an evolutionary hypothesis. J Exp Child Psychol 85(3):199–212Google Scholar
  20. 20.
    Senju A, Csibra G (2008) Gaze following in human infants depends on communicative signals. Curr Biol 18(9):668–671Google Scholar
  21. 21.
    Kompatsiari K, Ciardo F, Tikhanoff V, Metta G, Wykowska A (2018) On the role of eye contact in gaze cueing. Sci Rep 8(1):17842Google Scholar
  22. 22.
    Senju A, Hasegawa T (2005) Direct gaze captures visuospatial attention. Vis Cogn 12(1):127–144Google Scholar
  23. 23.
    Hood BM, Macrae CN, Cole-Davies V, Dias M (2003) Eye remember you: the effects of gaze direction on face recognition in children and adults. Dev Sci 6(1):67–71Google Scholar
  24. 24.
    Kuzmanovic B, Georgescu AL, Eickhoff SB, Shah NJ, Bente G, Fink GR, Vogeley K (2009) Duration matters: dissociating neural correlates of detection and evaluation of social gaze. Neuroimage 46(4):1154–1163Google Scholar
  25. 25.
    Brooks CI, Church MA, Fraser L (1986) Effects of duration of eye contact on judgments of personality characteristics. J Soc Psychol 126:71–78Google Scholar
  26. 26.
    Kompatsiari K, Tikhanoff V, Ciardo F, Metta G, Wykowska A (2017) The importance of mutual gaze in human-robot interaction. In: International conference on social robotics. Tsukuba. Springer, Cham, pp 443–452Google Scholar
  27. 27.
    Droney JM, Brooks CI (1993) Attributions of self-esteem as a function of duration of eye contact. J Soc Psychol 133:715–722Google Scholar
  28. 28.
    Knackstedt G, Kleinke CL (1991) Eye contact, gender, and personality judgments. J Soc Psychol 131:303–304Google Scholar
  29. 29.
    Mason MF, Tatkow EP, Macrae CN (2005) The look of love: gaze shifts and person perception. Psychol Sci 16(3):236–239Google Scholar
  30. 30.
    Conty L, Tijus C, Hugueville L, Coelho E, George N (2006) Searching for asymmetries in the detection of gaze contact versus averted gaze under different head views: a behavioural study. Spat Vis 19(6):529–545Google Scholar
  31. 31.
    Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces. ACM, New York, pp 140–145Google Scholar
  32. 32.
    Ito A, Hayakawa S, Terada T (2004) Why robots need body for mind communication-an attempt of eye-contact between human and robot. In: 13th IEEE international workshop on robot and human interactive communication, ROMAN 2004. IEEE, pp 473–478Google Scholar
  33. 33.
    Choi JJ, Kim Y, Kwak SS (2013) Have you ever lied? The impacts of gaze avoidance on people’s perception of a robot. In: 8th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 105–106Google Scholar
  34. 34.
    Zhang Y, Beskow J, Kjellström H (2017) Look but don’t stare: mutual gaze interaction in social robots. In: International conference on social robotics. Springer, Cham, pp 556–566Google Scholar
  35. 35.
    Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Interact 6(1):25–63Google Scholar
  36. 36.
    Metta G, Natale L, Nori F, Sandini G, Vernon D, Fadiga L, von Hofsten C, Rosander K, Lopes M, Santos-Victor J, Bernardino A, Montesano L (2010) The iCub humanoid robot: an open-systems platform for research in cognitive development. Neural Netw 23(8–9):1125–1134Google Scholar
  37. 37.
    Natale L, Bartolozzi C, Pucci D, Wykowska A, Metta G (2017) The not-yet-finished story of building a robot child. Sci Robot 2(13):eaaq1026Google Scholar
  38. 38.
    Metta G, Fitzpatrick P, Natale L (2006) YARP: yet another robot platform. Int J Adv Rob Syst 3(1):43–44Google Scholar
  39. 39.
    Roncone A, Pattacini U, Metta, Natale L (2016) A Cartesian 6-DoF gaze controller for humanoid robots. In: Proceedings of robotics: science and systems. AnnArbor, Michigan.  https://doi.org/10.15607/rss.2016.xii.022
  40. 40.
    Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1867–1874Google Scholar
  41. 41.
    Sharma S, Shanmugasundaram K, Ramasamy SK (2016) FAREC—CNN based efficient face recognition technique using Dlib. In: International conference on advanced communication control and computing technologies, pp 192–195Google Scholar
  42. 42.
    Feng ZH, Kittler J, Awais M, Huber P, Wu XJ (2017) Face detection, bounding box aggregation and pose estimation for robust facial landmark localisation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops 2017, pp 160–169Google Scholar
  43. 43.
    Valstar M, Gratch J, Schuller B, Ringeval F, Lalanne D, Torres Torres M, Scherer S, Stratou G, Cowie R, Avec PM (2016) Depression, mood, and emotion recognition workshop and challenge. In: Proceedings of the 6th international workshop on audio/visual emotion challenge. ACM, New York, pp 3–10Google Scholar
  44. 44.
    Martinez B, Valstar MF, Jiang B, Pantic M (2017) Automatic analysis of facial actions: a survey. IEEE Trans Affect Comput.  https://doi.org/10.1109/taffc.2017.2731763 Google Scholar
  45. 45.
    Matsuyama Y, Bhardwaj A, Zhao R, Romeo O, Akoju S, Cassell J (2016) Socially-aware animated intelligent personal assistant agent. In: Proceedings of the 17th annual meeting of the special interest group on discourse and dialogue, pp 224–227Google Scholar
  46. 46.
    Wood E, Baltrušaitis T, Morency LP, Robinson P, Bulling A (2016) A 3D morphable eye region model for gaze estimation. In: European conference on computer vision. Springer, Cham, pp 297–313Google Scholar
  47. 47.
    Nasir M, Jati A, Shivakumar PG, Nallan Chakravarthula S, Georgiou P (2016) Multimodal and multiresolution depression detection from speech and facial landmark features. In: Proceedings of the 6th international workshop on audio/visual emotion challenge. ACM, New York, pp 43–50Google Scholar
  48. 48.
    Zhang X, Sugano Y, Bulling A (2017) Everyday eye contact detection using unsupervised gaze target discovery. In: Proceedings of the 30th annual ACM symposium on user interface software and technology. ACM, New York, pp 193–203Google Scholar
  49. 49.
    Portalska KJ, Leferink A, Groen N, Fernandes H, Moroni L, van Blitterswijk C, de Boer J (2012) Endothelial differentiation of mesenchymal stromal cells. PLoS ONE 7(10):e46842Google Scholar
  50. 50.
    Gould S (2012) DARWIN: a framework for machine learning and computer vision research and development. J Mach Learn Res 13:3533–3537MathSciNetzbMATHGoogle Scholar
  51. 51.
    Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J of Soc Robot 1(1):71–81Google Scholar
  52. 52.
    Kajopoulos J, Wong AHY, Yuen AWC, Dung TA, Kee TY, Wykowska A (2015) Robot-assisted training of joint attention skills in children diagnosed with autism. In: Lecture notes in artificial intelligence, pp 296–305Google Scholar
  53. 53.
    Schilbach L (2014) On the relationship of online and offline social cognition. Front Hum Neurosci 8:278Google Scholar
  54. 54.
    Schilbach L, Timmermans B, Reddy V, Costall A, Bente G, Schlicht T, Vogeley K (2013) A second-person neuroscience in interaction. Behav Brain Sci 36(4):441–462Google Scholar
  55. 55.
    Kompatsiari K, Pérez-Osorio J, De Tommaso D, Metta G, Wykowska A (2018) Neuroscientifically-grounded research for improved human-robot interaction. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), Madrid. IEEE, pp 3403–3408Google Scholar
  56. 56.
    Wykowska A, Chaminade T, Cheng G (2016) Embodied artificial agents for understanding human social cognition. Philos Trans R Soc B 371(1693):20150375Google Scholar
  57. 57.
    Wiese E, Metta G, Wykowska A (2017) Robots as intentional agents: using neuroscientific methods to make robots appear more social. Front Psychol 8:1663Google Scholar
  58. 58.
    Wykowska A, Kajopoulos J, Obando-Leitón M, Chauhan SS, Cabibihan JJ, Cheng G (2015) Humans are well tuned to detecting agents among non-agents: examining the sensitivity of human perception to behavioral characteristics of intentional systems. Int J Soc Robot 7(5):767–781Google Scholar
  59. 59.
    Wykowska A, Kajopoulos J, Ramirez-Amaro K, Cheng G (2015) Autistic traits and sensitivity to human-like features of robot behavior. Interact Stud 16(2):219–248Google Scholar
  60. 60.
    Wykowska A, Chellali R, Al-Amin MM, Müller HJ (2014) Implications of robot actions for human perception. How do we represent actions of the observed robots? Int J of Soc Robot 6(3):357–366Google Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Istituto Italiano di Tecnologia, Social Cognition in Human-Robot InteractionCentre for Human TechnologiesGenoaItaly
  2. 2.Ludwig Maximilian UniversityPlaneggGermany
  3. 3.Istituto Italiano di Tecnologia, iCub FacilityGenoaItaly
  4. 4.University of PlymouthPlymouthUK
  5. 5.Luleå University of TechnologyLuleåSweden

Personalised recommendations