Abstract
Mutual gaze is a powerful cue for communicating social attention and intention. A plethora of studies have demonstrated the fundamental roles of mutual gaze in establishing communicative links between humans, and enabling non-verbal communication of social attention and intention. The amount of mutual gaze between two partners regulates human-human interaction and is a sign of social engagement. This paper investigates whether implementing mutual gaze in robotic systems can achieve social effects, thus to improve human robot interaction. Based on insights from existing human face-to-face interaction studies, we implemented an interactive mutual gaze model in an embodied agent, the social robot head Furhat. We evaluated the mutual gaze prototype with 24 participants in three applications. Our results show that our mutual gaze model improves social connectedness between robots and users.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andrist, S., Tan, X.Z., Gleicher, M., Mutlu, B.: Conversational gaze aversion for humanlike robots. In: Proceedings of HRI 2014, pp. 25–32. ACM (2014)
Argyle, M., Dean, J.: Eye-contact, distance and affiliation. Sociometry 28, 289–304 (1965)
Bailenson, J.N., Blascovich, J., Beall, A.C., Loomis, J.M.: Equilibrium theory revisited: mutual gaze and personal space in virtual environments. Presence 10(6), 583–598 (2001)
Breazeal, C., Fitzpatrick, P.: That certain look: social amplification of animate vision. In: Proceedings of AAAI 2000 (2000)
Cappella, J.N., Pelachaud, C.: Rules for responsive robots: using human interactions to build virtual interactions. In: Vangelisti, A.L., Fitzpatrick, A. (eds.) Stability and Change in Realtionships, pp. 325–354. Cambridge University Press, Cambdrige (2002)
Hoffman, G., Birnbaum, G.E., Vanunu, K., Sass, O., Reis, H.T.: Robot responsiveness to human disclosure affects social impression and appeal. In: Proceedings of HRI 2014. ACM (2014)
Lee, S.P., Badler, J.B., Badler, N.I.: Eyes alive. ACM Trans. Graph. 21(3), 637–644 (2002)
Mehlmann, G., Häring, M., Janowski, K., Baur, T., Gebhard, P., André, E.: Exploring a model of gaze for grounding in multimodal HRI. In: Proceedings of ICMI 2014, pp. 247–254. ACM (2014)
Moubayed, S.A., Skantze, G., Beskow, J.: The furhat back-projected humanoid head-lip reading, gaze and multi-party interaction. Int. J. Hum. Robot. 10(01), 1350005 (2013)
Mutlu, B., Kanda, T., Forlizzi, J., Hodgins, J., Ishiguro, H.: Conversational gaze mechanisms for humanlike robots. ACM Trans. Interact. Intell. Syst. 1(2), 12:1–12:33 (2012)
Riek, L.D., Paul, P.C., Robinson, P.: When my robot smiles at me: enabling human-robot rapport via real-time head gesture mimicry. J. Multimodal User Interfaces 3(1–2), 99–108 (2010)
Ruhland, K., Peters, C., Andrist, S., Badler, J., Badler, N., Gleicher, M., Mutlu, B., McDonnell, R.: A review of eye gaze in virtual agents, social robotics and HCI: behaviour generation, user interaction and perception. Comput. Graph. Forum. 34, 299–326 (2015). Wiley Online Library
Skantze, G., Hjalmarsson, A., Oertel, C.: Exploring the effects of gaze and pauses in situated human-robot interaction. In: Proceedings of the 14th Annual Meeting of Special Interest Group on Doscourse and Dialogue-SIGDial, pp. 375–383 (2013)
Skantze, G., Johansson, M., Beskow, J.: Exploring turn-taking cues in multi-party human-robot discussions about objects. In: Proceedings of ICMI 2015, pp. 67–74. ACM (2015)
Wang, N., Gratch, J.: Don’t just stare at me!. In: Proceedings of CHI 2010, pp. 1241–1250. ACM (2010)
Yonezawa, T., Yamazoe, H., Utsumi, A., Abe, S.: Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of ICMI 2007, pp. 140–145. ACM (2007)
Yoshikawa, Y., Shinozawa, K., Ishiguro, H., Hagita, N., Miyamoto, T.: Responsive robot gaze to interaction partner. In: Proceedings of Robotics: Science and systems (2006)
Yu, C., Schermerhorn, P., Scheutz, M.: Adaptive eye gaze patterns in interactions with human and artificial agents. ACM Trans. Interact. Intell. Syst. 1(2), 13:1–13:25 (2012)
Zhang, Y., Stellmach, S., Sellen, A., Blake, A.: The costs and benefits of combining gaze and hand gestures for remote interaction. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9298, pp. 570–577. Springer, Cham (2015). doi:10.1007/978-3-319-22698-9_39
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Zhang, Y., Beskow, J., Kjellström, H. (2017). Look but Don’t Stare: Mutual Gaze Interaction in Social Robots. In: Kheddar, A., et al. Social Robotics. ICSR 2017. Lecture Notes in Computer Science(), vol 10652. Springer, Cham. https://doi.org/10.1007/978-3-319-70022-9_55
Download citation
DOI: https://doi.org/10.1007/978-3-319-70022-9_55
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70021-2
Online ISBN: 978-3-319-70022-9
eBook Packages: Computer ScienceComputer Science (R0)