The Effects of Social Gaze in Human-Robot Collaborative Assembly

  • Kerstin Fischer
  • Lars Christian Jensen
  • Franziska Kirstein
  • Sebastian Stabinger
  • Özgür Erkent
  • Dadhichi Shukla
  • Justus Piater
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9388)


In this paper we explore how social gaze in an assembly robot affects how naïve users interact with it. In a controlled experimental study, 30 participants instructed an industrial robot to fetch parts needed to assemble a wooden toolbox. Participants either interacted with a robot employing a simple gaze following the movements of its own arm, or with a robot that follows its own movements during tasks, but which also gazes at the participant between instructions. Our qualitative and quantitative analyses show that people in the social gaze condition are significantly more quick to engage the robot, smile significantly more often, and can better account for where the robot is looking. In addition, we find people in the social gaze condition to feel more responsible for the task performance. We conclude that social gaze in assembly scenarios fulfills floor management functions and provides an indicator for the robot’s affordance, yet that it does not influence likability, mutual interest and suspected competence of the robot.


Human-robot interaction Gaze Conversation analysis Smile 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Srinivasan, V., Murphy, R.R.: A survey of social gaze. In: Proceedings of HRI 2011, Lausanne, Switzerland (2011)Google Scholar
  2. 2.
    Muxfeldt, A., Kluth, J.-H., Kubus, D.: Kinesthetic teaching in assembly operations– a user study. In: Broenink, J.F., Kroeger, T., MacDonald, B.A., Brugali, D. (eds.) SIMPAR 2014. LNCS, vol. 8810, pp. 533–544. Springer, Heidelberg (2014)Google Scholar
  3. 3.
    Mutlu, B., Kanda, T., Forlizzi, J., Hodgins, J., Ishiguro, H.: Conversational Gaze Mechanisms for Humanlike Robots. ACM Transactions on Interactive Intelligent Systems 1(2), 12 (2012)CrossRefGoogle Scholar
  4. 4.
    Moon, A., Troniak, D.M., Gleeson, B., Pan, M.K., Zeng, M., Blumer, B.A., MacLean, K., Croft, E.A.: Meet me where I’m gazing: how shared attention gaze affects human-robot handover timing. In: Proceedings of HRI 2014, pp. 334–341 (2014)Google Scholar
  5. 5.
    Ruhland, K., Andrist, S., Badler, J.B., Peters, C.E., Badler, N.I., Gleicher, M., Mutlu, B., McDonnell, R.: Look me in the eyes: a survey of eye and gaze animation for virtual agents and artificial systems. In: Eurographics 2014 (2014)Google Scholar
  6. 6.
    Argyle, M., Cook, M.: Gaze and Mutual Gaze. Cambridge University Press (1976)Google Scholar
  7. 7.
    Kendon, A.: Conducting Interaction: Patterns of Behavior in Focused Encounters. Cambridge University Press (1990)Google Scholar
  8. 8.
    Mehlmann, G., Häring, M., Janowski, K., Baur, T., Gebhard, P., André, E.: Exploring a model of gaze for grounding in multimodal HRI. In: ICMI 2014, pp. 247–254 (2014)Google Scholar
  9. 9.
    Wang, N., Gratch, J.: Don’t just Stare at me! In: Proceedings of CHI 2010, Atlanta, Georgia, USA (2010)Google Scholar
  10. 10.
    Andrist, S., Tan, X.Z., Gleicher, M., Mutlu, B.: Conversational gaze aversion for humanlike robots. In: Proceedings of HRI 2014, Bielefeld, Germany (2014)Google Scholar
  11. 11.
    Admoni, H., Bank, C., Tan, J., Toneva, M., Scassellati, B.: Robot gaze does not reflexively cue human attention. In: Proceedings of the 33rd Annual Conference of the Cognitive Science Society (CogSci 2011), Austin, TX, pp. 1983–1988 (2011)Google Scholar
  12. 12.
    Meltzoff, A.N., Brooks, R., Shon, A.P., Rao, R.P.: “Social” robots are psychological agents for infants: a test of gaze following. Neural Networks 23, 966–972 (2010)CrossRefGoogle Scholar
  13. 13.
    Fischer, K., Lohan, K., Nehaniv, C., Lehmann, H.: Effects of different types of robot feedback. In: International Conference on Social Robotics 2013, Bristol, UK (2013)Google Scholar
  14. 14.
    Fischer, K., Soto, B., Pantofaru, C., Takayama, L.: The effects of social framing on people’s responses to robots’ requests for help. In: Proceedings of the IEEE Conference on Robot-Human Interactive Communication– Ro-man 2014, Edinburgh (2014)Google Scholar
  15. 15.
    Admoni, H., Dragan, A., Srinivasa, S.S, Scassellati, B.: Deliberate delays during robot-to-human handovers improve compliance with gaze communication. In: Proceedings of HRI 2014, Bielefeld, Germany (2014)Google Scholar
  16. 16.
    Bischoff, R., Kurth, J., Schreiber, G., Koeppe, R., Albu-Schäffer, A., Beyer, A., Eiberger, O., Haddadin, S., Stemmer, A., Grunwald, G., Hirzinger, G.: The KUKA-DLR lightweight robot arm - a new reference platform for robotics research and manufacturing. In: Joint 41st International Symposium on Robotics and 6th German Conference on Robotics, ISR-Robotik, Munich, pp. 741–748 (2010)Google Scholar
  17. 17.
    Asfour, T., Welke, K., Azad, P., Ude, A., Dillmann, R.: The Karlsruhe humanoid head. In: 8th IEEE-RAS International Conference on Humanoid Robots, pp. 447–453 (2008)Google Scholar
  18. 18.
    Bartneck, C., Croft, E., Kulic, D.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics 1(1), 71–81 (2009)CrossRefGoogle Scholar
  19. 19.
    Sacks, H., Schegloff, E.A., Jefferson, G.: A simplest systematics for the organization of turn-taking for conversation. Language 50(4), 696–735 (1974)CrossRefGoogle Scholar
  20. 20.
    Strabala, K., Lee, K., Dragan, A., Forlizzi, J., Srinivasa, S., Cakmak, M., Micelli, V.: Toward Seamless Human-Robot Handovers. Journal of Human-Robot Interaction 2(1), 112–132 (2013)CrossRefGoogle Scholar
  21. 21.
    Kirstein, F., Fischer, K., Erkent, Ö., Piater, J.: Human smile distinguishes between collaborative and solitary tasks in human-robot interaction. In: Late Breaking Results, Human-Robot Interaction Conference 2015, Portland, Oregon (2015)Google Scholar
  22. 22.
    Bee, N., André, E., Tober, S.: Breaking the ice in human-agent communication: eye-gaze based initiation of contact with an embodied conversational agent. In: Ruttkay, Z., Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp. 229–242. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Authors and Affiliations

  • Kerstin Fischer
    • 1
  • Lars Christian Jensen
    • 1
  • Franziska Kirstein
    • 2
  • Sebastian Stabinger
    • 3
  • Özgür Erkent
    • 3
  • Dadhichi Shukla
    • 3
  • Justus Piater
    • 3
  1. 1.Department for Design & CommunicationUniversity of Southern DenmarkSonderborgDenmark
  2. 2.Blue Ocean RoboticsOdense MDenmark
  3. 3.Intelligent and Interactive Systems, Institute of Computer ScienceUniversity of InnsbruckInnsbruckAustria

Personalised recommendations