Skip to main content

Impact of Robot Actions on Social Signals and Reaction Times in HRI Error Situations

Part of the Lecture Notes in Computer Science book series (LNAI,volume 9388)


Human-robot interaction experiments featuring error situations are often excluded from analysis. We argue that a lot of value lies hidden in this discarded data. We analyzed a corpus of 201 videos that show error situations in human-robot interaction experiments. The aim of our analysis was to research (a) if and which social signals the experiment participants show in reaction to error situations, (b) how long it takes the participants to react in the error situations, and (c) whether different robot actions elicit different social signals. We found that participants showed social signals in 49.3% of error situations, more during social norm violations and less during technical failures. Task-related actions by the robot elicited less social signals by the participants, while participants showed more social signals when the robot did not react. Finally, the participants had an overall reaction time of 1.64 seconds before they showed a social signal in response to a robot action. The reaction times are specifically long (4.39 seconds) during task-related actions that go wrong during execution.


  • Human-robot interaction
  • Robot feedback
  • Social robots

This is a preview of subscription content, access via your institution.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Bohus, D., Horvitz, E.: Managing human-robot engagement with forecasts and... um... hesitations. In: Proc. of the 16th Int. Conf. on Multimodal Interaction, ICMI 2014, pp. 2–9. ACM (2014)

    Google Scholar 

  2. Carter, E.J., Mistry, M.N., Carr, G.P.K., Kelly, B.A., Hodgins, J.K.: Playing catch with robots: incorporating social gestures into physical interactions. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 231–236. IEEE (2014)

    Google Scholar 

  3. Foster, M.E., Gaschler, A., Giuliani, M., Isard, A., Pateraki, M., Petrick, R.P.A.: Two people walk into a bar: dynamic multi-party social interaction with a robot agent. In: Proc. of the 14th ACM Int. Conf. on Multimodal Interaction, ICMI 2012), Santa Monica, USA, October 2012

    Google Scholar 

  4. Giuliani, M., Foster, M.E., Isard, A., Matheson, C., Oberlander, J., Knoll, A.: Situated reference in a hybrid human-robot interaction system. In: Proc. of the 6th Int. Natural Language Generation Conf., INLG 2010, Dublin, Ireland, July 2010

    Google Scholar 

  5. Giuliani, M., Mirnig, N., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Systematic analysis of video data from different human-robot interaction studies: a categorisation of social signals during error situations. submitted for publication

    Google Scholar 

  6. Giuliani, M., Petrick, R.P., Foster, M.E., Gaschler, A., Isard, A., Pateraki, M., Sigalas, M.: Comparing task-based and socially intelligent behaviour in a robot bartender. In: Proc. of the 15th Int. Conf. on Multimodal Interfaces, ICMI 2013, Sydney, Australia, December 2013

    Google Scholar 

  7. Hagenaars, M.A., Oitzl, M., Roelofs, K.: Updating freeze: aligning animal and human research. Neurosc. & Biobehavioral Reviews 47, 165–176 (2014)

    CrossRef  Google Scholar 

  8. Hoque, M.E., McDuff, D.J., Picard, R.W.: Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3(3), 323–334 (2012)

    CrossRef  Google Scholar 

  9. Jang, M., Lee, D.-H., Kim, J., Cho, Y.: Identifying principal social signals in private student-teacher interactions for robot-enhanced education. In: Proc. of the 22rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2013, pp. 621–626, August 2013

    Google Scholar 

  10. Loth, S., Huth, K., De Ruiter, J.P.: Automatic detection of service initiation signals used in bars. Journal of Frontiers in Psychology 4 (2013)

    Google Scholar 

  11. Mirnig, N., Tscheligi, M.: Robots that Talk and Listen. Technological and Social Impact., chapter Comprehension, Coherence and Consistency: Essentials of Robot Feedback, pp. 149–171. De Gruyter (2014)

    Google Scholar 

  12. Sato, R., Takeuchi, Y.: Coordinating turn-taking and talking in multi-party conversations by controlling robot’s eye-gaze. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 280–285. IEEE (2014)

    Google Scholar 

  13. Schank, R., Abelson, R., et al.: Scripts, plans, goals and understanding: An inquiry into human knowledge structures, vol. 2. Lawrence Erlbaum Associates Hillsdale, NJ (1977)

    MATH  Google Scholar 

  14. Stadler, S., Weiss, A., Tscheligi, M.: I trained this robot: the impact of pre-experience and execution behavior on robot teachers. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 1030–1036. IEEE (2014)

    Google Scholar 

  15. Stanton, C., Stevens, C.J.: Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust. In: Beetz, M., Johnston, B., Williams, M.-A. (eds.) ICSR 2014. LNCS, vol. 8755, pp. 330–339. Springer, Heidelberg (2014)

    CrossRef  Google Scholar 

  16. Tseng, S.-H., Hsu, Y.-H., Chiang, Y.-S., Wu, T.-Y., Fu, L.-C.: Multi-human spatial social pattern understanding for a multi-modal robot through nonverbal social signals. In Proc. of the 23rd Int. Symp. on Robot and Human Interactive Comm., RO-MAN 2014, pp. 531–536. IEEE (2014)

    Google Scholar 

  17. Vinciarelli, A., Pantic, M., Heylen, D., Pelachaud, C., Poggi, I., D’Errico, F., Schröder, M.: Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE Transactions on Affective Computing 3(1), 69–87 (2012)

    CrossRef  Google Scholar 

  18. Watzlawick, P., Bavelas, J.B., Jackson, D.D.: Pragmatics of human communication: A study of interactional patterns, pathologies and paradoxes. WW Norton & Company (2011)

    Google Scholar 

  19. Weiss, A., Mirnig, N., Bruckenberger, U., Strasser, E., Tscheligi, M., Kühnlenz, B., Wollherr, D., Stanczyk, B.: The Interactive Urban Robot: User-centered development and final field trial of a direction requesting robot. Paladyn Journal of Behavioral Robotics 6(1), 42–6 (2015)

    CrossRef  Google Scholar 

  20. Witchel, H.J., Westling, C.E., Tee, J., Healy, A., Needham, R., Chockalingam, N.: What does not happen: Quantifying embodied engagement using nimi and self-adaptors. Journal of Audience and Reception Studies 11(1), 304–331 (2014)

    Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Nicole Mirnig .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Mirnig, N., Giuliani, M., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M. (2015). Impact of Robot Actions on Social Signals and Reaction Times in HRI Error Situations. In: Tapus, A., André, E., Martin, JC., Ferland, F., Ammi, M. (eds) Social Robotics. ICSR 2015. Lecture Notes in Computer Science(), vol 9388. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25553-8

  • Online ISBN: 978-3-319-25554-5

  • eBook Packages: Computer ScienceComputer Science (R0)