International Conference on Social Robotics

Social Robotics pp 461-471 | Cite as

Impact of Robot Actions on Social Signals and Reaction Times in HRI Error Situations

  • Nicole Mirnig
  • Manuel Giuliani
  • Gerald Stollnberger
  • Susanne Stadler
  • Roland Buchner
  • Manfred Tscheligi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9388)

Abstract

Human-robot interaction experiments featuring error situations are often excluded from analysis. We argue that a lot of value lies hidden in this discarded data. We analyzed a corpus of 201 videos that show error situations in human-robot interaction experiments. The aim of our analysis was to research (a) if and which social signals the experiment participants show in reaction to error situations, (b) how long it takes the participants to react in the error situations, and (c) whether different robot actions elicit different social signals. We found that participants showed social signals in 49.3% of error situations, more during social norm violations and less during technical failures. Task-related actions by the robot elicited less social signals by the participants, while participants showed more social signals when the robot did not react. Finally, the participants had an overall reaction time of 1.64 seconds before they showed a social signal in response to a robot action. The reaction times are specifically long (4.39 seconds) during task-related actions that go wrong during execution.

Keywords

Human-robot interaction Robot feedback Social robots 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bohus, D., Horvitz, E.: Managing human-robot engagement with forecasts and... um... hesitations. In: Proc. of the 16th Int. Conf. on Multimodal Interaction, ICMI 2014, pp. 2–9. ACM (2014)Google Scholar
  2. 2.
    Carter, E.J., Mistry, M.N., Carr, G.P.K., Kelly, B.A., Hodgins, J.K.: Playing catch with robots: incorporating social gestures into physical interactions. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 231–236. IEEE (2014)Google Scholar
  3. 3.
    Foster, M.E., Gaschler, A., Giuliani, M., Isard, A., Pateraki, M., Petrick, R.P.A.: Two people walk into a bar: dynamic multi-party social interaction with a robot agent. In: Proc. of the 14th ACM Int. Conf. on Multimodal Interaction, ICMI 2012), Santa Monica, USA, October 2012Google Scholar
  4. 4.
    Giuliani, M., Foster, M.E., Isard, A., Matheson, C., Oberlander, J., Knoll, A.: Situated reference in a hybrid human-robot interaction system. In: Proc. of the 6th Int. Natural Language Generation Conf., INLG 2010, Dublin, Ireland, July 2010Google Scholar
  5. 5.
    Giuliani, M., Mirnig, N., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Systematic analysis of video data from different human-robot interaction studies: a categorisation of social signals during error situations. submitted for publicationGoogle Scholar
  6. 6.
    Giuliani, M., Petrick, R.P., Foster, M.E., Gaschler, A., Isard, A., Pateraki, M., Sigalas, M.: Comparing task-based and socially intelligent behaviour in a robot bartender. In: Proc. of the 15th Int. Conf. on Multimodal Interfaces, ICMI 2013, Sydney, Australia, December 2013Google Scholar
  7. 7.
    Hagenaars, M.A., Oitzl, M., Roelofs, K.: Updating freeze: aligning animal and human research. Neurosc. & Biobehavioral Reviews 47, 165–176 (2014)CrossRefGoogle Scholar
  8. 8.
    Hoque, M.E., McDuff, D.J., Picard, R.W.: Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3(3), 323–334 (2012)CrossRefGoogle Scholar
  9. 9.
    Jang, M., Lee, D.-H., Kim, J., Cho, Y.: Identifying principal social signals in private student-teacher interactions for robot-enhanced education. In: Proc. of the 22rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2013, pp. 621–626, August 2013Google Scholar
  10. 10.
    Loth, S., Huth, K., De Ruiter, J.P.: Automatic detection of service initiation signals used in bars. Journal of Frontiers in Psychology 4 (2013)Google Scholar
  11. 11.
    Mirnig, N., Tscheligi, M.: Robots that Talk and Listen. Technological and Social Impact., chapter Comprehension, Coherence and Consistency: Essentials of Robot Feedback, pp. 149–171. De Gruyter (2014)Google Scholar
  12. 12.
    Sato, R., Takeuchi, Y.: Coordinating turn-taking and talking in multi-party conversations by controlling robot’s eye-gaze. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 280–285. IEEE (2014)Google Scholar
  13. 13.
    Schank, R., Abelson, R., et al.: Scripts, plans, goals and understanding: An inquiry into human knowledge structures, vol. 2. Lawrence Erlbaum Associates Hillsdale, NJ (1977)MATHGoogle Scholar
  14. 14.
    Stadler, S., Weiss, A., Tscheligi, M.: I trained this robot: the impact of pre-experience and execution behavior on robot teachers. In: Proc. of the 23rd Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2014, pp. 1030–1036. IEEE (2014)Google Scholar
  15. 15.
    Stanton, C., Stevens, C.J.: Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust. In: Beetz, M., Johnston, B., Williams, M.-A. (eds.) ICSR 2014. LNCS, vol. 8755, pp. 330–339. Springer, Heidelberg (2014) Google Scholar
  16. 16.
    Tseng, S.-H., Hsu, Y.-H., Chiang, Y.-S., Wu, T.-Y., Fu, L.-C.: Multi-human spatial social pattern understanding for a multi-modal robot through nonverbal social signals. In Proc. of the 23rd Int. Symp. on Robot and Human Interactive Comm., RO-MAN 2014, pp. 531–536. IEEE (2014)Google Scholar
  17. 17.
    Vinciarelli, A., Pantic, M., Heylen, D., Pelachaud, C., Poggi, I., D’Errico, F., Schröder, M.: Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE Transactions on Affective Computing 3(1), 69–87 (2012)CrossRefGoogle Scholar
  18. 18.
    Watzlawick, P., Bavelas, J.B., Jackson, D.D.: Pragmatics of human communication: A study of interactional patterns, pathologies and paradoxes. WW Norton & Company (2011)Google Scholar
  19. 19.
    Weiss, A., Mirnig, N., Bruckenberger, U., Strasser, E., Tscheligi, M., Kühnlenz, B., Wollherr, D., Stanczyk, B.: The Interactive Urban Robot: User-centered development and final field trial of a direction requesting robot. Paladyn Journal of Behavioral Robotics 6(1), 42–6 (2015)CrossRefGoogle Scholar
  20. 20.
    Witchel, H.J., Westling, C.E., Tee, J., Healy, A., Needham, R., Chockalingam, N.: What does not happen: Quantifying embodied engagement using nimi and self-adaptors. Journal of Audience and Reception Studies 11(1), 304–331 (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Nicole Mirnig
    • 1
  • Manuel Giuliani
    • 1
  • Gerald Stollnberger
    • 1
  • Susanne Stadler
    • 1
  • Roland Buchner
    • 1
  • Manfred Tscheligi
    • 1
  1. 1.Center for Human-Computer InteractionUniversity of SalzburgSalzburgAustria

Personalised recommendations