Advertisement

Engaging with Robots While Giving Simple Instructions

  • Terry Tritton
  • Joanna Hall
  • Angela Rowe
  • Sophie Valentine
  • Alicja Jedrzejewska
  • Anthony G. Pipe
  • Chris Melhuish
  • Ute Leonards
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7429)

Abstract

To facilitate fluent interaction with humans, socially assistive robots need to communicate in a way that can be intuitively understood. To investigate the effects of important nonverbal gestures on human experience in human-robot interactions, participants read a series of instructions to a robot which responded with nods, blinks, changes in gaze direction, or a combination of these. Participants then rated their engagement in the task as well as the perceived robot engagement, perceived robot comprehension and the robot’s likability. Unbeknown to the participants, the robot had no form of speech processing or gesture recognition, but simply measured speech volume levels and responded with a gesture or combination of gestures when it detected a lull in sound. Engagement of the human participants was not differentially affected by the different responses of the robot. However, the participants’ perception of the robot’s engagement in the task and its understanding of the instructions being communicated as well as its likability depended on the nonverbal gesture presented, with nodding being the most effective response.

Keywords

Human-robot interaction non-verbal communication experienced engagement 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cawsey, A.: Planning interactive explanations. Int. J. Man-Machine Stud. 38, 169–199 (1993)CrossRefGoogle Scholar
  2. 2.
    Moore, J.D., Paris, C.L.: Planning text for advisory dialogues: Capturing intentional and rhetorical information. Comput. Linguistics 19, 651–695 (1993)Google Scholar
  3. 3.
    Yamazaki, A., Yamazaki, K., Burdelski, M., Kuno, Y., Fukushima, M.: Coordination of verbal and non-verbal actions in human–robot interaction at museums and exhibitions. Journal of Pragmatics 42, 2398–2410 (2010)CrossRefGoogle Scholar
  4. 4.
    Breazeal, C.: Toward sociable robots. Robotics and Autonomous Systems 42(3-4), 167–175 (2003)zbMATHCrossRefGoogle Scholar
  5. 5.
    Cassell, J., Thorisson, K.R.: The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Applied Artificial Intelligence 13(4-5), 519–538 (1999)CrossRefGoogle Scholar
  6. 6.
    Clark, H.H., Krych, M.A.: Speaking while monitoring addressees for understanding. Journal of Memory and Language 50, 62–81 (2004)CrossRefGoogle Scholar
  7. 7.
    McClave, E.Z.: Linguistic functions of head movements in the context of speech. Journal of Pragmatics 32, 855–878 (2000)CrossRefGoogle Scholar
  8. 8.
    Kendon, A.: Some functions of gaze-direction in social interaction. Acta Psychologica (Amst) 26(1), 22–63 (1967)CrossRefGoogle Scholar
  9. 9.
    Doherty-Sneddon, G., Bruce, V., Bonner, L., Longbotham, S., Doyle, C.: Development of gaze aversion as disengagement from visual information. Developmental Psychology 38, 438–445 (2002)CrossRefGoogle Scholar
  10. 10.
    Glenberg, A.M., Schroeder, J.L., Robertson, D.A.: Averting the gaze disengages the environment and facilitates remembering. Memory and Cognition 26, 651–658 (1998)CrossRefGoogle Scholar
  11. 11.
    Bentivoglio, A.R., Bressman, S.B., Cassetta, E., Carretta, D., Tonali, P., Albanese, A.: Analysis of blink rate patterns in normal subjects. Movement Disorders 12(6), 1028–1034 (1997), doi:10.1002/mds.870120629.CrossRefGoogle Scholar
  12. 12.
    Hirokawa, K., Yagi, A., Miyata, Y.: Comparison of blinking behavior during listening to and speaking in Japanese and English. Percept. Mot. Skills 98, 463–472 (2004)CrossRefGoogle Scholar
  13. 13.
    Ousler, G.W., Hagberg, K.W., Schnindelar, M., Welch, D., Abelson, M.B.: The ocular protection index. Cornea 27, 509–513 (2008)CrossRefGoogle Scholar
  14. 14.
    Omori, Y., Yamada, F., Miyata, Y.: Influences of blinking on person perception. Jpn. J. Soc. Psychol. 83, 591–594 (1997)Google Scholar
  15. 15.
    Omori, Y., Miyata, Y.: Estimates of impressions based on frequency of blinking. Social Behaviour and Personality 29, 159–167 (2001)CrossRefGoogle Scholar
  16. 16.
    Mann, S., Vrij, A., Bull, R.: Suspects, lies, and videotape: an analysis of authentic high-stake liars. Law and Human Behavior 26, 365–376 (2002)CrossRefGoogle Scholar
  17. 17.
    Leal, S., Vrij, A.: Blinking during and after lying. Journal of Nonverbal Behavior 32, 187–194 (2008)CrossRefGoogle Scholar
  18. 18.
    Birdwhistell, R.: Kinesics and context. University of Pennsylvania Press, Philadelphia (1970)Google Scholar
  19. 19.
    Sidner, C.L., Lee, C., Morency, L.-P., Forlines, C.: The Effect of Head-Nod Recognition in Human-Robot Conversation. In: ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI), pp. 290–296 (2006) ISBN: 1-59593-294-1Google Scholar
  20. 20.
    Sidner, C.L., Lee, C.: Attentional Gestures in Dialogues between People and Robots. In: Nishida, T. (ed.) Engineering Approaches to Conversational Informatics. Wiley and Sons (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Terry Tritton
    • 1
  • Joanna Hall
    • 2
  • Angela Rowe
    • 2
  • Sophie Valentine
    • 2
  • Alicja Jedrzejewska
    • 2
  • Anthony G. Pipe
    • 1
  • Chris Melhuish
    • 1
  • Ute Leonards
    • 1
    • 2
  1. 1.Bristol Robotics Laboratory, T BlockUniversity of the West of EnglandBristolUK
  2. 2.School of Experimental PsychologyUniversity of BristolBristolU.K.

Personalised recommendations