Skip to main content
Log in

Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The effects of iconic gestures and eye contact on message retention in human-robot interaction were investigated in a series of experiments. A humanoid robot gave short verbal messages to participants, accompanied either by iconic gestures or no gestures while making eye contact with the participant or looking away. Results show that the use of iconic gestures aids retention of the verb to which the action-depicting gestures pertain. The various expected effects of eye contact were not supported by the data. Implications of these results for the design of interaction modalities between robots and people are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Participants were elderly people because the research conducted in this study is part of the KSERA project which aims at introducing socially-assistive robots in seniors homes (http://www.ksera.ieis.tue.nl).

  2. Translated from the original Dutch “De docent wenkte de lange leerling naar het bedompte kantoor”.

  3. Synoniemen.net is an online Dutch dictionary which provides synonyms. URL: http://synoniemen.net/.

References

  1. Adams RB, Pauker K, Weisbuch M (2010) Looking the other way: the role of gaze direction in the cross-race memory effect. J Exp Soc Psychol 46(2):478–481

    Article  Google Scholar 

  2. Aldebaran Robotics (2012) NAO Datasheet H25—Corporate—Aldebaran Robotics|Discover NAO

  3. Aldebaran Robotics (2012) ALFaceTracker API—NAO Software 1.12.5 documentation

  4. Andric M, Small SL (2012) Gesture’s neural language. Front Psychol 3(99). doi:10.3389/fpsyg.2012.00099

  5. Bartneck C, Kuliç D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81

    Article  Google Scholar 

  6. Begg I (1971) Recognition memory for sentence meaning and wording. J Verbal Learn Verbal Behav 10(2):176–181

    Article  Google Scholar 

  7. Begg I, Paivio A (1969) Concreteness and imagery in sentence meaning. J Verbal Learn Verbal Behav 8(6):821–827

    Article  Google Scholar 

  8. Chovil N (1991) Discourse-oriented facial displays in conversation. Res Lang Soc Interact 25(1–4):163–194

    Article  Google Scholar 

  9. Cohen J (1988) Statistical power analysis for the behavioral sciences, 2nd edn. Routledge, London

    MATH  Google Scholar 

  10. Fullwood C, Doherty-Sneddon G (2006) Effect of gazing at the camera during a video link on recall. Appl Ergon 37(2):167–175

    Article  Google Scholar 

  11. Holmes VM, Langford J (1976) Comprehension and recall of abstract and concrete sentences. J Verbal Learn Verbal Behav 15(5):559–566

    Article  Google Scholar 

  12. Carlton TJ, Abrahamson AA (1977) Recognition memory for active and passive sentences. J Psycholinguist Res 6(1):37–47

    Article  Google Scholar 

  13. Jansen E (2011) The effect of natural head and iconic hand gestures on message recall in human-robot interaction. HTI Research Project Thesis

  14. Murdock BB Jr. (1960) The distinctiveness of stimuli. Psychol Rev 67:16–31

    Article  Google Scholar 

  15. Kanda T, Ishiguro H, Ono T, Imai M, Nakatsu R (2002) Development and evaluation of an interactive humanoid robot “robovie”. In: Proceedings of the IEEE international conference on robotics and automation, ICRA’02, vol 2, pp 1848–1855

    Google Scholar 

  16. Kelly SD, Barr DJ, Church RB, Lynch K (1999) Offering a hand to pragmatic understanding: the role of speech and gesture in comprehension and memory. J Mem Lang 40(4):577–592

    Article  Google Scholar 

  17. Kita S, Ozyurek A (2003) What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking. J Mem Lang 48(1):16–32

    Article  Google Scholar 

  18. Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78

    Article  Google Scholar 

  19. Krämer NC, Bente G (2010) Personalizing e-learning. The social effects of pedagogical agents. Educ Psychol Rev 22(1):71–87

    Article  Google Scholar 

  20. Martin E, Roberts KH (1966) Grammatical factors in sentence retention. J Verbal Learn Verbal Behav 5(3):211–218

    Article  Google Scholar 

  21. Martin E, Roberts KH, Collins AM (1968) Short-term memory for sentences. J Verbal Learn Verbal Behav 7:560–566

    Article  Google Scholar 

  22. Moreno R, Mayer RE, Spires HA, Lester JC (2001) The case for social agency in computer-based teaching: do students learn more deeply when they interact with animated pedagogical agents? Cogn Instr 19(2):177–213

    Article  Google Scholar 

  23. Mutlu B, Forlizzi J, Hodgins J (2006) A storytelling robot: modeling and evaluation of human-like gaze behavior. In: 6th IEEE-RAS international conference on humanoid robots, pp 518–523

    Google Scholar 

  24. Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human-robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, HRI’09, pp 61–68

    Chapter  Google Scholar 

  25. Neath I (1993) Distinctiveness and serial position effects in recognition. Mem Cogn 21:689–698

    Article  Google Scholar 

  26. Neath I, Brown GDA (2006) SIMPLE: further applications of a local distinctiveness model of memory. Psychol Learn Motiv 46:201–243

    Article  Google Scholar 

  27. Perfetti CA (1969) Lexical density and phrase structure depth as variables in sentence retention. J Verbal Learn Verbal Behav 8(6):719–724

    Article  Google Scholar 

  28. Riek LD, Rabinowitch T-C, Bremner P, Pipe AG, Fraser M, Robinson P (2010) Cooperative gestures: effective signaling for humanoid robots. In: Proceedings of the 5th ACM/IEEE international conference on human-robot interaction, HRI’10, pp 61–68

    Google Scholar 

  29. Patricia R (1996) The effects of vocal variation on listener recall. J Psycholinguist Res 25:431–441

    Article  Google Scholar 

  30. Salem M, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: investigating the effect of multimodal robot behavior in human-robot interaction. In: IEEE, RO-MAN, pp 247–252

    Google Scholar 

  31. Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings of the 9th international conference on intelligent user interfaces, IUI’04, pp 78–84

    Google Scholar 

  32. Sidner CL, Lee C, Kidd CD, Lesh N (2005) Explorations in engagement for humans and robots. Artif Intell 166(1–2):140–164

    Article  Google Scholar 

  33. Singer MA, Goldin-Meadow S (2005) Children learn when their teacher’s gestures and speech differ. Psychol Sci 16(2):85–89

    Article  Google Scholar 

  34. Staudte M, Crocker MW (2011) Investigating joint attention mechanisms through spoken human–robot interaction. Cognition 120(2):268–291

    Article  Google Scholar 

  35. Valenzeno L, Alibali MW, Klatzky R (2003) Teachers’ gestures facilitate students’ learning: a lesson in symmetry. Contemp Educ Psychol 28(2):187–204

    Article  Google Scholar 

  36. Zimmerman BJ, Pons MM (1986) Development of a structured interview for assessing student use of self-regulated learning strategies. Am Educ Res J 23(4):614–628

    Article  Google Scholar 

Download references

Acknowledgements

The research leading to these results is part of the KSERA project (http://www.ksera-project.eu) and has received funding from the European Commission under the 7th Framework Programme (FP7) for Research and Technological Development under grant agreement n 2010-248085.

The authors would like to thank Eline Jansen for her contribution to this project. Eline’s work in her Research Project Thesis [13] forms the foundation of this study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elena Torta.

Appendix: Messages Used During the Experiment

Appendix: Messages Used During the Experiment

The messages used for presentation by Nao can be found in Table 2. For each sentence both the original Dutch sentence and an English translation (between brackets) are provided. The action depicted in the accompanying gesture is also given.

Table 2 The messages presented by Nao during the experiments

Rights and permissions

Reprints and permissions

About this article

Cite this article

van Dijk, E.T., Torta, E. & Cuijpers, R.H. Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction. Int J of Soc Robotics 5, 491–501 (2013). https://doi.org/10.1007/s12369-013-0214-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0214-y

Keywords

Navigation