Advertisement

Non-verbal Signals in HRI: Interference in Human Perception

  • Wafa Johal
  • Gaëlle Calvary
  • Sylvie Pesty
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9388)

Abstract

Non-verbal cues of communication can influence the human understanding of verbal signals in human-human communication. We present two illustrative experimental studies showing how non-verbal cues can both interfere and facilitate communication when passing a message to a user in HRI. In the first study, participants found that the cues enabling them to discriminate between two conditions : permissive or authoritative robots were mainly verbal. The verbal message was however unchanged between these two conditions and in this case, non-verbal cues of communication (gestures, posture, voice tone and gaze) substituted the neutral verbal message. The second study highlights the fact that verbal and non-verbal communication can facilitate the understanding of messages when combined appropriately. This study is based on a Stroop task of identifying the colour of the LEDs of a robot while the robot says words that are either facilitating, neutral or disturbing for the participant. These two studies put into perspective the importance of understanding interrelations between non-verbal and verbal signals in HRI.

Keywords

Parenting Style Stroop Task Humanoid Robot Incongruence Condition Social Robot 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aldebaran nao robot (2013). https://community.aldebaran-robotics.com/
  2. 2.
    Robopec reeti robot (2013). http://reeti.fr/index.php/en/
  3. 3.
    Bethel, C.L., Murphy, R.R.: Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 38(1), 83–92 (2008)CrossRefGoogle Scholar
  4. 4.
    Breazeal, C.: Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 59(1–2), 119–155 (2003)CrossRefGoogle Scholar
  5. 5.
    van den Brule, R., Dotsch, R., Bijlstra, G., Wigboldus, D.H., Haselager, P.: Do robot performance and behavioral style affect human trust? International Journal of Social Robotics 6(4), 519–531 (2014)CrossRefGoogle Scholar
  6. 6.
    Embgen, S., Luber, M., Becker-Asano, C., Ragni, M., Evers, V., Arras, K.O.: Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, pp. 1019–1025 (2012)Google Scholar
  7. 7.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robotics and autonomous systems 42(3), 143–166 (2003)CrossRefzbMATHGoogle Scholar
  8. 8.
    Gallaher, P.E.: Individual differences in nonverbal behavior: Dimensions of style. Journal of Personality and Social Psychology 63(1), 133–145 (1992)CrossRefGoogle Scholar
  9. 9.
    Hall, J.A., Coats, E.J., LeBeau, L.S.: Nonverbal behavior and the vertical dimension of social relations: a meta-analysis. Psychological bulletin 131(6), 898–924 (2005)CrossRefGoogle Scholar
  10. 10.
    Johal, W., Pesty, S., Calvary, G.: Towards companion robots behaving with style. In: 2014 RO-MAN: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 1063–1068. IEEE (2014)Google Scholar
  11. 11.
    Knapp, M., Hall, J., Horgan, T.: Nonverbal communication in human interaction. Cengage Learning (2013)Google Scholar
  12. 12.
    MacLeod, C.M.: The stroop task: The “gold standard” of attentional measures. Journal of Experimental Psychology: General 121(1), 12 (1992)CrossRefGoogle Scholar
  13. 13.
    Mehlmann, G., Häring, M., Janowski, K., Baur, T., Gebhard, P., André, E.: Exploring a model of gaze for grounding in multimodal HRI. In: ICMI 2014, pp. 247–254. ACM, New York (2014)Google Scholar
  14. 14.
    Pelachaud, C.: Studies on gesture expressivity for a virtual agent. Speech Communication 51(7), 630–639 (2009)CrossRefGoogle Scholar
  15. 15.
    Vernon, D., Hofsten, C., Fadiga, L.: A roadmap for cognitive development in humanoid robots. In: Cognitive Systems Monographs, vol. 11. Springer, Heidelberg (2011)Google Scholar
  16. 16.
    Wallbott, H.: Bodily expression of emotion. European Journal of Social Psychology 896(November 1997) (1998)Google Scholar
  17. 17.
    Walters, M.L., Syrdal, D.S., Dautenhahn, K., Te Boekhorst, R., Koay, K.L.: Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Autonomous Robots 24(2), 159–178 (2008)CrossRefGoogle Scholar
  18. 18.
    Williams, J.M.G., Mathews, A., MacLeod, C.: The emotional stroop task and psychopathology. Psychological bulletin 120(1), 3 (1996)CrossRefGoogle Scholar
  19. 19.
    Xu, J., Broekens, J.: Mood expression through parameterized functional behavior of robots. RO-MAN (2013)Google Scholar
  20. 20.
    Xu, J., Broekens, J., Hindriks, K., Neerincx, M.A.: Bodily mood expression: recognize moods from functional behaviors of humanoid robots. In: Pearson, M.J., Leonards, U., Herrmann, G., Lenz, A., Bremner, P., Spiers, A. (eds.) ICSR 2013. LNCS, vol. 8239, pp. 511–520. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  21. 21.
    Yaffe, P.: The 7% rule: Fact, fiction, or misunderstanding. ACM Ubiquity 2011(October), 1:1–1:5 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Authors and Affiliations

  1. 1.Univ. Grenoble Alpes, LIGGrenobleFrance

Personalised recommendations