International Journal of Social Robotics

, Volume 10, Issue 5, pp 569–582 | Cite as

The Effects of Humanlike and Robot-Specific Affective Nonverbal Behavior on Perception, Emotion, and Behavior

  • Astrid M. Rosenthal-von der PüttenEmail author
  • Nicole C. Krämer
  • Jonathan Herrmann


Research demonstrated that humans are able to interpret humanlike (affective) nonverbal behavior (HNB) in artificial entities (e.g. Beck et al., in: Proceedings of the 19th IEEE international symposium on robot and human interactive communication, IEEE Press, Piscataway, 2010.; Bente et al. in J Nonverbal Behav 25: 151–166, 2001; Mumm and Mutlu, in: Proceedings of the 6th international conference on human–robot interaction, HRI. ACM Press, New York, 2011. However, some robots lack the possibility to produce HNB. Using robot-specific nonverbal behavior (RNB) such as different eye colors to convey emotional meaning might be a fruitful mechanism to enhance HRI experiences, but it is unclear whether RNB is as effective as HNB. We present a review on affective nonverbal behaviors in robots and an experimental study. We experimentally tested the influence of HNB and RNB (colored LEDs) on users’ perception of the robot (e.g. likeability, animacy), their emotional experience, and self-disclosure. In a between-subjects design, users (\(n=80\)) interacted with either (a) a robot displaying no nonverbal behavior, (b) a robot displaying affective RNB, (c) a robot displaying affective HNB or (d) a robot displaying affective HNB and RNB. Results show that HNB, but not RNB, has a significant effect on the perceived animacy of the robot, participants’ emotional state, and self-disclosure. However, RNB still slightly influenced participants’ perception, emotion, and behavior: Planned contrasts revealed having any type of nonverbal behavior significantly increased perceived animacy, positive affect, and self-disclosure. Moreover, observed linear trends indicate that the effects increased with the addition of nonverbal behaviors (control< RNB< HNB). In combination, our results suggest that HNB is more effective in transporting the robot’s communicative message than RNB.


Humanoid robot Human–robot interaction Experimental study Affective nonverbal behavior Self-disclosure Emotional state 


  1. 1.
    Banse R, Scherer KR (1996) Acoustic profiles in vocal emotion expression. J Pers Soc Psychol 70:614–636CrossRefGoogle Scholar
  2. 2.
    Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement Instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1:71–81CrossRefGoogle Scholar
  3. 3.
    Beck A, Canamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: Proceedings of the 19th IEEE international symposium on robot and human interactive communication, RO-MAN. IEEE Press, Piscataway, New Jersey, pp 464–469.
  4. 4.
    Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5:325–334CrossRefGoogle Scholar
  5. 5.
    Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: 2011 IEEE workshop on affective computational intelligence. Piscataway, New Jersey, pp 1–8.
  6. 6.
    Bente G, Krämer NC (2003) Integrierte Registrierung und Analyse verbaler und nonverbaler Kommunikation. In: Herrmann T, Grabowski, J (ed) Sprachproduktion. Enzyklopädie der Psychologie. Themenbereich C Serie 3, Band 1. Hogrefe, Göttingen, pp 219–246Google Scholar
  7. 7.
    Bente G, Krämer NC, Petersen A, de Ruiter JP (2001) Computer animated movement and person perception: methodological advances in nonverbal behavior research. J Nonverbal Behav 25:151–166CrossRefGoogle Scholar
  8. 8.
    Breazeal C, Kidd C, Thomaz A, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, IROS, pp 708–713.
  9. 9.
    Burgoon JK, Bacue AE (2003) Nonverbal communication skills. In: Greene JO, Burleson BR (eds) Handbook of communication and social interaction skills. LEA’s communication series. Lawrence Erlbaum Associates, Mahwah, pp 179–220Google Scholar
  10. 10.
    Burgoon JK, Guerrero LK, Manusov V (2011) Nonverbal signals. In: Knapp ML, Daly JA (eds) The sage handbook of interpersonal communication. Sage, Thousand Oaks, pp 239–282Google Scholar
  11. 11.
    Cicchetti DV (1994) Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychol Assess 6:284–290CrossRefGoogle Scholar
  12. 12.
    Collins EC, Prescott TJ, Mitchinson B (2015) Saying it with light: a pilot study of affective communication using the MIRO robot. In: Wilson SP, Verschure PF, Mura A, Prescott TJ (eds) Biomimetic and biohybrid systems. Lecture notes in computer science. Springer, Cham, pp 243–255. CrossRefGoogle Scholar
  13. 13.
    Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12:1085–1101CrossRefGoogle Scholar
  14. 14.
    Ekman P (1993) Facial expression and emotion. Am Psychol 48:384–392CrossRefGoogle Scholar
  15. 15.
    Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO (2012) Robot-specific social cues in emotional body language. In: The 21st IEEE international symposium on robot and human interactive communication, RO-MAN, pp 1019–1025.
  16. 16.
    Häring M, Bee N, André E (2011) Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: The 20th IEEE international symposium on robot and human interactive communication, RO-MAN, pp 204–209.
  17. 17.
    Hurlbert AC, Ling Y (2007) Biological components of sex differences in color preference. Curr Biol 17:R623CrossRefGoogle Scholar
  18. 18.
    Johnson DO, Cuijpers RH, van der Pol D (2013) Imitating human emotions with artificial facial expressions. Int J Soc Robot 5:503–513CrossRefGoogle Scholar
  19. 19.
    Kang SH, Gratch J (2010) Virtual humans elicit socially anxious interactants’ verbal self-disclosure. Comput Animat Virt W 21:473–482Google Scholar
  20. 20.
    Kishi T, Endo N, Nozawa T et al. (2010) Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively. In: Proceedings of the IEEE international conference on robotics and automation (ICRA’10), pp 1965–1970Google Scholar
  21. 21.
    Krämer NC, Kopp S, Becker-Asano C, Sommer N (2013) Smile and the world will smile with you—the effects of a virtual agent’s smile on users’ evaluation and behavior. Int J Hum Comput St 71:335–349CrossRefGoogle Scholar
  22. 22.
    Leite I, Martinho C, Pereira A, Paiva A (2008) iCat: an affective game buddy based on anticipatory mechanisms. In: Proceedings of the 7th international conference on autonomous agents and multiagent systems, AAMAS. Estoril, Portugal, pp 1229–1232Google Scholar
  23. 23.
    Leite I, Mascarenhas S, Pereira A, Martinho C, Prada R, Paiva A (2010) ”Why can’t we be friends?” an empathic game companion for long-term interaction. In: Hutchison D et al (eds) Intelligent virtual agents. Lecture notes in computer science. Springer, Berlin, pp 215–321. CrossRefGoogle Scholar
  24. 24.
    Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3:125–142CrossRefGoogle Scholar
  25. 25.
    Manav B (2007) Color-emotion associations and color preferences: a case study for residences. Color Res App 32:144–150CrossRefGoogle Scholar
  26. 26.
    Manstead ASR, Fischer AH, Jakobs EB (1999) The social and emotional functions of facial displays. In: Phillipot P, Feldman RS, Coats EJ (eds) The social context of nonverbal behavior. Cambridge University Press, Cambridge, pp 287–316Google Scholar
  27. 27.
    McGraw KO, Wong SP (1996) Forming inferences about some intraclass correlation coefficients. Psychol Methods 1:30–46CrossRefGoogle Scholar
  28. 28.
    Mumm J, Mutlu B (2011) Human–robot proxemics. In: Proceedings of the 6th international conference on human–robot interaction, HRI. ACM Press, New York, USA.
  29. 29.
    Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human-robot conversations. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction. ACM Press, New York, p 61.
  30. 30.
    Mutlu B, Yamaoka F, Kanda T, Ishiguro H, Hagita N (2009) Nonverbal leakage in robots. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction. ACM Press, New York, 69 p.
  31. 31.
    Nass C, Moon Y (2000) Machines and mindlessness: social responses to computers. J Soc Issues 56:81–103CrossRefGoogle Scholar
  32. 32.
    Nomura T, Suzuki T, Kanda T et al (2006) Measurement of negative attitudes toward robots. Interact Stud 7(3):437–454. CrossRefGoogle Scholar
  33. 33.
    Nomura T, Suzuki T, Kanda T et al. (2007) Measurement of anxiety toward robots. In: Proceedings of the 16th IEEE international conference on robot and human interactive communication. IEEE Press; IEEE, Piscataway, NJ, pp 372–377Google Scholar
  34. 34.
    Pereira A, Leite I, Mascarenhas S, Martinho C, Paiva A (2011) Using empathy to improve human-robot relationships. In: Akan O, Bellavista P, Cao J, Dressler F, Ferrari D, Gerla M, Kobayashi H, Pallazo S, Sahni S, Shen X, Stan M, Xiaohua J, Zomaya A, Coulson G, Lamers MH, Verbeek FJ (eds) Human-robot personal relationships. Lecture notes of the institute for computer sciences, social informatics and telecommunications engineering. Springer, Berlin, pp 130–138. CrossRefGoogle Scholar
  35. 35.
    Press C (2011) Action observation and robotic agents: learning and anthropomorphism. Neurosci Biobehav R 35:1410–1418CrossRefGoogle Scholar
  36. 36.
    Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5:17–34CrossRefGoogle Scholar
  37. 37.
    Rosenthal-von der Pütten AM, Schulte FP, Eimler SC, Sobieraj S, Hoffmann L, Maderwald S, Brand M, Krämer NC (2014) Investigations on empathy towards humans and robots using fMRI. Comput Hum Behav 33:201–212. CrossRefGoogle Scholar
  38. 38.
    Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: Hutchison D et al (eds) Social robotics. Lecture notes in computer science. Springer, Berlin, pp 31–41. CrossRefGoogle Scholar
  39. 39.
    Scheutz M, Schermerhorn P, Kramer J (2006) The utility of affect expression in natural language interactions in joint human–robot tasks. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human robot interaction, HRI,
  40. 40.
    Suzuki Y, Galli L, Ikeda A et al (2015) Measuring empathy for human and robot hand pain using electroencephalography. Sci Rep 5:15924CrossRefGoogle Scholar
  41. 41.
    Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: The 21st IEEE international symposium on robot and human interactive communication, RO-MAN, pp 314–321.
  42. 42.
    Tsai J, Bowring E, Marsella S, Wood W, Tambe M (2012) A study of emotional contagion with virtual characters. In: Proceedings of the 12th international workshop on intelligent virtual agents 7502, IVA. Springer, Berlin, pp 81–88. CrossRefGoogle Scholar
  43. 43.
    Valdez P, Mehrabian A (1994) Effects of color on emotions. J Exp Psychol Gen 123:394–409CrossRefGoogle Scholar
  44. 44.
    von der Pütten AM, Klatt J, Hoffmann L, Krämer NC (2011) Quid pro quo? Reciprocal self-disclosure and communicative accomodation towards a virtual interviewer. In: Lecture notes in computer science 6895. Springer, Berlin, pp. 183–194Google Scholar
  45. 45.
    von der Pütten AM, Krämer NC, Gratch J, Kang S-H (2010) It doesn’t matter what you are! Explaining social effects of agents and avatars. Comput Hum Behav 26:1641–1650CrossRefGoogle Scholar
  46. 46.
    Wallbott HG (1988) In and out of context: influences of facial expression and context information on emotion attributions. Brit J Soc Psychol 27:357–369CrossRefGoogle Scholar
  47. 47.
    Watson D, Tellegen A, Clark LA (1988) Development and validation of brief measure of positive and negative affect: the PANAS scales. J Pers Soc Psychol 54:1063–1070CrossRefGoogle Scholar
  48. 48.
    Wu Y, Babu SV, Armstrong R, Bertrand JW, Luo J, Roy T, Daily SB, Dukes LC, Hodges LF, Fasolino T (2014) Effects of virtual human animation on emotion contagion in simulated inter-personal experiences. IEEE T Vis Comput Gr 20:626–635CrossRefGoogle Scholar
  49. 49.
    Xu J, Broekens J, Hindriks K, Neerincx MA (2014) Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the 2014 international conference on autonomous agents and multi-agent systems. International foundation for autonomous agents and multiagent systems, Paris, France, pp 973–980Google Scholar
  50. 50.
    Zatsiorsky VM, Prilutsky BI (2012) Biomechanics of skeletal muscles. Human Kinetics, ChampaignGoogle Scholar
  51. 51.
    Zecca M, Endo N, Momoki S, Itoh K, Takanishi A (2008) Design of the humanoid robot KOBIAN -preliminary analysis of facial and whole body emotion expression capabilities-. In: 2008 8th IEEE-RAS international conference on humanoid robots, Humanoids, pp 487–492.

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  • Astrid M. Rosenthal-von der Pütten
    • 1
    • 2
    Email author
  • Nicole C. Krämer
    • 2
  • Jonathan Herrmann
    • 2
  1. 1.RWTH Aachen UniversityAachenGermany
  2. 2.University of Duisburg-EssenDuisburgGermany

Personalised recommendations