Skip to main content
Log in

Child’s Perception of Robot’s Emotions: Effects of Platform, Context and Experience

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Social robots may comfort and support children who have to cope with chronic diseases like diabetes. In social interactions, it is important to be able to express recognizable emotions. Studies show that the iCat robot, with its humanoid facial features, has this capability. In this paper we look if a Nao robot, without humanoid facial features, but with a body and colored eyes is also able to express recognizable emotions. We compare the recognition rates of the emotions between the Nao and the iCat. First a set of bodily expressions of the Nao for five basic emotions (angry, fear, happy, sad, surprise) was created and evaluated. With a signal detection task, the best recognizable bodily expression for each emotion was chosen for the final set. Then, fourteen children between 8 and 9 years old interacted both with the Nao and iCat to recognize the emotions within context, in a story-telling session, and without context. These interactions were repeated one week later to study the learning effect. For both robots, recognition rates for the expressions were relatively high (between 68 and 99 % accuracy). Only for the emotional state of sadness, the recognition was significantly higher for the iCat (95 %) than for the Nao (68 %). The emotions shown within context had higher recognition rates than those without context and during the second interaction the emotion recognition was also significantly higher than during the first session for both robots. To conclude: we succeeded to design a set of well-recognized dynamic emotional expressions for a robot platform, the Nao, without facial features. These expressions were better recognized when placed in a context, and when shown a week later. This set provides useful ingredients of social robot dialogs with children.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Nejat G, Sun Y, Nies M (2009) Assistive robots in health care settings. Home Health Care Manag Pract 21(3):177–187

    Article  Google Scholar 

  2. Lubsch L, Lynch JC (2006) Caring for children with diabetes. US Pharm 11:21–30

    Google Scholar 

  3. Looije R, Neerincx MA, de Lange V (2008) Children’s responses and opinion on three bots that motivate, educate and play. J Phys Agents 2(2):13–20

    Google Scholar 

  4. Ekman P, Friesen WV (1975) Unmasking the Face. A guide to recognizing emotions from facial expressions. Prentice-Hall, New Jersey

    Google Scholar 

  5. Kessens JM et al (2009) Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children. In: Affective computing and intelligent interaction and workshops. ACII, Amsterdam

  6. Bartneck CJ, Reichenbach, Breemen Av (2004) In your face, robot! The influence of a character’s embodiement on how users perceive its emotional expressions. In: Design and emotion, Ankara

  7. Beer J, Fisk A, Rogers W (2009) Emotion recognition of virtual agents facial expressions: the effects of age and emotion intensity. In: Human factors and ergonomics society annual meeting

  8. Beck A et al (2010) Interpretation of emotional body language displayed by robots. In: AFFINE, Firenze

  9. Beck A et al (2011) Children interpretation of emotional body language displayed by a robot. In ICSR

  10. Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquitous Comput 14:1617–4917

    Google Scholar 

  11. Kaya N, Epps H (2004) Relationship between color and emotion: a study of college students. Coll Stud J 38(3):396–405

    Google Scholar 

  12. Bianchi-Berthouze N, Kleinsmith A (2003) A categorical approach to affective gesture recognition. Connect Sci 15(4):259–269

    Article  Google Scholar 

  13. De Silva PR, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15:269–276

    Article  Google Scholar 

  14. Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusion, and viewpoint dependence. J Nonverbal Behav 28(2):117–139

    Article  MathSciNet  Google Scholar 

  15. Clay A, Couture N, Nigay L (2007) Emotion captures based on body postures and movement. In: International conference on computing and e-systems. Hammamet, Tunisia

  16. Barrett LF, Lindquist KA, Gendron M (2008) Language as context for the perception of emotion. Trends Cognit Sci 11(8):327–332

    Article  Google Scholar 

  17. Kafetsios K (2004) Attachment and emotional intelligence abilities across the life course. Personal Individ Differ 37(1):129–145

    Article  Google Scholar 

  18. Steele H, Steele M, Croft C (2008) Early attachment predicts emotion recognition at 6 and 11 years old. Attach Human Dev 10(4):379–393

    Article  Google Scholar 

  19. Norman AD, Ortony A (2003) Designers and users: two perspectives on emotion and design. In: Interaction Design Institute, Ivrea, Italy

  20. Tapus A, Mataric MJ (2006) Towards socially assistive robotics. Int J Robot Soc Jpn 25(5):14–16

    Google Scholar 

  21. Reeves B, Nass C (1998) The media equation: how people treat computers, television, and new media like real people and places. CSLI Publications, Stanford

    Google Scholar 

  22. Breazeal C (2003) Towards sociable robots. Robot Auton Syst 42:3–4

    Article  Google Scholar 

  23. Fogg BJ (2002) Persuasive technology: using computers to change what we think and do. Morgan Kaufmann, San Francisco, CA

    Google Scholar 

  24. Keltner D, Gross JJ (1999) Functional accounts of emotions. Cognit Emot 13(5):467–480

    Article  Google Scholar 

  25. Keltner D, Haidt J (1999) Social functions of emotions at four levels of analysis. Cognit Emot 13(5):505–521

    Article  Google Scholar 

  26. Aronson E, Wilson TD, Akert RM (2005) Social psychology. Pearson Education, New Jersey

    Google Scholar 

  27. de Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc 364:3475–3484

    Article  Google Scholar 

  28. Shaarani AS, Romano DM (2008) The intensity of perceived emotions in 3D virtual humans. In: International conference on autonomous agenst and multiagent systems, pp 1261–1264

  29. Kätsyri J, Sams M (2008) The effect of dynamics on identifying basic emotions from synthetic and natural faces. Int J Hum Comput Stud 66:233–242

    Article  Google Scholar 

  30. Boyatzis CJ, Varghese R (2010) Children’s emotional associations with colors. J Genet Psychol 155(1):77–85

    Article  Google Scholar 

  31. Nieuwenhuisen M, Behnke S (2013) Human-like interaction skills for the mobile communication robot Robotinho. Int J Soc Robot 5:519–561

    Article  Google Scholar 

  32. Johnson DO, Cuijpers RH, Pol Dvd (2013) Imitating human emotions with artificial facial expressions. Int J Soc Robot 5:503–513

    Article  Google Scholar 

  33. Stanislaw H, Todorov N (1999) Calculation of signal detection theory measures. Behav Res Methods Instrum Comput 31(1): 137–149

    Google Scholar 

  34. Mower E, Matarić JM, Narayanan S (2009) Human perception of audio-visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Trans Multimed 11(5):843–855

    Article  Google Scholar 

  35. McNally KA et al (2009) Application of signal detection theory to verbal memory testing to distinguish patients with psychogenic nonepileptic seizures from patients with epileptic seizures. Epilepsy Behav 14(4):597–603

    Article  Google Scholar 

  36. Grimm LG (1993) Statistical applications for the behavioral sciences. Wiley, New York

  37. Erden MS (2013) Emotional postures for the humanoid-robot Nao. Int J Soc Robot 5:441–456

    Article  Google Scholar 

  38. Truong KP, van Leeuwen DA, Neerincx MA (2007) Unobtrusive multimodal emotion detection in adaptive interfaces: speech and facial expressions. In: Schomorrow DD, Reeves LM (eds) Foundations of augmented cognition. Lecture notes in artificial intelligence, vol 4565. Springer, Heidelberg, pp 354–363

  39. Beck A, Cañamero L, Bard, KA (2010) Towards an affect space for robots to display emotional body language. In: International symposium on robot and human interactive communication, Viareggio, Italy

  40. Weis A, Wurhofer D, Tscheligi M (2009) “I love this dog”—children’s emotional attachment to the robotic dog AIBO. Int J Soc Robot 1:243–248

    Article  Google Scholar 

  41. Tanaka F, Cicourel A, Movellan JR (2007) Socialization between toddlers and robots at an early childhood education center. In: National academy of sciences of the united states of america

Download references

Acknowledgments

This work is (partially) funded by the EU FP7 ALIZ-E project (Grant No: 248116). Thanks to Stella Donker and Linda van Ooijen from Utrecht University. Thanks to OBS de Watersnip from Zoetermeer, The Netherlands, and the children and parents for participating. Thanks to Andrea Kleinsmith and Nadia Bianchi-Berthouze for access to their database of affective postures. A. Kleinsmith, R. De Silva, N. Bianchi-Berthouze, “Cross-Cultural Differences in Recognizing Affect from Body Posture”, Interacting with Computers, 18(6), (2006) 1371–1389. Thanks to Mark Coulson for his pictures of affective body postures from his research. And thanks to Bert Bierman, Stella Donker and Linda van Ooijen for their contributions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to I. Cohen.

Appendix

Appendix

See the Appendix Table 4

Table 4 Parameters of the joint values during maximum emotional expression of the Nao’s emotional movements

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cohen, I., Looije, R. & Neerincx, M.A. Child’s Perception of Robot’s Emotions: Effects of Platform, Context and Experience. Int J of Soc Robotics 6, 507–518 (2014). https://doi.org/10.1007/s12369-014-0230-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0230-6

Keywords

Navigation