Skip to main content
Log in

Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This study explores deriving minimal features for a robotic face to convey information (via facial expressions) that people can perceive and understand. Recent research in computer vision has shown that a small number of moving points/lines can be used to capture the majority of information (\(\sim \)95 %) in human facial expressions. Here, we apply such findings to a minimalist robot face design, which was run through a series of experiments with human subjects (n = 75) exploring the effect of various factors, including added neck motion and degree of expression. Facial expression identification rates were similar to more complex robots. In addition, added neck motion significantly improved facial expression identification rates to 100 % for all expressions (except Fear). The Negative Attitudes towards Robots (NARS) and Godspeed scales were also collected to examine user perceptions, e.g. perceived animacy and intelligence. The project aims to answer a number of fundamental questions about robotic face design, as well as to develop inexpensive and replicable robotic faces for experimental purposes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Ekman P, Friesen WV (2003) Unmasking the face: a guide to recognizing emotions from facial clues. Malor Books, Los Altos

    Google Scholar 

  2. Cohn JF (2010) Advances in behavioral science using automated facial image analysis and synthesis. IEEE Signal Process Mag 27(6):128–133

    Google Scholar 

  3. Pantic M (2009) Machine analysis of facial behaviour: naturalistic and dynamic behaviour. Philos Trans R Soc Lond B 364(1535):3505–3513

    Article  Google Scholar 

  4. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155

    Article  Google Scholar 

  5. Sosnowski S, Bittermann A, Kuhnlenz K, Buss M (2006) Design and evaluation of emotion-display EDDIE. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 3113–3118

  6. Matsumoto N, Fujii H, Okada M (2006) Minimal design for human–agent communication. Artif Life Robot 10(1):49–54

    Article  Google Scholar 

  7. Kozima H, Michalowski MP, Nakagawa C (2009) Keepon: a playful robot for research, therapy, and entertainment. Int J Soc Robot 1(1):3–18

    Article  Google Scholar 

  8. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Rob Auton Syst 42(3–4):143–166

    Article  MATH  Google Scholar 

  9. Blow M, Dautenhahn K, Appleby A, Nehaniv CL, Lee DC (2006) Perception of robot smiles and dimensions for human-robot interaction design. In: 15th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pp 469–474

  10. Chaminade T, Zecca M, Blakemore S-J, Takanishi A, Frith CD, Micera S et al (2010) Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. PLoS ONE 5(7):e11577

    Article  Google Scholar 

  11. Mori M (1970) Bukimi no tani [The uncanny valley]. Energy 7(4):33–35. http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley. Accessed 15 Feb 2014

  12. MacDorman KF, Green RD, Ho C-C, Koch CT (2009) Too real for comfort? Uncanny responses to computer generated faces. Comput Hum Behav 25(3):695–710

    Article  Google Scholar 

  13. DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th ACM Conference on Designing Interactive Systems, pp 321–326

  14. Mayer C, Sosnowski S, Kuhnlenz K, Radig B (2010) Towards robotic facial mimicry: system development and evaluation. In: Proceedings of the IEEE RO-MAN Conference, pp 198–203

  15. Bazo D, Vaidyanathan R, Lentz A, Melhuish C (2010) Design and testing of a hybrid expressive face for a humanoid robot. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 5317–5322

  16. Canamero L, Fredslund J (2001) I show you how I like you—can you read it in my face? IEEE Trans Syst Man Cybern A 31(5): 454–459

    Google Scholar 

  17. Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D (2010) Expressing emotions with the social robot Probo. Int J of Soc Robot 2(4):377–389

    Article  Google Scholar 

  18. Delaunay F, De Greeff J, Belpaeme T (2009) Towards retro-projected robot faces: an alternative to mechatronic and android faces. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp 306–311

  19. Scheeff M, Pinto J, Rahardja K, Snibbe S, Tow R (2002) Experiences with Sparky, a social robot. In: Dautenhahn K, Bond A, Cañamero L, Edmonds B (eds) Socially intelligent agents. Springer, New York, pp 173–180

    Chapter  Google Scholar 

  20. Yoshikawa M, Matsumoto Y, Sumitani M, Ishiguro H (2011) Development of an android robot for psychological support in medical and welfare fields. In: IEEE International Conference on Robotics and Biomimetics (ROBIO), pp 2378–2383

  21. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE Workshop on Affective Computational Intelligence (WACI), pp 1–8

  22. Van Breemen A, Yan X, Meerbeek B (2005) iCat: an animated user-interface robot with personality. In: Proceedings of the 4\(^{th}\) International Joint Conference on Autonomous Agents and Multiagent Systems, pp 143–144

  23. Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot head ROMAN. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 3119–3124

  24. Saint-Aimé S, Le Pévédic B, Duhaut D (2009) First evaluation of EMI model of interaction. In: Proceedings of the 14th IASTED International Conference on Robotics and Applications, pp 263–270

  25. Zhang J, Sharkey AJ (2011) Contextual recognition of robot emotions. In: Groß R, Alboul L, Melhuish C, Witkowski M, Prescott TJ, Penders J (eds) Towards autonomous robotic systems. Springer, Berlin, pp 78–89

    Chapter  Google Scholar 

  26. Esau N, Kleinjohann B, Kleinjohann L, Stichling D (2003) MEXI: machine with emotionally extended intelligence. In: Abraham A, Köppen M, Franke K (eds) Design and application of hybrid intelligent systems. IOS Press, Amsterdam, pp 961–970

    Google Scholar 

  27. http://robotic.media.mit.edu/projects/robots/mds/overview/overview.html. Accessed 14 Jan 2013

  28. Gadanho SC, Hallam J (2001) Robot learning driven by emotions. Adapt Behav 9(1):42–64

    Article  Google Scholar 

  29. Ekman P (2009) Darwin’s contributions to our understanding of emotional expressions. Philos Trans R Soc Lond B 364(1535):3449–3451

    Article  Google Scholar 

  30. Bechara A, Damasio H, Damasio AR (2000) Emotion, decision making and the orbitofrontal cortex. Cereb Cortex 10(3):295–307

    Article  Google Scholar 

  31. Dolan RJ (2002) Emotion, cognition, and behavior. Science 298(5596):1191–1194

    Article  Google Scholar 

  32. Breazeal C (2009) Role of expressive behaviour for robots that learn from people. Philos Trans R Soc Lond B 364(1535):3527–3538

    Google Scholar 

  33. Robinson P, El Kaliouby R (2009) Computation of emotions in man and machines. Philos Trans R Soc Lond B 364(1535): 3441–3447

    Google Scholar 

  34. Cañamero D (1997) Modeling motivations and emotions as a basis for intelligent behavior. In: Proceedings of the 1st ACM International Conference on Autonomous Agents, pp 148–155

  35. Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Rob Auton Syst 58(3):322–332

    Article  Google Scholar 

  36. Bryson JJ, Tanguy EAR (2010) Simplifying the design of human-like behaviour: emotions as durative dynamic state for action selection. Int J Synth Emot 1(1):30–50

    Google Scholar 

  37. Dautenhahn K (2007) Socially intelligent robots: dimensions of human-robot interaction. Philos Trans R Soc Lond B 362(1480):679–704

    Article  Google Scholar 

  38. Movellan JR, Tanaka F, Fortenberry B, Aisaka K (2005) The RUBI/QRIO project: origins, principles, and first steps. In: 4th IEEE International Conference on Development and Learning (ICDL), pp 80–86

  39. Kwon DS, Kwak D, Keun Y, Park JC, Chung MJ, Jee ES, et al (2007) Emotion interaction system for a service robot. In: Proceedings of the 16\(^{th}\) International Symposium on Robot and Human interactive Communication (RO-MAN), pp 351–356

  40. Pollack ME, Brown L, Colbry D, Orosz C, Peintner B, Ramakrishnan S, et al (2002) Pearl: A mobile robotic assistant for the elderly. In: AAAI Workshop on Automation as Eldercare, pp 85–91

  41. Faber F, Bennewitz M, Eppner C, Gorog A, Gonsior C, Joho D, et al (2009) The humanoid museum tour guide Robotinho. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp 891–896

  42. Allison B, Nejat G, Kao E (2009) The design of an expressive humanlike socially assistive robot. J Mech Robot 1(1):011001

    Article  Google Scholar 

  43. Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294

    Article  Google Scholar 

  44. Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, et al (1999) Experiences with two deployed interactive tour-guide robots. In: Proceedings of the International Conference on Field and Service, Robotics (FSR’99)

  45. Ogino M, Watanabe A, Asada M (2008) Detection and categorization of facial image through the interaction with caregiver. In: 7th IEEE International Conference on Development and Learning (ICDL), pp 244–249

  46. Pantic M, Bartlett MS (2007) Machine analysis of facial expressions. Face recognition. I-Tech Education and Publishing, Vienna, pp 377–416

    Google Scholar 

  47. Calder AJ, Young AW (2005) Understanding the recognition of facial identity and facial expression. Nat Rev Neurosci 6(8): 641–651

    Google Scholar 

  48. Shore B (1996) Culture in mind: cognition, culture and the problem of meaning. Oxford University Press, Oxford

    Google Scholar 

  49. Yuki M, Maddux WW, Masuda T (2007) Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States. J Exp Soc Psychol 43:303–311

    Article  Google Scholar 

  50. Jack RE, Blais C, Scheepers C, Schyns PG, Caldara R (2009) Cultural confusions show that facial expressions are not universal. Curr Biol 19(18):1543–1548

    Article  Google Scholar 

  51. Russell JA, Fernández-Dols JM (1997) The psychology of facial expression. Cambridge University Press, Cambridge

    Book  Google Scholar 

  52. Fugate JMB (2013) Categorical perception for emotional faces. Emot Rev 5(1):84–89

    Article  Google Scholar 

  53. Anderson K, McOwan PW (2006) A real-time automated system for the recognition of human facial expressions. IEEE Trans Syst Man Cybern B 36(1):96–105

    Article  Google Scholar 

  54. Aronoff J, Woike BA, Hyman LM (1992) Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition. J Pers Soc Psychol 62(6):1050–1066

    Article  Google Scholar 

  55. Saint-Aime S, Le-Pevedic B, Duhaut D, Shibata T (2007) EmotiRob: companion robot project. In: 16th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN), pp 919–924

  56. Gratch J, Rickel J, Andre E, Cassell J, Petajan E, Badler N (2002) Creating interactive virtual humans: some assembly required. Intell Syst 17(4):54–63

    Article  Google Scholar 

  57. Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of the 1st ACM Conference on Human-Robot Interaction (HRI), pp 290–296

  58. De Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc Lond B 364(1535):3475–3484

    Article  Google Scholar 

  59. Russell JA (1994) Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol Bull 115(1):102–141

    Article  Google Scholar 

  60. Nomura T, Kanda T (2003) On proposing the concept of robot anxiety and considering measurement of it. In: Proceedings of 12th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN), pp 373–378

  61. Bartneck D, Kulic E, Croft M, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1:71–81

    Article  Google Scholar 

  62. Hermans D, De Houwer J, Eelen A (1994) The affective priming effect: automatic activation of evaluative information in memory. Cogn Emot 8(6):515–533

    Article  Google Scholar 

  63. Henson RN, Goshen-Gottstein Y, Ganel T, Otten LJ, Quayle A, Rugg MD (2003) Electrophysiological and haemodynamic correlates of face perception, recognition and priming. Cereb Cortex 13(7):793–805

    Article  Google Scholar 

  64. Pierno AC, Mari M, Lusher D, Castiello U (2008) Robotic movement elicits visuomotor priming in children with autism. Neuropsychologia 46(2):448–454

    Article  Google Scholar 

  65. Ishiguro H (2005) Android science—toward a new cross-interdisciplinary framework. Toward Social Mechanisms of Android Science, ICCS/CogSci Workshop, pp 1–6

  66. MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7(3):297–337

    Article  Google Scholar 

  67. Scassellati B (2000) How developmental psychology and robotics complement each other. Technical Report—Massachusetts Institute of Technology, CSAIL, Cambridge, Massachusetts

  68. Barsalou LW, Breazeal C, Smith LB (2007) Cognition as coordinated non-cognition. Cogn Process 8(2):79–91

    Article  Google Scholar 

  69. Trovato G, Kishi T, Endo N, Zecca M, Hashimoto K, Takanishi A (2013) Cross-cultural perspectives on emotion expressive humanoid robotic head: recognition of facial expressions and symbols. Int J Soc Robot 5(4):515–527

    Article  Google Scholar 

  70. Nisbett RE (2003) The Geography of Thought: How Asians and Westerners Think Differently ... and Why. The Free Press, New York

  71. Powers A, Kiesler S (2006) The advisor robot: tracing people’s mental model from a robot’s physical attributes. In: Proceedings of the 1st ACM Conference on Human-Robot Interaction (HRI), pp 218–225

  72. Sidner CL, Lee C (2007) Attentional gestures in dialogues between people and robots. In: Nishida T (ed) Conversational informatics: an engineering approach. John, West Sussex, pp 103–115

    Chapter  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Amyra Asamoah, Kay Jessee, and Matthew R. Francisco for their assistance in performing this research. Funding was provided by Indiana University’s School of Informatics and Computing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Casey C. Bennett.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Multimedia video extension1 (PDF 241 KB)

Multimedia video extension2 (GIF 22 MB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bennett, C.C., Šabanović, S. Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces. Int J of Soc Robotics 6, 367–381 (2014). https://doi.org/10.1007/s12369-014-0237-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0237-z

Keywords

Navigation