Abstract
This study explores deriving minimal features for a robotic face to convey information (via facial expressions) that people can perceive and understand. Recent research in computer vision has shown that a small number of moving points/lines can be used to capture the majority of information (\(\sim \)95 %) in human facial expressions. Here, we apply such findings to a minimalist robot face design, which was run through a series of experiments with human subjects (n = 75) exploring the effect of various factors, including added neck motion and degree of expression. Facial expression identification rates were similar to more complex robots. In addition, added neck motion significantly improved facial expression identification rates to 100 % for all expressions (except Fear). The Negative Attitudes towards Robots (NARS) and Godspeed scales were also collected to examine user perceptions, e.g. perceived animacy and intelligence. The project aims to answer a number of fundamental questions about robotic face design, as well as to develop inexpensive and replicable robotic faces for experimental purposes.
Similar content being viewed by others
References
Ekman P, Friesen WV (2003) Unmasking the face: a guide to recognizing emotions from facial clues. Malor Books, Los Altos
Cohn JF (2010) Advances in behavioral science using automated facial image analysis and synthesis. IEEE Signal Process Mag 27(6):128–133
Pantic M (2009) Machine analysis of facial behaviour: naturalistic and dynamic behaviour. Philos Trans R Soc Lond B 364(1535):3505–3513
Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155
Sosnowski S, Bittermann A, Kuhnlenz K, Buss M (2006) Design and evaluation of emotion-display EDDIE. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 3113–3118
Matsumoto N, Fujii H, Okada M (2006) Minimal design for human–agent communication. Artif Life Robot 10(1):49–54
Kozima H, Michalowski MP, Nakagawa C (2009) Keepon: a playful robot for research, therapy, and entertainment. Int J Soc Robot 1(1):3–18
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Rob Auton Syst 42(3–4):143–166
Blow M, Dautenhahn K, Appleby A, Nehaniv CL, Lee DC (2006) Perception of robot smiles and dimensions for human-robot interaction design. In: 15th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pp 469–474
Chaminade T, Zecca M, Blakemore S-J, Takanishi A, Frith CD, Micera S et al (2010) Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. PLoS ONE 5(7):e11577
Mori M (1970) Bukimi no tani [The uncanny valley]. Energy 7(4):33–35. http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley. Accessed 15 Feb 2014
MacDorman KF, Green RD, Ho C-C, Koch CT (2009) Too real for comfort? Uncanny responses to computer generated faces. Comput Hum Behav 25(3):695–710
DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th ACM Conference on Designing Interactive Systems, pp 321–326
Mayer C, Sosnowski S, Kuhnlenz K, Radig B (2010) Towards robotic facial mimicry: system development and evaluation. In: Proceedings of the IEEE RO-MAN Conference, pp 198–203
Bazo D, Vaidyanathan R, Lentz A, Melhuish C (2010) Design and testing of a hybrid expressive face for a humanoid robot. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 5317–5322
Canamero L, Fredslund J (2001) I show you how I like you—can you read it in my face? IEEE Trans Syst Man Cybern A 31(5): 454–459
Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D (2010) Expressing emotions with the social robot Probo. Int J of Soc Robot 2(4):377–389
Delaunay F, De Greeff J, Belpaeme T (2009) Towards retro-projected robot faces: an alternative to mechatronic and android faces. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp 306–311
Scheeff M, Pinto J, Rahardja K, Snibbe S, Tow R (2002) Experiences with Sparky, a social robot. In: Dautenhahn K, Bond A, Cañamero L, Edmonds B (eds) Socially intelligent agents. Springer, New York, pp 173–180
Yoshikawa M, Matsumoto Y, Sumitani M, Ishiguro H (2011) Development of an android robot for psychological support in medical and welfare fields. In: IEEE International Conference on Robotics and Biomimetics (ROBIO), pp 2378–2383
Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE Workshop on Affective Computational Intelligence (WACI), pp 1–8
Van Breemen A, Yan X, Meerbeek B (2005) iCat: an animated user-interface robot with personality. In: Proceedings of the 4\(^{th}\) International Joint Conference on Autonomous Agents and Multiagent Systems, pp 143–144
Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot head ROMAN. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 3119–3124
Saint-Aimé S, Le Pévédic B, Duhaut D (2009) First evaluation of EMI model of interaction. In: Proceedings of the 14th IASTED International Conference on Robotics and Applications, pp 263–270
Zhang J, Sharkey AJ (2011) Contextual recognition of robot emotions. In: Groß R, Alboul L, Melhuish C, Witkowski M, Prescott TJ, Penders J (eds) Towards autonomous robotic systems. Springer, Berlin, pp 78–89
Esau N, Kleinjohann B, Kleinjohann L, Stichling D (2003) MEXI: machine with emotionally extended intelligence. In: Abraham A, Köppen M, Franke K (eds) Design and application of hybrid intelligent systems. IOS Press, Amsterdam, pp 961–970
http://robotic.media.mit.edu/projects/robots/mds/overview/overview.html. Accessed 14 Jan 2013
Gadanho SC, Hallam J (2001) Robot learning driven by emotions. Adapt Behav 9(1):42–64
Ekman P (2009) Darwin’s contributions to our understanding of emotional expressions. Philos Trans R Soc Lond B 364(1535):3449–3451
Bechara A, Damasio H, Damasio AR (2000) Emotion, decision making and the orbitofrontal cortex. Cereb Cortex 10(3):295–307
Dolan RJ (2002) Emotion, cognition, and behavior. Science 298(5596):1191–1194
Breazeal C (2009) Role of expressive behaviour for robots that learn from people. Philos Trans R Soc Lond B 364(1535):3527–3538
Robinson P, El Kaliouby R (2009) Computation of emotions in man and machines. Philos Trans R Soc Lond B 364(1535): 3441–3447
Cañamero D (1997) Modeling motivations and emotions as a basis for intelligent behavior. In: Proceedings of the 1st ACM International Conference on Autonomous Agents, pp 148–155
Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Rob Auton Syst 58(3):322–332
Bryson JJ, Tanguy EAR (2010) Simplifying the design of human-like behaviour: emotions as durative dynamic state for action selection. Int J Synth Emot 1(1):30–50
Dautenhahn K (2007) Socially intelligent robots: dimensions of human-robot interaction. Philos Trans R Soc Lond B 362(1480):679–704
Movellan JR, Tanaka F, Fortenberry B, Aisaka K (2005) The RUBI/QRIO project: origins, principles, and first steps. In: 4th IEEE International Conference on Development and Learning (ICDL), pp 80–86
Kwon DS, Kwak D, Keun Y, Park JC, Chung MJ, Jee ES, et al (2007) Emotion interaction system for a service robot. In: Proceedings of the 16\(^{th}\) International Symposium on Robot and Human interactive Communication (RO-MAN), pp 351–356
Pollack ME, Brown L, Colbry D, Orosz C, Peintner B, Ramakrishnan S, et al (2002) Pearl: A mobile robotic assistant for the elderly. In: AAAI Workshop on Automation as Eldercare, pp 85–91
Faber F, Bennewitz M, Eppner C, Gorog A, Gonsior C, Joho D, et al (2009) The humanoid museum tour guide Robotinho. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp 891–896
Allison B, Nejat G, Kao E (2009) The design of an expressive humanlike socially assistive robot. J Mech Robot 1(1):011001
Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294
Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, et al (1999) Experiences with two deployed interactive tour-guide robots. In: Proceedings of the International Conference on Field and Service, Robotics (FSR’99)
Ogino M, Watanabe A, Asada M (2008) Detection and categorization of facial image through the interaction with caregiver. In: 7th IEEE International Conference on Development and Learning (ICDL), pp 244–249
Pantic M, Bartlett MS (2007) Machine analysis of facial expressions. Face recognition. I-Tech Education and Publishing, Vienna, pp 377–416
Calder AJ, Young AW (2005) Understanding the recognition of facial identity and facial expression. Nat Rev Neurosci 6(8): 641–651
Shore B (1996) Culture in mind: cognition, culture and the problem of meaning. Oxford University Press, Oxford
Yuki M, Maddux WW, Masuda T (2007) Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States. J Exp Soc Psychol 43:303–311
Jack RE, Blais C, Scheepers C, Schyns PG, Caldara R (2009) Cultural confusions show that facial expressions are not universal. Curr Biol 19(18):1543–1548
Russell JA, Fernández-Dols JM (1997) The psychology of facial expression. Cambridge University Press, Cambridge
Fugate JMB (2013) Categorical perception for emotional faces. Emot Rev 5(1):84–89
Anderson K, McOwan PW (2006) A real-time automated system for the recognition of human facial expressions. IEEE Trans Syst Man Cybern B 36(1):96–105
Aronoff J, Woike BA, Hyman LM (1992) Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition. J Pers Soc Psychol 62(6):1050–1066
Saint-Aime S, Le-Pevedic B, Duhaut D, Shibata T (2007) EmotiRob: companion robot project. In: 16th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN), pp 919–924
Gratch J, Rickel J, Andre E, Cassell J, Petajan E, Badler N (2002) Creating interactive virtual humans: some assembly required. Intell Syst 17(4):54–63
Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of the 1st ACM Conference on Human-Robot Interaction (HRI), pp 290–296
De Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc Lond B 364(1535):3475–3484
Russell JA (1994) Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol Bull 115(1):102–141
Nomura T, Kanda T (2003) On proposing the concept of robot anxiety and considering measurement of it. In: Proceedings of 12th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN), pp 373–378
Bartneck D, Kulic E, Croft M, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1:71–81
Hermans D, De Houwer J, Eelen A (1994) The affective priming effect: automatic activation of evaluative information in memory. Cogn Emot 8(6):515–533
Henson RN, Goshen-Gottstein Y, Ganel T, Otten LJ, Quayle A, Rugg MD (2003) Electrophysiological and haemodynamic correlates of face perception, recognition and priming. Cereb Cortex 13(7):793–805
Pierno AC, Mari M, Lusher D, Castiello U (2008) Robotic movement elicits visuomotor priming in children with autism. Neuropsychologia 46(2):448–454
Ishiguro H (2005) Android science—toward a new cross-interdisciplinary framework. Toward Social Mechanisms of Android Science, ICCS/CogSci Workshop, pp 1–6
MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7(3):297–337
Scassellati B (2000) How developmental psychology and robotics complement each other. Technical Report—Massachusetts Institute of Technology, CSAIL, Cambridge, Massachusetts
Barsalou LW, Breazeal C, Smith LB (2007) Cognition as coordinated non-cognition. Cogn Process 8(2):79–91
Trovato G, Kishi T, Endo N, Zecca M, Hashimoto K, Takanishi A (2013) Cross-cultural perspectives on emotion expressive humanoid robotic head: recognition of facial expressions and symbols. Int J Soc Robot 5(4):515–527
Nisbett RE (2003) The Geography of Thought: How Asians and Westerners Think Differently ... and Why. The Free Press, New York
Powers A, Kiesler S (2006) The advisor robot: tracing people’s mental model from a robot’s physical attributes. In: Proceedings of the 1st ACM Conference on Human-Robot Interaction (HRI), pp 218–225
Sidner CL, Lee C (2007) Attentional gestures in dialogues between people and robots. In: Nishida T (ed) Conversational informatics: an engineering approach. John, West Sussex, pp 103–115
Acknowledgments
The authors would like to thank Amyra Asamoah, Kay Jessee, and Matthew R. Francisco for their assistance in performing this research. Funding was provided by Indiana University’s School of Informatics and Computing.
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Multimedia video extension2 (GIF 22 MB)
Rights and permissions
About this article
Cite this article
Bennett, C.C., Šabanović, S. Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces. Int J of Soc Robotics 6, 367–381 (2014). https://doi.org/10.1007/s12369-014-0237-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-014-0237-z