Skip to main content
Log in

Imitating Human Emotions with Artificial Facial Expressions

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Humans convey information about their emotional state through facial expressions. Robots typically cannot show facial expressions like humans do, making it hard for them to imitate emotions. Here we investigate how LED patterns around the eyes of Aldebaran’s Nao robot can be used to imitate human emotions. We performed two experiments. In the first experiment we examined the LED color, intensity, frequency, sharpness, and orientation that humans associate with different emotions. Based on the results, 12 LED patterns were created. The second experiment measured how well humans recognized those LED patterns as the emotions intended by the design. We used a ROC (Receiver Operating Characteristic) graph to determine which of the 12 LED patterns were the best ones for the Nao robot to imitate emotions with. Our technique of using ROC graphs is generally applicable to determining the best of other methods for imitating human emotions (e.g., gestures, speech), as well.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Adams RB Jr, Kleck RE (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Am Psychol Assoc 5(1):3–11. doi:10.1037/1528-3542.5.1.3

    Google Scholar 

  2. Aldebaran Robotics (2012) Nao key features. www.aldebaran-robotics.com/en/Discover-NAO/Key-Features/hardware-platform.html

  3. Bazo D, Vaidyanathan R, Lentz A, Melhuish C (2010) Design and testing of a hybrid expressive face for a humanoid robot. In: IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE Press, New York, pp 5317–5322

    Google Scholar 

  4. Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE workshop on affective computational intelligence (WACI). IEEE Press, New York, pp 1–8

    Chapter  Google Scholar 

  5. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum-Comput Stud 59:119–155

    Article  Google Scholar 

  6. Cohen I, Sebe N, Garg A, Chen LS, Huang TS (2003) Facial expression recognition from video sequences: temporal and static modeling. Comput Vis Image Underst 91:160–187

    Article  Google Scholar 

  7. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1:49–98

    Google Scholar 

  8. Ekman P, Friesen WV (1978) Manual for facial action coding system. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  9. Lisetti CL, Schiano DJ (2000) Automatic facial expression interpretation: where human-computer interaction, artificial intelligence and cognitive science intersect. Pragmat Cogn 8(1):185–235

    Article  Google Scholar 

  10. Nadel J, Simon M, Canet P, Soussignan R, Blancard P, Canamero L, Gaussier P (2006) Human responses to an expressive robot. In: Proceedings of the sixth international workshop on epigenetic robotics. Lund University cognitive studies, vol 128, pp 79–86

    Google Scholar 

  11. Torta E, Oberzaucher J, Werner F, Cuijpers RH, Juola JF (2012) The attitude toward socially assistive robots in intelligent homes: results from laboratory studies and field trials. J Human-Robot Interact 1(2):76–99

    Google Scholar 

  12. Tsao DY, Livingstone MS (2008) Mechanisms of face perception. Annu Rev Neurosci 31:411–437. doi:10.1146/annurev.neuro.30.051606.094238

    Article  Google Scholar 

  13. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58

    Article  Google Scholar 

  14. Robot L (2013) Vision in ambient homes. In: García-Rodríguez J, Cazorla M (eds) Robotic vision: technologies for machine learning and vision applications. IGI Global, Hershey

    Google Scholar 

  15. Kanoh M, Iwata S, Kato S, Itoh H (2005) Emotive facial expressions of sensitivity communication robot “Ifbot”. Kansei Eng Int 5(3):35–42

    Article  Google Scholar 

  16. Morency L, Darrell T (2006) Head gesture recognition in intelligent interfaces: the role of context in improving recognition. In: Proceedings of the international conference on intelligent user interfaces, Sydney, Australia, pp 32–38

    Google Scholar 

  17. Wong B, Cronin-Golomb A, Neargarder S (2005) Patterns of visual scanning as predictors of emotion identification in normal aging. Neuropsychology 19:739–749

    Article  Google Scholar 

Download references

Acknowledgements

The research leading to these results is part of the KSERA project (http://www.ksera-project.eu) and has received funding from the European Commission under the 7th Framework Programme (FP7) for Research and Technological Development under grant agreement n2010-248085.

We would also like to thank Dennis Hulsen, Jeremy Karouta, Mike Vogel, and Daniel Lakens, for their contributions to this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David O. Johnson.

Appendix: Experiment 1 Questionnaire

Appendix: Experiment 1 Questionnaire

For each emotion (A–F), mark a color and intensity that best fits the emotion; then for each pair of lines pick the one that bests fits the emotion. There is no right or wrong answer, just how you feel.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Johnson, D.O., Cuijpers, R.H. & van der Pol, D. Imitating Human Emotions with Artificial Facial Expressions. Int J of Soc Robotics 5, 503–513 (2013). https://doi.org/10.1007/s12369-013-0211-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0211-1

Keywords

Navigation