Skip to main content

It’s All in the Eyes: Designing Facial Expressions for an Interactive Robot Therapy Coach for Children

  • Conference paper
  • First Online:
Designing Around People

Abstract

An important aspect for child and robot interactions in various therapy scenarios is the robot’s ability to convey emotions to the child. Due to the fact that 93% of human communication is non-verbal, these socially interactive robots need to have the ability to mimic non-verbal cues to the child, particularly through the use of facial expressions. In this paper, we discuss the ability for a socially interactive robot to emote emotions through a minimal set of features, i.e. soley through the eyes. In a study with five participants, we evaluate participants ability to recognize emotions based on the Plutchik emotion scale and the universal emotions of happiness, sadness, anger, and fear. Results indicate that participants recognition of an emotion is maximum when the intensity of the emotion is not at the extreme ends of the Plutchik emotion scale.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Baron-Cohen S, O’Riordan M, Stone V, Jones R, Plaisted K (1999) Recognition of faux pas by normally developing children and children with Asperger syndrome or high-functioning autism. Journal of Autism and Developmental Disorders 29: 407-418

    Google Scholar 

  • Bassili J (1979) Emotion recognition: The role of facial movement and the relative importance of upper and lower areas of the face. Journal of Personality and Social Psychology 37(11): 2049-58

    Google Scholar 

  • Beck A, Canamero L, Hiolle A, Damiano L, Cosi P, Tesser F et al. (2013) Interpretation of emotional body language displayed by a humanoid robot: A case study with children. International Journal of Social Robotics 5(3): 325-334

    Google Scholar 

  • Bennett CC, Šabanović S (2014) Deriving minimal features for human-like facial expressions in robotic faces. International Journal of Social Robotics 6(3): 367-381

    Google Scholar 

  • Black M, Yacoob Y (1997) Recognizing facial expressions in image sequences using local parameterized models of image motion. International Journal of Computer Vision 25(1): 23-48

    Google Scholar 

  • Delaunay F, De Greeff J, Belpaeme T (2009) Towards retro-projected robot faces: An alternative to mechatronic and android faces. In: Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 306-311, Toyama International Conference Center, Japan

    Google Scholar 

  • Ekman P (1992) Facial expressions of emotion: New findings, new questions. Psychological Science (3)1: 34-38

    Google Scholar 

  • Ekman P, Friese WV (1971) Constants across cultures in the face and emotion. Journal of Personality and Social Psychology 17(2): 124-129

    Google Scholar 

  • Howard A (2013) Robots learn to play: Robots emerging role in pediatric therapy. In: Proceedings of the 26th international Florida Artificial Intelligence Research Society Conference, St. Pete Beach, FL, USA

    Google Scholar 

  • Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. International Journal of Social Robotics 3(2): 125-142

    Google Scholar 

  • Metta G, Sandini G, Vernon D, Natale L, Nori F (2008) The iCub humanoid robot: An open platform for research in embodied cognition. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 50-56, Washington DC, USA

    Google Scholar 

  • Plutchik R (2011) The nature of emotions. American Scientist 89: 344-350

    Google Scholar 

  • Saerbeck M, Schut T, Bartneck C, Janse M (2010) Expressive robots in education: Varying the degree of social supportive behavior of a robotic tutor. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems, pp. 1613-1622, Atlanta, GA, USA

    Google Scholar 

  • Simmons R, Goldberg D, Goode A, Montemerlo M, Roy N, Sellner B et al. (2003) GRACE: An autonomous robot for the AAAI robot challenge. CMU, Pittsburgh, PA, USA

    Google Scholar 

  • Tiberius R, Billson J (1991) The social context of teaching and learning. New Directions for Teaching and Learning 1991(45): 67-86

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Howard .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Cloutier, P., Park, H.W., MacCalla, J., Howard, A. (2016). It’s All in the Eyes: Designing Facial Expressions for an Interactive Robot Therapy Coach for Children. In: Langdon, P., Lazar, J., Heylighen, A., Dong, H. (eds) Designing Around People. Springer, Cham. https://doi.org/10.1007/978-3-319-29498-8_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-29498-8_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-29496-4

  • Online ISBN: 978-3-319-29498-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics