Development and Evaluation of Emotional Robots for Children with Autism Spectrum Disorders

  • Myounghoon JeonEmail author
  • Ruimin Zhang
  • William Lehman
  • Seyedeh Fakhrhosseini
  • Jaclyn Barnes
  • Chung Hyuk Park
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 528)


Individuals with Autism Spectrum Disorders (ASD) often have difficulty recognizing emotional cues in ordinary interaction. To address this, we are developing a social robot that teaches children with ASD to recognize emotion in the simpler and more controlled context of interaction with a robot. An emotion recognition program using the Viola-Jones algorithm for facial detection is in development. To better understand emotion expression by social robots, a study was conducted with 11 college students matching animated facial expressions and emotionally neutral sentences spoken in affective voices to various emotions. Overall, facial expressions had greater recognition accuracy and higher perceived intensity than voices. Future work will test the recognition of combined face and voices.


Social robotics Emotion Autism spectrum disorders 



This material is based upon work supported by the National Institutes of Health under grant No. 1 R01 HD082914-01.


  1. 1.
    Paul, R., Shriberg, L.D., McSweeny, J., Cicchetti, D., Klin, A., Volkmar, F.: Brief report: relations between prosodic performance and communication and socialization ratings in high functioning speakers with autism spectrum disorders. J. Autism Dev. Disord. 35, 861–869 (2005)CrossRefGoogle Scholar
  2. 2.
    Feil-Seifer, D., Mataric, M.: Robot-assisted therapy for children with autism spectrum disorders. In: Proceedings of the 7th International Conference on Interaction Design and Children, pp. 49–52. ACM (2008)Google Scholar
  3. 3.
    Michaud, F., Théberge-Turmel, C.: Mobile robotic toys and autism. In: Dautenhahn, K., Bond, A., Cañamero, L., Edmonds, B. (eds.) Socially Intelligent Agents, pp. 125–132. Springer, Berlin (2002)CrossRefGoogle Scholar
  4. 4.
    Scassellati, B.: How social robots will help us to diagnose, treat, and understand autism. In: Thrun, S., Brooks, R., Durrant-Whyte, H. (eds.) Robotics Research, pp. 552–563. Springer, Berlin (2007)CrossRefGoogle Scholar
  5. 5.
    Scassellati, B.: Affective prosody recognition for human-robot interaction. In: Microsoft Research’s External Research Symposium. Redmond, WA, USA. Citeseer (2009)Google Scholar
  6. 6.
    Stanton, C.M., Kahn, P.H., Severson, R.L., Ruckert, J.H., Gill, B.T.: Robotic animals might aid in the social development of children with autism. In: 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 271–278. IEEE (2008)Google Scholar
  7. 7.
    Kim, E.S., Berkovits, L.D., Bernier, E.P., Leyzberg, D., Shic, F., Paul, R., Scassellati, B.: Social robots as embedded reinforcers of social behavior in children with autism. J. Autism Dev. Disord. 43, 1038–1049 (2013)CrossRefGoogle Scholar
  8. 8.
    Werry, I., Dautenhahn, K., Harwin, W.: Investigating a robot as a therapy partner for children with autism. In: Proceedings of AAATE 2001 (2001)Google Scholar
  9. 9.
    el Kaliouby, R., Picard, R., Baron-Cohen, S.: Affective computing and autism. Ann. N. Y. Acad. Sci. 1093, 228–248 (2006)CrossRefGoogle Scholar
  10. 10.
    DiSalvo, C.F., Gemperle, F., Forlizzi, J., Kiesler, S.: All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 321–326. ACM (2002)Google Scholar
  11. 11.
    Adams, A., Robinson, P.: An android head for social-emotional intervention for children with autism spectrum conditions. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011, Part II. LNCS, vol. 6975, pp. 183–190. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  12. 12.
    Calvo, R.A., D’Mello, S.: Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1, 18–37 (2010)CrossRefGoogle Scholar
  13. 13.
    Jeon, M., Walker, B.: Emotion detection and regulation interface for drivers with traumatic brain injury. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI11) (2011)Google Scholar
  14. 14.
    Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings. Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000, pp. 46–53. IEEE (2011)Google Scholar
  15. 15.
    Valstar, M., Pantic, M.: Induced disgust, happiness and surprise: an addition to the MMI facial expression database. In: Proceedings of International Conference Language Resources and Evaluation, Workshop on Emotion, pp. 65–70 (2010)Google Scholar
  16. 16.
    Jeon, M., Rayan, I.A.: The effect of physical embodiment of an animal robot on affective prosody recognition. In: Jacko, J.A. (ed.) Human-Computer Interaction, Part II, HCII 2011. LNCS, vol. 6762, pp. 523–532. Springer, Heidelberg (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Myounghoon Jeon
    • 1
    Email author
  • Ruimin Zhang
    • 1
  • William Lehman
    • 1
  • Seyedeh Fakhrhosseini
    • 1
  • Jaclyn Barnes
    • 1
  • Chung Hyuk Park
    • 2
  1. 1.Michigan Technological UniversityHoughtonUSA
  2. 2.New York Institute of TechnologyNew YorkUSA

Personalised recommendations