Development and Evaluation of Emotional Robots for Children with Autism Spectrum Disorders
Individuals with Autism Spectrum Disorders (ASD) often have difficulty recognizing emotional cues in ordinary interaction. To address this, we are developing a social robot that teaches children with ASD to recognize emotion in the simpler and more controlled context of interaction with a robot. An emotion recognition program using the Viola-Jones algorithm for facial detection is in development. To better understand emotion expression by social robots, a study was conducted with 11 college students matching animated facial expressions and emotionally neutral sentences spoken in affective voices to various emotions. Overall, facial expressions had greater recognition accuracy and higher perceived intensity than voices. Future work will test the recognition of combined face and voices.
KeywordsSocial robotics Emotion Autism spectrum disorders
This material is based upon work supported by the National Institutes of Health under grant No. 1 R01 HD082914-01.
- 2.Feil-Seifer, D., Mataric, M.: Robot-assisted therapy for children with autism spectrum disorders. In: Proceedings of the 7th International Conference on Interaction Design and Children, pp. 49–52. ACM (2008)Google Scholar
- 5.Scassellati, B.: Affective prosody recognition for human-robot interaction. In: Microsoft Research’s External Research Symposium. Redmond, WA, USA. Citeseer (2009)Google Scholar
- 6.Stanton, C.M., Kahn, P.H., Severson, R.L., Ruckert, J.H., Gill, B.T.: Robotic animals might aid in the social development of children with autism. In: 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 271–278. IEEE (2008)Google Scholar
- 8.Werry, I., Dautenhahn, K., Harwin, W.: Investigating a robot as a therapy partner for children with autism. In: Proceedings of AAATE 2001 (2001)Google Scholar
- 10.DiSalvo, C.F., Gemperle, F., Forlizzi, J., Kiesler, S.: All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 321–326. ACM (2002)Google Scholar
- 13.Jeon, M., Walker, B.: Emotion detection and regulation interface for drivers with traumatic brain injury. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI11) (2011)Google Scholar
- 14.Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings. Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000, pp. 46–53. IEEE (2011)Google Scholar
- 15.Valstar, M., Pantic, M.: Induced disgust, happiness and surprise: an addition to the MMI facial expression database. In: Proceedings of International Conference Language Resources and Evaluation, Workshop on Emotion, pp. 65–70 (2010)Google Scholar
- 16.Jeon, M., Rayan, I.A.: The effect of physical embodiment of an animal robot on affective prosody recognition. In: Jacko, J.A. (ed.) Human-Computer Interaction, Part II, HCII 2011. LNCS, vol. 6762, pp. 523–532. Springer, Heidelberg (2011)Google Scholar