Universal Access in the Information Society

, Volume 4, Issue 2, pp 121–134 | Cite as

The emotional hearing aid: an assistive tool for children with Asperger syndrome

  • R. el Kaliouby
  • P. Robinson
Long paper


People diagnosed along the autistic spectrum often have difficulties interacting with others in natural social environments. The emotional hearing aid is a portable assistive computer-based technology designed to help children with Asperger syndrome read and respond to the facial expressions of people they interact with. The tool implements the two principal elements that constitute one’s ability to empathize with others: the ability to identify a person’s mental state, a process known as mind-reading or theory of mind, and the ability to react appropriately to it (known as sympathizing). An automated mind-reading system attributes a mental state to a person by observing the behaviour of that person in real-time. Then the reaction advisor suggests to the user of the emotional hearing an appropriate reaction to the recognized mental state. This paper describes progress in the development and validation of the emotional hearing aid on two fronts. First, the implementation of the reaction advisor is described, showing how it takes into account the persistence, intensity and degree of confidence of a mental state inference. Second, the paper presents an experimental evaluation of the automated mind-reading system on six classes of complex mental states. In light of this progress, the paper concludes with a discussion of the challenges that still need to be addressed in developing and validating the emotional hearing aid.


Autism Asperger syndrome Facial expression analysis Mind-reading Dynamic Bayesian Networks 



The authors would like to thank Alex Birkby for implementing the reaction advisor in the context of his Computer Science Diploma dissertation at the University of Cambridge. We would also like to thank Professor Simon Baron-Cohen and Ofer Golan, at the Autism Research Centre, University of Cambridge for inspiring discussions about the automated mind-reading system and for making the Mind-reading DVD available to our research, and the anonymous reviewers for their valuable input. This research was funded by the Computer Laboratory’s Wiseman Fund, the Overseas Research Student Award, the Cambridge Overseas Trust, and Newnham College Studentship Research Award.


  1. 1.
    APA (1994) DSM-IV diagnostic and statistical manual of mental disorders, 4th edn. American Psychiatric Association, Washington DCGoogle Scholar
  2. 2.
    Attwood T (1998) Asperger’s syndrome: A Guide for Parents and Professionals. Jessica Kingsley, PhiladelphiaGoogle Scholar
  3. 3.
    Baron-Cohen S (1995) Mindblindness: An Essay on Autism and Theory of Mind. MIT, CambridgeGoogle Scholar
  4. 4.
    Baron-Cohen S, Wheelwright S (1999) Obsessions in children with autism or Asperger syndrome: a content analysis in terms of core domains of cognition. Br J Psychiatry 175:484–490PubMedGoogle Scholar
  5. 5.
    Baron-Cohen S, Golan O, Wheelwright S, Hill J (2004) Mind-reading: the interactive guide to emotions. Jessica Kingsley, London (
  6. 6.
    Baron-Cohen S, Wheelwright S, Lawson J, Griffin R, Hill J (2004) The exact mind: empathising and systemising in autism spectrum conditions. In: Goswami U (ed) Handbook of cognitive development. Blackwell, OxfordGoogle Scholar
  7. 7.
    Batty M, Taylor M (2003) Early processing of the six basic facial emotional expressions. Cogn Brain Res 17:613–620CrossRefGoogle Scholar
  8. 8.
    Birdwhistell R (1970) Kinesics and Context. University of Pennsylvania Press, PAGoogle Scholar
  9. 9.
    Birkby A (2004) The emotional hearing aid, Computer Science Diploma Dissertation. Computer Laboratory at the University of CambridgeGoogle Scholar
  10. 10.
    Blocher K (1999) Affective social quotient (ASQ): teaching emotion recognition with interactive media and wireless expressive toys. S.M. Thesis, MIT, CambridgeGoogle Scholar
  11. 11.
    Bruce V, Young A (1998) In the eye of the beholder: the science of face perception. Oxford University Press, New YorkGoogle Scholar
  12. 12.
    Cheng L, Kimberly G, Orlich F (2003) KidTalk: online therapy for Asperger’s syndrome. Technical Report, Social Computing Group, Microsoft ResearchGoogle Scholar
  13. 13.
    Dautenhahn K, Billard A (2002) Games children with autism can play with Robota, a Humanoid Robotic Doll. In: Keates S, Clarkson PJ, Langdon PJ, Robinson P (eds) Proceedings of the 1st Cambridge workshop on Universal Access and Assistive Technology [CWUAAT]. Universal Access and Assistive Technology. Springer, London, pp 179–190Google Scholar
  14. 14.
    Edwards, K (1998) The face of time: temporal cues in facial expression of emotion. Psychol Sci 9:270–276CrossRefGoogle Scholar
  15. 15.
    Ekman P, Friesen WV (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists Press, Palo AltoGoogle Scholar
  16. 16.
    Facetracker (2002) Neven Vision’s Facial Feature Tracking SDK (
  17. 17.
    Fling E (2000) Eating an artichoke: a mother’s perspective on Asperger syndrome. Jessica Kingsley Publishers Ltd, PhiladelphiaGoogle Scholar
  18. 18.
    Frith U (eds) (1991) Autism and Apserger syndrome, Cambridge University Press, LondonGoogle Scholar
  19. 19.
    Goldman A, Sripada C.S (2005) Simulationist models of face-based emotion recognition. Cognition (in press)Google Scholar
  20. 20.
    Hirose M, Kijima R, Shirakawa K, Nihei K (1997) Development of a virtual sand box: an application of virtual environment for psychological treatment. In: Riva G (ed) Virtual reality in neuro-psycho-physiology: cognitive, clinical and methodological issues in assessment and treatment. IOS Press, AmsterdamGoogle Scholar
  21. 21.
    Hoey J, Little J (2004) Decision theoretic modeling of human facial displays. In: Proceedings of European conference on computer visionGoogle Scholar
  22. 22.
    Howlin P, Baron-Cohen S, Hadwin J (1999) Teaching children with autism to mind-read: a practical guide for teachers and parents. Wiley, New YorkGoogle Scholar
  23. 23.
    el Kaliouby R, Robinson P (2003) The emotional hearing aid: an assistive tool for autism. in the proceedings of the 10th international conference on human–computer interaction (HCII): universal access in HCI, vol 4, pp 68–72Google Scholar
  24. 24.
    el Kaliouby R, Robinson P, Keates S (2003) Temporal context and the recognition of emotion from facial expression. In: Proceedings of the 10th international conference on human–computer interaction (HCII): human–computer interaction, theory and practice, vol 2, pp 631–635Google Scholar
  25. 25.
    el Kaliouby R, Robinson P (2004) Real-time inference of complex mental states from facial expressions and head gestures. In: IEEE international workshop on real time computer vision for human–computer interaction at CVPRGoogle Scholar
  26. 26.
    el Kaliouby R, Robinson P (2004) Mind-reading machines: automated inference of cognitive mental states from video. In: Proceedings of IEEE conference on systems, man and cybernetics, The HagueGoogle Scholar
  27. 27.
    Kozima H, Yano H (2001) Designing a robot for contingency-detection game. Working Notes Workshop Robotic& Virtual Interactive Systems in Autism Therapy. University of Hertfordshire, Technical Report No 364Google Scholar
  28. 28.
    MacKenzie IS (1995) Virtual environments and advanced interface design. In: Input devices and interaction techniques for advanced computing. Oxford University Press, Oxford, pp 437–470Google Scholar
  29. 29.
    Moore D, McGrath P, Thorpe J (2000) Computer-aided learning for people with autism-a framework for research and development. Innov Educ Training Int 37(3):218–228CrossRefGoogle Scholar
  30. 30.
    O’ Connell S (1998) Mindreading: how we learn to love and lie. Arrow Books, LondonGoogle Scholar
  31. 31.
    Pantic M, Rothkrantz L (2000) Automatic analysis of facial expressions: The state of the art. IEEE Trans Pattern Anal Mach Intell 22:1424–1445CrossRefGoogle Scholar
  32. 32.
    Premack D, Woodruff G (1978) Does the chimpanzee have a theory of mind? Behav Brain Sci 4:515–526Google Scholar
  33. 33.
    Phillips AT, Wellman HM (2002) Infant’s ability to connect gaze and expression to intentional actions. Cognition 85:53–78CrossRefPubMedGoogle Scholar
  34. 34.
    Posamentier M, Abdi H (2003) Processing faces and facial expressions. Neuropsychol Rev 13(3):113–144CrossRefPubMedGoogle Scholar
  35. 35.
    Rozin D, Cohen AB (2003) High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of American. Emotion 3(1):68–75CrossRefPubMedGoogle Scholar
  36. 36.
    Strickland D (1996) A virtual reality application with autistic children, presence. Teleoper Virtual Environ 5(3):319–329Google Scholar
  37. 37.
    Surakka E, Hietanen JK (1998) Facial and emotional reactions to duchenne and nonduchenne smiles. Int J Psychophysiol 29(1):23–33CrossRefPubMedGoogle Scholar
  38. 38.
    Turk M, Kolsch M (2004) Emerging Topics in Computer Vision, chapter Perceptual Interfaces. Prentice Hall, Englewood CliffsGoogle Scholar
  39. 39.
    Werry I, Dautenhahn K, Ogden B, Harwin W (2001) Can social interaction skills be taught by a social agent? the role of a robotic mediator in autism therapy. Proceedings CT2001, the fourth international conference on cognitive technology. Lecture Notes in Computer Science, sub series Lecture Notes in Artificial Intelligence, Springer, Berlin Heidelberg New YorkGoogle Scholar

Copyright information

© Springer-Verlag 2005

Authors and Affiliations

  1. 1.Computer LaboratoryUniversity of CambridgeCambridgeUK

Personalised recommendations