Skip to main content

The emotional hearing aid: an assistive tool for children with Asperger syndrome


People diagnosed along the autistic spectrum often have difficulties interacting with others in natural social environments. The emotional hearing aid is a portable assistive computer-based technology designed to help children with Asperger syndrome read and respond to the facial expressions of people they interact with. The tool implements the two principal elements that constitute one’s ability to empathize with others: the ability to identify a person’s mental state, a process known as mind-reading or theory of mind, and the ability to react appropriately to it (known as sympathizing). An automated mind-reading system attributes a mental state to a person by observing the behaviour of that person in real-time. Then the reaction advisor suggests to the user of the emotional hearing an appropriate reaction to the recognized mental state. This paper describes progress in the development and validation of the emotional hearing aid on two fronts. First, the implementation of the reaction advisor is described, showing how it takes into account the persistence, intensity and degree of confidence of a mental state inference. Second, the paper presents an experimental evaluation of the automated mind-reading system on six classes of complex mental states. In light of this progress, the paper concludes with a discussion of the challenges that still need to be addressed in developing and validating the emotional hearing aid.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9


  1. APA (1994) DSM-IV diagnostic and statistical manual of mental disorders, 4th edn. American Psychiatric Association, Washington DC

    Google Scholar 

  2. Attwood T (1998) Asperger’s syndrome: A Guide for Parents and Professionals. Jessica Kingsley, Philadelphia

    Google Scholar 

  3. Baron-Cohen S (1995) Mindblindness: An Essay on Autism and Theory of Mind. MIT, Cambridge

    Google Scholar 

  4. Baron-Cohen S, Wheelwright S (1999) Obsessions in children with autism or Asperger syndrome: a content analysis in terms of core domains of cognition. Br J Psychiatry 175:484–490

    PubMed  Google Scholar 

  5. Baron-Cohen S, Golan O, Wheelwright S, Hill J (2004) Mind-reading: the interactive guide to emotions. Jessica Kingsley, London (

  6. Baron-Cohen S, Wheelwright S, Lawson J, Griffin R, Hill J (2004) The exact mind: empathising and systemising in autism spectrum conditions. In: Goswami U (ed) Handbook of cognitive development. Blackwell, Oxford

    Google Scholar 

  7. Batty M, Taylor M (2003) Early processing of the six basic facial emotional expressions. Cogn Brain Res 17:613–620

    Article  Google Scholar 

  8. Birdwhistell R (1970) Kinesics and Context. University of Pennsylvania Press, PA

    Google Scholar 

  9. Birkby A (2004) The emotional hearing aid, Computer Science Diploma Dissertation. Computer Laboratory at the University of Cambridge

  10. Blocher K (1999) Affective social quotient (ASQ): teaching emotion recognition with interactive media and wireless expressive toys. S.M. Thesis, MIT, Cambridge

  11. Bruce V, Young A (1998) In the eye of the beholder: the science of face perception. Oxford University Press, New York

    Google Scholar 

  12. Cheng L, Kimberly G, Orlich F (2003) KidTalk: online therapy for Asperger’s syndrome. Technical Report, Social Computing Group, Microsoft Research

  13. Dautenhahn K, Billard A (2002) Games children with autism can play with Robota, a Humanoid Robotic Doll. In: Keates S, Clarkson PJ, Langdon PJ, Robinson P (eds) Proceedings of the 1st Cambridge workshop on Universal Access and Assistive Technology [CWUAAT]. Universal Access and Assistive Technology. Springer, London, pp 179–190

  14. Edwards, K (1998) The face of time: temporal cues in facial expression of emotion. Psychol Sci 9:270–276

    Article  Google Scholar 

  15. Ekman P, Friesen WV (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  16. Facetracker (2002) Neven Vision’s Facial Feature Tracking SDK (

  17. Fling E (2000) Eating an artichoke: a mother’s perspective on Asperger syndrome. Jessica Kingsley Publishers Ltd, Philadelphia

    Google Scholar 

  18. Frith U (eds) (1991) Autism and Apserger syndrome, Cambridge University Press, London

    Google Scholar 

  19. Goldman A, Sripada C.S (2005) Simulationist models of face-based emotion recognition. Cognition (in press)

  20. Hirose M, Kijima R, Shirakawa K, Nihei K (1997) Development of a virtual sand box: an application of virtual environment for psychological treatment. In: Riva G (ed) Virtual reality in neuro-psycho-physiology: cognitive, clinical and methodological issues in assessment and treatment. IOS Press, Amsterdam

    Google Scholar 

  21. Hoey J, Little J (2004) Decision theoretic modeling of human facial displays. In: Proceedings of European conference on computer vision

  22. Howlin P, Baron-Cohen S, Hadwin J (1999) Teaching children with autism to mind-read: a practical guide for teachers and parents. Wiley, New York

    Google Scholar 

  23. el Kaliouby R, Robinson P (2003) The emotional hearing aid: an assistive tool for autism. in the proceedings of the 10th international conference on human–computer interaction (HCII): universal access in HCI, vol 4, pp 68–72

  24. el Kaliouby R, Robinson P, Keates S (2003) Temporal context and the recognition of emotion from facial expression. In: Proceedings of the 10th international conference on human–computer interaction (HCII): human–computer interaction, theory and practice, vol 2, pp 631–635

  25. el Kaliouby R, Robinson P (2004) Real-time inference of complex mental states from facial expressions and head gestures. In: IEEE international workshop on real time computer vision for human–computer interaction at CVPR

  26. el Kaliouby R, Robinson P (2004) Mind-reading machines: automated inference of cognitive mental states from video. In: Proceedings of IEEE conference on systems, man and cybernetics, The Hague

  27. Kozima H, Yano H (2001) Designing a robot for contingency-detection game. Working Notes Workshop Robotic& Virtual Interactive Systems in Autism Therapy. University of Hertfordshire, Technical Report No 364

  28. MacKenzie IS (1995) Virtual environments and advanced interface design. In: Input devices and interaction techniques for advanced computing. Oxford University Press, Oxford, pp 437–470

  29. Moore D, McGrath P, Thorpe J (2000) Computer-aided learning for people with autism-a framework for research and development. Innov Educ Training Int 37(3):218–228

    Article  Google Scholar 

  30. O’ Connell S (1998) Mindreading: how we learn to love and lie. Arrow Books, London

    Google Scholar 

  31. Pantic M, Rothkrantz L (2000) Automatic analysis of facial expressions: The state of the art. IEEE Trans Pattern Anal Mach Intell 22:1424–1445

    Article  Google Scholar 

  32. Premack D, Woodruff G (1978) Does the chimpanzee have a theory of mind? Behav Brain Sci 4:515–526

    Google Scholar 

  33. Phillips AT, Wellman HM (2002) Infant’s ability to connect gaze and expression to intentional actions. Cognition 85:53–78

    Article  PubMed  Google Scholar 

  34. Posamentier M, Abdi H (2003) Processing faces and facial expressions. Neuropsychol Rev 13(3):113–144

    Article  PubMed  Google Scholar 

  35. Rozin D, Cohen AB (2003) High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of American. Emotion 3(1):68–75

    Article  PubMed  Google Scholar 

  36. Strickland D (1996) A virtual reality application with autistic children, presence. Teleoper Virtual Environ 5(3):319–329

    Google Scholar 

  37. Surakka E, Hietanen JK (1998) Facial and emotional reactions to duchenne and nonduchenne smiles. Int J Psychophysiol 29(1):23–33

    Article  PubMed  Google Scholar 

  38. Turk M, Kolsch M (2004) Emerging Topics in Computer Vision, chapter Perceptual Interfaces. Prentice Hall, Englewood Cliffs

    Google Scholar 

  39. Werry I, Dautenhahn K, Ogden B, Harwin W (2001) Can social interaction skills be taught by a social agent? the role of a robotic mediator in autism therapy. Proceedings CT2001, the fourth international conference on cognitive technology. Lecture Notes in Computer Science, sub series Lecture Notes in Artificial Intelligence, Springer, Berlin Heidelberg New York

Download references


The authors would like to thank Alex Birkby for implementing the reaction advisor in the context of his Computer Science Diploma dissertation at the University of Cambridge. We would also like to thank Professor Simon Baron-Cohen and Ofer Golan, at the Autism Research Centre, University of Cambridge for inspiring discussions about the automated mind-reading system and for making the Mind-reading DVD available to our research, and the anonymous reviewers for their valuable input. This research was funded by the Computer Laboratory’s Wiseman Fund, the Overseas Research Student Award, the Cambridge Overseas Trust, and Newnham College Studentship Research Award.

Author information

Authors and Affiliations


Corresponding author

Correspondence to R. el Kaliouby.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Kaliouby, R.e., Robinson, P. The emotional hearing aid: an assistive tool for children with Asperger syndrome. Univ Access Inf Soc 4, 121–134 (2005).

Download citation

  • Published:

  • Issue Date:

  • DOI:


  • Autism
  • Asperger syndrome
  • Facial expression analysis
  • Mind-reading
  • Dynamic Bayesian Networks