An Android-Based Mobile Eye Gaze Point Estimation System for Studying the Visual Perception in Children with Autism

  • J. Amudha
  • Hitha Nandakumar
  • S. Madhura
  • M. Parinitha Reddy
  • Nagabhairava Kavitha
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 32)


Autism is a neural developmental disorder characterized by poor social interaction, communication impairments and repeated behaviour. Reason for this difference in behaviour can be understood by studying the difference in their sensory processing. This paper proposes a mobile application which uses visual tasks to study the visual perception in children with Autism, which can give a profound explanation to the fact why they see and perceive things differently when compared to normal children. The application records the eye movements and estimates the region of gaze of the child to understand where the child’s attention focuses to during the visual tasks. This work provides an experimental proof that children with Autism are superior when compared to normal children in some visual tasks, which proves that they have higher IQ levels than their peers.


Autism spectral disorder Mobile application Cognitive visual tasks Human-Computer interaction 



We would like to thank “Apoorva Center of Autism”, Bangalore for selflessly helping us to interact with the Autistic children to gain a deeper understanding of the problem and successfully completing this work.


  1. 1.
    Shic, F., Scassellati, B., Lin, D., Chawarska K.: Measuring context: the gaze patterns of children with autism evaluated from the bottom-up. In: 6th International Conference on Development and Learning, IEEE (2007)Google Scholar
  2. 2.
    Plaisted, K., O’Riordan, M., Cohen, S.B.: Enhanced visual search for a conjunctive target in autism: a research note. J. Child Psychol. Psychiatry 39(5), 777–783 (1998)Google Scholar
  3. 3.
    Han, S., Yang, S., Kim, J., Gerla, M.: EyeGuardian: a framework of eye tracking and blink detection for mobile device users. In: UCLA. Department of Computer Science, Los Angeles (2012)Google Scholar
  4. 4.
    Vaitukaitis, V., Bulling, A.: Eye Gesture Recognition on Portable Devices. University of Cambridge, Cambridge (2012)Google Scholar
  5. 5.
    Su, M.C., Wang, K.C., Chen, G.D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(6), 319–327 (2006) (National Central University)Google Scholar
  6. 6.
    Miluzzo, E., Wang, T., Campbell, A.T.: EyePhone: Activating Mobile Phones With Your Eyes. Computer Science Department, Dartmouth College, Hanover (2010)Google Scholar
  7. 7.
    Pino, C., Kavasidis, I.: Improving Mobile Device Interaction By Eye Tracking Analysis, pp. 1199–1202. University of Catania, Catania (2012)Google Scholar
  8. 8.
    Costa, S., Lehmann, H., Soares, F.: Where is your nose?—Developing body awareness skills among children with autism using a humanoid robot. In: The 6th International Conference on Advances in Computer-Human Interaction (2013)Google Scholar
  9. 9.
    Daikin, S., Frith, U.: Vagaries of visual perception in autism. Neuron 48, 497–507 (2005)CrossRefGoogle Scholar
  10. 10.
    Billard, A., Robins, B., Nadel, J., Dautenhahn, K.: Building robota, a mini-humanoid robot for the rehabilitation of children with autism. RESNA Assistive Technol. J. 19, 37–49 (2006)Google Scholar

Copyright information

© Springer India 2015

Authors and Affiliations

  • J. Amudha
    • 1
  • Hitha Nandakumar
    • 1
  • S. Madhura
    • 1
  • M. Parinitha Reddy
    • 1
  • Nagabhairava Kavitha
    • 1
  1. 1.Department of Computer Science, Amrita School of EngineeringAmrita Vishwa VidyapeethamBangaloreIndia

Personalised recommendations