Using Eye Movement to Assess Auditory Attention

  • Alaa Bakry
  • Radwa Al-khatib
  • Randa Negm
  • Eslam Sabra
  • Mohamed Maher
  • Zainab Mohamed
  • Doaa ShawkyEmail author
  • Ashraf Badawi
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 921)


Eye movement has been found to be one of the factors that highly affect attention. In this paper, a study for detecting the influence of eye movement on attention is presented. Forty-three participants attended four sessions introducing different auditory stimuli while wearing a 14-channel wireless headset that collects their EEG signals. The participants were asked to fix their eyes in two of the sessions and they were allowed to move them freely in the other two. Their attention during the sessions was estimated using questionnaires that assess the information they were able to gain after the sessions. Different classifiers were trained to predict the attention scores when subjects were freely moving their eyes or fixing them. Among the trained classifiers, K nearest-neighbors classifiers yielded the best classification accuracy, which varied with the addition of the eye-movement features from about 72% to 87%. Thus, the obtained results show that there is an effect of eye movement on the gained attention. Hence, it is possible to detect attention of subjects using their eye movement patterns.


Attention measurement EEG Eye movement Eye gaze 


  1. 1.
    Liu, N., Chiang, C., Chu, H.: Recognizing the degree of human attention using EEG signals from mobile sensors. Sensors 13(8), 10273–10286 (2013)CrossRefGoogle Scholar
  2. 2.
    Shawky, D., Badawi, A.: A reinforcement learning-based adaptive learning system. In: International Conference on Advanced Machine Learning Technologies and Applications, pp. 221–231 (2018)Google Scholar
  3. 3.
    Shawky, D., Badawi, A.: Towards a personalized learning experience using reinforcement learning. In: Machine Learning Paradigms: Theory and Application, pp. 169–187 (2019)Google Scholar
  4. 4.
    Mohammadpour, M., Mozaffari, S.: Classification of EEG-based attention for brain computer interface. In: 3rd Iranian Conference on Intelligent Systems and Signal Processing (ICSPIS) (2017)Google Scholar
  5. 5.
    Ko, L., Komarov, O., Hairston, W., Jung, T., Lin, C.: Sustained attention in real classroom settings: an EEG study. Front. Hum. Neurosci. 11, 388 (2017)CrossRefGoogle Scholar
  6. 6.
    Ghosh, P., Mazumder, A., Bhattacharyya, S., Tibarewala, D.: An EEG study on working memory and cognition. In: Proceedings of the 2nd International Conference on Perception and Machine Intelligence - PerMIn 2015 (2015)Google Scholar
  7. 7.
    Pavlov, Y., Kotchoubey, B.: EEG correlates of working memory performance in females. BMC Neurosci. 18(1), 26 (2017)CrossRefGoogle Scholar
  8. 8.
    Braga, R., Fu, R., Seemungal, B., Wise, R., Leech, R.: Eye movements during auditory attention predict individual differences in dorsal attention network activity. Front. Hum. Neurosci. 10, 2016 (2016)CrossRefGoogle Scholar
  9. 9.
    Kwok, C.K.: Understanding user engagement level during tasks via facial responses, eye gaze and mouse movements. PhD Thesis, Hong Kong polytechnic University (2017)Google Scholar
  10. 10.
    Putze, F., Küster, D., Annerer-Walcher, S., Benedek, M.: Dozing off or thinking hard? In: Proceedings of the 2018 on International Conference on Multimodal Interaction (2018)Google Scholar
  11. 11.
    Kawahara, T., Inoue, K., Lala, D., Takanashi, K.: Audio-visual conversation analysis by smart posterboard and humanoid robot. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2018).
  12. 12.
    Rajavenkatanarayanan, A., Babu, A., Tsiakas, K., Makedon, F.: Monitoring task engagement using facial expressions and body postures. In: Proceedings of The 3rd International Workshop on Interactive and Spatial Computing - IWISC 2018 (2018).
  13. 13.
    Li, Y., Li, X., Ratcliffe, M., Liu, L., Qi, Y., Liu, Q.: A real-time EEG-based BCI system for attention recognition in ubiquitous environment. In: Proceedings of 2011 International Workshop on Ubiquitous Affective Awareness and Intelligent Interaction - UAAII 2011 (2011).
  14. 14.
    Niu, X., Han, H., Zeng, J., Sun, X., Shan, S., Huang, Y., et al.: Automatic engagement prediction with GAP feature. In: Proceedings of the 2018 on International Conference on Multimodal Interaction - ICMI 2018 (2018).
  15. 15.
    Stacchi, L., Ramon, M., Lao, J., Caldara, R.: Neural representations of faces are tuned to eye movements. J. Neurophysiol. 101(5), 2581–2600 (2018). Scholar
  16. 16.
    Sun, J., Yeh, K.: The effects of attention monitoring with EEG biofeedback on university students’ attention and self-efficacy: The case of anti-phishing instructional materials. Comput. Educ. 106, 73–82 (2017). Scholar
  17. 17.
    Mohamed, Z., El Halaby, M., Said, T., Shawky, D., Badawi, A.: Characterizing focused attention and working memory using EEG. Sensors 18(11), 3743 (2018)CrossRefGoogle Scholar
  18. 18.
    Mindvalley Blog: This is How Brain Waves Contribute to the State of Mind (2018). Accessed 29 Nov 2018
  19. 19.
    Thach, W.T., Goodkin, H.P., Keating, J.G.: The cerebellum and the adaptive coordination of movement. Annu. Rev. Neurosci. 15(1), 403–442 (1992)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Alaa Bakry
    • 1
  • Radwa Al-khatib
    • 1
  • Randa Negm
    • 1
  • Eslam Sabra
    • 1
  • Mohamed Maher
    • 1
  • Zainab Mohamed
    • 1
  • Doaa Shawky
    • 2
    Email author
  • Ashraf Badawi
    • 1
  1. 1.Center for Learning TechnologiesUniversity of Science and Technology, Zewail CityGizaEgypt
  2. 2.Engineering Mathematics Department, Faculty of EngineeringCairo UniversityGizaEgypt

Personalised recommendations