Assessing Levels of Attention Using Low Cost Eye Tracking

  • Per BækgaardEmail author
  • Michael Kai Petersen
  • Jakob Eg Larsen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9737)


The emergence of mobile eye trackers embedded in next generation smartphones or VR displays will make it possible to trace not only what objects we look at but also the level of attention in a given situation. Exploring whether we can quantify the engagement of a user interacting with a laptop, we apply mobile eye tracking in an in-depth study over 2 weeks with nearly 10.000 observations to assess pupil size changes, related to attentional aspects of alertness, orientation and conflict resolution. Visually presenting conflicting cues and targets we hypothesize that it’s feasible to measure the allocated effort when responding to confusing stimuli. Although such experiments are normally carried out in a lab, we have initial indications that we are able to differentiate between sustained alertness and complex decision making even with low cost eye tracking “in the wild”. From a quantified self perspective of individual behavioural adaptation, the correlations between the pupil size and the task dependent reaction time and error rates may longer term provide a foundation for modifying smartphone content and interaction to the users perceived level of attention.


Eye tracking Attention network 



This work is supported in part by the Innovation Fund Denmark through the project Eye Tracking for Mobile Devices.


  1. 1.
    Ang, Y.S., Manohar, S., Apps, M.A.J.: Commentary: noradrenaline and dopamine neurons in the reward/effort trade-off: a direct electrophysiological comparison in behaving monkeys. Front. Behav. Neurosci. 9, 310 (2015)., CrossRefGoogle Scholar
  2. 2.
    Aston-Jones, G., Cohen, J.D.: An integrative theory of locus coeruleus-norepinephrine function: adaptive gain and optimal performance. Ann. Rev. Neurosci. 28(1), 403–450 (2005). CrossRefGoogle Scholar
  3. 3.
    Bækgaard, P.: Simple python interface to the Eye Tribe eye tracker (2015).
  4. 4.
    Bækgaard, P.: Attention Network Test implemented in PsychoPy (2016).
  5. 5.
    Baekgaard, P., Petersen, M.K., Larsen, J.E.: Differentiating attentional network components using mobile eye tracking (in preparation)Google Scholar
  6. 6.
    Beatty, J.: Task-evoked pupillary responses, processing load, and the structure of processing resources (1982)Google Scholar
  7. 7.
    Fan, J., McCandliss, B.D., Sommer, T., Raz, A., Posner, M.I.: Testing the efficiency and independence of attentional networks. J. Cogn. Neurosci. 14(3), 340–347 (2002). CrossRefGoogle Scholar
  8. 8.
    Gabay, S., Pertzov, Y., Henik, A.: Orienting of attention, pupil size, and the norepinephrine system. Attention Percept. Psychophysics 73(1), 123–129 (2011). CrossRefGoogle Scholar
  9. 9.
    Hampel, F.R.: The influence curve and its role in robust estimation. J. Am. Stat. Assoc. 69(346), 383–393 (1974). MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Holmqvist, K.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford (2011)Google Scholar
  11. 11.
    Hunter, J.D.: Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9(3), 99–104 (2007)CrossRefGoogle Scholar
  12. 12.
    Hyönä, J., Tommola, J., Alaja, A.M.: Pupil dilation as a measure of processing load in simultaneous interpretation and other language tasks. Q. J. Exp. Psychology Sect. A 48(3), 598–612 (1995). CrossRefGoogle Scholar
  13. 13.
    Joshi, S., Li, Y., Kalwani, R.M., Gold, J.I.: Relationships between pupil diameter and neuronal activity in the locus coeruleus, colliculi, and cingulate cortex. Neuron 89(1), 221–234 (2016)CrossRefGoogle Scholar
  14. 14.
    Laeng, B., Ørbo, M., Holmlund, T., Miozzo, M.: Pupillary stroop effects. Cogn. Process. 12(1), 13–21 (2011)CrossRefGoogle Scholar
  15. 15.
    McKinney, W.: Data structures for statistical computing in python. In: Proceedings of the 9th Python in Science Conference 1697900(Scipy), pp. 51-56 (2010).
  16. 16.
    Oliphant, T.E.: SciPy: open source scientific tools for python. Comput. Sci. Eng. 9, 10–20 (2007). CrossRefGoogle Scholar
  17. 17.
    Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, É.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2012)., MathSciNetzbMATHGoogle Scholar
  18. 18.
    Peirce, J.W.: PsychoPy-psychophysics software in python. J. Neurosci. Methods 162(1–2), 8–13 (2007). CrossRefGoogle Scholar
  19. 19.
    Pérez, F., Granger, B.E.: IPython: a system for interactive scientific computing. Comput. Sci. Eng. 9(3), 21–29 (2007). CrossRefGoogle Scholar
  20. 20.
    Posner, M.I.: Attentional networks and consciousness. Front. Psychol. 3, 1–4 (2012). Google Scholar
  21. 21.
    The Eye Tribe: The Eye Tribe API Reference.
  22. 22.
    Van Der Walt, S., Colbert, S.C., Varoquaux, G.: The NumPy array: a structure for efficient numerical computation. Comput. Sci. Eng. 13(2), 22–30 (2011)CrossRefGoogle Scholar
  23. 23.
    Varazzani, C., San-Galli, A., Gilardeau, S., Bouret, S.: Noradrenaline and dopamine neurons in the reward/effort trade-off: a direct electrophysiological comparison in behaving monkeys. J. Neurosci. 35(20), 7866–7877 (2015)., CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Per Bækgaard
    • 1
    Email author
  • Michael Kai Petersen
    • 1
  • Jakob Eg Larsen
    • 1
  1. 1.Cognitive Systems, Department of Applied Mathematics and Computer ScienceTechnical University of DenmarkKongens LyngbyDenmark

Personalised recommendations