Eye Movement Recordings in Natural Settings

  • Benjamin W. TatlerEmail author
  • Dan Witzner Hansen
  • Jeff B. Pelz
Part of the Studies in Neuroscience, Psychology and Behavioral Economics book series (SNPBE)


In this chapter we consider why it is important to study eye movements in natural, real world settings and the practical considerations that need to be taken into account when collecting eye movement data in these situations. Conducting a study in a real world environment poses very different challenges than those present in laboratory-based settings. Variables are hard to control and this restricts the kinds of paradigms that can be employed. Thus, careful consideration of the research question and whether it is appropriate for real world study is a necessary first step. Mobile eye trackers are often the most obvious choice for studying eye movements in real world settings. There exists a range of mobile eye tracking systems available to buy, and researchers also build their own mobile eye trackers to solve particular issues. Selecting the right system is an important consideration for mobile eye tracking and in this chapter we will highlight some of the key choices that the researcher should consider when buying or building a mobile eye tracker. Care must be taken to ensure that the data collected are sufficiently reliable to address the research question. While the principles of eye tracking - how we detect features of the eye and use these to estimate where someone is looking - are very similar for all eye trackers, the challenges faced when collecting data in natural settings are far greater than those that face eye trackers in a controlled laboratory setting. We will consider the key threats that real world settings pose for collecting reliable eye tracking data in the hope that we not only raise awareness of these issues but also offer some suggestions for how these can be minimised or at least identified in the data. For those considering whether to run an eye tracking study in the real world, we hope that this chapter will help you as you consider your research questions and choices of mobile eye tracker. For those already committed to studying eye movement behaviour in a real world setting, we hope that our discussion of the problems and possible solutions to the challenges faced in such settings will help minimise data loss from the common challenges that we face when conducting mobile eye tracking studies.


  1. Ahmed, Z., Mardanbegi, D., & Hansen, D. W. (2016). Pupil center as a function of pupil diameter. In Proceedings of the ninth biennial acm sym- posium on eye tracking research & applications (pp. 283–286). ETRA ’16. Charleston, South Carolina: ACM.
  2. Babcock, J. S. & Pelz, J. B. (2004). Building a lightweight eyetracking headgear. In Proceedings of the 2004 symposium on eye tracking research & applications (pp. 109–114). ACM.Google Scholar
  3. Barsalou, L. W. (2008). Grounded cognition. Annu. Rev. Psychol., 59, 617–645.CrossRefPubMedGoogle Scholar
  4. Bekkering, H., & Neggers, S. F. (2002). Visual search is modulated by action intentions. Psychological science, 13(4), 370–374.CrossRefPubMedGoogle Scholar
  5. Buswell, G. T. (1935). How people look at pictures: a study of the psychology and perception in art. University of Chicago Press.Google Scholar
  6. Carpenter, R. H. S. (1981). Oculomotor procrastination. In D. F. Fisher, R. A. Monty, & J. W. Senders (Eds.), Eye movements (pp. 237–246). Hillsdale: Lawrence Erlbaum: Eye Movements: Cognition and Visual Perception.Google Scholar
  7. Dicks, M., Button, C., & Davids, K. (2010). Examination of gaze behaviors under in situ and video simulation task constraints reveals differences in information pickup for perception and action. Attention Perception & Psychophysics, 72(3), 706–720.CrossRefGoogle Scholar
  8. Diefendorf, A. R., & Dodge, R. (1908). An experimental study of the ocular reactions of the insane from photographic records. Brain, 31(3), 451–489.CrossRefGoogle Scholar
  9. Epelboim, J. L., Steinman, R. M., Kowler, E., Edwards, M., Pizlo, Z., Erkelens, C., et al. (1995). The function of visual search and memory in sequential looking tasks. Vision Research, 35, 3401–3422.CrossRefPubMedGoogle Scholar
  10. Evans, K. M., Jacobs, R. A., Tarduno, J. A., & Pelz, J. B. (2012). Collecting and analyzing eye tracking data in outdoor environments. Journal of Eye Movement Research, 5(2), 6.Google Scholar
  11. Fagioli, S., Hommel, B., & Schubotz, R. I. (2007). Intentional control of attention: action planning primes action-related stimulus dimensions. Psychological research, 71(1), 22–29.CrossRefPubMedGoogle Scholar
  12. Foulsham, T., Walker, E., & Kingstone, A. (2011). The where, what and when of gaze allocation in the lab and the natural environment. Vision research, 51(17), 1920–1931.CrossRefPubMedGoogle Scholar
  13. Furneaux, S., & Land, M. F. (1999). The effects of skill on the eye-hand span during musical sight-reading. Proceedings of the Royal Society of London B, 266, 2453–2440.CrossRefGoogle Scholar
  14. Gharaee, H., Shafiee, M., Hoseini, R., Abrishami, M., Abrishami, Y., & Abrishami, M. (2015). Angle kappa measurements: normal values in healthy iranian population obtained with the orbscan ii. Iranian Red Crescent Medical Journal, 17(1),Google Scholar
  15. Hallett, P. (1978). Primary and secondary saccades to goals defined by instructions. Vision research, 18(10), 1279–1296.CrossRefPubMedGoogle Scholar
  16. Hansen, D. W. (2014). Proceedings of the symposium on eye tracking research and applications 2014. In Conference etra’14 eye tracking research and applications. Association for Computing Machinery.Google Scholar
  17. Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: a survey of models for eyes and gaze. IEEE Transactions on pattern analysis and machine intelligence, 32(3), 478–500.CrossRefPubMedGoogle Scholar
  18. Hartridge, H., & Thomson, L. (1948). Methods of investigating eye movements. The British Journal of Ophthalmology, 32(9), 581–591.CrossRefPubMedPubMedCentralGoogle Scholar
  19. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye tracking: a comprehensive guide to methods and measures. OUP Oxford.Google Scholar
  20. Jarodzka, H., Holmqvist, K., & Nyström, M. (2010). A vector-based, Multidimensional scanpath similarity measure. In Proceedings of the 2010 sym- posium on eye-tracking research & applications (pp. 211–218). ACM.Google Scholar
  21. Kingstone, A., Laidlaw, K., Nasiopoulos, E., & Risko, E. (2017). Cognitive ethology and social attention. In On human nature (pp. 365–382). Elsevier.Google Scholar
  22. Kingstone, A., Smilek, D., & Eastwood, J. D. (2008). Cognitive Ethology: A new approach for studying human cognition. British Journal of Psy-chology, 99(3), 317–340.CrossRefGoogle Scholar
  23. Kinsman, T., Evans, K., Sweeney, G., Keane, T., & Pelz, J. (2012). Egomotion compensation improves fixation detection in wearable eye tracking. In Proceedings of the symposium on eye tracking research and ap- plications (pp. 221–224). ACM.Google Scholar
  24. Kohlbecher, S., Bartl, K., Bardins, S., & Schneider, E. (2010). Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time. In Proceedings of the 2010 symposium on eye-tracking research & applications (pp. 117–120). ACM.Google Scholar
  25. Land, M., Marshall, J., Brownless, D., & Cronin, T. (1990). The eye-movements of the mantis shrimp odontodactylus scyllarus (crustacea: stomatopoda). Journal of Comparative Physiology A, 167(2), 155–166.CrossRefGoogle Scholar
  26. Land, M. F. (1993). Eye-head coordination during driving. IEEE Systems, Man and Cybernetics: Le Touquet.CrossRefGoogle Scholar
  27. Land, M. F. (2004). The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Experimental brain research, 159(2), 151–160.CrossRefPubMedGoogle Scholar
  28. Land, M. F., & Furneaux, S. (1997). The knowledge base of the oculomotor system. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 352, 1231–1239.Google Scholar
  29. Land, M. F., & Lee, D. N. (1994). Where we look when we steer. Nature, 369(6483), 742–744.CrossRefGoogle Scholar
  30. Land, M. F., & McLeod, P. (2000). From eye movements to actions: how batsmen hit the ball. Nature Neuroscience, 3, 1340–1345.CrossRefPubMedGoogle Scholar
  31. Land, M. F., Mennie, N., & Rusted, J. (1999). The roles of vision and eye movements in the control of activities of daily living. Perception, 28(11), 1311–1328.CrossRefPubMedGoogle Scholar
  32. Land, M. F., & Tatler, B. W. (2009). Looking and Acting: Vision and Eye Movements in Natural Behaviour. Oxford: Oxford University Press.CrossRefGoogle Scholar
  33. Mackworth, N. H., & Thomas, E. L. (1962). Head-mounted eye-marker camera. JOSA, 52(6), 713–716.Google Scholar
  34. MacLeod, C., Mathews, A., & Tata, P. (1986). Attentional bias in emotional disorders. Journal of Abnormal Psychology, 95(1), 15–20.CrossRefGoogle Scholar
  35. Mardanbegi, D. & Hansen, D. W. (2012). Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 acm conference on ubiquitous computing (pp. 689–694). ACM.Google Scholar
  36. Merchant, J. (1966). Oculometry. Optical Engineering, 4(2), 040258–040258.CrossRefGoogle Scholar
  37. Merchant, J., Morrissette, R., & Porterfield, J. L. (1974). Remote measurement of eye direction allowing subject motion over one cubic foot of space. Biomedical Engineering, IEEE Transactions on, 4, 309–317.CrossRefGoogle Scholar
  38. Mulligan, J. B. (2006). Optical eye models for gaze tracking. In Proceedings of the 2006 symposium on eye tracking research & applications (pp. 51–51). ACM.Google Scholar
  39. Narcizo, F. B. & Hansen, D. W. (2015, December). Depth compensation model for gaze estimation in sport analysis. In Proceedings of the 2015 ieee international conference on computer vision workshop (pp. 788–795). ICCVW ’15. Washington, DC, USA: IEEE Computer Society.
  40. Nasiopoulos, E., Risko, E., Foulsham, T., & Kingstone, A. (2015). The social presence effect of wearing an eye tracker: now you see it, now you don’t. PERCEPTION, 44, 79–80.CrossRefGoogle Scholar
  41. Posner, M. I. (1980). Orienting of attention. Quarterly Journal of Experi- mental Psychology, 32(1), 3–25.CrossRefGoogle Scholar
  42. Risko, E. F., & Kingstone, A. (2011). Eyes wide shut: implied social presence, eye tracking and attention. Attention, Perception, & Psychophysics, 73(2), 291–296.CrossRefGoogle Scholar
  43. Risko, E. F., & Kingstone, A. (2017). Everyday attention. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie experimentale, 71(2), 89–92.CrossRefPubMedGoogle Scholar
  44. Shackel, B. (1960). Note on mobile eye viewpoint recording. JOSA, 50(8), 763–768.CrossRefGoogle Scholar
  45. Tatler, B. W., & Land, M. F. (2016). Everyday visual attention. Handbook of attention, 391–422.Google Scholar
  46. Tatler, B. W., Macdonald, R. G., Hamling, T., & Richardson, C. (2016). Looking at domestic textiles: an eye-tracking experiment analysing in- uences on viewing behaviour at owlpen manor. Textile History, 47(1), 94–118.CrossRefGoogle Scholar
  47. Theeuwes, J. (1991). Exogenous and endogenous control of attention: the effect of visual onsets and offsets. Perception & psychophysics, 49(1), 83–90.CrossRefGoogle Scholar
  48. Thomas, E. L. (1968). Movements of the eye. Scientific American, 219(2), 88–95.CrossRefPubMedGoogle Scholar
  49. Wolfe, J. M. (1998). Visual Search. In H. Pashler (Ed.), Attention (pp. 13–73). London: University College London Press.Google Scholar
  50. Young, L. R., & Sheena, D. (1975). Survey of eye movement recording methods. Behavior research methods & instrumentation, 7(5), 397–429.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Benjamin W. Tatler
    • 1
    Email author
  • Dan Witzner Hansen
    • 2
  • Jeff B. Pelz
    • 3
  1. 1.School of PsychologyUniversity of AberdeenAberdeenUK
  2. 2.IT University CopenhagenCopenhagenDenmark
  3. 3.Carlson Center for Imaging Science, Rochester Institute of TechnologyRochesterUSA

Personalised recommendations