Skip to main content

Methods and Models of Eye-Tracking in Natural Environments

  • Protocol
  • First Online:
Eye Tracking

Part of the book series: Neuromethods ((NM,volume 183))

  • 915 Accesses

Abstract

Mobile head-free eye-tracking is one of the most valuable methods we have in vision science for understanding the distribution and dynamics of attention in natural real-world tasks. However, mobile eye-tracking is still a somewhat nascent field, and experimental setups with such devices are not yet fully mature enough for consistently reliable investigation of real-world gaze behavior. Here, we review the development of eye-trackers from their inception to the current state of the art and discuss the experimental methodologies and technologies one can use to investigate natural goal-directed real-world gaze behavior in fully ecological experimental setups. We subsequently expand on the experimental approaches to discuss the modelling approaches used in the field with eye-tracking data, from conventional 2D saliency modelling to more fully embodied gaze approaches that incorporate gaze and motor behavior, which allow us to predict gaze dynamics in fully head-free experimental setups.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Protocol
USD 49.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Gehring WJ (2014) The evolution of vision. Wiley Interdiscip. Rev Dev Biol 3(1):1–40

    CAS  Google Scholar 

  2. Schroeder CE, Wilson DA, Radman T, Scharfman H, Lakatos P (2010) Dynamics of active sensing and perceptual selection. Curr Opin Neurobiol 20(2):172–176

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Johansson RS, Westling G, Bäckström A, Flanagan JR (2001) Eye–Hand coordination in object manipulation. J Neurosci 21(17):6917–6932

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. Hayhoe MM, Shrivastava A, Mruczek R, Pelz JB (2003) Visual memory and motor planning in a natural task. J Vis 3(1):6

    Article  Google Scholar 

  5. Ejaz N, Hamada M, Diedrichsen J (2015) Hand use predicts the structure of representations in sensorimotor cortex. Nat Neurosci 18(7):1034–1040

    Article  CAS  PubMed  Google Scholar 

  6. Haar S, Aldo Faisal A (2020) Neural biomarkers of multiple motor-learning mechanisms in a real-world task

    Google Scholar 

  7. Haar S, Sundar G, Faisal AA (2021) Embodied virtual reality for the study of real-world motor learning. PloS One 16(1):e0245717

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Rito Lima I, Haar S, Di Grassi L, Faisal AA (2020) Neurobehavioural signatures in race car driving: a case study. Sci Rep 10(1):11537

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Einhäuser W, Schumann F, Bardins S, Bartl K, Böning G, Schneider E, König P (2007) Human eye-head co-ordination in natural exploration. Netw Comput Neural Syst 18(3):267–297

    Article  Google Scholar 

  10. Land M, Mennie N, Rusted J (1999) The roles of vision and eye movements in the control of activities of daily living. Perception 28(11):1311–1328

    Article  CAS  PubMed  Google Scholar 

  11. Pelz JB, Canosa R, Babcock J (2000) Extended tasks elicit complex eye movement patterns. In: Proceedings of the symposium on Eye tracking research & applications—ETRA ’00. ACM Press, Palm Beach Gardens, Florida, United States, ETRA ’00, pp 37–43

    Google Scholar 

  12. Hayhoe M, Ballard D (2005) Eye movements in natural behavior. Trends Cogn Sci 9(4):188–194

    Article  PubMed  Google Scholar 

  13. Abbott WW, Faisal AA (2012) Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces. J Neural Eng 9(4):046016

    Article  CAS  PubMed  Google Scholar 

  14. Buswell GT (1935) How people look at pictures: a study of the psychology and perception in art. University of Chicago Press, Chicago

    Google Scholar 

  15. Barlow HB (1952) Eye movements during fixation. J Physiol 116(3):290–306

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  16. Yarbus AL (1967) Eye movements during perception of complex objects. In: Eye Movements and Vision, Springer, Boston, pp 171–211

    Chapter  Google Scholar 

  17. Mackworth JF, Mackworth NH (1958) Eye fixations recorded on changing visual scenes by the television eye-marker. J Opt Soc Am 48(7):439–445

    Article  CAS  PubMed  Google Scholar 

  18. Robinson DA (1963) A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Trans Bio-med Electron 10(4):137–145

    Article  CAS  Google Scholar 

  19. Murphy PJ, Duncan AL, Glennie AJ, Knox PC (2001) The effect of scleral search coil lens wear on the eye. Br J Ophthalmol 85(3):332–335

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  20. Mackworth NH, Thomas EL (1962) Head-mounted eye-marker camera. J Opt Soc Am 52:713–716

    Article  CAS  PubMed  Google Scholar 

  21. Shackel B (1960) Note on mobile eye viewpoint recording. J Opt Soc Am 50:763–768

    Article  CAS  PubMed  Google Scholar 

  22. Crane HD, Steele CM (1985) Generation-V dual-purkinje-image eyetracker. Appl Opt 24(4):527

    Article  CAS  PubMed  Google Scholar 

  23. Schneider E, Villgrattner T, Vockeroth J, Bartl K, Kohlbecher S, Bardins S, Ulbrich H, Brandt T (2009) EyeSeeCam: an eye Movement-Driven head camera for the examination of natural visual exploration. Ann N Y Acad Sci 1164(1):461–467

    Article  PubMed  Google Scholar 

  24. Foulsham T, Underwood G (2008) What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition. J Vis 8(2):6.1–17

    Article  Google Scholar 

  25. Henderson JM (2003) Human gaze control during real-world scene perception. Trends Cogn Sci 7(11):498–504

    Article  PubMed  Google Scholar 

  26. Foulsham T, Cheng JT, Tracy JL, Henrich J, Kingstone A (2010) Gaze allocation in a dynamic situation: Effects of social status and speaking. Cognition 117(3):319–331

    Article  PubMed  Google Scholar 

  27. Foulsham T, Walker E, Kingstone A (2011) The where, what and when of gaze allocation in the lab and the natural environment. Vis Res 51(17):1920–1931

    Article  PubMed  Google Scholar 

  28. Marius ’t Hart B, Vockeroth J, Schumann F, Bartl K, Schneider E, König P, Einhäuser W (2009) Gaze allocation in natural stimuli: comparing free exploration to head-fixed viewing conditions. Vis Cogn 17(6–7):1132–1158

    Google Scholar 

  29. Papinutto M, Lao J, Lalanne D, Caldara R (2020) Watchers do not follow the eye movements of walkers. Vis Res 176:130–140

    Article  CAS  PubMed  Google Scholar 

  30. Kowler E (2011) Eye movements: the past 25 years. Vis Res 51(13):1457–1483

    Article  PubMed  Google Scholar 

  31. Goettker A, Agtzidis I, Braun DI, Dorr M, Gegenfurtner KR (2020) From gaussian blobs to naturalistic videos: comparison of oculomotor behavior across different stimulus complexities. J Vis 20(8):26

    Article  PubMed  PubMed Central  Google Scholar 

  32. Koch C, Ullman S (1985) Shifts in selective visual attention: towards the underlying neural circuitry. Hum Neurobiol 4(4):219–227

    CAS  PubMed  Google Scholar 

  33. Niebur E, Koch C (1996) Control of selective visual attention: modeling the “where” pathway. In: Touretzky DS, Mozer MC, Hasselmo ME (eds) Advances in neural information processing systems 8 (NIPS 1995), advances in neural information processing systems. MIT Press, Cambridge, pp 802–808

    Google Scholar 

  34. Itti L, Koch C, Niebur E (1998) A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell 20(11):1254–1259

    Article  Google Scholar 

  35. Faisal AA, Fislage M, Pomplun M, Rae R, Ritter H (1998) Observation of human eye movements to simulate visual exploration of complex scenes. SFB Rep 360:1–34

    Google Scholar 

  36. Schütz AC, Braun DI, Gegenfurtner KR (2011) Eye movements and perception: a selective review. J Vis 11(5):9–9

    Article  PubMed  Google Scholar 

  37. Wandell BA (1995) Useful quantities in vision science. Inner cover pages in “Foundations of vision”. Sinauer Associates, Sunderland

    Google Scholar 

  38. Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans R Soc Lond B: Biol Sci 352(1358):1231–1239

    Article  CAS  Google Scholar 

  39. Jovancevic J, Sullivan B, Hayhoe M (2006) Control of attention and gaze in complex environments. J Vis 6(12):1431–1450

    Article  PubMed  Google Scholar 

  40. Triesch J, Ballard DH, Hayhoe MM, Sullivan BT (2003) What you see is what you need. J Vis 3(1):86–94

    Article  PubMed  Google Scholar 

  41. Patla A, Vickers J (2003) How far ahead do we look when required to step on specific locations in the travel path during locomotion? Exp Brain Res 148(1):133–138

    Article  PubMed  Google Scholar 

  42. Matthis JS, Yates JL, Hayhoe MM (2018) Gaze and the control of foot placement when walking in natural terrain. Curr Biol 28(8):1224–1233.e5

    Google Scholar 

  43. Keshava A, Nezami FN, Neumann H, Izdebski K, Schüler T, König P (2021) Just-in-time: gaze guidance behavior while action planning and execution in VR

    Google Scholar 

  44. Land MF, Hayhoe M (2001) In what ways do eye movements contribute to everyday activities? Vis Res 41(25-26):3559–3565

    Article  CAS  PubMed  Google Scholar 

  45. Hayhoe M, Ballard D (2014) Modeling task control of eye movements. Curr Biol 24(13):R622–R628

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  46. Land MF (2004) The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Exp Brain Res 159(2):151–160

    Article  PubMed  Google Scholar 

  47. Ripoll H, Fleurance PH, Cazeneuve D (1987) Analysis of visual patterns of table tennis players. In: O’regan JK, Levy-Schoen A (eds) Eye movements from physiology to cognition. Elsevier, Amsterdam, pp 616–617

    Chapter  Google Scholar 

  48. Mann DL, Spratford W, Abernethy B (2013) The head tracks and gaze predicts: how the world’s best batters hit a ball. PLoS One 8(3):e58289

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  49. Kishita Y, Ueda H, Kashino M (2020) Eye and head movements of elite baseball players in real batting. Front Sports Active Living 2:3

    Article  Google Scholar 

  50. Guestrin ED, Eizenman M (2006) General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans Bio-med Eng 53(6):1124–1133

    Article  Google Scholar 

  51. Moghimi M, Azagra P, Montesano L, Murillo AC, Belongie S (2014) Experiments on an RGB-D wearable vision system for egocentric activity recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 597–603

    Google Scholar 

  52. Hu JF, Zheng WS, Ma L, Wang G, Lai J (2016) Real-Time RGB-D activity prediction by soft regression. In: Leibe B, Matas J, Sebe N, Welling M (eds) Computer vision—ECCV 2016, vol 9905. Springer, Cham, pp 280–296

    Chapter  Google Scholar 

  53. Tang Y, Tian Y, Lu J, Feng J, Zhou J (2017) Action recognition in RGB-D egocentric videos. In: 2017 IEEE International conference on image processing (ICIP). IEEE, Beijing, pp 3410–3414

    Google Scholar 

  54. Niehorster DC, Santini T, Hessels RS, Hooge ITC, Kasneci E, Nyström M (2020) The impact of slippage on the data quality of head-worn eye trackers. Behav Res Methods 52(3):1140–1160

    Article  PubMed  PubMed Central  Google Scholar 

  55. Sullivan B, Ludwig CJH, Damen D, Mayol-Cuevas W, Gilchrist ID (2021) Look-ahead fixations during visuomotor behavior: evidence from assembling a camping tent. J Vis 21(3):13

    Article  PubMed  PubMed Central  Google Scholar 

  56. Hendrickson AE, Yuodelis C (1984) The morphological development of the human fovea. Ophthalmology 91(6):603–612

    Article  CAS  PubMed  Google Scholar 

  57. Collewijn H, Erkelens CJ (1990) Binocular eye movements and the perception of depth. Rev Oculomotor Res 4:213–261

    CAS  Google Scholar 

  58. Brouwer AM, Franz VH, Gegenfurtner KR (2009) Differences in fixations between grasping and viewing objects. J Vis 9(1):18.1–24

    Google Scholar 

  59. Smeets JB, Hayhoe MM, Ballard DH (1996) Goal-directed arm movements change eye-head coordination. Exp Brain Res 109(3):434–440

    Article  CAS  PubMed  Google Scholar 

  60. Nakashima R, Fang Y, Hatori Y, Hiratani A, Matsumiya K, Kuriki I, Shioiri S (2015) Saliency-based gaze prediction based on head direction. Vis Res 117:59–66

    Article  PubMed  Google Scholar 

  61. Peacock CE, Hayes TR, Henderson JM (2019) Meaning guides attention during scene viewing, even when it is irrelevant. Attention Percept Psychophys 81(1):20–34

    Article  Google Scholar 

  62. Abbott WW, Harston JA, Aldo Faisal A (2020) Linear embodied saliency: a model of Full-Body kinematics-based visual attention

    Google Scholar 

  63. Tatler BW, Hayhoe MM, Land MF, Ballard DH (2011) Eye guidance in natural vision: reinterpreting salience. J Vis 11(5):5

    Article  PubMed  Google Scholar 

  64. Li M, Songur N, Orlov P, Leutenegger S, Faisal AA (2018) Towards an embodied semantic fovea: semantic 3D scene reconstruction from ego-centric eye-tracker videos

    Google Scholar 

  65. Oliva A, Torralba A (2006) Chapter 2 building the gist of a scene: the role of global image features in recognition. In: Progress in brain research, vol 155, Elsevier, Amsterdam, pp 23–36

    Google Scholar 

  66. Parkhurst D, Law K, Niebur E (2002) Modeling the role of salience in the allocation of overt visual attention. Vis Res 42(1):107–123

    Article  PubMed  Google Scholar 

  67. Desimone R, Duncan J (1995) Neural mechanisms of selective visual attention. Ann Rev Neurosci 18:193–222

    Article  CAS  PubMed  Google Scholar 

  68. Treisman AM, Gelade G (1980) A feature-integration theory of attention. Cogn Psychol 12(1):97–136

    Article  CAS  PubMed  Google Scholar 

  69. Carandini M, Demb JB, Mante V, Tolhurst DJ, Dan Y, Olshausen BA, Gallant JL, Rust NC (2005) Do we know what the early visual system does? J Neurosc 25(46):10577–10597

    Article  CAS  Google Scholar 

  70. White BJ, Kan JY, Levy R, Itti L, Munoz DP (2017) Superior colliculus encodes visual saliency before the primary visual cortex. Proc Nat Acad Sci USA 114(35):9451–9456

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  71. White BJ, Berg DJ, Kan JY, Marino RA, Itti L, Munoz DP (2017) Superior colliculus neurons encode a visual saliency map during free viewing of natural dynamic video. Nat Commun 8(1):14263

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  72. Veale R, Hafed ZM, Yoshida M (2017) How is visual salience computed in the brain? Insights from behaviour, neurobiology and modelling. Philos Trans R Soc Lond B: Biol Sci 372(1714):20160113

    Article  Google Scholar 

  73. Melloni L, van Leeuwen S, Alink A, Müller NG (2012) Interaction between bottom-up saliency and top-down control: how saliency maps are created in the human brain. Cereb Cortex 22(12):2943–2952

    Article  PubMed  Google Scholar 

  74. Betz T, Kietzmann T, Wilming N, Konig P (2010) Investigating task-dependent top-down effects on overt visual attention. J Vis 10(3):1–14

    Article  PubMed  Google Scholar 

  75. Vig E, Dorr M, Cox D (2014) Large-scale optimization of hierarchical features for saliency prediction in natural images. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2798–2805

    Google Scholar 

  76. Kümmerer M, Theis L, Bethge M (2014) Deep gaze I: boosting saliency prediction with feature maps trained on ImageNet

    Google Scholar 

  77. Liu N, Han J, Zhang D, Wen S, Liu T (2015) Predicting eye fixations using convolutional neural networks. In: 2015 IEEE conference on computer vision and pattern recognition (CVPR), pp 362–370

    Google Scholar 

  78. Li X, Zhao L, Wei L, Yang MH, Wu F, Zhuang Y, Ling H, Wang J (2016) DeepSaliency: multi-task deep neural network model for salient object detection. IEEE Trans Image Process Publ IEEE Signal Process Soc 25(8):3919–3930

    Article  Google Scholar 

  79. Zhao R, Ouyang W, Li H, Wang X (2015) Saliency detection by multi-context deep learning. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1265–1274

    Google Scholar 

  80. Kruthiventi SSS, Ayush K, Babu RV (2017) DeepFix: a fully convolutional neural network for predicting human eye fixations. IEEE Trans Image Process: Publ IEEE Signal Process Soc 26(9):4446–4456

    Article  Google Scholar 

  81. Gatys LA, Kümmerer M, Wallis TSA, Bethge M (2017) Guiding human gaze with convolutional neural networks

    Google Scholar 

  82. Jiang M, Huang S, Duan J, Zhao Q (2015) Salicon: saliency in context. In: 2015 IEEE conference on Computer vision and pattern recognition (CVPR), pp 1072–1080

    Google Scholar 

  83. Kümmerer M, Wallis TSA, Bethge M (2015) Information-theoretic model comparison unifies saliency metrics. Proc Nat Acad Sci USA 112(52):16054–16059

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  84. Bylinskii Z, Judd T, Oliva A, Torralba A, Durand F (2016) What do different evaluation metrics tell us about saliency models?

    Google Scholar 

  85. Wloka C, Kotseruba I, Tsotsos JK (2017) Saccade sequence prediction: beyond static saliency maps

    Google Scholar 

  86. Kümmerer M, Wallis TSA, Bethge M (2016) DeepGaze II: reading fixations from deep features trained on object recognition

    Google Scholar 

  87. Pelz JB, Rothkopf C (2007) Chapter 31—oculomotor behavior in natural and man-made environments. In: Van Gompel RPG, Fischer MH, Murray WS, Hill RL (eds) Eye movements. Elsevier, Oxford, pp 661–676

    Chapter  Google Scholar 

  88. Tong MH, Zohar O, Hayhoe MM (2017) Control of gaze while walking: task structure, reward, and uncertainty. J Vis 17(1):28

    Article  PubMed  PubMed Central  Google Scholar 

  89. Schütt HH, Rothkegel LOM, Trukenbrod HA, Engbert R, Wichmann FA (2019) Disentangling bottom-up versus top-down and low-level versus high-level influences on eye movements over time. J Vis 19(3):1

    Article  PubMed  Google Scholar 

  90. Judd T, Ehinger K, Durand F, Torralba A (2009) Learning to predict where humans look. In: 2009 IEEE 12th international conference on computer vision, pp 2106–2113

    Google Scholar 

  91. Land MF, Lee DN (1994) Where we look when we steer. Nature 369(6483):742–744

    Article  CAS  PubMed  Google Scholar 

  92. Nuthmann A, Schütz I, Einhäuser W (2020) Salience-based object prioritization during active viewing of naturalistic scenes in young and older adults. Sci Rep 10(1):22057

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  93. Kothari R, Yang Z, Kanan C, Bailey R, Pelz J, Diaz G (2019) Gaze-in-wild: a dataset for studying eye and head coordination in everyday activities

    Google Scholar 

  94. Fuhl W, Kasneci G, Kasneci E (2021) TEyeD: Over 20 million real-world eye images with pupil, eyelid, and iris 2D and 3D segmentations, 2D and 3D landmarks, 3D eyeball, gaze vector, and eye movement types

    Google Scholar 

  95. Nguyen A, Yan Z (2019) A saliency dataset for 360-degree videos. In: Proceedings of the 10th ACM multimedia systems

    Google Scholar 

  96. Wang W, Shen J, Xie J, Cheng MM, Ling H, Borji A (2019) Revisiting video saliency prediction in the deep learning era. IEEE Trans Pattern Anal Mach Intell 43(1):220–237

    Article  Google Scholar 

  97. Engbert R, Trukenbrod HA, Barthelmé S, Wichmann FA (2015) Spatial statistics and attentional dynamics in scene viewing. J Vis 15(1):15.1.14

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We are very grateful for the support of the EPSRC in funding this project work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Aldo Faisal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Science+Business Media, LLC, part of Springer Nature

About this protocol

Check for updates. Verify currency and authenticity via CrossMark

Cite this protocol

Harston, J.A., Faisal, A.A. (2022). Methods and Models of Eye-Tracking in Natural Environments. In: Stuart, S. (eds) Eye Tracking. Neuromethods, vol 183. Humana, New York, NY. https://doi.org/10.1007/978-1-0716-2391-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-1-0716-2391-6_4

  • Published:

  • Publisher Name: Humana, New York, NY

  • Print ISBN: 978-1-0716-2390-9

  • Online ISBN: 978-1-0716-2391-6

  • eBook Packages: Springer Protocols

Publish with us

Policies and ethics