Cognitive Computation

, Volume 3, Issue 1, pp 79–88 | Cite as

Eye Movements Show Optimal Average Anticipation with Natural Dynamic Scenes

  • Eleonora Vig
  • Michael Dorr
  • Thomas Martinetz
  • Erhardt Barth
Article

Abstract

A less studied component of gaze allocation in dynamic real-world scenes is the time lag of eye movements in responding to dynamic attention-capturing events. Despite the vast amount of research on anticipatory gaze behaviour in natural situations, such as action execution and observation, little is known about the predictive nature of eye movements when viewing different types of natural or realistic scene sequences. In the present study, we quantify the degree of anticipation during the free viewing of dynamic natural scenes. The cross-correlation analysis of image-based saliency maps with an empirical saliency measure derived from eye movement data reveals the existence of predictive mechanisms responsible for a near-zero average lag between dynamic changes of the environment and the responding eye movements. We also show that the degree of anticipation is reduced when moving away from natural scenes by introducing camera motion, jump cuts, and film-editing.

Keywords

Eye movements Anticipatory gaze behaviour Natural dynamic scenes Saliency maps 

References

  1. 1.
    Barth E, Dorr M, Böhme M, Gegenfurtner KR, Martinetz T. Guiding the mind’s eye: improving communication and vision by external control of the scanpath. In: Rogowitz BE, Pappas TN, Daly SJ, editors. Human vision and electronic imaging, vol 6057 of Proceedings of SPIE. Invited contribution for a special session on Eye Movements, Visual Search, and Attention: a Tribute to Larry Stark; 2006.Google Scholar
  2. 2.
    Becker W. Saccades. In: Carpenter RHS, editor. Vision & visual dysfunction, vol 8: Eye movements. London: CRC Press; 1991. p. 95–137.Google Scholar
  3. 3.
    Böhme M, Dorr M, Krause C, Martinetz T, Barth E. Eye movement predictions on natural videos. Neurocomputing. 2006;69(16–18):1996–2004.CrossRefGoogle Scholar
  4. 4.
    Carmi R, Itti L. Visual causes versus correlates of attentional selection in dynamic scenes. Vis Res. 2006;46:4333–45.PubMedCrossRefGoogle Scholar
  5. 5.
    Carpenter RHS. Oculomotor procrastination. In: Fisher DF, Monty RA, Senders JW, editors. Eye movements: cognition and visual perception. Hillsdale, NJ: Lawrence Erlbaum; 1981. p. 237–46.Google Scholar
  6. 6.
    Chajka K, Hayhoe M, Sullivan B, Pelz J, Mennie N, Droll J. Predictive eye movements in squash. J Vis. 2006;6(6):481–6.CrossRefGoogle Scholar
  7. 7.
    Crundall D, Chapman P, Phelps N, Underwood G. Eye movements and hazard perception in police pursuit and emergency response driving. J Exp Psychol. 2003;9(3):163–74.Google Scholar
  8. 8.
    Findlay JM. Spatial and temporal factors in the predictive generation of saccadic eye movements. Vis Res. 1981;21(3):347–54.PubMedCrossRefGoogle Scholar
  9. 9.
    Flanagan JR, Johansson RS. Action plans used in action observation. Nature. 2003;424(6950):769–71.PubMedCrossRefGoogle Scholar
  10. 10.
    Gesierich B, Bruzzo A, Ottoboni G, Finos L. Human gaze behaviour during action execution and observation. Acta Psychol. 2008;128(2):324–30.CrossRefGoogle Scholar
  11. 11.
    Goldstein RB, Woods RL, Peli E. Where people look when watching movies: do all viewers look at the same place?. Comput Biol Med. 2007;3(7):957–64.CrossRefGoogle Scholar
  12. 12.
    Gonzalez RC, Woods RE. Digital image processing, 2nd edn. Boston, MA: Addison-Wesley Longman Publishing Co., Inc; 2001.Google Scholar
  13. 13.
    Hayhoe M, Ballard D. Eye movements in natural behavior. Trends Cogn Sci. 2005;9(4):188–94.PubMedCrossRefGoogle Scholar
  14. 14.
    Itti L. Automatic foveation for video compression using a neurobiological model of visual attention. IEEE Trans Image Process. 2004;13(10):1304–18.PubMedCrossRefGoogle Scholar
  15. 15.
    Itti L. Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cogn. 2005;12(6):1093–123.CrossRefGoogle Scholar
  16. 16.
    Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell. 1998;20(11):1254–9.CrossRefGoogle Scholar
  17. 17.
    Koch C, Ullman S. Shifts in selective visual attention: towards the underlying neural circuitry. Hum Neurobiol. 1985;4(4):219–27.PubMedGoogle Scholar
  18. 18.
    Land MF, Furneaux S. The knowledge base of the oculomotor system. Philos Trans R Soc B Biol Sci. 1997;352:1231–9.CrossRefGoogle Scholar
  19. 19.
    Land MF, Hayhoe M. In what ways do eye movements contribute to everyday activities?. Vis Res. 2001;41:3559–65.PubMedCrossRefGoogle Scholar
  20. 20.
    Land MF, Lee DN. Where we look when we steer. Nature. 1994;369:742–4.PubMedCrossRefGoogle Scholar
  21. 21.
    Land MF, McLeod P. From eye movements to actions: how batsmen hit the ball. Nat Neurosci. 2000;3:1340–5.PubMedCrossRefGoogle Scholar
  22. 22.
    Land MF, Mennie N, Rusted J. The roles of vision and eye movements in the control of activities of daily living. Perception. 1999;28:1311–28.PubMedCrossRefGoogle Scholar
  23. 23.
    Mennie N, Hayhoe M, Sullivan B. Look-ahead fixations: anticipatory eye movements in natural tasks. Exp Brain Res. 2007;179(3):427–42.PubMedCrossRefGoogle Scholar
  24. 24.
    Pelz JB, Canosa R. Oculomotor behavior and perceptual strategies in complex tasks. Vis Res. 2001;41:3587–96.PubMedCrossRefGoogle Scholar
  25. 25.
    Reinagel P, Zador AM. Natural scene statistics at the centre of gaze. Network Comput Neural Syst. 1999;10:341–50.CrossRefGoogle Scholar
  26. 26.
    Russo FD, Pitzalis S, Spinelli D. Fixation stability and saccadic latency in élite shooters. Vis Res. 2003;43(17):1837–45.PubMedCrossRefGoogle Scholar
  27. 27.
    Smit AC, Van Gisbergen JAM. A short-latency transition in saccade dynamics during square-wave tracking and its significance for the differentiation of visually-guided and predictive saccades. Exp Brain Res. 1989;76:64–74.PubMedCrossRefGoogle Scholar
  28. 28.
    Smith TJ, Henderson JM. Edit blindness: the relationship between attention and global change blindness in dynamic scenes. J Eye Movement Res. 2008;2(2):1–17.Google Scholar
  29. 29.
    ’t Hart BM, Vockeroth J, Schumann F, Bartl K, Schneider E, König P, Einhäuser W. Gaze allocation in natural stimuli: comparing free exploration to head-fixed viewing conditions. Visual Cogn. 2009;17(6/7):1132–58.CrossRefGoogle Scholar
  30. 30.
    Tatler BW, Baddeley RJ, Gilchrist ID. Visual correlates of fixation selection: effects of scale and time. Vis Res. 2005;45:643–59.PubMedCrossRefGoogle Scholar
  31. 31.
    Underwood G, Phelps N, Wright C, van Loon E, Galpin A. Eye fixation scanpaths of younger and older drivers in a hazard perception task. Ophthal Physiol Opt. 2005;25:346–56.CrossRefGoogle Scholar
  32. 32.
    Vig E, Dorr M, Barth E. Efficient visual coding and the predictability of eye movements on natural movies. Spat Vis 2009;22(5):397–408.PubMedCrossRefGoogle Scholar
  33. 33.
    Vig E, Dorr M, Martinetz T, Barth E. A learned saliency predictor for dynamic natural scenes. In: Diamantaras K, Duch W, Iliadis LS, editors. ICANN 2010, Part III, LNCS 6354, Berlin: Springer; 2010. p. 52–61.Google Scholar
  34. 34.
    Wooding DS. Eye movements of large populations: II. Deriving regions of interest, coverage, and similarity using fixation maps. Behav Res Methods Instruments Comput. 2002;34(4):518–28.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Eleonora Vig
    • 1
  • Michael Dorr
    • 1
    • 2
  • Thomas Martinetz
    • 1
  • Erhardt Barth
    • 1
  1. 1.Institute for Neuro- and BioinformaticsUniversity of LübeckLübeckGermany
  2. 2.Schepens Eye Research Institute, Department of OphthalmologyHarvard Medical SchoolBostonUSA

Personalised recommendations