Abstract
Over the last 30 years, eye tracking has grown in popularity as a method to understand attention during visual search, principally because it provides a means to characterize the spatiotemporal properties of selective operations across a trial. In the present chapter, we review the motivations, methods, and measures for using eye tracking in visual search experiments. This includes a discussion of the advantages (and some disadvantages) of eye tracking data as a measure spatial attention, compared with more traditional reaction time paradigms. In addition, we discuss stimulus and design considerations for implementing experiments of this type. Finally, we will discuss the major measures that can be extracted from an eye tracking record and discuss the inferences that each allow. In the course of this discussion, we address both experiments using abstract arrays and experiments using real-world scene stimuli.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Note that instructions are rarely sufficient to ensure that participants do not make eye movements. Thus, even if the goal is to eliminate eye movements, gaze still needs to be monitored. Ideally, an eye tracker can be used, but there is another option. In covert attentions studies, we often use a simple video camera to display a large image of one of the eyes, and the experimenter monitors this image throughout the experiment (a human eye tracker). Movements of the eyes are quite easy to observe, and the experimenter both notes trials with eye movements and reminds the participant, when an eye movement is observed, to keep gaze focused on the relevant reference point. With appropriate, well-timed feedback of this sort, most participants quickly learn how to keep gaze focused centrally and rarely make eye movements after an initial practice session.
- 2.
Note that a present/absent design is not always ideal for an eye tracking study, as the mere presence or absence of the target can often be determined without foveation.
- 3.
Note, however, that in this example experiment, if we were to manipulate photographic images, we would need to use an object image from a different source and then paste and integrate it into both the plausible and implausible locations within the experimental scene; this would control our two conditions for artifacts generated by the process of adding an object to a particular scene location.
- 4.
An alternative is to make trial initiation contingent on central fixation.
- 5.
It is possible that, on a small proportion of trials, a participant fixates the target, fails to recognize it at such, leaves the target region to fixate other objects, and returns only later during search, leading to the manual response. Thus, elapsed time to target fixation should be the time until the entry that immediately precedes the response and not necessarily the elapsed time to the very first entry.
- 6.
An alternative would be to consider each entry and exit from an object as a single event, collapsing across multiple fixations between entry and exit.
- 7.
It can also be useful to examine saccade latency in this context. For trials without oculomotor capture, several studies have observed that saccades directed to the target were delayed when a critical distractor was present versus when it was not, indicating that the programming of the saccade required additional time to resolve the competition between the salient distractor and the target.
References
Wolfe JM (2007) Guided search 4.0: current progress with a model of visual search. In: Gray W (ed) Integrated models of cognitive systems. Oxford University Press, Oxford, pp 99–119
Williams LG (1967) The effects of target specification on objects fixated during visual search. Acta Psychol 27:355–360. https://doi.org/10.1016/0001-6918(67)90080-7
Zelinsky GJ (1996) Using eye saccades to assess the selectivity of search movements. Vis Res 36:2177–2187. https://doi.org/10.1016/0042-6989(95)00300-2
Beck VM, Hollingworth A, Luck SJ (2012) Simultaneous control of attention by multiple working memory representations. Psychol Sci 23:887–898. https://doi.org/10.1177/0956797612439068
Theeuwes J, Kramer AF, Hahn S, Irwin DE, Zelinsky GJ (1999) Influence of attentional capture on oculomotor control. J Exp Psychol Hum Percept Perform 25:1595–1608. https://doi.org/10.1037/0096-1523.25.6.1595
Gaspelin N, Leonard CJ, Luck SJ (2015) Direct evidence for active suppression of salient-but-irrelevant sensory inputs. Psychol Sci 26:1740–1750. https://doi.org/10.1177/0956797615597913
Bahle B, Beck VM, Hollingworth A (2018) The architecture of interaction between visual working memory and visual attention. J Exp Psychol Hum Percept Perform 44:992–1011. https://doi.org/10.1037/xhp0000509
Beck VM, Luck SJ, Hollingworth A (2018) Whatever you do, don't look at the...: evaluating guidance by an exclusionary attentional template. J Exp Psychol Hum Percept Perform 44:645–662. https://doi.org/10.1037/xhp0000485
Le Pelley ME, Pearson D, Griffiths O, Beesley T (2015) When goals conflict with values: counterproductive attentional and oculomotor capture by reward-related stimuli. J Exp Psychol Gen 144:158–171. https://doi.org/10.1037/xge0000037
Henderson JM (2003) Human gaze control during real-world scene perception. Trends Cogn Sci 7:498–504. https://doi.org/10.1016/j.tics.2003.09.006
Itti L, Koch C (2000) A saliency-based search mechanism for overt and covert shifts of visual attention. Vis Res 40:1489–1506. https://doi.org/10.1016/S0042-6989(99)00163-7
Castelhano MS, Henderson JM (2007) Initial scene representations facilitate eye movement guidance in visual search. J Exp Psychol Hum Percept Perform 33:753–763. https://doi.org/10.1037/0096-1523.33.4.753
Castelhano MS, Heaven C (2011) Scene context influences without scene gist: eye movements guided by spatial associations in visual search. Psychon Bull Rev 18:890–896. https://doi.org/10.3758/s13423-011-0107-8
Eckstein MP, Drescher BA, Shimozaki SS (2006) Attentional cues in real scenes, saccadic targeting, and Bayesian priors. Psychol Sci 17:973–980. https://doi.org/10.1111/j.1467-9280.2006.01815.x
Henderson JM, Malcolm GL, Schandl C (2009) Searching in the dark: cognitive relevance drives attention in real-world scenes. Psychon Bull Rev 16:850–856. https://doi.org/10.3758/pbr.16.5.850
Malcolm GL, Henderson JM (2009) The effects of target template specificity on visual search in real-world scenes: evidence from eye movements. J Vis 9(8):1–13. https://doi.org/10.1167/9.11.8
Henderson JM, Weeks PA, Hollingworth A (1999) The effects of semantic consistency on eye movements during complex scene viewing. J Exp Psychol Hum Percept Perform 25:210–228. https://doi.org/10.1037//0096-1523.25.1.210
Võ MLH, Henderson JM (2010) The time course of initial scene processing for eye movement guidance in natural scene search. J Vis 10:14. https://doi.org/10.1167/10.3.14
Torralba A, Oliva A, Castelhano MS, Henderson JM (2006) Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol Rev 113:766–786. https://doi.org/10.1037/0033-295X.113.4.766
Brockmole JR, Castelhano MS, Henderson JM (2006) Contextual cueing in naturalistic scenes: global and local contexts. J Exp Psychol Learn Mem Cogn 32:699–706. https://doi.org/10.1037/0278-7393.32.4.699
Brockmole JR, Henderson JM (2006) Using real-world scenes as contextual cues for search. Vis Cogn 13:99–108. https://doi.org/10.1080/13506280500165188
Võ MLH, Wolfe JM (2012) When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. J Exp Psychol Hum Percept Perform 38:23–41. https://doi.org/10.1037/a0024147
Hollingworth A (2012) Task specificity and the influence of memory on visual search: comment on Võ and Wolfe (2012). J Exp Psychol Hum Percept Perform 38:1596–1603. https://doi.org/10.1037/A0030237
Bahle B, Hollingworth A (2019) Contrasting episodic and template-based guidance during search through natural scenes. J Exp Psychol Hum Percept Perform 45:523–536. https://doi.org/10.1037/xhp0000624
Hunt AR, Reuther J, Hilchey MD, Klein R (2019) The relationship between spatial attention and eye movements. Curr Top Behav Neurosci. https://doi.org/10.1007/7854_2019_95
Hoffman JE, Subramaniam B (1995) The role of visual attention in saccadic eye movements. Percept Psychophys 57:787–795. https://doi.org/10.3758/BF03206794
Kowler E, Anderson E, Dosher B, Blaser E (1995) The role of attention in the programming of saccades. Vis Res 35:1897–1916. https://doi.org/10.1016/0042-6989(94)00279-U
Deubel H, Schneider WX (1996) Saccade target selection and object recognition: evidence for a common attentional mechanism. Vis Res 36:1827–1837. https://doi.org/10.1016/0042-6989(95)00294-4
Hunt AR, Kingstone A (2003) Covert and overt voluntary attention: linked or independent? Cogn Brain Res 18:102–105. https://doi.org/10.1016/j.cogbrainres.2003.08.006
Klein RM (1980) Does oculomotor readiness mediate cognitive control of visual attention? In: Nickerson RS (ed) Attention and performance VIII. Erlbaum, Hillsdale, pp 259–276
Klein RM, Pontefract A (1994) Does oculomotor readiness mediate cognitive control of visual attention? Revisited! In: Umilta C, Moscovitch M (eds) Attention and performance XV—conscious and nonconscious information processing. MIT Press, Cambridge, pp 333–350
Juan CH, Shorter-Jacobi SM, Schall JD (2004) Dissociation of spatial attention and saccade preparation. Proc Natl Acad Sci U S A 101:15541–15544. https://doi.org/10.1073/pnas.0403507101
Schafer RJ, Moore T (2011) Selective attention from voluntary control of neurons in prefrontal cortex. Science 332:1568–1571. https://doi.org/10.1126/science.1199892
Thompson KG, Biscoe KL, Sato TR (2005) Neuronal basis of covert spatial attention in the frontal eye field. J Neurosci 25:9479–9487. https://doi.org/10.1523/jneurosci.0741-05.2005
Hollingworth A, Henderson JM (2002) Accurate visual memory for previously attended objects in natural scenes. J Exp Psychol Hum Percept Perform 28:113–136. https://doi.org/10.1037//0096-1523.28.1.113
Koehler K, Eckstein MP (2017) Beyond scene gist: objects guide search more than scene background. J Exp Psychol Hum Percept Perform 43:1177–1193. https://doi.org/10.1037/xhp0000363
Bahle B, Matsukura M, Hollingworth A (2018) Contrasting gist-based and template-based guidance during real-world visual search. J Exp Psychol Hum Percept Perform 44:367–386. https://doi.org/10.1037/xhp0000468
Malcolm GL, Henderson JM (2010) Combining top-down processes to guide eye movements during real-world scene search. J Vis 10(4):1–11. https://doi.org/10.1167/10.2.4
Castelhano MS, Pollatsek A, Cave KR (2008) Typicality aids search for an unspecified target, but only in identification and not in attentional guidance. Psychon Bull Rev 15:795–801. https://doi.org/10.3758/pbr.15.4.795
Smith TJ, Henderson JM (2011) Does oculomotor inhibition of return influence fixation probability during scene search? Atten Percept Psychophys 73:2384–2398. https://doi.org/10.3758/s13414-011-0191-x
Klein RM, MacInnes WJ (1999) Inhibition of return is a foraging facilitator in visual search. Psychol Sci 10:346–352. https://doi.org/10.1111/1467-9280.00166
Peterson MS, Kramer AF, Wang RF, Irwin DE, McCarley JS (2001) Visual search has memory. Psychol Sci 12:287–292. https://doi.org/10.1111/1467-9280.00353
Horowitz TS, Wolfe JM (1998) Visual search has no memory. Nature 394:575–577. https://doi.org/10.1038/29068
Gilchrist ID, Harvey M (2000) Refixation frequency and memory mechanisms in visual search. Curr Biol 10:1209–1212. https://doi.org/10.1016/s0960-9822(00)00729-6
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Science+Business Media, LLC
About this protocol
Cite this protocol
Hollingworth, A., Bahle, B. (2019). Eye Tracking in Visual Search Experiments. In: Pollmann, S. (eds) Spatial Learning and Attention Guidance. Neuromethods, vol 151. Humana, New York, NY. https://doi.org/10.1007/7657_2019_30
Download citation
DOI: https://doi.org/10.1007/7657_2019_30
Published:
Publisher Name: Humana, New York, NY
Print ISBN: 978-1-4939-9947-7
Online ISBN: 978-1-4939-9948-4
eBook Packages: Springer Protocols