Skip to main content

Eye Tracking in Visual Search Experiments

  • Protocol
  • First Online:
Spatial Learning and Attention Guidance

Part of the book series: Neuromethods ((NM,volume 151))

Abstract

Over the last 30 years, eye tracking has grown in popularity as a method to understand attention during visual search, principally because it provides a means to characterize the spatiotemporal properties of selective operations across a trial. In the present chapter, we review the motivations, methods, and measures for using eye tracking in visual search experiments. This includes a discussion of the advantages (and some disadvantages) of eye tracking data as a measure spatial attention, compared with more traditional reaction time paradigms. In addition, we discuss stimulus and design considerations for implementing experiments of this type. Finally, we will discuss the major measures that can be extracted from an eye tracking record and discuss the inferences that each allow. In the course of this discussion, we address both experiments using abstract arrays and experiments using real-world scene stimuli.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Protocol
USD 49.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Note that instructions are rarely sufficient to ensure that participants do not make eye movements. Thus, even if the goal is to eliminate eye movements, gaze still needs to be monitored. Ideally, an eye tracker can be used, but there is another option. In covert attentions studies, we often use a simple video camera to display a large image of one of the eyes, and the experimenter monitors this image throughout the experiment (a human eye tracker). Movements of the eyes are quite easy to observe, and the experimenter both notes trials with eye movements and reminds the participant, when an eye movement is observed, to keep gaze focused on the relevant reference point. With appropriate, well-timed feedback of this sort, most participants quickly learn how to keep gaze focused centrally and rarely make eye movements after an initial practice session.

  2. 2.

    Note that a present/absent design is not always ideal for an eye tracking study, as the mere presence or absence of the target can often be determined without foveation.

  3. 3.

    Note, however, that in this example experiment, if we were to manipulate photographic images, we would need to use an object image from a different source and then paste and integrate it into both the plausible and implausible locations within the experimental scene; this would control our two conditions for artifacts generated by the process of adding an object to a particular scene location.

  4. 4.

    An alternative is to make trial initiation contingent on central fixation.

  5. 5.

    It is possible that, on a small proportion of trials, a participant fixates the target, fails to recognize it at such, leaves the target region to fixate other objects, and returns only later during search, leading to the manual response. Thus, elapsed time to target fixation should be the time until the entry that immediately precedes the response and not necessarily the elapsed time to the very first entry.

  6. 6.

    An alternative would be to consider each entry and exit from an object as a single event, collapsing across multiple fixations between entry and exit.

  7. 7.

    It can also be useful to examine saccade latency in this context. For trials without oculomotor capture, several studies have observed that saccades directed to the target were delayed when a critical distractor was present versus when it was not, indicating that the programming of the saccade required additional time to resolve the competition between the salient distractor and the target.

References

  1. Wolfe JM (2007) Guided search 4.0: current progress with a model of visual search. In: Gray W (ed) Integrated models of cognitive systems. Oxford University Press, Oxford, pp 99–119

    Chapter  Google Scholar 

  2. Williams LG (1967) The effects of target specification on objects fixated during visual search. Acta Psychol 27:355–360. https://doi.org/10.1016/0001-6918(67)90080-7

    Article  CAS  Google Scholar 

  3. Zelinsky GJ (1996) Using eye saccades to assess the selectivity of search movements. Vis Res 36:2177–2187. https://doi.org/10.1016/0042-6989(95)00300-2

    Article  CAS  PubMed  Google Scholar 

  4. Beck VM, Hollingworth A, Luck SJ (2012) Simultaneous control of attention by multiple working memory representations. Psychol Sci 23:887–898. https://doi.org/10.1177/0956797612439068

    Article  PubMed  PubMed Central  Google Scholar 

  5. Theeuwes J, Kramer AF, Hahn S, Irwin DE, Zelinsky GJ (1999) Influence of attentional capture on oculomotor control. J Exp Psychol Hum Percept Perform 25:1595–1608. https://doi.org/10.1037/0096-1523.25.6.1595

    Article  CAS  PubMed  Google Scholar 

  6. Gaspelin N, Leonard CJ, Luck SJ (2015) Direct evidence for active suppression of salient-but-irrelevant sensory inputs. Psychol Sci 26:1740–1750. https://doi.org/10.1177/0956797615597913

    Article  PubMed  PubMed Central  Google Scholar 

  7. Bahle B, Beck VM, Hollingworth A (2018) The architecture of interaction between visual working memory and visual attention. J Exp Psychol Hum Percept Perform 44:992–1011. https://doi.org/10.1037/xhp0000509

    Article  PubMed  PubMed Central  Google Scholar 

  8. Beck VM, Luck SJ, Hollingworth A (2018) Whatever you do, don't look at the...: evaluating guidance by an exclusionary attentional template. J Exp Psychol Hum Percept Perform 44:645–662. https://doi.org/10.1037/xhp0000485

    Article  PubMed  Google Scholar 

  9. Le Pelley ME, Pearson D, Griffiths O, Beesley T (2015) When goals conflict with values: counterproductive attentional and oculomotor capture by reward-related stimuli. J Exp Psychol Gen 144:158–171. https://doi.org/10.1037/xge0000037

    Article  PubMed  Google Scholar 

  10. Henderson JM (2003) Human gaze control during real-world scene perception. Trends Cogn Sci 7:498–504. https://doi.org/10.1016/j.tics.2003.09.006

    Article  PubMed  Google Scholar 

  11. Itti L, Koch C (2000) A saliency-based search mechanism for overt and covert shifts of visual attention. Vis Res 40:1489–1506. https://doi.org/10.1016/S0042-6989(99)00163-7

    Article  CAS  PubMed  Google Scholar 

  12. Castelhano MS, Henderson JM (2007) Initial scene representations facilitate eye movement guidance in visual search. J Exp Psychol Hum Percept Perform 33:753–763. https://doi.org/10.1037/0096-1523.33.4.753

    Article  PubMed  Google Scholar 

  13. Castelhano MS, Heaven C (2011) Scene context influences without scene gist: eye movements guided by spatial associations in visual search. Psychon Bull Rev 18:890–896. https://doi.org/10.3758/s13423-011-0107-8

    Article  PubMed  Google Scholar 

  14. Eckstein MP, Drescher BA, Shimozaki SS (2006) Attentional cues in real scenes, saccadic targeting, and Bayesian priors. Psychol Sci 17:973–980. https://doi.org/10.1111/j.1467-9280.2006.01815.x

    Article  PubMed  Google Scholar 

  15. Henderson JM, Malcolm GL, Schandl C (2009) Searching in the dark: cognitive relevance drives attention in real-world scenes. Psychon Bull Rev 16:850–856. https://doi.org/10.3758/pbr.16.5.850

    Article  PubMed  Google Scholar 

  16. Malcolm GL, Henderson JM (2009) The effects of target template specificity on visual search in real-world scenes: evidence from eye movements. J Vis 9(8):1–13. https://doi.org/10.1167/9.11.8

    Article  PubMed  Google Scholar 

  17. Henderson JM, Weeks PA, Hollingworth A (1999) The effects of semantic consistency on eye movements during complex scene viewing. J Exp Psychol Hum Percept Perform 25:210–228. https://doi.org/10.1037//0096-1523.25.1.210

    Article  Google Scholar 

  18. Võ MLH, Henderson JM (2010) The time course of initial scene processing for eye movement guidance in natural scene search. J Vis 10:14. https://doi.org/10.1167/10.3.14

    Article  PubMed  Google Scholar 

  19. Torralba A, Oliva A, Castelhano MS, Henderson JM (2006) Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol Rev 113:766–786. https://doi.org/10.1037/0033-295X.113.4.766

    Article  PubMed  Google Scholar 

  20. Brockmole JR, Castelhano MS, Henderson JM (2006) Contextual cueing in naturalistic scenes: global and local contexts. J Exp Psychol Learn Mem Cogn 32:699–706. https://doi.org/10.1037/0278-7393.32.4.699

    Article  PubMed  Google Scholar 

  21. Brockmole JR, Henderson JM (2006) Using real-world scenes as contextual cues for search. Vis Cogn 13:99–108. https://doi.org/10.1080/13506280500165188

    Article  Google Scholar 

  22. Võ MLH, Wolfe JM (2012) When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. J Exp Psychol Hum Percept Perform 38:23–41. https://doi.org/10.1037/a0024147

    Article  PubMed  Google Scholar 

  23. Hollingworth A (2012) Task specificity and the influence of memory on visual search: comment on Võ and Wolfe (2012). J Exp Psychol Hum Percept Perform 38:1596–1603. https://doi.org/10.1037/A0030237

    Article  PubMed  PubMed Central  Google Scholar 

  24. Bahle B, Hollingworth A (2019) Contrasting episodic and template-based guidance during search through natural scenes. J Exp Psychol Hum Percept Perform 45:523–536. https://doi.org/10.1037/xhp0000624

    Article  PubMed  Google Scholar 

  25. Hunt AR, Reuther J, Hilchey MD, Klein R (2019) The relationship between spatial attention and eye movements. Curr Top Behav Neurosci. https://doi.org/10.1007/7854_2019_95

    Chapter  Google Scholar 

  26. Hoffman JE, Subramaniam B (1995) The role of visual attention in saccadic eye movements. Percept Psychophys 57:787–795. https://doi.org/10.3758/BF03206794

    Article  CAS  PubMed  Google Scholar 

  27. Kowler E, Anderson E, Dosher B, Blaser E (1995) The role of attention in the programming of saccades. Vis Res 35:1897–1916. https://doi.org/10.1016/0042-6989(94)00279-U

    Article  CAS  PubMed  Google Scholar 

  28. Deubel H, Schneider WX (1996) Saccade target selection and object recognition: evidence for a common attentional mechanism. Vis Res 36:1827–1837. https://doi.org/10.1016/0042-6989(95)00294-4

    Article  CAS  PubMed  Google Scholar 

  29. Hunt AR, Kingstone A (2003) Covert and overt voluntary attention: linked or independent? Cogn Brain Res 18:102–105. https://doi.org/10.1016/j.cogbrainres.2003.08.006

    Article  Google Scholar 

  30. Klein RM (1980) Does oculomotor readiness mediate cognitive control of visual attention? In: Nickerson RS (ed) Attention and performance VIII. Erlbaum, Hillsdale, pp 259–276

    Google Scholar 

  31. Klein RM, Pontefract A (1994) Does oculomotor readiness mediate cognitive control of visual attention? Revisited! In: Umilta C, Moscovitch M (eds) Attention and performance XV—conscious and nonconscious information processing. MIT Press, Cambridge, pp 333–350

    Google Scholar 

  32. Juan CH, Shorter-Jacobi SM, Schall JD (2004) Dissociation of spatial attention and saccade preparation. Proc Natl Acad Sci U S A 101:15541–15544. https://doi.org/10.1073/pnas.0403507101

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Schafer RJ, Moore T (2011) Selective attention from voluntary control of neurons in prefrontal cortex. Science 332:1568–1571. https://doi.org/10.1126/science.1199892

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  34. Thompson KG, Biscoe KL, Sato TR (2005) Neuronal basis of covert spatial attention in the frontal eye field. J Neurosci 25:9479–9487. https://doi.org/10.1523/jneurosci.0741-05.2005

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Hollingworth A, Henderson JM (2002) Accurate visual memory for previously attended objects in natural scenes. J Exp Psychol Hum Percept Perform 28:113–136. https://doi.org/10.1037//0096-1523.28.1.113

    Article  Google Scholar 

  36. Koehler K, Eckstein MP (2017) Beyond scene gist: objects guide search more than scene background. J Exp Psychol Hum Percept Perform 43:1177–1193. https://doi.org/10.1037/xhp0000363

    Article  PubMed  Google Scholar 

  37. Bahle B, Matsukura M, Hollingworth A (2018) Contrasting gist-based and template-based guidance during real-world visual search. J Exp Psychol Hum Percept Perform 44:367–386. https://doi.org/10.1037/xhp0000468

    Article  PubMed  Google Scholar 

  38. Malcolm GL, Henderson JM (2010) Combining top-down processes to guide eye movements during real-world scene search. J Vis 10(4):1–11. https://doi.org/10.1167/10.2.4

    Article  PubMed  Google Scholar 

  39. Castelhano MS, Pollatsek A, Cave KR (2008) Typicality aids search for an unspecified target, but only in identification and not in attentional guidance. Psychon Bull Rev 15:795–801. https://doi.org/10.3758/pbr.15.4.795

    Article  PubMed  Google Scholar 

  40. Smith TJ, Henderson JM (2011) Does oculomotor inhibition of return influence fixation probability during scene search? Atten Percept Psychophys 73:2384–2398. https://doi.org/10.3758/s13414-011-0191-x

    Article  PubMed  Google Scholar 

  41. Klein RM, MacInnes WJ (1999) Inhibition of return is a foraging facilitator in visual search. Psychol Sci 10:346–352. https://doi.org/10.1111/1467-9280.00166

    Article  Google Scholar 

  42. Peterson MS, Kramer AF, Wang RF, Irwin DE, McCarley JS (2001) Visual search has memory. Psychol Sci 12:287–292. https://doi.org/10.1111/1467-9280.00353

    Article  CAS  PubMed  Google Scholar 

  43. Horowitz TS, Wolfe JM (1998) Visual search has no memory. Nature 394:575–577. https://doi.org/10.1038/29068

    Article  CAS  PubMed  Google Scholar 

  44. Gilchrist ID, Harvey M (2000) Refixation frequency and memory mechanisms in visual search. Curr Biol 10:1209–1212. https://doi.org/10.1016/s0960-9822(00)00729-6

    Article  CAS  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew Hollingworth .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Science+Business Media, LLC

About this protocol

Check for updates. Verify currency and authenticity via CrossMark

Cite this protocol

Hollingworth, A., Bahle, B. (2019). Eye Tracking in Visual Search Experiments. In: Pollmann, S. (eds) Spatial Learning and Attention Guidance. Neuromethods, vol 151. Humana, New York, NY. https://doi.org/10.1007/7657_2019_30

Download citation

  • DOI: https://doi.org/10.1007/7657_2019_30

  • Published:

  • Publisher Name: Humana, New York, NY

  • Print ISBN: 978-1-4939-9947-7

  • Online ISBN: 978-1-4939-9948-4

  • eBook Packages: Springer Protocols

Publish with us

Policies and ethics