Attention, Perception, & Psychophysics

, Volume 77, Issue 1, pp 128–149 | Cite as

Target templates: the precision of mental representations affects attentional guidance and decision-making in visual search

Article

Abstract

When people look for things in the environment, they use target templates—mental representations of the objects they are attempting to locate—to guide attention and to assess incoming visual input as potential targets. However, unlike laboratory participants, searchers in the real world rarely have perfect knowledge regarding the potential appearance of targets. In seven experiments, we examined how the precision of target templates affects the ability to conduct visual search. Specifically, we degraded template precision in two ways: 1) by contaminating searchers’ templates with inaccurate features, and 2) by introducing extraneous features to the template that were unhelpful. We recorded eye movements to allow inferences regarding the relative extents to which attentional guidance and decision-making are hindered by template imprecision. Our findings support a dual-function theory of the target template and highlight the importance of examining template precision in visual search.

Keywords

Visual search Eye movements Target templates Attentional guidance Decision making 

Supplementary material

13414_2014_764_MOESM1_ESM.pdf (1.9 mb)
ESM 1(PDF 1970 kb)

References

  1. Alexander, R. G., & Zelinsky, G. J. (2011). Visual similarity effects in categorical search. Journal of Vision, 11, 1–15. doi:10.1167/11.8.9 Google Scholar
  2. Al-Aidroos, N., Emrich, S. M., Ferber, S., & Pratt, J. (2012). Visual working memory supports the inhibition of previously processed information: Evidence from preview search. Journal of Experimental Psychology: Human Perception and Performance, 38, 643–663. doi:10.1037/a0025707 PubMedGoogle Scholar
  3. Alvarez, G. A., & Cavanagh, P. (2004). The capacity of visual short-term memory is set both by visual information load and by number of objects. Psychological Science, 15, 106–111. doi:10.1111/j.0963-7214.2004.01502006.x PubMedGoogle Scholar
  4. Anderson, D. E., Vogel, E. K., & Awh, E. (2011). Precision in visual working memory reaches a stable plateau when individual item limits are exceeded. Journal of Neuroscience, 31, 1128–1138. doi:10.1523/JNEUROSCI.4125-10.2011 PubMedGoogle Scholar
  5. Arita, J. T., Carlisle, N. B., & Woodman, G. F. (2012). Templates for rejection: Configuring attention to ignore task-relevant features. Journal of Experimental Psychology: Human Perception and Performance, 38, 580–584. doi:10.1037/a0027885 PubMedGoogle Scholar
  6. Awh, E., Barton, B., & Vogel, E. K. (2007). Visual working memory represents a fixed number of items regardless of complexity. Psychological Science, 18, 622–628. doi:10.1111/j.1467-9280.2007.01949.x PubMedGoogle Scholar
  7. Bays, P. M., & Husain, M. (2008). Dynamic shifts of limited working memory resources in human vision. Science, 321, 851–854. doi:10.1126/science.1158023 PubMedCentralPubMedGoogle Scholar
  8. Becker, S. I. (2011). Determinants of dwell time in visual search: Similarity or perceptual difficulty? PLoS One, 6, 1–5. doi:10.1371/journal.pone.0017740 Google Scholar
  9. Becker, W. (1972). The control of eye movements in the saccadic system. Bibliotheca Opthalamologica, 82, 233–243.Google Scholar
  10. Bichot, N. P., Rossi, A. F., & Desimone, R. (2005). Parallel and serial neural mechanisms for visual search in macaque area V4. Science, 308, 529–534. doi:10.1126/science.1109676 PubMedGoogle Scholar
  11. Bond, A. B. (1983). Visual search and selection of natural stimuli in the pigeon: The attention threshold hypothesis. Journal of Experimental Psychology: Animal Behavior Processes, 9, 292–306. doi:10.1037/0097-7403.9.3.292 PubMedGoogle Scholar
  12. Brady, T. F., Konkle, T., Alvarez, G. A., & Oliva, A. (2008). Visual long-term memory has a massive storage capacity for object details. Proceedings of the National Academy of Sciences, 105, 14325–14329. doi:10.1073/pnas.0803390105 Google Scholar
  13. Bravo, M. J., & Farid, H. (2009). The specificity of the search template. Journal of Vision, 9, 1–9. doi:10.1167/9.1.34 PubMedGoogle Scholar
  14. Bravo, M. J., & Farid, H. (2012). Task demands determine the specificity of the search template. Attention, Perception & Psychophysics, 74, 124–131. doi:10.3758/s13414-011-0224-5 Google Scholar
  15. Buschman, T. J., Siegel, M., Roy, J. E., & Miller, E. K. (2011). Neural substrates of cognitive capacity limitations. Proceedings of the National Academy of Sciences, 108, 11252–11255. doi:10.1073/pnas.1104666108 Google Scholar
  16. Castelhano, M. S., & Henderson, J. M. (2007). Initial scene representations facilitate eye movement guidance in visual search. Journal of Experimental Psychology: Human Perception and Performance, 33, 753–763. doi:10.1037/0096-1523.33.4.753 PubMedGoogle Scholar
  17. Castelhano, M. S., Pollatsek, A., & Cave, K. (2008). Typicality aids search for an unspecified target, but only in identification, and not in attentional guidance. Psychonomic Bulletin & Review, 15, 795–801. doi:10.3758/PBR.15.4.795 Google Scholar
  18. Chelazzi, L., Duncan, J., Miller, E. K., & Desimone, R. (1998). Responses of neurons in inferior temporal cortex during memory-guided visual search. Journal of Neurophysiology, 80, 2918–2940.PubMedGoogle Scholar
  19. Chelazzi, L., Miller, E. K., Duncan, J., & Desimone, R. (1993). A neural basis for visual search in inferior temporal cortex. Nature, 363, 345–347. doi:10.1038/363345a0 PubMedGoogle Scholar
  20. Chen, X., & Zelinsky, G. J. (2006). Real-world search is dominated by top-down guidance. Vision Research, 46, 4118–4133. Real-world search is dominated by.PubMedGoogle Scholar
  21. Cowan, N. (2001). The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences, 24, 87–185.PubMedGoogle Scholar
  22. Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18, 193–222. doi:10.1146/annurev.ne.18.030195.001205 PubMedGoogle Scholar
  23. Deubel, H., Wolf, W., & Hauske, G. (1982). Corrective saccades: Effect of shifting the saccade goal. Vision Research, 22, 353–364. doi:10.1016/0042-6989(82)90151-1 PubMedGoogle Scholar
  24. Dowd, E. W., & Mitroff, S. R. (2013). Attentional guidance by working memory overrides saliency cues in visual search. Journal of Experimental Psychology: Human Perception and Performance, 39, 1786–1796. doi:10.1037/a0032548 PubMedGoogle Scholar
  25. Duncan, J., & Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96, 433–458. doi:10.1037/0033-295X.96.3.433 PubMedGoogle Scholar
  26. Duncan, J., & Humphreys, G. W. (1992). Beyond the search surface: Visual search and attentional engagement. Journal of Experimental Psychology: Human Perception and Performance, 18, 578–588. doi:10.1037/0096-1523.18.2.578 PubMedGoogle Scholar
  27. Eckstein, M. P., Beutter, B. R., Pham, B. T., Shimozaki, S. S., & Stone, L. S. (2007). Similar neural representations of the target for saccades and perception during search. Neuron, 27, 1266–1270. doi:10.1523/JNEUROSCI.3975-06.2007 Google Scholar
  28. Eimer, M., Kiss, M., & Nicholas, S. (2011). What top-down task sets do for us: An ERP study on the benefits of advance preparation in visual search. Journal of Experimental Psychology: Human Perception and Performance, 6, 1758–1766. doi:10.1037/a0024326 Google Scholar
  29. Einhäuser, W., Rutishauser, U., & Koch, C. (2008). Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli. Journal of Vision, 8, 1–19. doi:10.1167/8.2.2 Google Scholar
  30. Ethel, M. (1974). Saccadic suppression: A review and an analysis. Psychological Bulletin, 81, 899–917. doi:10.1037/h0037368 Google Scholar
  31. Evans, K. K., Horowitz, T. S., Howe, P., Pedersini, R., Reijnen, E., Pinto, Y., … & Wolfe, J. M. (2011). Visual attention. Wiley Interdisciplinary Reviews: Cognitive Science, 2, 503–514. doi: 10.1002/wcs.127
  32. Findlay, J. M. (1997). Saccade target selection during visual search. Vision Research, 37, 617–631. doi:10.1016/S0042-6989(96)00218-0 PubMedGoogle Scholar
  33. Frings, C., Wentura, D., & Wühr, P. (2012). On the fate of distractor representations. Journal of Experimental Psychology: Human Perception and Performance, 38, 570–575. doi:10.1037/a0027781 PubMedGoogle Scholar
  34. Godwin, H. J., Hout, M. C., & Menneer, T. (2014). Visual similarity is stronger than semantic similarity in guiding visual search for numbers. Psychonomic Bulletin & Review, 21, 689–695. doi:10.3758/s13423-013-0547-4 Google Scholar
  35. Godwin, H. J., Menneer, T., Cave, K. R., & Donnelly, N. (2010). Dual-target search for high and low prevalence X-ray threat targets. Visual Cognition, 18, 1439–1463. doi:10.1080/13506285.2010.500605 Google Scholar
  36. Goldstone, R. L., & Medin, D. L. (1994). The time course of comparison. Journal of Experimental Psychology: Learning, Memory, & Cognition, 20, 29–50. doi:10.1037/0278-7393.20.1.29 Google Scholar
  37. Gorgoraptis, N., Catalao, R. F., Bays, P. M., & Husain, M. (2011). Dynamic updating of working memory resources for visual objects. Journal of Neuroscience, 31, 8502–8511. doi:10.1523/JNEUROSCI.0208-11.2011 PubMedCentralPubMedGoogle Scholar
  38. Henderson, J. M., Brockmole, J. R., Castelhano, M. S., & Mack, M. (2007). Visual saliency does not account for eye movements during visual search in real world scenes. In R. P. G. van Gompel, M. H. Fischer, W. S. Murray, & R. L. Hill (Eds.), Eye movements: A window on mind and brain (pp. 537–562). Oxford, UK: Elsevier.Google Scholar
  39. Henderson, J. M., Malcolm, G. L., & Schandl, C. (2009). Searching in the dark: Cognitive relevance versus visual salience during search for non-salient objects in real-world scenes. Psychonomic Bulletin & Review, 16, 850–856. doi:10.3758/PBR.16.5.850 Google Scholar
  40. Hollingworth, A., & Luck, S. J. (2009). The role of visual working memory (VWM) in the control of gaze during visual search. Attention, Perception & Psychophysics, 71, 936–949. doi:10.3758/APP.71.4.936 Google Scholar
  41. Hollingworth, A., Richard, A. M., & Luck, S. J. (2008). Understanding the function of visual short-term memory: Transsaccadic memory, object correspondence, and gaze correction. Journal of Experimental Psychology: General, 137, 163–181. doi:10.1037/0096-3445.137.1.163 Google Scholar
  42. Hon, N., Thompson, R., Sigala, N., & Duncan, J. (2009). Evidence for long-range feedback in target detection: Detection of semantic targets modulates activity in early visual areas. Neuropsychologia, 47, 1721–1727. doi:10.1016/j.neuropsychologia.2009.02.011 PubMedGoogle Scholar
  43. Hout, M. C., & Goldinger, S. D. (2010). Learning in repeated visual search. Attention, Perception & Psychophysics, 72, 1267–1282. doi:10.3758/APP.72.5.1267 Google Scholar
  44. Hout, M. C., & Goldinger, S. D. (2012). Incidental learning speeds visual search by lowering response thresholds, not by improving efficiency. Journal of Experimental Psychology: Human Perception and Performance, 38, 90–112. doi:10.1037/a0023894 PubMedCentralPubMedGoogle Scholar
  45. Hout, M. C., Goldinger, S. D., & Brady, K. J. (under review). MM-MDS: A multidimensional scaling database with similarity ratings for 240 object categories from the Massive Memory picture database.Google Scholar
  46. Hout, M. C., Goldinger, S. D., & Ferguson, R. W. (2013). The versatility of SpAM: A fast, efficient spatial method of data collection for multidimensional scaling. Journal of Experimental Psychology: General, 142, 256–281. doi:10.1037/a0028860 Google Scholar
  47. Houtkamp, R., & Roelfsema, P. R. (2009). Matching of visual input to only one item at any one time. Psychological Research, 73, 317–326. doi:10.1007/s00426-008-0157-3 PubMedGoogle Scholar
  48. Hwang, A. D., Higgins, E. C., & Pomplun, M. (2009). A model of top-down attentional control during visual search in complex scenes. Journal of Vision, 9, 1–18. doi:10.1167/9.5.25 PubMedGoogle Scholar
  49. Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40, 1489–1506. A saliency-based search mechanism for overt and covert shifts of visual attention.PubMedGoogle Scholar
  50. Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2, 194–203. doi:10.1038/35058500 PubMedGoogle Scholar
  51. Kapoula, Z., & Robinson, D. A. (1986). Saccadic undershoot is not inevitable: Saccades can be accurate. Vision Research, 26, 735–743. doi:10.1016/0042-6989(86)90087-8 PubMedGoogle Scholar
  52. Koch, C., & Ullman, S. (1985). Shifts in selective visual attention: Towards the underlying neural circuitry. Human Neurobiology, 4, 219–227. doi:10.1007/978-94-009-3833-5_5 PubMedGoogle Scholar
  53. Konkle, T., Brady, T. F., Alvarez, G. A., & Oliva, A. (2010). Conceptual distinctiveness supports detailed visual long-term memory for real-world objects. Journal of Experimental Psychology: General, 139, 558–578. doi:10.1037/a0019165 Google Scholar
  54. Kunar, M. A., Flusberg, S., & Wolfe, J. M. (2008). The role of memory and restricted context in repeated visual search. Perception & Psychophysics, 70, 314–328. doi:10.3758/PP.70.2.314 Google Scholar
  55. Machizawa, M. G., Goh, C. C. W., & Driver, J. (2012). Human visual short-term memory precision can be varied at will when the number of retained items is low. Psychological Science, 23, 554–559. doi:10.1177/0956797611431988 PubMedGoogle Scholar
  56. Malcolm, G. L., & Henderson, J. M. (2009). The effects of target template specificity on visual search in real-world scenes: Evidence from eye movements. Journal of Vision, 9, 1–13. doi:10.1167/9.11.8 PubMedGoogle Scholar
  57. Malcolm, G. L., & Henderson, J. M. (2010). Combining top-down processes to guide eye movements during real-world scene search. Journal of Vision, 10, 1–11. doi:10.1167/10.2.4 PubMedGoogle Scholar
  58. Mannan, S. K., Kennard, C., Potter, D., Pan, Y., & Soto, D. (2010). Early oculomotor capture by new onsets driven by the contents of working memory. Vision Research, 50, 1590–1597. doi:10.1016/j.visres.2010.05.015 PubMedGoogle Scholar
  59. Medin, D. L., Goldstone, R. L., & Gentner, D. (1993). Respects for similarity. Psychological Review, 100, 254–278. doi:10.1037/0033-295X.100.2.254 Google Scholar
  60. Menneer, T., Barrett, D. J. K., Phillips, L., Donnelly, N., & Cave, K. R. (2007). Costs in searching for two targets: Dividing search across target types could improve airport security screening. Applied Cognitive Psychology, 21, 915–932. doi:10.1002/acp.1305 Google Scholar
  61. Menneer, T., Cave, K. R., & Donnelly, N. (2009). The cost of search for multiple targets: Effects of practice and target similarity. Journal of Experimental Psychology: Applied, 15, 125–139. doi:10.1037/a0015331 PubMedGoogle Scholar
  62. Menneer, T., Donnelly, N., Godwin, H. J., & Cave, K. R. (2010). High or low target prevalence increases the dual-target cost in visual search. Journal of Experimental Psychology: Applied, 16, 133–144. doi:10.1037/a0019569 PubMedGoogle Scholar
  63. Moore, C. M., & Osman, A. M. (1993). Looking for two targets at the same time: One search or two? Perception & Psychophysics, 53, 381–390. doi:10.3758/BF03206781 Google Scholar
  64. Mruczek, R. E. B., & Sheinberg, D. L. (2007). Activity of inferior temporal cortical neurons predicts recognition choice behavior and recognition time during visual search. Journal of Neuroscience, 27, 2825–2836. doi:10.1523/JNEUROSCI.4102-06.2007 PubMedGoogle Scholar
  65. Navalpakkam, V., & Itti, L. (2005). Modeling the influence of task on attention. Vision Research, 45, 205–231.Google Scholar
  66. Navalpakkam, V., & Itti, L. (2007). Search goal tunes visual features optimally. Neuron, 53, 605–617. doi:10.1016/j.neuron.2007.01.018 PubMedGoogle Scholar
  67. Neider, M. B., & Zelinsky, G. J. (2006). Searching for camouflaged targets: Effects of target-background similarity on visual search. Vision Research, 46, 2217–2235. doi:10.1016/j.visres.2006.01.006 PubMedGoogle Scholar
  68. Olivers, C. N., Meijer, F., & Theeuwes, J. (2006). Feature-based memory-driven attentional capture: Visual working memory content affects visual attention. Journal of Experimental Psychology: Human Perception and Performance, 32, 1243–1265. doi:10.1037/0096-1523.32.5.1243 PubMedGoogle Scholar
  69. Olivers, C. N., Peters, J., Houtkamp, R., & Roelfsema, P. R. (2011). Different states in visual working memory: When it guides attention and when it does not. Trends in Cognitive Sciences, 15, 327–334. doi:10.1016/j.tics.2011.05.004 PubMedGoogle Scholar
  70. Palmer, E. M., Fencsik, D. E., Flusberg, S. J., Horowitz, T. S., & Wolfe, J. M. (2011). Signal detection evidence for limited capacity in visual search. Attention, Perception & Psychophysics, 73, 2413–2424. doi:10.3758/s13414-011-0199-2 Google Scholar
  71. Peelen, M. V., Li, F.-F., & Kastner, S. (2009). Neural mechanisms of rapid natural scene categorization in human visual cortex. Nature, 460, 94–97. doi:10.1038/nature08103 PubMedCentralPubMedGoogle Scholar
  72. Pietrewicz, A. T., & Kamil, A. C. (1979). Search image formation in the blue jay (Cyanocitta cristata). Science, 204, 1332–1333. doi:10.1126/science.204.4399.1332 PubMedGoogle Scholar
  73. Rao, R. P., Zelinsky, G. J., Hayhoe, M. M., & Ballard, D. H. (2002). Eye movements in iconic visual search. Vision Research, 42, 1447–1463. doi:10.1016/S0042-6989(02)00040-8 PubMedGoogle Scholar
  74. Schneider, W., Eschman, A., & Zuccolotto, A. (2002). E-Prime User’s Guide. Pittsburgh, PA: Psychology Software Tools Inc.Google Scholar
  75. Schmidt, J., & Zelinsky, G. J. (2009). Search guidance is proportional to the categorical specificity of a target cue. The Quarterly Journal of Experimental Psychology, 62, 1904–1914. doi:10.1080/17470210902853530 PubMedGoogle Scholar
  76. Soto, D., Hodsoll, J., Rotshein, P., & Humphreys, G. W. (2008). Automatic guidance of attention from working memory. Trends in Cognitive Sciences, 12, 342–348. doi:10.1016/j.tics.2008.05.007 PubMedGoogle Scholar
  77. Sternberg, S. (1966). High-speed scanning in human memory. Science, 153, 652–654.PubMedGoogle Scholar
  78. Sternberg, S. (1969). Memory-scanning: Mental processes revealed by reaction-time experiments. American Scientist, 57, 421–457.PubMedGoogle Scholar
  79. Sternberg, S. (1975). Memory scanning: New findings and current controversies. The Quarterly Journal of Experimental Psychology, 27, 1–32. doi:10.1080/14640747508400459 Google Scholar
  80. Stokes, M., Thompson, R., Nobre, A. C., & Duncan, J. (2009). Shape-specific preparatory activitiy mediates attention to targets in human visual cortex. Proceedings of the National Academy of Sciences, 106, 19569–19574. doi:10.1073/pnas.0905306106 Google Scholar
  81. Stroud, M. J., Menneer, T., Cave, K. R., Donnelly, N., & Rayner, K. (2011). Search for multiple targets of different colours: Misguided eye movements reveal a reduction of colour selectivity. Applied Cognitive Psychology, 25, 971–982. doi:10.1002/acp.1790 Google Scholar
  82. Tatler, B. W., & Vincent, B. T. (2008). Systematic tendencies in scene-viewing. Journal of Eye Movement Research, 2, 1–18.Google Scholar
  83. Tatler, B. W., & Vincent, B. T. (2009). The prominence of behavioral biases in eye guidance. Visual Cognition, 17, 1029–1054. doi:10.1080/13506280902764539 Google Scholar
  84. Thiele, A., Henning, P., Kubischik, M., & Hoffman, K. P. (2002). Neural mechanisms of saccadic suppression. Science, 295, 2460–2462. doi:10.1126/science.1068788 PubMedGoogle Scholar
  85. Tinbergen, N. (1960). The natural control of insects in pine woods: Vol. I. Factors influencing the intensity of predation by songbirds. Archives Neelandaises de Zoologie, 13, 265–343.Google Scholar
  86. Usher, M., & Neiber, E. (1996). Modeling the temporal dynamics of IT neurons in visual search: A mechanism for top-down selective attention. Journal of Cognitive Neuroscience, 8, 311–327. doi:10.1162/jocn.1996.8.4.311 PubMedGoogle Scholar
  87. Vickery, T. J., King, L., & Jiang, Y. (2005). Setting up the target template in visual search. Journal of Vision, 5, 81–92. doi:10.1167/5.1.8 PubMedGoogle Scholar
  88. Vogel, E. K., McCollough, A. W., & Machizawa, M. G. (2005). Neural measures reveal individual differences in controlling access to working memory. Nature, 438, 500–503. doi:10.1038/nature04171 PubMedGoogle Scholar
  89. Watson, D. G., & Humphreys, G. W. (1997). Visual marking: Prioritizing selection for new objects by top-down attentional inhibition of old objects. Psychological Review, 104, 90–122. doi:10.1037/0033-295X.104.1.90 PubMedGoogle Scholar
  90. Watson, D. G., & Humphreys, G. W. (2000). Visual marking: Evidence for inhibition using a probe-dot paradigm. Perception & Psychophysics, 62, 471–481. doi:10.3758/BF03212099 Google Scholar
  91. Watson, D. G., Humphreys, G. W., & Olivers, C. N. L. (2003). Visual marking: Using time in visual selection. Trends in Cognitive Sciences, 7, 180–186. doi:10.1016/S1364-6613(03)00033-0 PubMedGoogle Scholar
  92. Wilschut, A., Theeuwes, J., & Olivers, C. N. L. (2013). The time it takes to turn a memory into a template. Journal of Vision, 13, 1–11. doi:10.1167/13.3.8 Google Scholar
  93. Wilschut, A., Theeuwes, J., & Olivers, C. N. L. (2014). Priming and the guidance by visual and categorical templates in visual search. Frontiers in Psychology, 5, 1–12. doi:10.3389/fpsyg.2014.00148 Google Scholar
  94. Woodman, G. F., Luck, S. J., & Schall, J. D. (2007). The role of working memory representations in the control of attention. Cerebral Cortex, 17, 118–124. doi:10.1093/cercor/bhm065 Google Scholar
  95. Wolfe, J. M. (1994). Guided Search 2.0: A revised model of visual search. Psychonomic Bulletin & Review, 1, 202–238. doi:10.3758/BF03200774 Google Scholar
  96. Wolfe, J. M. (2005). Watching single cells pay attention. Science, 308, 503–504. doi:10.1126/science.1112616 PubMedGoogle Scholar
  97. Wolfe, J. M. (2007). Guided Search 4.0: Current progress with a model of visual search. In W. D. Gray (Ed.), Integrated models of cognitive systems (pp. 99–119). New York, NY, USA: Oxford University Press.Google Scholar
  98. Wolfe, J. M., Butcher, S. J., Lee, C., & Hyle, M. (2003). Changing your mind: On the contribution of top-down and bottom-up guidance in visual search for feature singletons. Journal of Experimental Psychology: Human Perception and Performance, 29, 483–502. doi:10.1037/0096-1523.29.2.483 PubMedGoogle Scholar
  99. Wolfe, J. M., Cave, K. R., & Franzel, S. L. (1989). Guided Search: An alternative to the feature integration model for visual search. Journal of Experimental Psychology: Human Perception and Performance, 15, 419–433. doi:10.1037/0096-1523.15.3.419 PubMedGoogle Scholar
  100. Wolfe, J. M., & Gancarz, G. (1996). Guided Search 3.0: A model of visual search catches up with Jay Enoch 40 years later. In V. Lakshminrayanan (Ed.), Basic and clinical applications of vision science (pp. 189–192). Dordrecht, Netherlands: Kluwer Academic.Google Scholar
  101. Wolfe, J. M., & Horowitz, T. S. (2004). What attributes guide the deployment of visual attention and how do they do it? Nature Reviews: Neuroscience, 5, 1–7. doi:10.1038/nrn1411 Google Scholar
  102. Wolfe, J. M., Horowitz, T. S., Kenner, N., Hyle, M., & Vasan, N. (2004). How fast can you change your mind? The speed of top-down guidance in visual search. Vision Research, 44, 1411–1426. doi:10.1016/j.visres.2003.11.024 PubMedGoogle Scholar
  103. Yang, H., & Zelinsky, G. J. (2009). Visual search is guided to categorically-defined targets. Vision Research, 49, 2095–2103. doi:10.1016/j.visres.2009.05.017 PubMedCentralPubMedGoogle Scholar
  104. Yang, H., Chen, X., & Zelinsky, G. J. (2009). A new look at novelty effects: Guiding search away from old distractors. Attention, Perception & Psychophysics, 71, 554–564. doi:10.3758/APP.71.3.554 Google Scholar
  105. Zelinsky, G. J. (2008). A theory of eye movements during target acquisition. Psychological Review, 115, 787–835. doi:10.1037/a0013118 PubMedCentralPubMedGoogle Scholar
  106. Zhang, Y., Meyers, E. M., Bichot, N. P., Serre, T., Poggio, T. A., & Desimone, R. (2011). Object decoding with attention in inferior temporal cortex. Proceedings of the National Academy of Sciences, 108, 8850–8855. doi:10.1073/pnas.1100999108 Google Scholar
  107. Zhang, Y., Yang, H., Samaras, D., & Zelinsky, G. J. (2006). A computational model of eye movements during object class detection. In Y. Weiss, B. Scholkopf, & J. Platt (Eds.), Advances in neural information processing systems (Vol. 18, pp. 1609–1616). Cambridge, MA: MIT Press.Google Scholar

Copyright information

© The Psychonomic Society, Inc. 2014

Authors and Affiliations

  1. 1.Department of PsychologyNew Mexico State UniversityLas CrucesUSA
  2. 2.Department of PsychologyArizona State UniversityTempeUSA

Personalised recommendations