Attention, Perception, & Psychophysics

, Volume 80, Issue 5, pp 1110–1126 | Cite as

Learning efficient visual search for stimuli containing diagnostic spatial configurations and color-shape conjunctions

  • Eric A. Reavis
  • Sebastian M. Frank
  • Peter U. Tse


Visual search is often slow and difficult for complex stimuli such as feature conjunctions. Search efficiency, however, can improve with training. Search for stimuli that can be identified by the spatial configuration of two elements (e.g., the relative position of two colored shapes) improves dramatically within a few hundred trials of practice. Several recent imaging studies have identified neural correlates of this learning, but it remains unclear what stimulus properties participants learn to use to search efficiently. Influential models, such as reverse hierarchy theory, propose two major possibilities: learning to use information contained in low-level image statistics (e.g., single features at particular retinotopic locations) or in high-level characteristics (e.g., feature conjunctions) of the task-relevant stimuli. In a series of experiments, we tested these two hypotheses, which make different predictions about the effect of various stimulus manipulations after training. We find relatively small effects of manipulating low-level properties of the stimuli (e.g., changing their retinotopic location) and some conjunctive properties (e.g., color-position), whereas the effects of manipulating other conjunctive properties (e.g., color-shape) are larger. Overall, the findings suggest conjunction learning involving such stimuli might be an emergent phenomenon that reflects multiple different learning processes, each of which capitalizes on different types of information contained in the stimuli. We also show that both targets and distractors are learned, and that reversing learned target and distractor identities impairs performance. This suggests that participants do not merely learn to discriminate target and distractor stimuli, they also learn stimulus identity mappings that contribute to performance improvements.


Perceptual learning Visual search 



The authors wish to thank numerous undergraduate research assistants for their help with data collection. E.R. was supported by a National Science Foundation graduate research fellowship during data collection for this project (DGE-1313911). The research was supported by internal Dartmouth funding, Templeton Foundation Grant 14316 to P.T. and National Science Foundation Grant 1632738 to P.T. E.R. is currently supported by a postdoctoral Ruth L. Kirschstein National Research Service Award from the National Institutes of Health (F32MH108317).

Supplementary material

13414_2018_1516_MOESM1_ESM.pdf (80 kb)
ESM 1 (PDF 79 kb)


  1. Ahissar, M., & Hochstein, S. (1997). Task difficulty and the specificity of perceptual learning. Nature, 387, 401–406.CrossRefPubMedGoogle Scholar
  2. Ahissar, M., & Hochstein, S. (2004). The reverse hierarchy theory of visual perceptual learning. Trends in Cognitive Sciences, 8(10), 457–464. doi: CrossRefPubMedGoogle Scholar
  3. Ahissar, M., Nahum, M., Nelken, I., & Hochstein, S. (2009). Reverse hierarchies and sensory learning. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 364, 285–299. doi: CrossRefPubMedGoogle Scholar
  4. Ball, K., & Sekuler, R. (1982). A specific and enduring improvement in visual motion discrimination. Science, 218(4573), 697–698.CrossRefPubMedGoogle Scholar
  5. Ball, K., & Sekuler, R. (1987). Direction-specific improvement in motion discrimination. Vision Research, 27(6), 953–965.CrossRefPubMedGoogle Scholar
  6. Beck, M. R., Peterson, M. S., Boot, W. R., Vomela, M., & Kramer, A. F. (2006). Explicit memory for rejected distractors during visual search. Visual Cognition, 14(2), 150–174. doi: CrossRefGoogle Scholar
  7. Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10(4), 433–436.CrossRefPubMedGoogle Scholar
  8. Bravo, M. J., & Farid, H. (2009). The specificity of the search template. Journal of Vision, 9(1), 1–9. doi: CrossRefPubMedGoogle Scholar
  9. Bravo, M. J., & Farid, H. (2012). Task demands determine the specificity of the search template. Perception & Psychophysics, 74(1), 124–131. doi: CrossRefGoogle Scholar
  10. Carrasco, M., Ponte, D., Rechea, C., & Sampedro, M. J. (1998). “Transient structures”: The effects of practice and distractor grouping on within-dimension conjunction searches. Perception & Psychophysics, 60(7), 1243–1258.CrossRefGoogle Scholar
  11. Chelazzi, L., Miller, E. K., Duncan, J., & Desimone, R. (1993). A neural basis for visual search in inferior temporal cortex. Nature, 363, 345–347.CrossRefPubMedGoogle Scholar
  12. Czerwinski, M., Lightfoot, N., & Shiffrin, R. M. (1992). Automatization and training in visual search. The American Journal of Psychology, 105(2), 271–315.CrossRefPubMedGoogle Scholar
  13. Dosher, B., & Lu, Z.-L. (2017). Visual perceptual learning and models. Annual Review of Vision Science, 3, 343–363.CrossRefPubMedGoogle Scholar
  14. Duncan, J., & Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96(3), 433–458.CrossRefPubMedGoogle Scholar
  15. Duncan, R., & Boynton, G. (2003). Cortical magnification within human primary visual cortex correlates with acuity thresholds. Neuron, 38, 659–671.CrossRefPubMedGoogle Scholar
  16. Egeth, H., & Dagenbach, D. (1991). Parallel versus serial processing in visual search: Further evidence from subadditive effects of visual quality. Journal of Experimental Psychology: Human Perception and Performance, 17(2), 551–560.PubMedGoogle Scholar
  17. Ellison, A., & Walsh, V. (1998). Perceptual learning in visual search: Some evidence of specificities. Vision Research, 38(3), 333–345.CrossRefPubMedGoogle Scholar
  18. Fahle, M. (1997). Specificity of learning curvature, orientation, and vernier discriminations. Vision Research, 37(14), 1885–1895.CrossRefPubMedGoogle Scholar
  19. Fahle, M. (2004). Perceptual learning : A case for early selection. Journal of Vision, 4, 879–890. doi: CrossRefPubMedGoogle Scholar
  20. Fahle, M., Edelman, S., & Poggio, T. (1995). Fast perceptual learning in hyperacuity. Vision Research, 35(21), 3003–3013.CrossRefPubMedGoogle Scholar
  21. Fahle, M., & Morgan, M. (1996). No transfer of perceptual learning between similar stimuli in the same retinal position. Current Biology, 6(3), 292–297.CrossRefPubMedGoogle Scholar
  22. Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191.CrossRefPubMedGoogle Scholar
  23. Fortier-Gauthier, U., Dell’Acqua, R., & Jolicœur, P. (2013). The “red-alert” effect in visual search: Evidence from human electrophysiology. Psychophysiology, 50, 671–679. doi: CrossRefPubMedGoogle Scholar
  24. Frank, S. M., Greenlee, M. W., & Tse, P. U. (2018). Long time no see : Enduring behavioral and neuronal changes in perceptual learning of motion trajectories 3 years after training. Cerebral Cortex, 28(4), 1260–1271. doi: CrossRefPubMedGoogle Scholar
  25. Frank, S. M., Reavis, E. A., Greenlee, M. W., & Tse, P. U. (2016). Pretraining cortical thickness predicts subsequent perceptual learning rate in a visual search task. Cerebral Cortex, 26, 1–10. doi: CrossRefGoogle Scholar
  26. Frank, S. M., Reavis, E. A., Tse, P. U., & Greenlee, M. W. (2014). Neural mechanisms of feature conjunction learning: Enduring changes in occipital cortex after a week of training. Human Brain Mapping, 35(4), 1201–1211. doi: CrossRefPubMedGoogle Scholar
  27. Gibson, E. (1963). Perceptual learning. Annual Review of Psychology, 14, 29–56.CrossRefPubMedGoogle Scholar
  28. Gold, J. I., & Watanabe, T. (2010). Perceptual learning. Current Biology, 20(2), 46–48.CrossRefGoogle Scholar
  29. Goldstone, R. L. (1998). Perceptual learning. Annual Review of Psychology, 49, 585–612. doi: CrossRefPubMedGoogle Scholar
  30. Harris, H., Gliksberg, M., & Sagi, D. (2012). Generalized perceptual learning in the absence of sensory adaptation. Current Biology, 22(19), 1813–1817. doi: CrossRefPubMedGoogle Scholar
  31. Heathcote, A., & Mewhort, D. J. K. (1993). Representation and selection of relative position. Journal of Experimental Psychology: Human Perception and Performance, 19(3), 488–516.PubMedGoogle Scholar
  32. Hickey, C., Kaiser, D., & Peelen, M. V. (2015). Reward guides attention to object categories in real-world scenes. Journal of Experimental Psychology: General, 144(2), 264–273.CrossRefGoogle Scholar
  33. Hillstrom, A. P., & Logan, G. D. (1998). Decomposing visual search: Evidence of multiple item-specific skills. Journal of Experimental Psychology: Human Perception and Performance, 24(5), 1385–1398.PubMedGoogle Scholar
  34. Hochstein, S., & Ahissar, M. (2002). View from the top : Hierarchies and reverse hierarchies review. Neuron, 36, 791–804.CrossRefPubMedGoogle Scholar
  35. Karni, A., & Sagi, D. (1991). Where practice makes perfect in texture discrimination: Evidence for primary visual cortex plasticity. Proceedings of the National Academy of Sciences of the United States of America, 88(11), 4966–4970.CrossRefPubMedPubMedCentralGoogle Scholar
  36. Karni, A., & Sagi, D. (1993). The time course of learning a visual skill. Nature, 365(6443), 250–252. doi: CrossRefPubMedGoogle Scholar
  37. Kwak, H. W., Dagenbach, D., & Egeth, H. (1991). Further evidence for a time-independent shift of the focus of attention. Perception & Psychophysics, 49(5), 473–480.CrossRefGoogle Scholar
  38. Lindsey, D. T., Brown, A. M., Reijnen, E., Rich, A. N., Kuzmova, Y. I., & Wolfe, J. M. (2010). Color channels, not color appearance or color categories, guide visual search for desaturated color targets. Psychological Science, 21(9), 1208–1214. doi: CrossRefPubMedPubMedCentralGoogle Scholar
  39. Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95(4), 492–527.CrossRefGoogle Scholar
  40. Logothetis, N. K., Pauls, J., & Poggio, T. (1995). Shape representation in the inferior temporal cortex of monkeys. Current Biology: CB, 5(5), 552–563.CrossRefPubMedGoogle Scholar
  41. Maniglia, M., & Seitz, A. R. (2018). Towards a whole brain model of perceptual learning. Current Opinion in Behavioral Sciences, 20, 47–55. doi: CrossRefPubMedGoogle Scholar
  42. Pelli, D. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10(4), 437–442.CrossRefPubMedGoogle Scholar
  43. Poggio, T., Fahle, M., & Edelman, S. (1992). Fast perceptual learning in visual hyperacuity. Science, 256(5059), 1018–1021.CrossRefPubMedGoogle Scholar
  44. Reavis, E. A., Frank, S. M., Greenlee, M. W., & Tse, P. U. (2016). Neural correlates of context-dependent feature-conjunction learning in visual search tasks. Human Brain Mapping, 37, 2319–2330.CrossRefPubMedGoogle Scholar
  45. Rosenthal, R. (1991). Meta-analytic procedures for social research. Newbury Park, CA: SAGE.CrossRefGoogle Scholar
  46. Sasaki, Y., Náñez, J. E., & Watanabe, T. (2010). Advances in visual perceptual learning and plasticity. Nature Reviews Neuroscience, 11(1), 53–60. doi: CrossRefPubMedGoogle Scholar
  47. Schneider, W., & Shiffrin, R. M. (1977). Controlled and automatic human information processing: I. Detection, search, and attention. Psychological Review, 84(1), 1–66.CrossRefGoogle Scholar
  48. Shiffrin, R. M., & Schneider, W. (1977). Controlled and automatic human information processing: II. Perceptual learning, automatic attending, and a general theory. Psychological Review, 84(2), 127–190.CrossRefGoogle Scholar
  49. Shiu, L. P., & Pashler, H. (1992). Improvement in line orientation discrimination is retinally local but dependent on cognitive set. Perception & Psychophysics, 52(5), 582–588.CrossRefGoogle Scholar
  50. Sireteanu, R., & Rettenbach, R. (1995). Perceptual learning in visual search: Fast, enduring, but non-specific. Vision Research, 35(14), 2037–2043.CrossRefPubMedGoogle Scholar
  51. Sireteanu, R., & Rettenbach, R. (2000). Perceptual learning in visual search generalizes over tasks, locations, and eyes. Vision Research, 40(21), 2925–2949.CrossRefPubMedGoogle Scholar
  52. Sripati, A. P., & Olson, C. R. (2010). Global image dissimilarity in macaque inferotemporal cortex predicts human visual search efficiency. Journal of Neuroscience, 30(4), 1258–1269. doi: CrossRefPubMedGoogle Scholar
  53. Su, Y., Lai, Y., Huang, W., Tan, W., Qu, Z., & Ding, Y. (2014). Short-term perceptual learning in visual conjunction search. Journal of Experimental Psychology: Human Perception and Performance, 40(4), 1415–1424. doi: PubMedGoogle Scholar
  54. Townsend, J., & Ashby, F. (1978). Methods of modeling capacity in simple processing systems. In N. Castellan & F. Restle (Eds.), Cognitive theory (Vol. 3, pp. 199–239). Hillsdale, NJ: Erlbaum.Google Scholar
  55. Treisman, A., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97–136.CrossRefPubMedGoogle Scholar
  56. Treisman, A., Vieira, A., & Hayes, A. (1992). Automaticity and preattentive processing. The American Journal of Psychology, 105(2), 341–362.CrossRefPubMedGoogle Scholar
  57. Vickery, T. J., King, L.-W., & Jiang, Y. (2005). Setting up the target template in visual search. Journal of Vision, 5, 81–92. doi: CrossRefPubMedGoogle Scholar
  58. Walsh, V., Ashbridge, E., & Cowey, A. (1998). Cortical plasticity in perceptual learning demonstrated by transcranial magnetic stimulation. Neuropsychologia, 36(4), 363–367.CrossRefPubMedGoogle Scholar
  59. Wang, Q., Cavanagh, P., & Green, M. F. (1994). Familiarity and pop-out in visual search. Perception & Psychophysics, 56(5), 495–500.CrossRefGoogle Scholar
  60. Wang, R., Wang, J., Zhang, J.-Y., Xie, X.-Y., Yang, Y.-X., Luo, S.-H., … Li, W. (2016). Perceptual learning at a conceptual level. Journal of Neuroscience, 36(7), 2238–2246.
  61. Wang, R., Zhang, J., Klein, S. A., & Levi, D. M. (2014). Vernier perceptual learning transfers to completely untrained retinal locations after double training: A “‘piggybacking’” effect. Journal of Vision, 14, 1–10. doi: Google Scholar
  62. Watanabe, T., Náñez, J. E., & Sasaki, Y. (2001). Perceptual learning without perception. Nature, 413(6858), 844–848. doi: CrossRefPubMedGoogle Scholar
  63. Watanabe, T., & Sasaki, Y. (2015). Perceptual learning : Toward a comprehensive theory. Annual Review of Psychology, 66(August), 197–221. doi: CrossRefPubMedGoogle Scholar
  64. Wolfe, J. M. (1998). Visual search. In H. Pashler (Ed.), Attention. London, UK: University College London Press.Google Scholar
  65. Wolfe, J. M., Cave, K. R., & Franzel, S. L. (1989). Guided search: An alternative to the feature integration model for visual search. Journal of Experimental Psychology: Human Perception and Performance, 15(3), 419–433.PubMedGoogle Scholar
  66. Wolfe, J. M., & Horowitz, T. S. (2017). Five factors that guide attention in visual search. Nature Human Behavior, 1(58), 1–8. doi: Google Scholar
  67. Wyble, B., Folk, C., & Potter, M. C. (2013). Contingent attentional capture by conceptually relevant images. Journal of Experimental Psychology: Human Perception and Performance, 39(3), 861–871. doi: PubMedGoogle Scholar
  68. Xiao, L.-Q., Zhang, J.-Y., Wang, R., Klein, S. A., Levi, D. M., & Yu, C. (2008). Complete transfer of perceptual learning across retinal locations enabled by double training. Current Biology, 18(24), 1922–1926. doi: CrossRefPubMedGoogle Scholar
  69. Yashar, A., & Carrasco, M. (2016). Rapid and long-lasting learning of feature binding. Cognition, 154, 130–138. doi: CrossRefPubMedPubMedCentralGoogle Scholar

Copyright information

© The Psychonomic Society, Inc. 2018

Authors and Affiliations

  • Eric A. Reavis
    • 1
    • 2
    • 3
  • Sebastian M. Frank
    • 1
    • 4
  • Peter U. Tse
    • 1
  1. 1.Department of Psychological & Brain SciencesDartmouth CollegeHanoverUSA
  2. 2.Semel Institute for Neuroscience and Human BehaviorUniversity of California, Los AngelesLos AngelesUSA
  3. 3.Desert Pacific Mental Illness Research, Education, and Clinical CenterGreater Los Angeles Veterans Affairs Healthcare SystemLos AngelesUSA
  4. 4.Department of Cognitive, Linguistic & Psychological SciencesBrown UniversityProvidenceUSA

Personalised recommendations