Attention, Perception, & Psychophysics

, Volume 79, Issue 6, pp 1578–1592

Categorical templates are more useful when features are consistent: Evidence from eye movements during search for societally important vehicles

  • Michael C. Hout
  • Arryn Robbins
  • Hayward J. Godwin
  • Gemma Fitzsimmons
  • Collin Scarince
Short Report

Abstract

Unlike in laboratory visual search tasks—wherein participants are typically presented with a pictorial representation of the item they are asked to seek out—in real-world searches, the observer rarely has veridical knowledge of the visual features that define their target. During categorical search, observers look for any instance of a categorically defined target (e.g., helping a family member look for their mobile phone). In these circumstances, people may not have information about noncritical features (e.g., the phone’s color), and must instead create a broad mental representation using the features that define (or are typical of) the category of objects they are seeking out (e.g., modern phones are typically rectangular and thin). In the current investigation (Experiment 1), using a categorical visual search task, we add to the body of evidence suggesting that categorical templates are effective enough to conduct efficient visual searches. When color information was available (Experiment 1a), attentional guidance, attention restriction, and object identification were enhanced when participants looked for categories with consistent features (e.g., ambulances) relative to categories with more variable features (e.g., sedans). When color information was removed (Experiment 1b), attention benefits disappeared, but object recognition was still better for feature-consistent target categories. In Experiment 2, we empirically validated the relative homogeneity of our societally important vehicle stimuli. Taken together, our results are in line with a category-consistent view of categorical target templates (Yu, Maxfield, & Zelinsky in, Psychological Science, 2016. doi:10.1177/0956797616640237), and suggest that when features of a category are consistent and predictable, searchers can create mental representations that allow for the efficient guidance and restriction of attention as well as swift object identification.

Keywords

Eye-movements Target templates Categorical search 

References

  1. Duncan, J., & Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96(3), 433.CrossRefPubMedGoogle Scholar
  2. Godwin, H. J., Walenchok, S., Houpt, J. W., Hout, M. C., & Goldinger, S. D. (2015). Faster than the speed of rejection: Object identification processes during visual search for multiple targets. Journal of Experimental Psychology: Human Perception & Performance, 41, 1007–1020. doi:10.1037/xhp0000036 Google Scholar
  3. Hout, M. C., Godwin, H. J., Fitzsimmons, G., Robbins, A., Menneer, T., & Goldinger, S. D. (2015). Using multidimensional scaling to quantify similarity in visual search and beyond. Attention, Perception, & Psychophysics, 78, 3–20. doi:10.3758/s13414-015-1010-6 CrossRefGoogle Scholar
  4. Hout, M. C., & Goldinger, S. D. (2010). Learning in repeated visual search. Attention, Perception & Psychophysics, 72, 1267–1282. doi:10.3758/APP.72.5.1267 CrossRefGoogle Scholar
  5. Hout, M. C., & Goldinger, S. D. (2012). Incidental learning speeds visual search by lowering response thresholds, not by improving efficiency: Evidence from eye movements. Journal of Experimental Psychology: Human Perception and Performance, 38, 90–112. doi:10.1037/a0023894 PubMedGoogle Scholar
  6. Hout, M. C., & Goldinger, S. D. (2015). Target templates: The precision of mental representations affects attentional guidance and decision-making in visual search. Attention, Perception, & Psychophysics, 77(1), 128–149. doi:10.3758/s13414-014-0764-6 CrossRefGoogle Scholar
  7. Hout, M. C., Walenchok, S. C., Goldinger, S. D., & Wolfe, J. M. (2015). Failures of perception in the low-prevalence effect: Evidence from active and passive visual search. Journal of Experimental Psychology: Human Perception & Performance, 41, 977–994. doi:10.1037/xhp0000053 Google Scholar
  8. Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10), 1489–1506.CrossRefPubMedGoogle Scholar
  9. Itti, L., & Koch, C. (2001). Computational modelling of visual attention. Nature Reviews Neuroscience, 2(3), 194–203.CrossRefPubMedGoogle Scholar
  10. Maxfield, J. T., Stalder, W. D., & Zelinsky, G. J. (2014). Effects of target typicality on categorical search. Journal of Vision, 14, 1. doi:10.1167/14.12.1 CrossRefPubMedPubMedCentralGoogle Scholar
  11. Peterson, M. S., Kramer, A. F., Wang, R. F., Irwin, D. E., & McCarley, J. S. (2001). Visual search has memory. Psychological Science, 4, 287–292.CrossRefGoogle Scholar
  12. Psychology Software Tools. (2012). E-Prime 2.0 [Computer software]. Retrieved from http://www.pstnet.com
  13. Robbins, A., & Hout, M. C. (2015). Categorical templates: Typical category members are found and identified quickly during word-cued search (Summary published in Object Perception, Attention, and Memory (OPAM) 2015 Conference Report). Visual Cognition, 23(7), 817–821. doi:10.1080/13506285.2015.1093247 CrossRefGoogle Scholar
  14. Schmidt, J., & Zelinsky, G. J. (2009). Search guidance is proportional to the categorical specificity of a target cue. The Quarterly Journal of Experimental Psychology, 62, 1904–1914.CrossRefPubMedGoogle Scholar
  15. Schmidt, J., & Zelinsky, G. J. (2017). Adding details to the attentional template offsets search difficulty: Evidence from contralateral delay activity. Journal of Experimental Psychology: Human Perception and Performance, 43(3), 429–437. doi:10.1037/xhp0000367 PubMedGoogle Scholar
  16. Treisman, A., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97–136. doi:10.1016/0010-0285(80)90005-5 CrossRefPubMedGoogle Scholar
  17. Walenchok, S. C., Hout, M. C., & Goldinger, S. D. (2016). Implicit object naming in visual search: Evidence from phonological competition. Attention, Perception, & Psychophysics, 78, 2633–2654. doi:10.3758/s13414-016-1184-6 CrossRefGoogle Scholar
  18. Wolfe, J. M., & Horowitz, T. S. (2017). Five factors that guide attention in visual search. Nature: Human Behavior, 1. doi:10.1038/s41562-017-0058
  19. Wolfe, J. M., Horowitz, T. S., Kenner, N., Hyle, M., & Vasan, N. (2004). How fast can you change your mind? The speed of top-down guidance in visual search. Vision Research, 44, 1411–1426.CrossRefPubMedGoogle Scholar
  20. Yang, H., & Zelinsky, G. J. (2009). Visual search is guided to categorically-defined targets. Vision Research, 49, 2095–2103.CrossRefPubMedPubMedCentralGoogle Scholar
  21. Yu, C., Maxfield, J., & Zelinsky, G. (2016). Searching for category-consistent features: A computational approach to understanding visual category representation. Psychological Science. doi:10.1177/0956797616640237 Google Scholar

Copyright information

© The Psychonomic Society, Inc. 2017

Authors and Affiliations

  • Michael C. Hout
    • 1
  • Arryn Robbins
    • 1
  • Hayward J. Godwin
    • 2
  • Gemma Fitzsimmons
    • 2
  • Collin Scarince
    • 1
  1. 1.Department of PsychologyNew Mexico State UniversityLas CrucesUSA
  2. 2.Department of PsychologyUniversity of SouthamptonSouthamptonUK

Personalised recommendations