Advertisement

Dwelling on simple stimuli in visual search

  • Gernot HorstmannEmail author
  • Stefanie I. Becker
  • Anna Grubert
40 Years of Feature Integration: Special Issue in Memory of Anne Treisman

Abstract

Research and theories on visual search often focus on visual guidance to explain differences in search. Guidance is the tuning of attention to target features and facilitates search because distractors that do not show target features can be more effectively ignored (skipping). As a general rule, the better the guidance is, the more efficient search is. Correspondingly, behavioral experiments often interpreted differences in efficiency as reflecting varying degrees of attentional guidance. But other factors such as the time spent on processing a distractor (dwelling) or multiple visits to the same stimulus in a search display (revisiting) are also involved in determining search efficiency. While there is some research showing that dwelling and revisiting modulate search times in addition to skipping, the corresponding studies used complex naturalistic and category-defined stimuli. The present study tests whether results from prior research can be generalized to more simple stimuli, where target-distractor similarity, a strong factor influencing search performance, can be manipulated in a detailed fashion. Thus, in the present study, simple stimuli with varying degrees of target-distractor similarity were used to deliver conclusive evidence for the contribution of dwelling and revisiting to search performance. The results have theoretical and methodological implications: They imply that visual search models should not treat dwelling and revisiting as constants across varying levels of search efficiency and that behavioral search experiments are equivocal with respect to the responsible processing mechanisms underlying more versus less efficient search. We also suggest that eye-tracking methods may be used to disentangle different search components such as skipping, dwelling, and revisiting.

Keywords

Attention: Selective visual search Eye movements and visual attention 

Notes

Funding

This work was supported by the Cluster of Excellence – Cognitive Interaction Technology ‘CITEC’ (EXC 277) at Bielefeld University, which is funded by the German Research Foundation (DFG), and by DFG grant HO 3248/2-1 to Gernot Horstmann.

References

  1. Alexander, R. G., & Zelinsky, G. J. (2012). Effects of part-based similarity on visual search: The Frankenbear experiment. Vision research, 54, 20-30.CrossRefGoogle Scholar
  2. Becker, S. I. (2011). Determinants of dwell time in visual search: Similarity or perceptual difficulty? PLoS ONE 6(3): e17740. doi  https://doi.org/10.1371/journal.pone.0017740 CrossRefPubMedPubMedCentralGoogle Scholar
  3. Chun, M. M., & Wolfe, J. M. (1996). Just say no: How are visual searches terminated when there is no target-present? Cognitive Psychology, 30, 39-78.CrossRefGoogle Scholar
  4. Deubel, H., & Schneider, W. X. (1996). Saccade target selection and object recognition: Evidence for a common attentional mechanism. Vision Research, 36(12), 1827–1837.Google Scholar
  5. Duncan, J., & Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96, 443-458.CrossRefGoogle Scholar
  6. Duncan, J., Ward, R., & Shapiro, K. (1994). Direct measurement of attentional dwell time in human vision. Nature, 369(6478), 313–315.Google Scholar
  7. Einhäuser, W., & Nuthmann, A. (2016). Salient in space, salient in time: Fixation probability predicts fixation duration during natural scene viewing. Journal of Vision, 16(11): 13, 1–17.Google Scholar
  8. Found, A., & Müller, H. J. (1996). Searching for unknown feature targets on more than one dimension: Investigating a “dimension-weighting” account. Perception & Psychophysics, 58(1), 88-101.CrossRefGoogle Scholar
  9. Godwin, H. J., Reichle, E. D., & Menneer, T. (2017). Modeling Lag-2 revisits to understand trade-offs in mixed control of fixation termination during visual search. Cognitive Science, 41(4), 996-1019.CrossRefGoogle Scholar
  10. Gould, J. D. (1967). Pattern-recognition and eye-movement parameters. Perception & Psychophysics, 2, 399-407.CrossRefGoogle Scholar
  11. Hooge, I. T. C., & Erkelens, C. J. (1998). Adjustment of fixation duration in visual search. Vision Research, 38, 1295–1302.CrossRefGoogle Scholar
  12. Horowitz, T. S., & Wolfe, J. M. (1998). Visual search has no memory. Nature, 394, 575-577.CrossRefGoogle Scholar
  13. Horstmann, G., & Becker, S. I. (2019). More efficient visual search for happy faces may not indicate guidance, but rather faster distractor rejection: Evidence from eye movements and fixations. Emotion. Advance online publication.Google Scholar
  14. Horstmann, G., Scharlau, I., & Ansorge, U. (2006). More efficient rejection of happy than of angry face distractors in visual search. Psychonomic Bulletin & Review, 13(6), 1067–1073.Google Scholar
  15. Horstmann, G., Lipp, O. V., & Becker, S. I. (2012). Of toothy grins and angry snarls - Open mouth displays contribute to efficiency gains in search for emotional faces. Journal of Vision. 12 (5), 7.CrossRefGoogle Scholar
  16. Horstmann, G., Herwig, A., & Becker, S. I. (2016). Distractor dwelling, skipping, and revisiting determine target-absent performance in difficult visual search. Frontiers in Psychology,7, 1152.CrossRefGoogle Scholar
  17. Horstmann, G., Becker, S., & Ernst, D. (2017). Dwelling, rescanning, and skipping of distractors explain search efficiency in difficult search better than guidance by the target. Visual Cognition, 25(1–3), 291–305.Google Scholar
  18. Horstmann, G., Ernst, D., & Becker, S.I. (2019). Dwelling on distractors varying in target-distractor similarity. Acta Psychologica, 198, in press.Google Scholar
  19. Hout, M.C., Godwin, H.J., Fitzsimmons, G. Robbins, A., Menneer, T., & Goldinger, S.D. (2016). Using multidimensional scaling to quantify similarity in visual search and beyond. Attention, Perception, & Psychophysics, 78, 3–20.Google Scholar
  20. Hout, M. C., Robbins, A., Godwin, H. J., Fitzsimmons, G, & Scarince, C. (2017). Categorical templates are more useful when features are consistent: Evidence from eye-movements during search for societally important vehicles. Attention, Perception, & Psychophysics, 79, 1578-1592CrossRefGoogle Scholar
  21. Hulleman, J. & Olivers, C. N. L. (2016). The impending demise of the item in visual search. Behavioral and Brain Sciences, in press.Google Scholar
  22. Hulleman, J., & Olivers, C. N. (2017). The impending demise of the item in visual search. Behavioral and Brain Sciences, 40, 1–69.Google Scholar
  23. Itti, L., & Koch, C. (2001). Computational modelling of visual attention. Nature reviews neuroscience, 2(3), 194.Google Scholar
  24. James R. Antes, (1974) The time course of picture viewing. Journal of Experimental Psychology 103 (1):62–70Google Scholar
  25. Jenkins, M., Grubert, A., & Eimer, M. (2018). Category-based attentional guidance can operate in parallel for multiple target objects. Biological Psychology, 135, 211-219.CrossRefGoogle Scholar
  26. Jeremy M. Wolfe, (2003) Moving towards solutions to some enduring controversies in visual search. Trends in Cognitive Sciences 7 (2):70–76Google Scholar
  27. Laurent Itti, Christof Koch, (2000) A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research 40 (10–12):1489–1506Google Scholar
  28. Ludwig, C. J., Davies, J. R., & Eckstein, M. P. (2014). Foveal analysis and peripheral selection during active visual sampling. Proceedings of the National Academy of Sciences, 111(2), E291-E299.Google Scholar
  29. Nakagawa, S., & Schielzeth, H. (2013). A general and simple method for obtaining R2 from generalized linear mixed-effects models. Methods in Ecology and Evolution, 4(2), 133-142.CrossRefGoogle Scholar
  30. Neider, M. B., & Zelinsky, G. J. (2006). Scene context guides eye movements during visual search. Vision Research, 46(5), 614-621.CrossRefGoogle Scholar
  31. Nuthmann, A., Smith, T. J., Engbert, R., & Henderson, J. M. (2010). CRISP: a computational model of fixation durations in scene viewing. Psychological Review, 117(2), 382–405.Google Scholar
  32. Posner, M. I., Rafal, R. D., Choate, L. S., & Vaughan, J. (1985). Inhibition of return: Neural basis and function. Cognitive Neuropsychology, 2(3), 211-228.CrossRefGoogle Scholar
  33. Reingold, E. M., & Glaholt, M. G. (2014). Cognitive control of fixation duration in visual search: The role of extrafoveal processing. Visual Cognition, 22(3–4), 610–634.Google Scholar
  34. Shipp, S. (2004). The brain circuitry of attention. Trends in Cognitive Sciences, 8(5), 223-230.CrossRefGoogle Scholar
  35. Treisman, A. (1985). Preattentive processing in vision. Computer vision, graphics, and image processing, 31(2), 156-177.CrossRefGoogle Scholar
  36. Treisman, A. M., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97-136.CrossRefGoogle Scholar
  37. Treisman, A., & Souther, J. (1985). Search asymmetry: A diagnostic for preattentive processing of separable features. Journal of Experimental Psychology: General, 114(3), 285-310.CrossRefGoogle Scholar
  38. Unema, P. J., Pannasch, S., Joos, M., & Velichkovsky, B. M. (2005). Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration. Visual Cognition, 12(3), 473–494.Google Scholar
  39. Venini, D., Remington, R.W., Horstmann, G., & Becker, S.I. (2014). Centre-of-gravity fixations in visual search: When looking at nothing helps to find something. Journal of Ophthalmology, 237812, 1-14.CrossRefGoogle Scholar
  40. Viviani, P., & Swensson, R. G. (1982). Saccadic eye movements to peripherally discriminated visual targets. Journal of Experimental Psychology: Human Perception and Performance, 8(1), 113–126.Google Scholar
  41. Vlaskamp, B. N., & Hooge, I. T. C. (2006). Crowding degrades saccadic search performance. Vision Research, 46(3), 417–425.Google Scholar
  42. Walenchok, S. C., Hout, M. C., & Goldinger, S. D. (2016). Implicit object naming in visual search: Evidence from phonological competition. Attention, Perception, & Psychophysics, 78, 2633–2654CrossRefGoogle Scholar
  43. Wolfe, J. M. (1994). Guided Search 2.0: A revised model of guided search. Psychonomic Bulletin & Review, 1, 202-238.CrossRefGoogle Scholar
  44. Wolfe, J. M. (1998). What can 1 million trials tell us about visual search? Psychological Science, 9, 33-39.CrossRefGoogle Scholar
  45. Wolfe, J. M. (2001). Asymmetries in visual search: An Introduction. Perception and Psychophysics, 63(3), 381-389.CrossRefGoogle Scholar
  46. Wolfe J. M. (2007). Guided search 4.0: Current progress with a model of visual search. In W. Gray (Ed.), Integrated models of cognitive systems (pp. 99–119). New York: Oxford.CrossRefGoogle Scholar
  47. Wolfe, J. M. (2018). Visual Search. In J. Wixted) (Ed.), Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience (Vol. II. Sensation, Perception & Attention: John Serences (UCSD)): Wiley.Google Scholar
  48. Wolfe, J. M., & Horowitz, T. S. (2017). Five factors that guide attention in visual search. Nature Human Behaviour, 1(3), 0058.Google Scholar
  49. Wolfe, J. M., Cave, K. R., & Franzel, S. L. (1989). Guided search: An alternative to the feature integration model for visual search. Journal of Experimental Psychology: Human Perception and Performance, 15(3), 419.PubMedGoogle Scholar
  50. Zelinsky, G. J. (2008). A theory of eye movements during target acquisition. Psychological Review, 115, 787-835.CrossRefGoogle Scholar
  51. Zelinsky, G. J., & Schmidt, J. (2009). An effect of referential scene constraint on search implies scene segmentation. Visual Cognition, 17(6), 1004-1028.CrossRefGoogle Scholar

Copyright information

© The Psychonomic Society, Inc. 2019

Authors and Affiliations

  • Gernot Horstmann
    • 1
    Email author
  • Stefanie I. Becker
    • 2
  • Anna Grubert
    • 3
  1. 1.Department of PsychologyBielefeld UniversityBielefeldGermany
  2. 2.University of QueenslandSt LuciaAustralia
  3. 3.Durham UniversityDurhamUK

Personalised recommendations