Abstract
Presenting an auditory or tactile cue in temporal synchrony with a change in the color of a visual target can facilitate participants’ visual search performance. In the present study, we compared the magnitude of unimodal auditory, vibrotactile, and bimodal (i.e., multisensory) cuing benefits when the nonvisual cues were presented in temporal synchrony with the changing of the target’s color (Experiments 1 and 2). The target (a horizontal or vertical line segment) was presented among a number of distractors (tilted line segments) that also changed color at various times. In Experiments 3 and 4, the cues were also made spatially informative with regard to the location of the visual target. The unimodal and bimodal cues gave rise to an equivalent (significant) facilitation of participants’ visual search performance relative to a no-cue baseline condition. Making the unimodal auditory and vibrotactile cues spatially informative produced further performance improvements (on validly cued trials), as compared with cues that were spatially uninformative or otherwise spatially invalid. A final experiment was conducted in order to determine whether cue location (close to versus far from the visual display) would influence participants’ visual search performance. Auditory cues presented close to the visual search display were found to produce significantly better performance than cues presented over headphones. Taken together, these results have implications for the design of nonvisual and multisensory warning signals used in complex visual displays.
Article PDF
Similar content being viewed by others
References
Bakeman, R., & McArthur, D. (1996). Picturing repeated measures: Comments on Loftus, Morrison, and others. Behavior Research Methods, Instruments, & Computers, 28, 584–589.
Bolia, R. S., D’Angelo, W. R., & McKinley, R. L. (1999). Aurally aided visual search in three-dimensional space. Human Factors, 41, 664–669.
Bolognini, N., Frassinetti, F., Serino, A., & Làdavas, E. (2005). “Acoustical vision” of below threshold stimuli: Interaction among spatially converging audiovisual inputs. Experimental Brain Research, 160, 273–282.
Box, G. E. P., & Cox, D. R. (1964). An analysis of transformations. Journal of the Royal Statistical Society: Series B, 26, 211–252.
Chan, A. H. S., & Chan, K. W. L. (2006). Synchronous and asynchronous presentations of auditory and visual signals: Implications for control console design. Applied Ergonomics, 37, 131–140.
Dalton, P., & Spence, C. (2007). Attentional capture in serial audiovisual search tasks. Perception & Psychophysics, 69, 422–438.
Di Luca, M., Machulla, T.-K., & Ernst, M. O. (2009). Recalibration of multisensory simultaneity: Cross-modal transfer coincides with a change in perceptual latency. Journal of Vision, 9 (12, Art. 7), 1–16.
Dufour, A. (1999). Importance of attentional mechanisms in audiovisual links. Experimental Brain Research, 126, 215–222.
Ferris, T. K., & Sarter, N. B. (2008). Cross-modal links among vision, audition, and touch in complex environments. Human Factors, 50, 17–26.
Fitch, G. M., Kiefer, R. J., Hankey, J. M., & Kleiner, B. M. (2007). Toward developing an approach for alerting drivers to the direction of a crash threat. Human Factors, 49, 710–720.
Folk, C. L., Remington, R. W., & Johnston, J. C. (1992). Involuntary covert orienting is contingent on attentional control settings. Journal of Experimental Psychology: Human Perception & Performance, 18, 1030–1044.
Ho, C., Reed, N., & Spence, C. (2006). Assessing the effectiveness of “intuitive” vibrotactile warning signals in preventing front-to-rear-end collisions in a driving simulator. Accident Analysis & Prevention, 38, 988–996.
Ho, C., Santangelo, V., & Spence, C. (2009). Multisensory warning signals: When spatial correspondence matters. Experimental Brain Research, 195, 261–272.
Ho, C., Tan, H. Z., & Spence, C. (2006). The differential effect of vibrotactile and auditory cues on visual spatial attention. Ergonomics, 49, 724–738.
Jones, C. M., Gray, R., Spence, C., & Tan, H. Z. (2008). Directing visual attention with spatially informative and spatially noninformative tactile cues. Experimental Brain Research, 186, 659–669.
Làdavas, E., & Farnè, A. (2004). Neuropsychological evidence for multimodal representations of space near specific body parts. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 69–98). Oxford: Oxford University Press.
Lee, J.-H., & Spence, C. (2009). Feeling what you hear: Task-irrelevant sounds modulate tactile perception delivered via a touch screen. Journal on Multimodal User Interfaces, 2, 145–156.
Lindeman, R. W., Yanagida, Y., Sibert, J. L., & Lavine, R. (2003). Effective vibrotactile cuing in a visual search task. In M. Rauterburg, M. Menozzi, & J. Wesson (Eds.), Proceedings of the Ninth IFIP TC13 International Conference on Human-Computer Interaction (pp. 89–98). Zurich: IOS.
McDonald, J. J., Teder-Sälejärvi, W. A., & Hillyard, S. A. (2000). Involuntary orienting to sound improves visual perception. Nature, 407, 906–908.
Odgaard, E. C., Arieh, Y., & Marks, L. E. (2003). Cross-modal enhancement of perceived brightness: Sensory interaction versus response bias. Perception & Psychophysics, 65, 123–132.
Pawlak, W. S., & Vicente, K. J. (1996). Inducing effective operator control through ecological interface design. International Journal of Human-Computer Studies, 44, 653–688.
Perrott, D. R., Cisneros, J., McKinley, R. L., & D’Angelo, W. R. (1996). Aurally aided visual search under virtual and free-field listening conditions. Human Factors, 38, 702–715.
Perrott, D. R., & Saberi, K. (1990). Minimum audible angle thresholds for sources varying in both elevation and azimuth. Journal of the Acoustical Society of America, 87, 1728–1731.
Perrott, D. R., Saberi, K., Brown, K., & Strybel, T. Z. (1990). Auditory psychomotor coordination and visual search performance. Perception & Psychophysics, 48, 214–226.
Perrott, D. R., Sadralodabai, T., Saberi, K., & Strybel, T. Z. (1991). Aurally aided visual search in the central visual field: Effects of visual load and visual enhancement of the target. Human Factors, 33, 389–400.
Previc, F. H. (1998). The neuropsychology of 3-D space. Psychological Bulletin, 124, 123–164.
Previc, F. H. (2000). Neuropsychological guidelines for aircraft con trol stations. IEEE Engineering in Medicine & Biology Magazine, 19, 81–88.
Quinlan, P. T. (2003). Visual feature integration theory: Past, present, and future. Psychological Bulletin, 129, 643–673.
Roberts, K. L., Summerfield, A. Q., & Hall, D. A. (2009). Covert auditory spatial orienting: An evaluation of the spatial relevance hypothesis. Journal of Experimental Psychology: Human Perception & Performance, 35, 1178–1191.
Rudmann, D. S., & Strybel, T. Z. (1999). Auditory spatial facilitation of visual search performance: Effect of cue precision and distractor density. Human Factors, 41, 146–160.
Santangelo, V., Ho, C., & Spence, C. (2008). Capturing spatial attention with multisensory cues. Psychonomic Bulletin & Review, 15, 398–403.
Santangelo, V., Van der Lubbe, R. H. J., Olivetti Belardinelli, M., & Postma, A. (2006). Spatial attention triggered by unimodal, crossmodal, and bimodal exogenous cues: A comparison of reflexive orienting mechanisms. Experimental Brain Research, 173, 40–48.
Sarter, N. B. (2000). The need for multisensory interfaces in support of effective attention allocation in highly dynamic event-driven domains: The case of cockpit automation. International Journal of Aviation Psychology, 10, 231–245.
Sarter, N. B. (2001). Multimodal communication in support of coordinative functions in human-machine teams. Journal of Human Performance in Extreme Environments, 5, 50–54.
Schneider, W., Eschman, A., & Zuccolotto, A. (2002). E-Prime reference guide. Pittsburgh, PA: Psychology Software Tools.
Spence, C. J., & Driver, J. (1994). Covert spatial orienting in audition: Exogenous and endogenous mechanisms. Journal of Experimental Psychology: Human Perception & Performance, 20, 555–574.
Spence, C. [J.], & Driver, J. (1997). Audiovisual links in exogenous covert spatial orienting. Perception & Psychophysics, 59, 1–22.
Spence, C. [J.], & Ho, C. (2008). Tactile and multisensory spatial warning signals for drivers. IEEE Transactions on Haptics, 1, 121–129.
Spence, C. [J.], Kingstone, A., Shore, D. I., & Gazzaniga, M. S. (2001). Representation of visuotactile space in the split brain. Psychological Science, 12, 90–93.
Spence, C. [J.], McDonald, J., & Driver, J. (2004). Exogenous spatialcuing studies of human crossmodal attention and multisensory integration. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 277–320). Oxford: Oxford University Press.
Spence, C. [J.], & Santangelo, V. (2009). Capturing spatial attention with multisensory cues: A review. Hearing Research, 258, 134–142.
Stein, B. E., London, N., Wilkinson, L. K., & Price, D. D. (1996). Enhancement of perceived visual intensity by auditory stimuli: A psychophysical analysis. Journal of Cognitive Neuroscience, 8, 497–506.
Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge, MA: MIT Press.
Stein, B. E., & Stanford, T. R. (2008). Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9, 255–266.
Tan, H. Z., Gray, R., Spence, C., Jones, C. M., & Roslizawaty, R. M. (2009). The haptic cuing of visual spatial attention: Evidence of a spotlight effect. In B. E. Rogowitz & T. N. Pappas (Eds.), Human Vision and Electronic Imaging XIV (Proceedings of SPIE Vol. 7240, pp. 1–12). San Jose, CA: SPIE-IS&T Electronic Imaging.
Townsend, J. T., & Ashby, F. G. (1983). Stochastic modeling of elementary psychological processes. New York: Cambridge University Press.
Treisman, A. [M.] (1996). The binding problem. Current Opinion in Neurobiology, 6, 171–178.
Treisman, A. M., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97–136.
Treisman, A. M., & Sato, S. (1990). Conjunction search revisited. Journal of Experimental Psychology: Human Perception & Performance, 16, 459–478.
Van der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., & Theeuwes, J. (2008). Pip and pop: Nonspatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception & Performance, 34, 1053–1065.
Van der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., & Theeuwes, J. (2009). Poke and pop: Tactile-visual synchrony increases visual saliency. Neuroscience Letters, 450, 60–64.
Van der Burg, E., Talsma, D., Olivers, C. N. L., Hickey, C., & Theeuwes, J. (2010). Early multisensory interactions affect the competition among multiple visual objects. Manuscript submitted for publication.
Vicente, K. J., & Rasmussen, J. (1992). Ecological interface design: Theoretical foundations. IEEE Transactions of Systems, Man, & Cybernetics, 22, 589–606.
Vroomen, J., & de Gelder, B. (2000). Sound enhances visual perception: Cross-modal effects of auditory organization on vision. Journal of Experimental Psychology: Human Perception & Performance, 26, 1583–1590.
Watson, D. G., Humphreys, G. W., & Olivers, C. N. L. (2003). Visual marking: Using time in visual selection. Trends in Cognitive Sciences, 7, 180–186.
Zampini, M., Guest, S., Shore, D. I., & Spence, C. (2005). Audio-visual simultaneity judgments. Perception & Psychophysics, 67, 531–544.
Author information
Authors and Affiliations
Corresponding author
Additional information
The research presented here was supported in part by a Clarendon Fund Scholarship to the first author from Oxford University.
Rights and permissions
About this article
Cite this article
Ngo, M.K., Spence, C. Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli. Attention, Perception, & Psychophysics 72, 1654–1665 (2010). https://doi.org/10.3758/APP.72.6.1654
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.3758/APP.72.6.1654