Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance
- 751 Downloads
In natural environments, human sensory systems work in a coordinated and integrated manner to perceive and respond to external events. Previous research has shown that the spatial and temporal relationships of sensory signals are paramount in determining how information is integrated across sensory modalities, but in ecologically plausible settings, these factors are not independent. In the current study, we provide a novel exploration of the impact on behavioral performance for systematic manipulations of the spatial location and temporal synchrony of a visual-auditory stimulus pair. Simple auditory and visual stimuli were presented across a range of spatial locations and stimulus onset asynchronies (SOAs), and participants performed both a spatial localization and simultaneity judgment task. Response times in localizing paired visual-auditory stimuli were slower in the periphery and at larger SOAs, but most importantly, an interaction was found between the two factors, in which the effect of SOA was greater in peripheral as opposed to central locations. Simultaneity judgments also revealed a novel interaction between space and time: individuals were more likely to judge stimuli as synchronous when occurring in the periphery at large SOAs. The results of this study provide novel insights into (a) how the speed of spatial localization of an audiovisual stimulus is affected by location and temporal coincidence and the interaction between these two factors and (b) how the location of a multisensory stimulus impacts judgments concerning the temporal relationship of the paired stimuli. These findings provide strong evidence for a complex interdependency between spatial location and temporal structure in determining the ultimate behavioral and perceptual outcome associated with a paired multisensory (i.e., visual-auditory) stimulus.
KeywordsAudiovisual Inverse effectiveness Response time Race model Multisensory
This research was funded in part through a grant from NIDCD awarded to Mark Wallace and Stephen Camarata, NIH # R34 DC010927, as well as an NIDCD grant awarded to Ryan Stevenson, NIH 1F32 DC011993. We would also like to acknowledge the support of the Vanderbilt Kennedy Center and the Vanderbilt Brain Institute.
- Blauert J (1997) Spatial hearing: the psychophysics of human sound localization. MIT Press, CambridgeGoogle Scholar
- Conrey BL, Pisoni DB (2004) Detection of auditory-visual asynchrony in speech and nonspeech signals. Research on Spoken Language Processing, vol 26. Indiana University, BloomingtonGoogle Scholar
- Kim S, Stevenson RA, James TW (in press) Visuo-haptic neuronal convergence demonstrated with an inversely effective pattern of BOLD activation. J Cognit Neurosci. doi: 10.1162/jocn_a_00176
- Matin L (1986) Visual localization and eye movements. In: Boff KR, Kaufman L, Thomas JP (eds) Handbook of perception and human performance, vol 1. Wiley-Interscience, New York, pp 20.21–20.45Google Scholar
- Raab DH (1962) Statistical facilitation of simple reaction times. Trans NY Acad Sci 24:574–590Google Scholar
- Ross LA, Saint-Amour D, Leavitt VM, Molholm S, Javitt DC, Foxe JJ (2007b) Impaired multisensory processing in schizophrenia: deficits in the visual enhancement of speech comprehension under noisy environmental conditions. Schizophr Res 97(1–3):173–183. doi: 10.1016/j.schres.2007.08.008 PubMedCrossRefGoogle Scholar
- Stevenson RA, Kim S, James TW (2009) An additive-factors design to disambiguate neuronal and areal convergence: measuring multisensory interactions between audio, visual, and haptic sensory streams using fMRI. Exp Brain Res 198(2–3):183–194. doi: 10.1007/s00221-009-1783-8 PubMedCrossRefGoogle Scholar
- Wallace MH, Murray MM (eds) (2011) Frontiers in the neural basis of multisensory processes. Taylor & Francis, LondonGoogle Scholar