Unifying multisensory signals across time and space
- 942 Downloads
The brain integrates information from multiple sensory modalities and, through this process, generates a coherent and apparently seamless percept of the external world. Although multisensory integration typically binds information that is derived from the same event, when multisensory cues are somewhat discordant they can result in illusory percepts such as the “ventriloquism effect.” These biases in stimulus localization are generally accompanied by the perceptual unification of the two stimuli. In the current study, we sought to further elucidate the relationship between localization biases, perceptual unification and measures of a participant’s uncertainty in target localization (i.e., variability). Participants performed an auditory localization task in which they were also asked to report on whether they perceived the auditory and visual stimuli to be perceptually unified. The auditory and visual stimuli were delivered at a variety of spatial (0°, 5°, 10°, 15°) and temporal (200, 500, 800 ms) disparities. Localization bias and reports of perceptual unity occurred even with substantial spatial (i.e., 15°) and temporal (i.e., 800 ms) disparities. Trial-by-trial comparison of these measures revealed a striking correlation: regardless of their disparity, whenever the auditory and visual stimuli were perceived as unified, they were localized at or very near the light. In contrast, when the stimuli were perceived as not unified, auditory localization was often biased away from the visual stimulus. Furthermore, localization variability was significantly less when the stimuli were perceived as unified. Intriguingly, on non-unity trials such variability increased with decreasing disparity. Together, these results suggest strong and potentially mechanistic links between the multiple facets of multisensory integration that contribute to our perceptual Gestalt.
KeywordsCross-modal Ventriloquism Sensorimotor Visual Auditory
We thank Julie Edelson for editorial assistance. The study was supported in part by US National Institutes of Health grants MH63861, NS22543 and NS36916.
- Bertelson P, Aschersleben G (1998) Automatic visual bias of perceived auditory location. Psychon Bull Rev 5:482–489Google Scholar
- Choe CS, Welch RB, Gilford RM, Juola JF (1975) The “ventriloquist effect”: visual dominance or response bias? Percept Psychophys 18:55–60Google Scholar
- Hartigan PM (1985) Statistical algorithms: algorithm AS 217. Computation of the dip statistic to test for unimodality. Appl Stat 34:320–325Google Scholar
- Hartigan JA, Hartigan PM (1985) The dip test of unimodality. Ann Stat 13:70–84Google Scholar
- Howard IP, Templeton WB (1966) Human Spatial orientation. Wiley, New YorkGoogle Scholar
- Pick HL, Warren DH, Hay JC (1969) Sensory conflict in judgements of spatial direction. Percept Psychophys 6:203–205Google Scholar
- Stein BE, Meredith MA (1993) The merging of the senses. MIT Press, Cambridge MAGoogle Scholar
- Stein BE, Laurienti PJ, Wallace MT, Stanford TR (2002) Multisensory integration. In: Ramachandran V (ed) Encyclopedia of the human brain. Elsevier, Amsterdam, pp 227–241Google Scholar
- Thomas G (1941) Experimental study of the influence of vision on sound localisation. J Exp Psychol 28:167–177Google Scholar