Research Article

Experimental Brain Research

, Volume 158, Issue 2, pp 252-258

First online:

Unifying multisensory signals across time and space

  • M. T. WallaceAffiliated withDepartment of Neurobiology and Anatomy, Wake Forest University School of Medicine Email author 
  • , G. E. RobersonAffiliated withPsychology Department, Wake Forest University
  • , W. D. HairstonAffiliated withDepartment of Neurobiology and Anatomy, Wake Forest University School of Medicine
  • , B. E. SteinAffiliated withDepartment of Neurobiology and Anatomy, Wake Forest University School of Medicine
  • , J. W. VaughanAffiliated withDepartment of Neurobiology and Anatomy, Wake Forest University School of Medicine
  • , J. A. SchirilloAffiliated withPsychology Department, Wake Forest University

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

The brain integrates information from multiple sensory modalities and, through this process, generates a coherent and apparently seamless percept of the external world. Although multisensory integration typically binds information that is derived from the same event, when multisensory cues are somewhat discordant they can result in illusory percepts such as the “ventriloquism effect.” These biases in stimulus localization are generally accompanied by the perceptual unification of the two stimuli. In the current study, we sought to further elucidate the relationship between localization biases, perceptual unification and measures of a participant’s uncertainty in target localization (i.e., variability). Participants performed an auditory localization task in which they were also asked to report on whether they perceived the auditory and visual stimuli to be perceptually unified. The auditory and visual stimuli were delivered at a variety of spatial (0°, 5°, 10°, 15°) and temporal (200, 500, 800 ms) disparities. Localization bias and reports of perceptual unity occurred even with substantial spatial (i.e., 15°) and temporal (i.e., 800 ms) disparities. Trial-by-trial comparison of these measures revealed a striking correlation: regardless of their disparity, whenever the auditory and visual stimuli were perceived as unified, they were localized at or very near the light. In contrast, when the stimuli were perceived as not unified, auditory localization was often biased away from the visual stimulus. Furthermore, localization variability was significantly less when the stimuli were perceived as unified. Intriguingly, on non-unity trials such variability increased with decreasing disparity. Together, these results suggest strong and potentially mechanistic links between the multiple facets of multisensory integration that contribute to our perceptual Gestalt.

Keywords

Cross-modal Ventriloquism Sensorimotor Visual Auditory