It sounds real when you see it. Realistic sound source simulation in multimodal virtual environments
- 257 Downloads
Designing multimodal virtual environments promises revolutionary advances in interacting with computers in the near future. In this paper, we report the results of an experimental investigation on the possible use of surround-sound systems to support visualization, taking advantage of increased knowledge about how spatial perception and attention work in the human brain. We designed two auditory-visual cross-modal experiments, where noise bursts and light-blobs were presented synchronously, but with spatial offsets. We presented sounds in two ways: using free field sounds and using a stereo speaker set. Participants were asked to localize the direction of sound sources. In the first experiment visual stimuli were displaced vertically relative to the sounds, in the second experiment we used horizontal offsets. We found that, in both experiments, sounds were mislocalized in the direction of the visual stimuli in each condition (ventriloquism effect), but this effect was stronger when visual stimuli were displaced vertically, as compared to horizontally. Moreover we found that the ventriloquism effect is strongest for centrally presented sounds. The analyses revealed a variation between different sound presentation modes. We explain our results from the viewpoint of multimodal interface design. These findings draw attention to the importance of cognitive features of multimodal perception in the design of virtual environment setups and may help to open new ways to more realistic surround based multimodal virtual reality simulations.
KeywordsSurround system Ventriloquist illusion Multisensory integration Multilevel-modeling Spatial audio
The research leading to these results has received funding from the European Community’s Research Infrastructure Action—grant agreement VISIONAIR 262044—under the 7th Framework Programme (FP7/2007-2013). Ágoston Török was supported by a Young Researcher Fellowship from the Hungarian Academy of Sciences. Thanks to Dénes Tóth for his help during the statistical analyses. Thanks to Orsolya Kolozsvári for her help in the preparation of the manuscript.
- 6.Vroomen J, Gelder B De (2004) Perceptual effects of cross-modal stimulation: ventriloquism and the freezing phenomenon. Handb Multisens Process 3(1):1–23Google Scholar
- 17.Howard IP, Templeton WB (1966) Human spatial orientation. Wiley, New YorkGoogle Scholar
- 23.Werner S, Liebetrau J, Sporer T (2013) Vertical sound source localization influenced by visual stimuli. Signal Process Res 2(2)Google Scholar
- 27.Heck RH, Thomas SL, Tabata LN (2010) Multilevel and longitudinal modeling with IBM SPSS (Google eBook). Taylor Francis, New YorkGoogle Scholar
- 32.Baranyi P, Csapo A (2012) Definition and synergies of cognitive infocommunications. Acta Polytech Hung 9(1):67–83Google Scholar
- 33.Lee J-H, Spence C (2009) Feeling what you hear: task-irrelevant sounds modulate tactile perception delivered via a touch screen. J Multimodal User Interfaces 2(3–4):145–156Google Scholar
- 36.Ghirardelli TG, Scharine AA (2009) Auditory–visual interactions. In: Letowski, El Michael B, Russo tomasz R (eds) Helmet-mounted displays: sensation, perception, and cognition issues. U.S. Army Aeromedical Research, Fort Rucker, pp 599–618Google Scholar