Multisensory aversive stimuli differentially modulate negative feelings in near and far space
Affect, space, and multisensory integration are processes that are closely linked. However, it is unclear whether the spatial location of emotional stimuli interacts with multisensory presentation to influence the emotional experience they induce in the perceiver. In this study, we used the unique advantages of virtual reality techniques to present potentially aversive crowd stimuli embedded in a natural context and to control their display in terms of sensory and spatial presentation. Individuals high in crowdphobic fear navigated in an auditory–visual virtual environment, in which they encountered virtual crowds presented through the visual channel, the auditory channel, or both. They reported the intensity of their negative emotional experience at a far distance and at a close distance from the crowd stimuli. Whereas auditory–visual presentation of close feared stimuli amplified negative feelings, auditory–visual presentation of distant feared stimuli did not amplify negative feelings. This suggests that spatial closeness allows multisensory processes to modulate the intensity of the emotional experience induced by aversive stimuli. Nevertheless, the specific role of auditory stimulation must be investigated to better understand this interaction between multisensory, affective, and spatial representation processes. This phenomenon may serve the implementation of defensive behaviors in response to aversive stimuli that are in position to threaten an individual’s feeling of security.
This research was supported by the EU FP7-ICT-2011-7 project VERVE (http://www.verveconsortium.eu/), Grant No. 288910. This work was performed within the Labex SMART (ANR-11-LABX-65) supported by French state funds managed by the ANR within the Investissements d’Avenir programme under reference ANR-11-IDEX-0004-02. The research leading to these results has also received funding from the program “Investissements d’avenir” ANR-10-IAIHU-06. We thank Thibaut Carpentier and Kévin Perros for their work on the elaboration of the auditory component of the virtual environment. We thank Camille Frey and Cassandra Visconti who contributed to the experimentation. We thank Nathalie George, Philippe Fossati and the SAN lab for their helpful comments during protocol elaboration.
- Aiello, J. R. (1987). Human spatial behavior. In D. Stokols & I. Altman (Eds.), Handbook of environmental psychology (pp. 389–504). New York: Wiley.Google Scholar
- Brozzoli, C., Makin, T. R., Cardinali, L., Holmes, N. P., & Farnè, A. (2012). Peripersonal space. In M. M. Murray & M. T. Wallace (Eds.), The neural bases of multisensory processes. Baco Raton: CRC Press.Google Scholar
- Carpentier, T., Noisternig, M., & Warusfel, O. (2015). Twenty years of Ircam Spat: looking back, looking forward. In International Computer Music Conference Proceedings.Google Scholar
- Christensen, J. F., Gaigg, S. B., Gomila, A., Oke, P., & Calvo-Merino, B. (2014). Enhancing emotional experiences to dance through music: the role of valence and arousal in the cross-modal bias. Frontiers in Human Neuroscience, 8, 757. doi: 10.3389/fnhum.2014.00757.CrossRefPubMedPubMedCentralGoogle Scholar
- Dolan, R. J., Morris, J. S., & De Gelder, B. (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006–10. doi: 10.1073/pnas.171288598.
- Hall, E. T. (1966). The hidden dimension. New York: Doubleday.Google Scholar
- Huis in ‘t Veld, E. M. J., & De Gelder, B. (2015). From personal fear to mass panic: The neurological basis of crowd perception. Human Brain Mapping. doi: 10.1002/hbm.22774.
- Lecrubier, Y., Sheehan, D. V., Weiller, E., Amorim, P., Bonora, I., Sheehan, K. H., & Dunbar, G. C. (1997). The Mini International Neuropsychiatric Interview (MINI). A short diagnostic structured interview: reliability and validity according to the CIDI. European Psychiatry, 12(5), 224–231.CrossRefGoogle Scholar
- Moeck, T., Bonneel, N., Tsingos, N., Drettakis, G., Viaud-Delmon, I., & Alloza, D. (2007). Progressive Perceptual Audio Rendering of Complex Scenes. In: Proceedings of the 2007 symposium on Interactive 3D graphics and games, April 30-May 02, 2007. Seattle, Washington.Google Scholar
- Risberg, A., & Lubker, J. (1978). Prosody and speech-reading. Quarterly Progress and Status Report Prosody and Speechreading, 4, 1–16.Google Scholar
- Riva, G., Mantovani, F., Capideville, C. S., Preziosa, A., Morganti, F., Villani, D., Gaggioli, A., Botella, C., & Alcañiz, M. (2007). Affective interactions using virtual reality: the link between presence and emotions. Cyberpsychology & Behavior : The Impact of the Internet, Multimedia and Virtual Reality on Behavior and Society, 10(1), 45–56.CrossRefGoogle Scholar
- Robillard, G., Bouchard, S., Fournier, T., & Renaud, P. (2003). Anxiety and presence during VR immersion: a comparative study of the reactions of phobic and non-phobic participants in therapeutic virtual environments derived from computer games. Cyberpsychology & Behavior : The Impact of the Internet, Multimedia and Virtual Reality on Behavior and Society, 6(5), 467–476. doi: 10.1089/109493103769710497.CrossRefGoogle Scholar
- Spielberger, C. D., Gorsuch, R. L., Lushene, P. R., Vagg, P. R., & Jacobs, A. G. (1983). Manual for the state-trait anxiety inventory (Form Y). Palo Alto: Consulting Psychologists Press.Google Scholar
- Taffou, M., Ondrej, J., O’Sullivan, C., Warusfel, O., & Viaud-Delmon, I. (2016). Judging crowds’ size by ear and by eye in virtual reality. Journal on Multimodal User Interfaces. Google Scholar
- Tajadura-Jiménez, A., Kitagawa, N., Väljamäe, A., Zampini, M., Murray, M. M., & Spence, C. (2009). Auditory-somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra. Neuropsychologia, 47(1), 195–203. doi: 10.1016/j.neuropsychologia.2008.07.025.CrossRefPubMedGoogle Scholar
- Van der Stoep, N., Van der Stigchel, S., Nijboer, T. C. W., & Van der Smagt, M. J. (2015). Audiovisual integration in near and far space: effects of changes in distance and stimulus effectiveness. Experimental Brain Research, 234, 1175–1188. doi: 10.1007/s00221-015-4248-2.CrossRefPubMedPubMedCentralGoogle Scholar
- Viaud-Delmon, I., Warusfel, O., Seguelas, A., Rio, E., & Jouvent, R. (2006). High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality. European Psychiatry : The Journal of the Association of European Psychiatrists, 21(7), 501–508. doi: 10.1016/j.eurpsy.2004.10.004.CrossRefGoogle Scholar
- Watson, R., Latinus, M., Noguchi, T., Garrod, O., Crabbe, F., & Belin, P. (2014). Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration. Journal of Neuroscience, 34(20), 6813–6821. doi: 10.1523/JNEUROSCI.4478-13.2014.CrossRefPubMedPubMedCentralGoogle Scholar
- Wolpe, J. (1973). The practice of behavior therapy (2nd ed.). New York: Pergamon.Google Scholar