Psychological Research

, Volume 81, Issue 4, pp 764–776 | Cite as

Multisensory aversive stimuli differentially modulate negative feelings in near and far space

  • Marine Taffou
  • Jan Ondřej
  • Carol O’Sullivan
  • Olivier Warusfel
  • Stéphanie Dubal
  • Isabelle Viaud-Delmon
Original Article

Abstract

Affect, space, and multisensory integration are processes that are closely linked. However, it is unclear whether the spatial location of emotional stimuli interacts with multisensory presentation to influence the emotional experience they induce in the perceiver. In this study, we used the unique advantages of virtual reality techniques to present potentially aversive crowd stimuli embedded in a natural context and to control their display in terms of sensory and spatial presentation. Individuals high in crowdphobic fear navigated in an auditory–visual virtual environment, in which they encountered virtual crowds presented through the visual channel, the auditory channel, or both. They reported the intensity of their negative emotional experience at a far distance and at a close distance from the crowd stimuli. Whereas auditory–visual presentation of close feared stimuli amplified negative feelings, auditory–visual presentation of distant feared stimuli did not amplify negative feelings. This suggests that spatial closeness allows multisensory processes to modulate the intensity of the emotional experience induced by aversive stimuli. Nevertheless, the specific role of auditory stimulation must be investigated to better understand this interaction between multisensory, affective, and spatial representation processes. This phenomenon may serve the implementation of defensive behaviors in response to aversive stimuli that are in position to threaten an individual’s feeling of security.

Notes

Acknowledgments

This research was supported by the EU FP7-ICT-2011-7 project VERVE (http://www.verveconsortium.eu/), Grant No. 288910. This work was performed within the Labex SMART (ANR-11-LABX-65) supported by French state funds managed by the ANR within the Investissements d’Avenir programme under reference ANR-11-IDEX-0004-02. The research leading to these results has also received funding from the program “Investissements d’avenir” ANR-10-IAIHU-06. We thank Thibaut Carpentier and Kévin Perros for their work on the elaboration of the auditory component of the virtual environment. We thank Camille Frey and Cassandra Visconti who contributed to the experimentation. We thank Nathalie George, Philippe Fossati and the SAN lab for their helpful comments during protocol elaboration.

References

  1. Aiello, J. R. (1987). Human spatial behavior. In D. Stokols & I. Altman (Eds.), Handbook of environmental psychology (pp. 389–504). New York: Wiley.Google Scholar
  2. Baumgartner, T., Lutz, K., Schmidt, C. F., & Jäncke, L. (2006). The emotional power of music: how music enhances the feeling of affective pictures. Brain Research, 1075(1), 151–164. doi: 10.1016/j.brainres.2005.12.065.CrossRefPubMedGoogle Scholar
  3. Bolognini, N., Frassinetti, F., Serino, A., & Làdavas, E. (2005). “Acoustical vision” of below threshold stimuli: interaction among spatially converging audiovisual inputs. Experimental Brain Research, 160(3), 273–282. doi: 10.1007/s00221-004-2005-z.CrossRefPubMedGoogle Scholar
  4. Bouchard, S., St-Jacques, J., Robillard, G., & Renaud, P. (2008). Anxiety increases the feeling of presence in virtual reality. Presence: Teleoperators and Virtual Environments, 17(4), 376–391.CrossRefGoogle Scholar
  5. Brozzoli, C., Makin, T. R., Cardinali, L., Holmes, N. P., & Farnè, A. (2012). Peripersonal space. In M. M. Murray & M. T. Wallace (Eds.), The neural bases of multisensory processes. Baco Raton: CRC Press.Google Scholar
  6. Carpentier, T., Noisternig, M., & Warusfel, O. (2015). Twenty years of Ircam Spat: looking back, looking forward. In International Computer Music Conference Proceedings.Google Scholar
  7. Christensen, J. F., Gaigg, S. B., Gomila, A., Oke, P., & Calvo-Merino, B. (2014). Enhancing emotional experiences to dance through music: the role of valence and arousal in the cross-modal bias. Frontiers in Human Neuroscience, 8, 757. doi: 10.3389/fnhum.2014.00757.CrossRefPubMedPubMedCentralGoogle Scholar
  8. Collignon, O., Girard, S., Gosselin, F., Roy, S., Saint-Amour, D., Lassonde, M., & Lepore, F. (2008). Audio-visual integration of emotion expression. Brain Research, 1242, 126–135. doi: 10.1016/j.brainres.2008.04.023.CrossRefPubMedGoogle Scholar
  9. Conty, L., Russo, M., Loehr, V., Hugueville, L., Barbu, S., Huguet, P., & George, N. (2010). The mere perception of eye contact increases arousal during a word-spelling task. Social Neuroscience, 5(2), 171–186.CrossRefPubMedGoogle Scholar
  10. Cowey, A., Small, M., & Ellis, S. (1994). Left visuo-spatial neglect can be worse in far than in near space. Neuropsychologia, 32(9), 1059–1066.CrossRefPubMedGoogle Scholar
  11. Damasio, A. R. (1998). Emotion in the perspective of an integrated nervous system. Brain Research Reviews, 26(2–3), 83–86.CrossRefPubMedGoogle Scholar
  12. De Gelder, B., Böcker, K. B., Tuomainen, J., Hensen, M., & Vroomen, J. (1999). The combined perception of emotion from voice and face: early interaction revealed by human electric brain responses. Neuroscience Letters, 260(2), 133–136.CrossRefPubMedGoogle Scholar
  13. De Gelder, B., & Vroomen, J. (2000). The perception of emotion by ear and by eye. Cognition and Emotion, 14(3), 289–311.CrossRefGoogle Scholar
  14. Diederich, A., & Colonius, H. (1987). Intersensory facilitation in the motor component? Psychological Research, 49, 23–29.CrossRefGoogle Scholar
  15. Dolan, R. J., Morris, J. S., & De Gelder, B. (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006–10. doi: 10.1073/pnas.171288598.
  16. Dosey, M. A., & Meisels, M. (1969). Personal space and self-protection. Journal of Personality and Social Psychology, 11(2), 93–97.CrossRefPubMedGoogle Scholar
  17. Ferri, F., Tajadura-Jiménez, A., Väljamäe, A., Vastano, R., & Costantini, M. (2015). Emotion-inducing approaching sounds shape the boundaries of multisensory peripersonal space. Neuropsychologia, 70, 468–475. doi: 10.1016/j.neuropsychologia.2015.03.001.CrossRefPubMedGoogle Scholar
  18. Föcker, J., Gondan, M., & Röder, B. (2011). Preattentive processing of audio-visual emotional signals. Acta Psychologica, 137(1), 36–47. doi: 10.1016/j.actpsy.2011.02.004.CrossRefPubMedGoogle Scholar
  19. Giray, M., & Ulrich, R. (1993). Motor coactivation revealed by response force in divided and focused attention. Journal of Experimental Psychology: Human Perception and Performance, 19(6), 1278–1291.PubMedGoogle Scholar
  20. Gondan, M., Lange, K., Rösler, F., & Röder, B. (2004). The redundant target effect is affected by modality switch costs. Psychonomic Bulletin and Review, 11(2), 307–313.CrossRefPubMedGoogle Scholar
  21. Gondan, M., Niederhaus, B., Rösler, F., & Röder, B. (2005). Multisensory processing in the redundant-target effect: a behavioral and event-related potential study. Perception and Psychophysics, 67(4), 713–726.CrossRefPubMedGoogle Scholar
  22. Graziano, M. S. A., & Cooke, D. F. (2006). Parieto-frontal interactions, personal space, and defensive behavior. Neuropsychologia, 44(6), 845–859. doi: 10.1016/j.neuropsychologia.2005.09.009.CrossRefPubMedGoogle Scholar
  23. Hagan, C. C., Woods, W., Johnson, S., Green, G. G. R., & Young, A. W. (2013). Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG. PloS One, 8(8), e70648. doi: 10.1371/journal.pone.0070648.CrossRefPubMedPubMedCentralGoogle Scholar
  24. Hall, E. T. (1963). A system for the notation of proxemic behavior. American Anthropologist, 65(5), 1003–1026.CrossRefGoogle Scholar
  25. Hall, E. T. (1966). The hidden dimension. New York: Doubleday.Google Scholar
  26. Halligan, P. W., & Marshall, J. C. (1991). Left neglect for near but not far space in man. Nature, 350(6318), 498–500.CrossRefPubMedGoogle Scholar
  27. Hayduk, L. A. (1978). Personal space: an evaluative and orienting overview. Psychological Bulletin, 85(1), 117–134. doi: 10.1037//0033-2909.85.1.117.CrossRefGoogle Scholar
  28. Hayduk, L. A. (1983). Personal space: where we now stand. Psychological Bulletin, 94(2), 293–335. doi: 10.1037//0033-2909.94.2.293.CrossRefGoogle Scholar
  29. Hietanen, J. K., Leppänen, J. M., Peltola, M. J., Linna-aho, K., & Ruuhiala, H. J. (2008). Seeing direct and averted gaze activates the approach-avoidance motivational brain systems. Neuropsychologia, 46(9), 2423–2430.CrossRefPubMedGoogle Scholar
  30. Holmes, N. P., Sanabria, D., Calvert, G. A., & Spence, C. (2007). Tool-use: capturing multisensory spatial attention or extending multisensory peripersonal space? Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 43(3), 469–489.CrossRefPubMedPubMedCentralGoogle Scholar
  31. Huis in ‘t Veld, E. M. J., & De Gelder, B. (2015). From personal fear to mass panic: The neurological basis of crowd perception. Human Brain Mapping. doi: 10.1002/hbm.22774.
  32. Kitagawa, N., Zampini, M., & Spence, C. (2005). Audiotactile interactions in near and far space. Experimental Brain Research. Experimentelle Hirnforschung. Expérimentation Cérébrale, 166(3–4), 528–537. doi: 10.1007/s00221-005-2393-8.CrossRefPubMedGoogle Scholar
  33. Klasen, M., Chen, Y.-H., & Mathiak, K. (2012). Multisensory emotions: perception, combination and underlying neural processes. Reviews in the Neurosciences, 23(4), 381–392. doi: 10.1515/revneuro-2012-0040.CrossRefPubMedGoogle Scholar
  34. Kokinous, J., Kotz, S. A., Tavano, A., & Schröger, E. (2015). The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 10(5), 713–720. doi: 10.1093/scan/nsu105.CrossRefPubMedGoogle Scholar
  35. Kreifelts, B., Ethofer, T., Grodd, W., Erb, M., & Wildgruber, D. (2007). Audiovisual integration of emotional signals in voice and face: an event-related fMRI study. NeuroImage, 37(4), 1445–1456. doi: 10.1016/j.neuroimage.2007.06.020.CrossRefPubMedGoogle Scholar
  36. Laurienti, P. J., Kraft, R. A., Maldjian, J. A., Burdette, J. H., & Wallace, M. T. (2004). Semantic congruence is a critical factor in multisensory behavioral performance. Experimental Brain Research, 158, 405–414. doi: 10.1007/s00221-004-1913-2.CrossRefPubMedGoogle Scholar
  37. Lecrubier, Y., Sheehan, D. V., Weiller, E., Amorim, P., Bonora, I., Sheehan, K. H., & Dunbar, G. C. (1997). The Mini International Neuropsychiatric Interview (MINI). A short diagnostic structured interview: reliability and validity according to the CIDI. European Psychiatry, 12(5), 224–231.CrossRefGoogle Scholar
  38. Li, Y., Long, J., Huang, B., Yu, T., Wu, W., Liu, Y., Liang, C., & Sun, P. (2013). Crossmodal integration enhances neural representation of task-relevant features in audiovisual face perception. Cerebral Cortex,. doi: 10.1093/cercor/bht228.Google Scholar
  39. Ling, Y., Nefs, H. T., Morina, N., Heynderickx, I., & Brinkman, W. P. (2014). A meta-analysis on the relationship between self-reported presence and anxiety in virtual reality exposure therapy for anxiety disorders. PLoS One, 9(5), e96144.CrossRefPubMedPubMedCentralGoogle Scholar
  40. Lourenco, S. F., Longo, M. R., & Pathman, T. (2011). Near space and its relation to claustrophobic fear. Cognition, 119(3), 448–453. doi: 10.1016/j.cognition.2011.02.009.CrossRefPubMedGoogle Scholar
  41. Lovelace, C. T., Stein, B. E., & Wallace, M. T. (2003). An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Cognitive Brain Research, 17(2), 447–453.CrossRefPubMedGoogle Scholar
  42. Miller, J. (1982). Divided attention: evidence for coactivation with redundant signals. Cognitive Psychology, 14(2), 247–279.CrossRefPubMedGoogle Scholar
  43. Miller, J. (1991). Channel interaction and the redundant-targets effect in bimodal divided attention. Journal of Experimental Psychology: Human Perception and Performance, 17(1), 160–169.PubMedGoogle Scholar
  44. Mobbs, D., Petrovic, P., Marchant, J. L., Hassabis, D., Weiskopf, N., Seymour, B., Dolan, R. J., & Frith, C. D. (2007). When fear is near: threat imminence elicits prefrontal-periaqueductal gray shifts in humans. Science (New York, N.Y.), 317(5841), 1079–1083.CrossRefGoogle Scholar
  45. Moeck, T., Bonneel, N., Tsingos, N., Drettakis, G., Viaud-Delmon, I., & Alloza, D. (2007). Progressive Perceptual Audio Rendering of Complex Scenes. In: Proceedings of the 2007 symposium on Interactive 3D graphics and games, April 30-May 02, 2007. Seattle, Washington.Google Scholar
  46. Molholm, S., Ritter, W., Javitt, D. C., & Foxe, J. J. (2004). Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. Cerebral Cortex, 14(4), 452–465. doi: 10.1093/cercor/bhh007.CrossRefPubMedGoogle Scholar
  47. Phillips, M. L., Drevets, W. C., Rauch, S. L., & Lane, R. (2003). Neurobiology of emotion perception I: the neural basis of normal emotion perception. Biological Psychiatry, 54(5), 504–514. doi: 10.1016/S0006-3223(03)00168-9.CrossRefPubMedGoogle Scholar
  48. Pourtois, G., De Gelder, B., Bol, A., & Crommelinck, M. (2005). Perception of facial expressions and voices and of their combination in the human brain. Cortex a Journal Devoted to the Study of the Nervous System and Behavior, 41(1), 49–59.CrossRefPubMedGoogle Scholar
  49. Previc, F. H. (1998). The neuropsychology of 3-D space. Psychological Bulletin, 124(2), 123–164.CrossRefPubMedGoogle Scholar
  50. Risberg, A., & Lubker, J. (1978). Prosody and speech-reading. Quarterly Progress and Status Report Prosody and Speechreading, 4, 1–16.Google Scholar
  51. Riva, G., Mantovani, F., Capideville, C. S., Preziosa, A., Morganti, F., Villani, D., Gaggioli, A., Botella, C., & Alcañiz, M. (2007). Affective interactions using virtual reality: the link between presence and emotions. Cyberpsychology & Behavior : The Impact of the Internet, Multimedia and Virtual Reality on Behavior and Society, 10(1), 45–56.CrossRefGoogle Scholar
  52. Rizzolatti, G., Fadiga, L., Fogassi, L., & Gallese, V. (1997). The space around us. Science (New York, N.Y.), 277(5323), 190–191.CrossRefGoogle Scholar
  53. Robillard, G., Bouchard, S., Fournier, T., & Renaud, P. (2003). Anxiety and presence during VR immersion: a comparative study of the reactions of phobic and non-phobic participants in therapeutic virtual environments derived from computer games. Cyberpsychology & Behavior : The Impact of the Internet, Multimedia and Virtual Reality on Behavior and Society, 6(5), 467–476. doi: 10.1089/109493103769710497.CrossRefGoogle Scholar
  54. Sambo, C. F., & Iannetti, G. D. (2013). Better safe than sorry? The safety margin surrounding the body is increased by anxiety. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience, 33(35), 14225–14230. doi: 10.1523/JNEUROSCI.0706-13.2013.CrossRefGoogle Scholar
  55. Sarlat, L., Warusfel, O., & Viaud-Delmon, I. (2006). Ventriloquism aftereffects occur in the rear hemisphere. Neuroscience Letters, 404(3), 324–329. doi: 10.1016/j.neulet.2006.06.007.CrossRefPubMedGoogle Scholar
  56. Schubert, T., Friedmann, F., & Regenbrecht, H. (2001). The experience of presence: factor analytic insights. Presence Teleoperators and Virtual Environments, 10, 266–281.CrossRefGoogle Scholar
  57. Serino, A., Pizzoferrato, F., & Làdavas, E. (2008). Viewing a face (especially one’s own face) being touched enhances tactile perception on the face. Psychological Science, 19(5), 434–438. doi: 10.1111/j.1467-9280.2008.02105.x.CrossRefPubMedGoogle Scholar
  58. Spence, C., Pavani, F., & Driver, J. (2004). Spatial constraints on visual-tactile cross-modal distractor congruency effects. Cognitive, Affective & Behavioral Neuroscience, 4(2), 148–169.CrossRefGoogle Scholar
  59. Spielberger, C. D., Gorsuch, R. L., Lushene, P. R., Vagg, P. R., & Jacobs, A. G. (1983). Manual for the state-trait anxiety inventory (Form Y). Palo Alto: Consulting Psychologists Press.Google Scholar
  60. Stein, B. E., & Stanford, T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255–266. doi: 10.1038/nrn2331.CrossRefPubMedGoogle Scholar
  61. Suied, C., Bonneel, N., & Viaud-Delmon, I. (2009). Integration of auditory and visual information in the recognition of realistic objects. Experimental Brain Research Experimentelle Hirnforschung Expérimentation Cérébrale, 194(1), 91–102. doi: 10.1007/s00221-008-1672-6.CrossRefPubMedGoogle Scholar
  62. Sumby, W. H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. The Journal of the Acoustical Society of America, 26(2), 212–215.CrossRefGoogle Scholar
  63. Taffou, M., Guerchouche, R., Drettakis, G., & Viaud-Delmon, I. (2013). Auditory–visual aversive stimuli modulate the conscious experience of fear. Multisensory Research, 26, 347–370. doi: 10.1163/22134808-00002424.PubMedGoogle Scholar
  64. Taffou, M., Ondrej, J., O’Sullivan, C., Warusfel, O., & Viaud-Delmon, I. (2016). Judging crowds’ size by ear and by eye in virtual reality. Journal on Multimodal User Interfaces. Google Scholar
  65. Taffou, M., & Viaud-Delmon, I. (2014). Cynophobic fear adaptively extends peri-personal space. Frontiers in Psychiatry, 5, 122. doi: 10.3389/fpsyt.2014.00122.CrossRefPubMedPubMedCentralGoogle Scholar
  66. Tajadura-Jiménez, A., Kitagawa, N., Väljamäe, A., Zampini, M., Murray, M. M., & Spence, C. (2009). Auditory-somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra. Neuropsychologia, 47(1), 195–203. doi: 10.1016/j.neuropsychologia.2008.07.025.CrossRefPubMedGoogle Scholar
  67. Tanaka, A., Koizumi, A., Imai, H., Hiramatsu, S., Hiramoto, E., & De Gelder, B. (2010). I feel your voice. Cultural differences in the multisensory perception of emotion. Psychological Science, 21(9), 1259–1262. doi: 10.1177/0956797610380698.CrossRefPubMedGoogle Scholar
  68. Vagnoni, E., Lourenco, S. F., & Longo, M. R. (2012). Threat modulates perception of looming visual stimuli. Current Biology : CB, 22(19), R826–R827. doi: 10.1016/j.cub.2012.07.053.CrossRefPubMedGoogle Scholar
  69. Van den Stock, J., Grèzes, J., & De Gelder, B. (2008). Human and animal sounds influence recognition of body language. Brain Research, 1242, 185–190. doi: 10.1016/j.brainres.2008.05.040.CrossRefPubMedGoogle Scholar
  70. Van der Stoep, N., Van der Stigchel, S., Nijboer, T. C. W., & Van der Smagt, M. J. (2015). Audiovisual integration in near and far space: effects of changes in distance and stimulus effectiveness. Experimental Brain Research, 234, 1175–1188. doi: 10.1007/s00221-015-4248-2.CrossRefPubMedPubMedCentralGoogle Scholar
  71. Viaud-Delmon, I., Ivanenko, Y. P., Berthoz, A., & Jouvent, R. (2000). Adaptation as a sensorial profile in trait anxiety: a study with virtual reality. Journal of Anxiety Disorders, 14(6), 583–601.CrossRefPubMedGoogle Scholar
  72. Viaud-Delmon, I., Warusfel, O., Seguelas, A., Rio, E., & Jouvent, R. (2006). High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality. European Psychiatry : The Journal of the Association of European Psychiatrists, 21(7), 501–508. doi: 10.1016/j.eurpsy.2004.10.004.CrossRefGoogle Scholar
  73. Vines, B. W., Krumhansl, C. L., Wanderley, M. M., Dalca, I. M., & Levitin, D. J. (2011). Music to my eyes: cross-modal interactions in the perception of emotions in musical performance. Cognition, 118(2), 157–170. doi: 10.1016/j.cognition.2010.11.010.CrossRefPubMedGoogle Scholar
  74. Vines, B. W., Krumhansl, C. L., Wanderley, M. M., & Levitin, D. J. (2006). Cross-modal interactions in the perception of musical performance. Cognition, 101, 80–113. doi: 10.1016/j.cognition.2005.09.003.CrossRefPubMedGoogle Scholar
  75. Watson, R., Latinus, M., Noguchi, T., Garrod, O., Crabbe, F., & Belin, P. (2014). Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration. Journal of Neuroscience, 34(20), 6813–6821. doi: 10.1523/JNEUROSCI.4478-13.2014.CrossRefPubMedPubMedCentralGoogle Scholar
  76. Wolpe, J. (1973). The practice of behavior therapy (2nd ed.). New York: Pergamon.Google Scholar
  77. Zampini, M., Torresan, D., Spence, C., & Murray, M. M. (2007). Auditory-somatosensory multisensory interactions in front and rear space. Neuropsychologia, 45(8), 1869–1877. doi: 10.1016/j.neuropsychologia.2006.12.004.CrossRefPubMedGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Marine Taffou
    • 1
    • 2
  • Jan Ondřej
    • 3
  • Carol O’Sullivan
    • 3
  • Olivier Warusfel
    • 1
  • Stéphanie Dubal
    • 2
  • Isabelle Viaud-Delmon
    • 1
  1. 1.Sciences et Technologies de la Musique et du SonCNRS UMR 9912, IRCAM, Sorbonne Universités, UPMC Univ Paris 06ParisFrance
  2. 2.Social and Affective Neuroscience (SAN) Laboratory, Institut du Cerveau et de la Moelle épinière, ICMInserm, U 1127, CNRS UMR 7225, Sorbonne Universités, UPMC Univ Paris 06 UMR S 1127ParisFrance
  3. 3.School of Computer Science and StatisticsTrinity College DublinDublin 2Ireland

Personalised recommendations