The COGs (context, object, and goals) in multisensory processing

Abstract

Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2

References

  1. Aller M, Giani A, Conrad V, Watanabe M, Noppeney U (2015) A spatially collocated sound thrusts a flash into awareness. Front Integr Neurosci 9:16. doi:10.3389/fnint.2015.00016

    PubMed  PubMed Central  Article  Google Scholar 

  2. Alsius A, Munhall KG (2013) Detection of audiovisual speech correspondences without visual awareness. Psychol Sci 24:423–431

    PubMed  Article  Google Scholar 

  3. Alsius A, Soto-Faraco S (2011) Searching for audiovisual correspondence in multiple speaker scenarios. Exp Brain Res 213:175–183

    PubMed  Article  Google Scholar 

  4. Alsius A, Navarra J, Campbell R, Soto-Faraco S (2005) Audiovisual integration of speech falters under high attention demands. Curr Biol 15:839–843

    CAS  PubMed  Article  Google Scholar 

  5. Alsius A, Navarra J, Soto-Faraco S (2007) Attention to touch weakens audiovisual speech integration. Exp Brain Res 183:399–404

    PubMed  Article  Google Scholar 

  6. Alsius A, Möttönen R, Sams ME, Soto-Faraco S, Tippana K (2014) Effect of attentional load on audiovisual speech perception: evidence from ERPs. Front Psychol 5:727. doi:10.3389/fpsyg.2014.00727

    PubMed  PubMed Central  Article  Google Scholar 

  7. Altieri N, Stevenson RA, Wallace MT, Wenger MJ (2015) Learning to associate auditory and visual stimuli: behavioral and neural mechanisms. Brain Topogr 28(3):479–493

    PubMed  Article  Google Scholar 

  8. Amedi A, von Kriegstein K, van Atteveldt NM, Beauchamp MS, Naumer MJ (2005) Functional imaging of human crossmodal identification and object recognition. Exp Brain Res 166:559–571

    CAS  PubMed  Article  Google Scholar 

  9. Amso D, Scerif G (2015) The attentive brain: insights from developmental cognitive neuroscience. Nat Rev Neurosci 16:606–619

    CAS  PubMed  Article  Google Scholar 

  10. Arnal LH, Giraud AL (2012) Cortical oscillations and sensory predictions. Trends Cogn Sci 16:390–398

    PubMed  Article  Google Scholar 

  11. Astle DE, Scerif G (2011) Interactions between attention and visual short-term memory (VSTM): what can be learnt from individual and developmental differences? Neuropsychologia 49(6):1435–1445

    PubMed  Article  Google Scholar 

  12. Baart M, Stekelenburg JJ, Vroomen J (2014) Electrophysiological evidence for speech-specific audiovisual integration. Neuropsychologia 53:115–121

    PubMed  Article  Google Scholar 

  13. Bach DR, Neuhoff JG, Perrig W, Seirfritz E (2009) Looming sounds as warning signals: the function of motion cues. Int J Psychophysiol 74:28–33

    PubMed  Article  Google Scholar 

  14. Baddeley A, Eysensck AW, Anderson MC (2009) Memory: motivated forgetting. Psychology press, New York

    Google Scholar 

  15. Baier B, Kleinschmidt A, Müller NG (2006) Cross-modal processing in early visual and auditory cortices depends on expected statistical relationship of multisensory information. J Neurosci 26:12260–12265

    CAS  PubMed  Article  Google Scholar 

  16. Baker CI, Olson CR, Behrmann M (2004) Role of attention and perceptual grouping in visual statistical learning. Psychol Sci 15(7):460–466

    PubMed  Article  Google Scholar 

  17. Bar M (2004) Visual objects in context. Nat Rev Neurosci 5:617–629

    CAS  PubMed  Article  Google Scholar 

  18. Barakat BK, Seitz AR, Shams L (2013) The effect of statistical learning on internal stimulus representations: predictable items are enhanced even when not predicted. Cognition 129:205–211

    PubMed  Article  Google Scholar 

  19. Barenholtz E, Lewkowicz DJ, Davidson M, Mavica L (2014) Categorical congruence facilitates multisensory associative learning. Psychon Bull Rev 21(5):1346–1352

    PubMed  Article  Google Scholar 

  20. Barth DS, Goldberg N, Brett B, Di S (1995) The spatiotemporal organization of auditory, visual, and auditory-visual evoked potentials in rat cortex. Brain Res 678:177–190

    PubMed  Article  Google Scholar 

  21. Beauchamp MS, Argall BD, Bodurka J, Duyn JH, Martin A (2004) Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nat Neurosci 7:1190–1192

    CAS  PubMed  Article  Google Scholar 

  22. Beierholm UR, Quartz SR, Shams L (2009) Bayesian priors are encoded independently from likelihoods in human multisensory perception. J Vis 9:23

    PubMed  Article  Google Scholar 

  23. Belin P, Zatorre RJ, Lafaille P, Ahad P, Pike B (2000) Voice-selective areas in human auditory cortex. Nature 403:309–312

    CAS  PubMed  Article  Google Scholar 

  24. Besle J et al (2011) Tuning of the human neocortex to the temporal dynamics of attended events. J Neurosci 31:3176–3185

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  25. Bien N, ten Oever S, Goebel R, Sack AT (2012) The sound of size: crossmodal binding in pitch-size synesthesia: a combined TMS, EEG and psychophysics study. NeuroImage 59:663–672

    PubMed  Article  Google Scholar 

  26. Brang D, Towle VL, Suzuki S, Hillyard SA, Di Tusa S, Dai Z, Grabowecky M (2015) Peripheral sounds rapidly activate visual cortex: evidence from electrocorticography. J Neurophys. doi:10.1152/jn.00728.2015

    Google Scholar 

  27. Braver TS (2012) The variable nature of cognitive control: a dual-mechanism framework. Trends Cogn Sci 16:106–113

    PubMed  PubMed Central  Article  Google Scholar 

  28. Cappe C, Thut G, Romei V, Murray MM (2009) Selective integration of auditory-visual looming cues by humans. Neuropsychologia 47:1045–1052

    PubMed  Article  Google Scholar 

  29. Cappe C, Thut G, Romei V, Murray MM (2010) Auditory–visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci 30:12572–12580

    CAS  PubMed  Article  Google Scholar 

  30. Cappe C, Thelen A, Romei V, Thut G, Murray MM (2012) Looming signals reveal synergistic principles of multisensory integration. J Neurosci 32:1171–1182

    CAS  PubMed  Article  Google Scholar 

  31. Cecere R, Romei V, Bertini C, Làdavas E (2014) Crossmodal enhancement of visual orientation discrimination by looming sounds requires functional activation of primary visual areas: a case study. Neuropsychologia 56:350–358

    PubMed  Article  Google Scholar 

  32. Cecere R, Rees G, Romei V (2015) Individual differences in alpha frequency drive crossmodal illusory perception. Curr Biol 25(2):231–235

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  33. Chandrasekaran C, Trubanova A, Stillittano S, Caplier A, Ghazanfar AA (2009) The natural statistics of audiovisual speech. PLoS Comp Biol 5:e1000436

    Article  CAS  Google Scholar 

  34. Coull JT, Nobre AC (1998) Where and when to pay attention: the neural systems for directing attention to spatial locations and to time intervals as revealed by both PET and fMRI. J Neurosci 18:7426–7435

    CAS  PubMed  Google Scholar 

  35. Cravo AM, Rohenkohl G, Wyart V, Nobre AC (2011) Endogenous modulation of low frequency oscillations by temporal expectations. J Neurophysiol 106:2964–2972

    PubMed  PubMed Central  Article  Google Scholar 

  36. De Meo R, Murray MM, Clarke S, Matusz PJ (2015) Top-down control and early multisensory processes: chicken vs. egg. Front Integr Neurosci 9:17. doi:10.3389/fnint.2015.00017

    PubMed  PubMed Central  Article  Google Scholar 

  37. Dehaene S, Cohen L (2007) Cultural recycling of cortical maps. Neuron 56:384–398

    CAS  PubMed  Article  Google Scholar 

  38. Desimone R, Duncan J (1995) Neural mechanisms of selective visual attention. Annual Rev Neurosci 18:193–222

    CAS  Article  Google Scholar 

  39. Diaconescu AO, Alain C, McIntosh AR (2011) The co-occurrence of multisensory facilitation and cross-modal conflict in the human brain. J Neurophysiol 106(6):2896–2909

    PubMed  Article  Google Scholar 

  40. Ding Y, Martinez A, Qu Z, Hillyard SA (2014) Earliest stages of visual cortical processing are not modified by attentional load. Hum Brain Map 35:3008–3024

    Article  Google Scholar 

  41. Doehrmann O, Naumer MJ (2008) Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration. Brain Res 1242:136–150

    CAS  PubMed  Article  Google Scholar 

  42. Duncan J, Humphreys GW (1989) Visual search and stimulus similarity. Psychol Rev 96(3):433–458

    CAS  PubMed  Article  Google Scholar 

  43. Fairhall SL, Macaluso E (2009) Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites. Eur J Neurosci 29:1247–1257

    CAS  PubMed  Article  Google Scholar 

  44. Fernández LM, Visser M, Campos NV, Rivera CÁ, Soto-Faraco S (2015) Top-down attention regulates the neural expression of audiovisual integration. NeuroImage. 119:272–285

    Article  Google Scholar 

  45. Fetsch CR, DeAngelis GC, Angelaki DE (2013) Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat Rev Neurosci 14:429–442

    CAS  PubMed  Article  Google Scholar 

  46. Fiebelkorn IC, Foxe JJ, Butler JS, Mercier MR, Snyder AC, Molholm S (2011) Ready, set, reset: stimulus-locked periodicity in behavioral performance demonstrates the consequences of cross-sensory phase reset. J Neurosci 31(27):9971–9981

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  47. Fiebelkorn IC, Snyder AC, Mercier MR, Butler JS, Molhom S, Foxe JJ (2013) cortical cross-frequency coupling predicts perceptual outcomes. Neuroimage 69:126–137

    CAS  PubMed  Article  Google Scholar 

  48. Finisguerra A, Canzoneri E, Serino A, Pozzo T, Bassolino M (2015) Moving sounds within the peripersonal space modulate the motor system. Neuropsychologia 70:421–428

    PubMed  Article  Google Scholar 

  49. Folk CL, Remington RW, Johnston JC (1992) Involuntary covert orienting is contingent on attentional control settings. J Exp Psychol Hum Percept Perform 18:1030–1044

    CAS  PubMed  Article  Google Scholar 

  50. Fort A, Delpuech C, Pernier J, Giard MH (2002) Dynamics of cortico-subcortical crossmodal operations involved in audio-visual object detection in humans. Cereb Cortex 12:1031–1039

    PubMed  Article  Google Scholar 

  51. Fries P (2005) A mechanism for cognitive dynamics: neuronal communication through neuronal coherence. Trends Cogn Sci 9:474–480

    PubMed  Article  Google Scholar 

  52. Frost R, Armstrong BC, Siegelman N, Christiansen MH (2015) Domain generality versus modality specificity: the paradox of statistical learning. Trends Cogn Sci 19(3):117–125

    PubMed  PubMed Central  Article  Google Scholar 

  53. Froyen DJ, Bonte ML, van Atteveldt N, Blomert L (2009) The long road to automation: neurocognitive development of letter–speech sound processing. J Cogn Neurosci 21:567–580

    PubMed  Article  Google Scholar 

  54. Fujisaki W, Shimojo S, Kashino M, Nishida SY (2004) Recalibration of audiovisual simultaneity. Nat Neurosci 7(7):773–778

    CAS  PubMed  Article  Google Scholar 

  55. Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK (2005) Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J Neurosci 25(20):5004–5012

    CAS  PubMed  Article  Google Scholar 

  56. Giard MH, Peronnet F (1999) Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J Cogn Neurosci 11:473–490

    CAS  PubMed  Article  Google Scholar 

  57. Gori M, Sandini G, Burr D (2008) Young children do not integrate visual and haptic form information. Curr Biol 18:694–698

    CAS  PubMed  Article  Google Scholar 

  58. Heron J, Roach NW, Hanson JV, McGraw PV, Whitaker D (2012) Audiovisual time perception is spatially specific. Exp Brain Res 218(3):477–485

    PubMed  PubMed Central  Article  Google Scholar 

  59. Holloway I, van Atteveldt N, Blomert L, Ansari D (2015) Orthographic dependency in the neural correlates of reading: evidence from audiovisual integration in English readers. Cereb Cortex 25(6):1544–1553

    PubMed  Article  Google Scholar 

  60. Ikumi N, Soto-Faraco S (2014) Selective attention modulates the direction of audio-visual temporal recalibration. PloS One 9:e99311

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  61. Iordanescu L, Grabowecky M, Suzuki S (2009) Demand-based dynamic distribution of attention and monitoring of velocities during multiple-object tracking. J Vis 9:1

    PubMed  PubMed Central  Article  Google Scholar 

  62. Jones A (2015) Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing. Front Integr Neurosci 8:96

    PubMed  PubMed Central  Article  Google Scholar 

  63. Kayser C, Petkov CI, Logothetis NK (2008) Visual modulation of neurons in auditory cortex. Cereb Cortex 18(7):1560–1574

    PubMed  Article  Google Scholar 

  64. Klapetek A, Ngo MK, Spence C (2012) Does crossmodal correspondence modulate the facilitatory effect of auditory cues on visual search? Atten Percept Psychophys 74:1154–1167

    PubMed  Article  Google Scholar 

  65. Lakatos P, Karmos G, Mehta AD, Ulbert I, Schroeder CE (2008) Entrainment of neuronal oscillations as a mechanism of attentional selection. Science 320:110–113

    CAS  PubMed  Article  Google Scholar 

  66. Lamme VA, Roelfsema PR (2000) The distinct modes of vision offered by feedforward and recurrent processing. Trends Neurosci 23:571–579

    CAS  PubMed  Article  Google Scholar 

  67. Laurienti PJ, Kraft RA, Maldjian JA, Burdette JH, Wallace MT (2004) Semantic congruence is a critical factor in multisensory behavioral performance. Exp Brain Res 158:405–414

    PubMed  Article  Google Scholar 

  68. Lavie N (2010) Attention, distraction, and cognitive control under load. Cur Dir Psyc Sci 19:143–148

    Article  Google Scholar 

  69. Lewkowicz DJ (2014) Early experience and multisensory perceptual narrowing. Dev Psychobiol 56:292–315

    PubMed  PubMed Central  Article  Google Scholar 

  70. Los SA, Van der Burg E (2013) Sound speeds vision through preparation, not integration. J Exp Psychol Hum Percept Perform 39:1612

    PubMed  Article  Google Scholar 

  71. Luck SJ, Hillyard SA (1994) Spatial filtering during visual search: evidence from human electrophysiology. J Exp Psychol Hum Percept Perform 20:1000–1014

    CAS  PubMed  Article  Google Scholar 

  72. Lunghi C, Morrone MC, Alais D (2014) Auditory and tactile signals combine to influence vision during binocular rivalry. J Neurosci 34:784–792

    CAS  PubMed  Article  Google Scholar 

  73. Maier JX, Nuehoff JG, Logothetis NK, Ghazanfar AA (2004) Multisensory integration of looming signals by rhesus monkeys. Neuron 43:177–181

    CAS  PubMed  Article  Google Scholar 

  74. Martuzzi R et al (2007) Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb Cortex 17:1672–1679

    PubMed  Article  Google Scholar 

  75. Masterberdino S, Santangelo V, Macaluso E (2015) Crossmodal semantic congruence can affect visuo-spatial processing and activity of the fronto-parietal attention networks. Front Integr Neurosci 9:45

    Google Scholar 

  76. Matusz PJ, Eimer M (2011) Multisensory enhancement of attentional capture in visual search. Psychon B Rev 18:904–909

    Article  Google Scholar 

  77. Matusz PJ, Eimer M (2013) Top-down control of audiovisual search by bimodal search templates. Psychophysiology 50:996–1009

    PubMed  Google Scholar 

  78. Matusz PJ, Traczyk J, Sobkow A, Strelau J (2015a) Individual differences in emotional reactivity moderate the strength of the relationship between attentional and implicit-memory biases towards threat-related stimuli. J Cogn Psyc 27:715–724

    Article  Google Scholar 

  79. Matusz PJ et al (2015b) The role of auditory cortices in the retrieval of single-trial auditory–visual object memories. Eur J Neurosci 41:699–708

    PubMed  Article  Google Scholar 

  80. Matusz PJ et al (2015c) Multi-modal distraction: insights from children’s limited attention. Cognition 136:156–165

    PubMed  Article  Google Scholar 

  81. Matusz PJ, Retsa C, Murray MM (2016) The context-contingent nature of cross-modal activations of the visual cortex. Neuroimage 125:996–1004

    PubMed  Article  Google Scholar 

  82. McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264:746–748

    CAS  PubMed  Article  Google Scholar 

  83. Meredith MA, Nemitz JW, Stein BE (1987) Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors. J Neurosci 7:3215–3229

    CAS  PubMed  Google Scholar 

  84. Meredith MA, Allman BL, Keniston LP, Clemo HR (2012) Are bimodal neurons the same throughout the brain? In: Murray MM, Wallace MT (eds) The neural bases of multisensory processes, chapter 4. CRC Press, Boca Raton (FL)

    Google Scholar 

  85. Molholm S, Ritter W, Murray MM, Javitt DC, Schroeder CE, Foxe JJ (2002) Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study. Cogn Brain Res 14:115–128

    Article  Google Scholar 

  86. Molholm S, Ritter W, Javitt DC, Foxe JJ (2004) Multisensory visual–auditory object recognition in humans: a high-density electrical mapping study. Cereb Cortex 14:452–465

    PubMed  Article  Google Scholar 

  87. Mondloch CJ, Maurer D (2004) Do small white balls squeak? Pitch-object correspondences in young children. Cogn Affect Behav Neurosci 4:133–136

    PubMed  Article  Google Scholar 

  88. Murray MM et al (2004) Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging. Neuroimage 21:125–135

    PubMed  Article  Google Scholar 

  89. Murray MM, Foxe JJ, Wylie GR (2005) The brain uses single-trial multisensory memories to discriminate without awareness. Neuroimage 27:473–478

    PubMed  Article  Google Scholar 

  90. Murray MM, Wallace MT (eds) (2012) The neural bases of multisensory processes. CRC Press, Boca Raton (FL)

  91. Murray MM, Thelen A, Thut G, Romei V, Martuzzi R, Matusz PJ (2016) The multisensory function of the human primary visual cortex. Neurpsychologia 83C:161–169

    Article  Google Scholar 

  92. Musacchia G, Schroeder CE (2009) Neuronal mechanisms, response dynamics and perceptual functions of multisensory interactions in auditory cortex. Hear Res 258:72–79

    PubMed  Article  Google Scholar 

  93. Naci L, Taylor KI, Cusack R, Tyler LK (2012) Are the senses enough for sense? Early high-level feedback shapes our comprehension of multisensory objects. Front Integr Neurosci 6:82. doi:10.3389/fnint.2012.00082

    PubMed  PubMed Central  Article  Google Scholar 

  94. Nahorna O, Berthommier F, Schwartz JL (2012) Binding and unbinding the auditory and visual streams in the McGurk effect. J Acoust Soc Am 132:1061–1077

    PubMed  Article  Google Scholar 

  95. Nardini M, Jones P, Bedford R, Braddick O (2008) Development of cue integration in human navigation. Curr Biol 18:689–693

    CAS  PubMed  Article  Google Scholar 

  96. Nardini M, Bales J, Mareschal D (2015) Integration of audio-visual information for spatial decisions in children and adults. Dev Sci. doi:10.1111/desc.12327

    Google Scholar 

  97. Nardo D, Santangelo V, Macaluso E (2014) Spatial orienting in complex audiovisual environments. Hum Brain Map 35:1597–1614

    Article  Google Scholar 

  98. Neil PA, Chee-Ruiter C, Scheier C, Lewkowicz DJ, Shimojo S (2006) Development of multisensory spatial integration and perception in humans. Dev Sci 9(5):454–464

    PubMed  Article  Google Scholar 

  99. Niemi P, Näätänen R (1981) Foreperiod and simple reaction time. Psychol Bull 89(1):133–162

    Article  Google Scholar 

  100. Nobre K, Kastner S (eds) (2014) The Oxford handbook of attention. Oxford University Press, Oxford

    Google Scholar 

  101. Orchard-Mills E, Alais D, Van der Burg E (2013a) Crossmodal associations between vision, touch, and audition influence visual search through top-down attention, not bottom-up capture. Atten Percept Psychophys 75:1892–1905

    PubMed  Article  Google Scholar 

  102. Orchard-Mills E, Van der Burg E, Alais D (2013b) Amplitude-modulated auditory stimuli influence selection of visual spatial frequencies. J Vis 13:6

    PubMed  Article  Google Scholar 

  103. Palmer TD, Ramsey AK (2012) The function of consciousness in multisensory integration. Cognition 125:353–364

    PubMed  Article  Google Scholar 

  104. Parise CV, Ernst M (2015) Correlation detection as a general mechanism for multisensory integration. J Vis 15:364

    PubMed  Article  Google Scholar 

  105. Parise CV, Spence C (2009) ‘When birds of a feather flock together’: synesthetic correspondences modulate audiovisual integration in non-synesthetes. PLoS One 4:e5664

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  106. Perrodin C, Kayser C, Abel TJ, Logothetis NK, Petkov CI (2015) Who is that? Brain networks and mechanisms for indentifying individuals. doi:10.1016/j.tics.2015.09.002

  107. Powers AR, Hevey MA, Wallace MT (2012) Neural correlates of multisensory perceptual learning. J Neurosci 32:6263–6274

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  108. Raij T, Uutela K, Hari R (2000) Audiovisual integration of letters in the human brain. Neuron 28:617–625

    CAS  PubMed  Article  Google Scholar 

  109. Raij T, Ahveninen J et al (2010) Onset timing of cross-sensory activations and multisensory interactions in auditory and visual sensory cortices. Eur J Neurosci 31:1772–1782

    PubMed  PubMed Central  Article  Google Scholar 

  110. Rind FC, Simmons PJ (1999) Seeing what is coming: building collision-sensitive neurons 22(5):215–220

    CAS  Google Scholar 

  111. Rohe T, Noppeney U (2015) Cortical hierarchies perform Bayesian causal inference in multisensory perception. PLoS Biol 13(2):e1002073

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  112. Romei V, Murray MM, Merabet LB, Thut G (2007) Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: implications for multisensory interactions. J Neurosci 27:11465–11472

    CAS  PubMed  Article  Google Scholar 

  113. Romei V, Murray MM, Cappe C, Thut G (2009) Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Curr Biol 19:1799–1805

    CAS  PubMed  Article  Google Scholar 

  114. Romei V, Gross J, Thut G (2012) Sounds reset rhythms of visual cortex and corresponding human visual perception. Curr Biol 22(9):807–813

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  115. Romei V, Murray MM, Cappe C, Thut G (2013) The contributions of sensory dominance and attentional bias to crossmodal enhancement of visual cortex excitability. J Cogn Neurosci 25:1122–1135

    PubMed  Article  Google Scholar 

  116. Rowland BA, Stein BE (2007) Multisensory integration produces an initial response enhancement. Front Integr Neurosci 1:4

    PubMed  PubMed Central  Article  Google Scholar 

  117. Sanabria D, Soto-Faraco S, Spence C (2005) Assessing the effect of visual and tactile distractors on the perception of auditory apparent motion. Exp Brain Res 166(3–4):548–558

    PubMed  Article  Google Scholar 

  118. Santangelo V, Spence C (2007) Multisensory cues capture spatial attention regardless of perceptual load. J Exp Psychol Hum Percept Perform 33(6):1311–1321

    PubMed  Article  Google Scholar 

  119. Santangelo V, Di Francesco SA, Mastroberardino S, Macaluso E (2015) Parietal cortex integrates contextual and saliency signals during the encoding of natural scenes in working memory. Hum Brain Mapp 36:5003–5017

    PubMed  Article  Google Scholar 

  120. Sarmiento BR, Shore DI, Milliken B, Sanabria D (2012) Audiovisual interactions depend on context of congruency. Atten Percept Psychophys 74:563–574

    PubMed  Article  Google Scholar 

  121. Sarmiento B, Matusz PJ, Sanabria D, Murray MM (2016) Contextual factors multiplex to control multisensory processes. Hum Brain Mapp. doi: 10.1002/hbm.23030

  122. Scerif G (2010) Attention trajectories, mechanisms and outcomes: at the interface between developing cognition and environment. Dev Sci 13:805–812

    PubMed  Article  Google Scholar 

  123. Schiff W, Caviness JA, Gibson JJ (1962) Persistent fear responses in rhesus monkeys to the optical stimulus of “looming”. Science 136:982–983

    CAS  PubMed  Article  Google Scholar 

  124. Schroeder CE, Molhom S, Lakatos P, Ritter W, Foxe JJ (2004) Human–simian correspondence in the early cortical processing of multisensory cues. Cogn Proc 5:140–151

    Article  Google Scholar 

  125. Schroeder CE, Wilson DA, Radman T, Scharfman H, Lakatos P (2010) Dynamics of active sensing and perceptual selection. Curr Opin Neurobiol 20:172–176

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  126. Senkowski D, Saint-Amour D, Kelly SP, Foxe JJ (2007) Multisensory processing of naturalistic objects in motion: a high-density electrical mapping and source estimation study. Neuroimage 36(3):877–888

    PubMed  Article  Google Scholar 

  127. Soto-Faraco S, Calabresi M, Navarra J, Werker J, Lewkowicz DJ (2012) The development of audiovisual speech perception. Multisensory development. Oxford University Press, Oxford, pp 207–228

    Google Scholar 

  128. Spence C, Deroy O (2013) How automatic are crossmodal correspondences? Conscious Cogn 22:245–260

    PubMed  Article  Google Scholar 

  129. Spierer L, Manuel AL, Bueti D, Murray MM (2013) Contributions of pitch and bandwidth to sound-induced enhancement of visual cortex excitability in humans. Cortex 49:2728–2734

    PubMed  Article  Google Scholar 

  130. Stein BE (2012) The new handbook of multisensory processing. MIT Press, Cambridge

    Google Scholar 

  131. Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19:1964–1973

    PubMed  Article  Google Scholar 

  132. Stevenson RA, Wallace MT (2013) Multisensory temporal integration: task and stimulus dependencies. Exp Brain Res 277(2):249–261

    Article  Google Scholar 

  133. Stevenson RA, Altieri NA, Kim S, Pisoni DB, James TW (2010) Neural processing of asynchronous audiovisual speech perception. Neuroimage 49(4):3308

    PubMed  PubMed Central  Article  Google Scholar 

  134. Stevenson RA et al (2014) Identifying and quantifying multisensory integration: a tutorial review. Brain Topogr 27:707–730

    PubMed  Article  Google Scholar 

  135. Summerfield C, de Lange FP (2014) Expectation in perceptual decision making: neural and computational mechanisms. Nat Rev Neurosci 15:745–756

    CAS  PubMed  Article  Google Scholar 

  136. Summerfield C, Egner T (2009) Expectation (and attention) in visual cognition. Trends Cogn Sci 13:403–409

    PubMed  Article  Google Scholar 

  137. Sutherland CA, Thut G, Romei V (2014) Hearing brighter: changing in-depth visual perception through looming sounds. Cognition 132:312–323

    PubMed  Article  Google Scholar 

  138. Talsma D (2015) Predictive coding and multisensory integration: an attentional account of the multisensory mind. Front Integr Neurosci 9:19. doi:10.3389/fnint.2015.00019

    PubMed  PubMed Central  Article  Google Scholar 

  139. Talsma D, Woldorff MG (2005) Selective attention and multisensory integration: multiple phases of effects on the evoked brain activity. J Cogn Neurosci 17:1098–1114

    PubMed  Article  Google Scholar 

  140. Talsma D, Doty TJ, Woldorff MG (2007) Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb Cortex 17:679–690

    PubMed  Article  Google Scholar 

  141. Talsma D, Senkowski D, Soto-Faraco S, Woldorff MG (2010) The multifaceted interplay between attention and multisensory integration. Trends Cogn Sci 14:400–410

    PubMed  PubMed Central  Article  Google Scholar 

  142. Teder-Sälejärvi WA, McDonald JJ, Di Russo F, Hillyard SA (2002) An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Cogn Brain Res 14:106–114

    Article  Google Scholar 

  143. ten Oever S, Sack AT, Wheat KL, Bien N, Van Atteveldt N (2013) Audio-visual onset differences are used to determine syllable identity for ambiguous audio-visual stimulus pairs. Front Psychol 4:331. doi:10.3389/fpsyg.2013.00331

    PubMed  PubMed Central  Article  Google Scholar 

  144. ten Oever S, Schroeder CE, Poeppel D, Van Atteveldt N, Zion Golumbic EM (2014) The influence of temporal regularities and crossmodal temporal cues on auditory detection. Neuropsychologia 63:43–50

    PubMed  PubMed Central  Article  Google Scholar 

  145. Thelen A, Matusz PJ, Murray MM (2014) Multisensory context portends object memory. Curr Biol 24:R734–R735

    CAS  PubMed  Article  Google Scholar 

  146. Thelen A, Talsma D, Murray MM (2015) Single-trial multisensory memories affect later auditory and visual object discrimination. Cognition 138:148–160

    PubMed  Article  Google Scholar 

  147. Thillay A, Roux S, Gissot V, Carteau-Martin I, Knight RT, Bonnet-Brilhault F, Bidet-Caulet A (2015) Sustained attention and prediction: distinct brain maturation trajectories during adolescence. Front Hum Neurosci 9:519

    PubMed  PubMed Central  Article  Google Scholar 

  148. Tuomainen J, Andersen TS, Tiippana K, Sams M (2005) Audio–visual speech perception is special. Cognition 96:B13–B22

    PubMed  Article  Google Scholar 

  149. Tyll S, Bonath B, Schoenfeld MA, Heinze HJ, Ohl FW, Noesselt T (2013) Neural basis of multisensory looming signals. NeuroImage 65:13–22

    PubMed  Article  Google Scholar 

  150. van Atteveldt N, Ansari D (2014) How symbols transform brain function: a review in memory of Leo Blomert. Trends Neurosci Educ 3:44–49

    Article  Google Scholar 

  151. van Atteveldt N, Formisano E, Goebel R, Blomert L (2004) Integration of letters and speech sounds in the human brain. Neuron 43:271–282

    PubMed  Article  Google Scholar 

  152. van Atteveldt NM, Formisano E, Goebel R, Blomert L (2007) Top-down task effects overrule automatic multisensory responses to letter-sound pairs in auditory association cortex. NeuroImage 36:1345–1360

    PubMed  Article  Google Scholar 

  153. van Atteveldt N, Murray MM, Thut G, Schroeder CE (2014a) Multisensory integration: flexible use of general operations. Neuron 81:1240–1253

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  154. van Atteveldt NM, Peterson BS, Schroeder CE (2014b) Contextual control of audiovisual integration in low-level sensory cortices. Human Brain Mapp 35:2394–2411

    Article  Google Scholar 

  155. van der Burg E, Olivers CNL, Bronkhorst A, Theeuwes J (2008) Pip-and-pop: nonspatial auditory signals improve spatial visual search. J Exp Psychol Hum Percept Perform 34:1053–1065

    PubMed  Article  Google Scholar 

  156. van der Burg E, Talsma D, Olivers CN, Hickey C, Theeuwes J (2011) Early multisensory interactions affect the competition among multiple visual objects. Neuroimage 55:1208–1218

    PubMed  Article  Google Scholar 

  157. van Ee R, van Boxtel JJ, Parker AL, Alais D (2009) Multisensory congruency as a mechanism for attentional control over perceptual selection. J Neurosci 29:11641–11649

    PubMed  Article  CAS  Google Scholar 

  158. Van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102:1181–1186

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  159. Vatakis A, Spence C (2007) Crossmodal binding: evaluating the “unity assumption” using audiovisual speech stimuli. Percept Psychophys 69:744–756

    PubMed  Article  Google Scholar 

  160. Vroomen J, Keetels M, de Gelder B, Bertelson P (2004) Recalibration of temporal order perception by exposure to audio-visual asynchrony. Brain Res Cogn Brain Res 22(1):32–35

    PubMed  Article  Google Scholar 

  161. Walker-Andrews A, Lennon EM (1985) Auditory–visual perception of changing distance by human infants. Child Dev 56:544–548

    CAS  PubMed  Article  Google Scholar 

  162. Welch RB, Warren DH (1980) Immediate perceptual response to intersensory discrepancy. Psychchol Bull 88:638–667

    CAS  Article  Google Scholar 

  163. Werner S, Noppeney U (2010a) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. J Neurosci 30:2662–2675

    CAS  PubMed  Article  Google Scholar 

  164. Werner S, Noppeney U (2010b) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cereb Cortex 20(8):1829–1842

    PubMed  Article  Google Scholar 

  165. Yuval-Greenberg S, Deouell LY (2007) What you see is not (always) what you hear: induced gamma band responses reflect cross-modal interactions in familiar object recognition. J Neurosci 27(5):1090–1096

    CAS  PubMed  Article  Google Scholar 

  166. Zion Golumbic EM, Poeppel D, Schroeder CE (2012) Temporal context in speech processing and attentional stream selection: a behavioral and neural perspective. Brain Lang 122:151–161

    PubMed  PubMed Central  Article  Google Scholar 

Download references

Acknowledgments

This research was supported by grants from the Ministerio de Economia y Competitividad (PSI2013-42626-P), AGAUR Generalitat de Catalunya (2014SGR856), and the European Research Council (StG-2010 263145) to S.S-F, and the Swiss National Science Foundation (Grant #320030-149982 as well as the National Centre of Competence in Research project “SYNAPSY, The Synaptic Bases of Mental Disease” [Project 51AU40-125759]) and the Swiss Brain League (2014 Research Prize) to MMM. StO receives support from the Dutch Organisation for Scientific Research (Grant 406-11-068).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Pawel J. Matusz.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

ten Oever, S., Romei, V., van Atteveldt, N. et al. The COGs (context, object, and goals) in multisensory processing. Exp Brain Res 234, 1307–1323 (2016). https://doi.org/10.1007/s00221-016-4590-z

Download citation

Keywords

  • Attention
  • Multisensory
  • Control
  • Object
  • Top-down
  • Bottom-up
  • Audio-visual
  • Brain mapping