Skip to main content
Log in

Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech

  • Original Paper
  • Published:
Brain Topography Aims and scope Submit manuscript

Abstract

In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Allison T, Ginter H, McCarthy G, Nobre AC, Puce A, Luby M, Spencer DD (1994a) Face recognition in human extrastriate cortex. J Neurophysiol 71(2):821–825

    PubMed  CAS  Google Scholar 

  • Allison T, McCarthy G, Nobre A, Puce A, Belger A (1994b) Human extrastriate visual cortex and the perception of faces, words, numbers, and colors. Cereb Cortex 4(5):544–554

    PubMed  CAS  Google Scholar 

  • Allison T, Puce A, Spencer DD, McCarthy G (1999) Electrophysiological studies of human face perception. I: potentials generated in occipitotemporal cortex by face and non-face stimuli. Cereb Cortex 9(5):415–430

    PubMed  CAS  Google Scholar 

  • Attwell D, Iadecola C (2002) The neural basis of functional brain imaging signals. Trends Neurosci 25(12):621–625

    PubMed  CAS  Google Scholar 

  • Barth DS, Goldberg N, Brett B, Di S (1995) The spatiotemporal organization of auditory, visual, and auditory-visual evoked potentials in rat cortex. Brain Res 678(1–2):177–190

    PubMed  Google Scholar 

  • Beauchamp MS, Argall BD, Bodurka J, Duyn JH, Martin A (2004a) Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nat Neurosci 7(11):1190–1192

    PubMed  CAS  Google Scholar 

  • Beauchamp MS, Lee KE, Argall BD, Martin A (2004b) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41(5):809–823

    PubMed  CAS  Google Scholar 

  • Beauchamp MS, Yasar NE, Frye RE, Ro T (2008) Touch, sound and vision in human superior temporal sulcus. NeuroImage 41(3):1011–1020. doi:10.1016/j.neuroimage.2008.03.015

    PubMed  Google Scholar 

  • Bentin S, Allison T, Puce A, Perez E, McCarthy G (1996) Electrophysiological studies of face perception in humans. J Cogn Neurosci 8(6):551–565

    PubMed  Google Scholar 

  • Berman AL (1961) Interaction of cortical responses to somatic and auditory stimuli in anterior ectosylvian gyrus of cat. J Neurophysiol 24:608–620

    PubMed  CAS  Google Scholar 

  • Besle J, Fort A, Delpuech C, Giard MH (2004a) Bimodal speech: early suppressive visual effects in human auditory cortex. Eur J Neurosci 20(8):2225–2234. doi:10.1111/j.1460-9568.2004.03670.xEJN3670[pii]

    PubMed  Google Scholar 

  • Besle J, Fort A, Giard M (2004b) Interest and validity of the additive model in electrophysiological studies of multisensory interactions. Cogn Process 5:189–192

    Google Scholar 

  • Besle J, Bertrand O, Giard MH (2009) Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex. Hear Res. doi:10.1016/j.heares.2009.06.016

    PubMed  Google Scholar 

  • Brainard DH (1997) The psychophysics toolbox. Spat Vis 10(4):433–436

    PubMed  CAS  Google Scholar 

  • Brefczynski-Lewis J, Lowitszch S, Parsons M, Lemieux S, Puce A (2009) Audiovisual non-verbal dynamic faces elicit converging fMRI and ERP responses. Brain Topogr 21(3–4):193–206. doi:10.1007/s10548-009-0093-6

    PubMed  Google Scholar 

  • Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS (1997) Activation of auditory cortex during silent lipreading. Science 276(5312):593–596

    PubMed  CAS  Google Scholar 

  • Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10(11):649–657

    PubMed  CAS  Google Scholar 

  • Calvert GA, Hansen PC, Iversen SD, Brammer MJ (2001) Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect. NeuroImage 14(2):427–438

    PubMed  CAS  Google Scholar 

  • Cappe C, Barone P (2005) Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. Eur J Neurosci 22(11):2886–2902. doi:10.1111/j.1460-9568.2005.04462.x

    PubMed  Google Scholar 

  • Cappe C, Rouiller EM, Barone P (2009) Multisensory anatomical pathways. Hear Res 258(1–2):28–36. doi:10.1016/j.heares.2009.04.017

    PubMed  CAS  Google Scholar 

  • Cappe C, Thut G, Romei V, Murray MM (2010) Auditory-visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci 30(38):12572–12580. doi:10.1523/JNEUROSCI.1099-10.2010

    PubMed  CAS  Google Scholar 

  • Clarke R, Morton J (1983) Cross modality facilitation in tachistoscope word recognition. Q J Exp Psychol 35A:79–96

    Google Scholar 

  • Clavagnier S, Falchier A, Kennedy H (2004) Long-distance feedback projections to area V1: implications for multisensory integration, spatial awareness, and visual consciousness. Cogn Affect Behav Neurosci 4(2):117–126

    PubMed  Google Scholar 

  • Diederich A, Colonius H (2004) Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time. Percept Psychophys 66(8):1388–1404

    PubMed  Google Scholar 

  • Falchier A, Clavagnier S, Barone P, Kennedy H (2002) Anatomical evidence of multimodal integration in primate striate cortex. J Neurosci 22(13):5749–5759

    PubMed  CAS  Google Scholar 

  • Falchier A, Schroeder CE, Hackett TA, Lakatos P, Nascimento-Silva S, Ulbert I, Karmos G, Smiley JF (2010) Projection from visual areas V2 and prostriata to caudal auditory cortex in the monkey. Cereb Cortex 20(7):1529–1538. doi:10.1093/cercor/bhp213

    PubMed  Google Scholar 

  • Fort A, Delpuech C, Pernier J, Giard MH (2002a) Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. Cereb Cortex 12(10):1031–1039

    PubMed  Google Scholar 

  • Fort A, Delpuech C, Pernier J, Giard MH (2002b) Early auditory–visual interactions in human cortex during nonredundant target identification. Brain Res Cogn Brain Res 14(1):20–30

    PubMed  Google Scholar 

  • Giard M, Besle J (2010) Methodological considerations: electrophysiology of multisensory interactions in humans. In: Kaiser J, Naumer MJ (eds) Multisensory object perception in the primate brain. Springer, New York

    Google Scholar 

  • Giard MH, Peronnet F (1999) Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J Cogn Neurosci 11(5):473–490

    PubMed  CAS  Google Scholar 

  • Gondan M, Röder B (2006) A new method for detecting interactions between the senses in event-related potentials. Brain Res 1073–1074:389–397. doi:10.1016/j.brainres.2005.12.050

    PubMed  Google Scholar 

  • Grill-Spector K, Henson R, Martin A (2006) Repetition and the brain: neural models of stimulus-specific effects. Trends Cogn Sci 10(1):14–23. doi:10.1016/j.tics.2005.11.006

    PubMed  Google Scholar 

  • Haxby JV, Horwitz B, Ungerleider LG, Maisog JM, Pietrini P, Grady CL (1994) The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations. J Neurosci 14(11 Pt 1):6336–6353

    PubMed  CAS  Google Scholar 

  • Heeger DJ, Ress D (2002) What does fMRI tell us about neuronal activity? Nat Rev 3(2):142–151. doi:10.1038/nrn730nrn730[pii]

    CAS  Google Scholar 

  • Henson RN (2003) Neuroimaging studies of priming. Prog Neurobiol 70(1):53–81. doi:S0301008203000868[pii]

    PubMed  CAS  Google Scholar 

  • Hershenson M (1962) Reaction time as a measure of intersensory facilitation. J Exp Psychol 63:289–293

    PubMed  CAS  Google Scholar 

  • James TW, Gauthier I (2006) Repetition-induced changes in BOLD response reflect accumulation of neural activity. Hum Brain Mapp 27(1):37–46

    PubMed  Google Scholar 

  • James TW, Stevenson RA (2012) The use of fMRI to assess multisensory integration. In: Wallace MH, Murray MM (eds) Frontiers in the neural basis of multisensory processes. Taylor & Francis, London

    Google Scholar 

  • James TW, Stevenson RA, Kim S (2009) Assessing multisensory integration with additive factors and functional MRI. In: Proceedings of the International Society for Psychophysics, Dublin

  • James TW, Stevenson RA, Kim S (2012) Inverse effectiveness in multisensory processing. In: Stein BE (ed) The new handbook of multisensory processes. MIT Press, Cambridge

    Google Scholar 

  • Joassin F, Maurage P, Bruyer R, Crommelinck M, Campanella S (2004) When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices. Neurosci Lett 369(2):132–137

    PubMed  CAS  Google Scholar 

  • Kawashima R, O’Sullivan BT, Roland PE (1995) Positron-emission tomography studies of cross-modality inhibition in selective attentional tasks: closing the “mind’s eye”. Proc Natl Acad Sci USA 92(13):5969–5972

    PubMed  CAS  Google Scholar 

  • Kim S, James TW (2010) Enhanced effectiveness in visuo-haptic object-selective brain regions with increasing stimulus salience. Hum Brain Mapp 31(5):678–693. doi:10.1002/hbm.20897

    PubMed  CAS  Google Scholar 

  • Kim S, Stevenson RA, James TW (2011) Visuo-haptic neuronal convergence demonstrated with an inversely effective pattern of BOLD activation. J Cogn Neurosci 24(4):830–842. doi: 10.1162/jocn_a_00176

    PubMed  Google Scholar 

  • Klucharev V, Mottonen R, Sams M (2003) Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. Brain Res Cogn Brain Res 18(1):65–75

    PubMed  Google Scholar 

  • Kriegeskorte N, Simmons WK, Bellgowan PS, Baker CI (2009) Circular analysis in systems neuroscience: the dangers of double dipping. Nat Neurosci 12(5):535–540. doi:10.1038/nn.2303

    PubMed  CAS  Google Scholar 

  • Lachs L, Hernandez LR (1998) Update: the Hoosier Audiovisual Multitalker Database. Research on spoken language processing. Speech Research Laboratory, Indiana University, Bloomington

    Google Scholar 

  • Laming D, Laming J (1992) F. Hegelmaier: on memory for the length of a line. Psychol Res 54(4):233–239

    PubMed  CAS  Google Scholar 

  • Latinus M, VanRullen R, Taylor MJ (2010) Top-down and bottom-up modulation in processing bimodal face/voice stimuli. BMC Neurosci 11:36. doi:10.1186/1471-2202-11-36

    PubMed  Google Scholar 

  • Laurienti PJ, Burdette JH, Wallace MT, Yen YF, Field AS, Stein BE (2002) Deactivation of sensory-specific cortex by cross-modal stimuli. J Cogn Neurosci 14(3):420–429

    PubMed  Google Scholar 

  • Logothetis NK (2002) The neural basis of the blood-oxygen-level-dependent functional magnetic resonance imaging signal. Philos Trans R Soc London 357(1424):1003–1037

    Google Scholar 

  • Logothetis NK (2003) The underpinnings of the BOLD functional magnetic resonance imaging signal. J Neurosci 23(10):3963–3971

    PubMed  CAS  Google Scholar 

  • Lovelace CT, Stein BE, Wallace MT (2003) An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Brain Res Cogn Brain Res 17(2):447–453

    PubMed  Google Scholar 

  • Luce PA, Pisoni DB (1998) Recognizing spoken words: the neighborhood activation model. Ear Hear 19(1):1–36

    PubMed  CAS  Google Scholar 

  • Macaluso E, Frith CD, Driver J (2000) Modulation of human visual cortex by crossmodal spatial attention. Science 289(5482):1206–1208

    PubMed  CAS  Google Scholar 

  • Martuzzi R, Murray MM, Michel CM, Thiran JP, Maeder PP, Clarke S, Meuli RA (2007) Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb Cortex 17(7):1672–1679. doi:10.1093/cercor/bhl077

    PubMed  Google Scholar 

  • McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264(5588):746–748

    PubMed  CAS  Google Scholar 

  • Meredith MA, Stein BE (1983) Interactions among converging sensory inputs in the superior colliculus. Science 221(4608):389–391

    PubMed  CAS  Google Scholar 

  • Meredith MA, Stein BE (1986) Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56(3):640–662

    PubMed  CAS  Google Scholar 

  • Miller J (1982) Divided attention: evidence for coactivation with redundant signals. Cogn Psychol 14(2):247–279

    PubMed  CAS  Google Scholar 

  • Molholm S, Ritter W, Murray MM, Javitt DC, Schroeder CE, Foxe JJ (2002) Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study. Brain Res Cogn Brain Res 14(1):115–128

    PubMed  Google Scholar 

  • Molholm S, Ritter W, Javitt DC, Foxe JJ (2004) Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. Cereb Cortex 14(4):452–465

    PubMed  Google Scholar 

  • Murray MM, Molholm S, Michel CM, Heslenfeld DJ, Ritter W, Javitt DC, Schroeder CE, Foxe JJ (2005) Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cereb Cortex 15(7):963–974. doi:10.1093/cercor/bhh197

    PubMed  Google Scholar 

  • Murray MM, Brunet D, Michel CM (2008) Topographic ERP analyses: a step-by-step tutorial review. Brain Topogr 20(4):249–264. doi:10.1007/s10548-008-0054-5

    PubMed  Google Scholar 

  • Musacchia G, Schroeder CE (2009) Neuronal mechanisms, response dynamics and perceptual functions of multisensory interactions in auditory cortex. Hear Res 258(1–2):72–79. doi:10.1016/j.heares.2009.06.018

    PubMed  Google Scholar 

  • Nelson WT, Hettinger LJ, Cunningham JA, Brickman BJ, Haas MW, McKinley RL (1998) Effects of localized auditory information on visual target detection performance using a helmet-mounted display. Hum Factors 40(3):452–460

    PubMed  CAS  Google Scholar 

  • Pelli DG (1997) The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spatial Vis 10(4):437–442

    CAS  Google Scholar 

  • Perrault TJ Jr, Vaughan JW, Stein BE, Wallace MT (2003) Neuron-specific response characteristics predict the magnitude of multisensory integration. J Neurophysiol 90(6):4022–4026. doi:10.1152/jn.00494.2003

    PubMed  Google Scholar 

  • Ponton C, Eggermont JJ, Khosla D, Kwong B, Don M (2002) Maturation of human central auditory system activity: separating auditory evoked potentials by dipole source modeling. Clin Neurophysiol 113(3):407–420. doi:S1388245701007337[pii]

    PubMed  Google Scholar 

  • Puce A, Epling JA, Thompson JC, Carrick OK (2007) Neural responses elicited to face motion and vocalization pairings. Neuropsychologia 45(1):93–106. doi:10.1016/j.neuropsychologia.2006.04.017

    PubMed  Google Scholar 

  • Raab DH (1962) Statistical facilitation of simple reaction times. Trans New York Acad Sci 24:574–590

    CAS  Google Scholar 

  • Raichle ME, Mintun MA (2006) Brain work and brain imaging. Ann Rev Neurosci 29:449–476. doi:10.1146/annurev.neuro.29.051605.112819

    PubMed  CAS  Google Scholar 

  • Rockland KS, Ojima H (2003) Multisensory convergence in calcarine visual areas in macaque monkey. Int J Psychophysiol 50(1–2):19–26

    PubMed  Google Scholar 

  • Roediger HL III, McDermott KB (1993) Implicit memory in normal subjects. In: Boller F, Grafman J (eds) Handbook of neuropsychology. Elsevier, Amsterdam, pp 63–131

    Google Scholar 

  • Romei V, Murray MM, Merabet LB, Thut G (2007) Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: implications for multisensory interactions. J Neurosci 27(43):11465–11472. doi:10.1523/JNEUROSCI.2827-07.2007

    PubMed  CAS  Google Scholar 

  • Romei V, Murray MM, Cappe C, Thut G (2009) Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Curr Biol 19(21):1799–1805. doi:10.1016/j.cub.2009.09.027

    PubMed  CAS  Google Scholar 

  • Ross LA, Saint-Amour D, Leavitt VM, Javitt DC, Foxe JJ (2007) Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cereb Cortex 17(5):1147–1153. doi:10.1093/cercor/bhl024

    PubMed  Google Scholar 

  • Sartori G, Umilta C (2000) The additive factor method in brain imaging. Brain Cogn 42(1):68–71. doi:10.1006/brcg.1999.1164

    PubMed  CAS  Google Scholar 

  • Senkowski D, Saint-Amour D, Hofle M, Foxe JJ (2011) Multisensory interactions in early evoked brain activity follow the principle of inverse effectiveness. NeuroImage. doi:10.1016/j.neuroimage.2011.03.075

    PubMed  Google Scholar 

  • Scarff CJ, Reynolds A, Goodyear BG, Ponton CW, Dort JC, Eggermont JJ (2004) Simultaneous 3-T fMRI and high-density recording of human auditory evoked potentials. NeuroImage 23(3):1129–1142. doi:10.1016/j.neuroimage.2004.07.035

    PubMed  Google Scholar 

  • Sheffert SM, Lachs L, Hernandez LR (1996) The Hooiser audiovisual multitalker database. Research on spoken language processing. Speech Research Laboratory, Indiana University, Bloomington

    Google Scholar 

  • Smiley JF, Falchier A (2009) Multisensory connections of monkey auditory cerebral cortex. Hear Res 258(1–2):37–46. doi:10.1016/j.heares.2009.06.019

    PubMed  Google Scholar 

  • Soto-Faraco S, Navarra J, Alsius A (2004) Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task. Cognition 92(3):B13–B23

    PubMed  Google Scholar 

  • Stanford TR, Quessy S, Stein BE (2005) Evaluating the operations underlying multisensory integration in the cat superior colliculus. J Neurosci 25(28):6499–6508

    PubMed  CAS  Google Scholar 

  • Stein B, Meredith MA (1993) The merging of the senses. MIT Press, Boston

    Google Scholar 

  • Stein BE, Wallace MT (1996) Comparisons of cross-modality integration in midbrain and cortex. Prog Brain Res 112:289–299

    PubMed  CAS  Google Scholar 

  • Stein BE, Huneycutt WS, Meredith MA (1988) Neurons and behavior: the same rules of multisensory integration apply. Brain Res 448(2):355–358

    PubMed  CAS  Google Scholar 

  • Stein BE, Stanford TR, Ramachandran R, Perrault TJ Jr, Rowland BA (2009) Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness. Exp Brain Res 198(2–3):113–126. doi:10.1007/s00221-009-1880-8

    PubMed  Google Scholar 

  • Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19(12):1964–1973. doi:10.1162/jocn.2007.19.12.1964

    PubMed  Google Scholar 

  • Sternberg S (1969) Memory-scanning: mental processes revealed by reaction-time experiments. Am Sci 57(4):421–457

    PubMed  CAS  Google Scholar 

  • Sternberg S (1998) Discovering mental processing stages: the method of additive factors. In: Scarborough D, Sternberg S (eds) An invitation to cognitive science: volume 4, methods, models, and conceptual issues, vol 4. MIT Press, Cambridge, pp 739–811

    Google Scholar 

  • Sternberg S (2001) Seperate modifiability, mental modules, and the use of pure and composite measures to reveal them. Acta Psychol 106:147–246

    CAS  Google Scholar 

  • Stevenson RA, James TW (2009) Audiovisual integration in human superior temporal sulcus: inverse effectiveness and the neural processing of speech and object recognition. NeuroImage 44(3):1210–1223. doi:10.1016/j.neuroimage.2008.09.034

    PubMed  Google Scholar 

  • Stevenson RA, Geoghegan ML, James TW (2007) Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects. Exp Brain Res 179(1):85–95

    PubMed  Google Scholar 

  • Stevenson RA, Kim S, James TW (2009) An additive-factors design to disambiguate neuronal and areal convergence: measuring multisensory interactions between audio, visual, and haptic sensory streams using fMRI. Exp Brain Res 198(2–3):183–194. doi:10.1007/s00221-009-1783-8

    PubMed  Google Scholar 

  • Stevenson RA, Altieri NA, Kim S, Pisoni DB, James TW (2010) Neural processing of asynchronous audiovisual speech perception. NeuroImage 49(4):3308–3318. doi:10.1016/j.neuroimage.2009.12.001

    PubMed  Google Scholar 

  • Stevenson RA, VanDerKlok RM, Pisoni DB, James TW (2011) Discrete neural substrates underlie complementary audiovisual speech integration processes. NeuroImage 55(3):1339–1345. doi:10.1016/j.neuroimage.2010.12.063

    PubMed  Google Scholar 

  • Sumby WH, Pollack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26:212–215

    Google Scholar 

  • Teder-Salejarvi WA, McDonald JJ, Di Russo F, Hillyard SA (2002) An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Brain Res Cogn Brain Res 14(1):106–114

    PubMed  CAS  Google Scholar 

  • van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102(4):1181–1186. doi:10.1073/pnas.0408949102

    PubMed  Google Scholar 

  • Vroomen J, Stekelenburg JJ (2010) Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli. J Cogn Neurosci 22(7):1583–1596. doi:10.1162/jocn.2009.21308

    PubMed  Google Scholar 

  • Wallace MH, Murray MM (eds) (2011) Frontiers in the neural basis of multisensory processes. Taylor & Francis, London

    Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1992) Integration of multiple sensory modalities in cat cortex. Exp Brain Res 91(3):484–488

    PubMed  CAS  Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1993) Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J Neurophysiol 69(6):1797–1809

    PubMed  CAS  Google Scholar 

  • Wallace MT, Wilkinson LK, Stein BE (1996) Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76(2):1246–1266

    PubMed  CAS  Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1998) Multisensory integration in the superior colliculus of the alert cat. J Neurophysiol 80(2):1006–1010

    PubMed  CAS  Google Scholar 

  • Watkins S, Shams L, Tanaka S, Haynes JD, Rees G (2006) Sound alters activity in human V1 in association with illusory visual perception. NeuroImage 31(3):1247–1256. doi:10.1016/j.neuroimage.2006.01.016

    PubMed  CAS  Google Scholar 

  • Watkins S, Shams L, Josephs O, Rees G (2007) Activity in human V1 follows multisensory perception. NeuroImage 37(2):572–578. doi:10.1016/j.neuroimage.2007.05.027

    PubMed  CAS  Google Scholar 

  • Werner S, Noppeney U (2009) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cereb Cortex. doi:10.1093/cercor/bhp248

    PubMed  Google Scholar 

  • Werner S, Noppeney U (2010) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. J Neurosci 30(7):2662–2675. doi:10.1523/JNEUROSCI.5091-09.2010

    PubMed  CAS  Google Scholar 

  • Wilkinson LK, Meredith MA, Stein BE (1996) The role of anterior ectosylvian cortex in cross-modality orientation and approach behavior. Exp Brain Res 112(1):1–10

    PubMed  CAS  Google Scholar 

Download references

Acknowledgments

This research was supported in part by a grant to T. W. James from Indiana University’s Faculty Research Support Program administered by the office of the vice provost for research, an NIH NIDCD grant to R. A. Stevenson, 1F32 DC011993, an NIH NIDCD grant to M. T. Wallace, R34 DC010927, and NIH NIDCD training grant T32 DC000012 Training in Speech, Hearing, and Sensory Communication. Thanks to Laurel Stevenson, Karin Harman James, and Jeanne Wallace for their support, to Luis Hernandez for the stimuli, and to Vera Blau for help with the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryan A. Stevenson.

Appendix

Appendix

The most commonly used metric of multisensory interactions in ERP research is the additive metric, or model (Barth et al. 1995; Berman 1961; Besle et al. 2004b). The additive model’s null hypothesis asserts that the multisensory response should equal the summed responses measured with both unisensory presentations in isolation, an assertion based on the law of superposition of electrical fields (Besle et al. 2004b, 2009; Giard and Besle 2010). While this metric is useful, there are a number of situations where it may produce spurious superadditive, or subadditive results. In particular, variations in attention across sensory modality (including variations in difficulty across sensory modalities or divided attention in bi-sensory conditions relative to unisensory conditions) and common activity create serious concerns about the use of the additive metric in multisensory paradigms (for an in-depth discussion, see Besle et al. 2004b; Giard and Besle 2010; Gondan and Röder 2006). While the additive metric calculates interactions as:

$$ {\text{AV}} \ne {\text{A}} + {\text{V}}. $$

In terms of sensory specific activations, it is more accurately written as:

$$ {\text{AV}} + {\text{CA}} \ne \left( {{\text{A}} + {\text{CA}}} \right) + \left( {{\text{V}} + {\text{CA}}} \right), $$

where CA refers to common activation, i.e., activation of processes that commonly occur regardless of sensory input. In this case, the common activation is accounted for twice on the right side of the equation, but only once on the left, producing spurious findings of multisensory interactions (Besle et al. 2004b; Giard and Besle 2010). Importantly, the additive factors equation reduces the impact of the common activation, measuring a change in each sensory modality:

$$ {\text{AV}}_{\text{H}} - {\text{AV}}_{\text{L}} \ne \left( {{\text{A}}_{\text{H}} - {\text{A}}_{\text{L}} } \right) + \left( {{\text{V}}_{\text{H}} - {\text{V}}_{\text{L}} } \right) .$$

In terms of specific sensory activations, again, this equation is more accurately written as:

$$ \left( {{\text{AV}}_{\text{H}} + {\text{CA}}} \right) - \left( {{\text{AV}}_{\text{L}} + {\text{CA}}} \right) \ne \left[ {\left( {{\text{A}}_{\text{H}} + {\text{CA}}} \right) - \left( {{\text{A}}_{\text{L}} + {\text{CA}}} \right)} \right] + \left[ {\left( {{\text{V}}_{\text{H}} + {\text{CA}}} \right) - \left( {{\text{V}}_{\text{L}} + {\text{CA}}} \right)} \right]. $$

Here, it should be noted that the impact of common activations, or CA, are diminished relative to the classic additive metric equation. Each component of the additive-factors equation includes two common activations that are subtracted from one another, leaving only the difference between common activations associated with levels of the added factor.

Finally, it should also be noted that there are a number of other approaches that have been used to circumvent the issues associated with the additive criterion. One such example is the application of electrical neuroimaging analyses to ERPs that includes assessing not only the response amplitude and timing of responses, but also utilizes response topography. This analysis, in addition to bypassing issues of associated with the additive metric also allows the experimenter to differentiate effects cause by changes in response strength from a given set of generators from effects caused by changes in the configuration of these generators Furthermore, the use of global field power can allow for the identification of the directionality of those interactions (Cappe et al. 2010; Murray et al. 2005, 2008).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Stevenson, R.A., Bushmakin, M., Kim, S. et al. Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech. Brain Topogr 25, 308–326 (2012). https://doi.org/10.1007/s10548-012-0220-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10548-012-0220-7

Keywords

Navigation