Abstract
In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.
Similar content being viewed by others
References
Allison T, Ginter H, McCarthy G, Nobre AC, Puce A, Luby M, Spencer DD (1994a) Face recognition in human extrastriate cortex. J Neurophysiol 71(2):821–825
Allison T, McCarthy G, Nobre A, Puce A, Belger A (1994b) Human extrastriate visual cortex and the perception of faces, words, numbers, and colors. Cereb Cortex 4(5):544–554
Allison T, Puce A, Spencer DD, McCarthy G (1999) Electrophysiological studies of human face perception. I: potentials generated in occipitotemporal cortex by face and non-face stimuli. Cereb Cortex 9(5):415–430
Attwell D, Iadecola C (2002) The neural basis of functional brain imaging signals. Trends Neurosci 25(12):621–625
Barth DS, Goldberg N, Brett B, Di S (1995) The spatiotemporal organization of auditory, visual, and auditory-visual evoked potentials in rat cortex. Brain Res 678(1–2):177–190
Beauchamp MS, Argall BD, Bodurka J, Duyn JH, Martin A (2004a) Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nat Neurosci 7(11):1190–1192
Beauchamp MS, Lee KE, Argall BD, Martin A (2004b) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41(5):809–823
Beauchamp MS, Yasar NE, Frye RE, Ro T (2008) Touch, sound and vision in human superior temporal sulcus. NeuroImage 41(3):1011–1020. doi:10.1016/j.neuroimage.2008.03.015
Bentin S, Allison T, Puce A, Perez E, McCarthy G (1996) Electrophysiological studies of face perception in humans. J Cogn Neurosci 8(6):551–565
Berman AL (1961) Interaction of cortical responses to somatic and auditory stimuli in anterior ectosylvian gyrus of cat. J Neurophysiol 24:608–620
Besle J, Fort A, Delpuech C, Giard MH (2004a) Bimodal speech: early suppressive visual effects in human auditory cortex. Eur J Neurosci 20(8):2225–2234. doi:10.1111/j.1460-9568.2004.03670.xEJN3670[pii]
Besle J, Fort A, Giard M (2004b) Interest and validity of the additive model in electrophysiological studies of multisensory interactions. Cogn Process 5:189–192
Besle J, Bertrand O, Giard MH (2009) Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex. Hear Res. doi:10.1016/j.heares.2009.06.016
Brainard DH (1997) The psychophysics toolbox. Spat Vis 10(4):433–436
Brefczynski-Lewis J, Lowitszch S, Parsons M, Lemieux S, Puce A (2009) Audiovisual non-verbal dynamic faces elicit converging fMRI and ERP responses. Brain Topogr 21(3–4):193–206. doi:10.1007/s10548-009-0093-6
Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS (1997) Activation of auditory cortex during silent lipreading. Science 276(5312):593–596
Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10(11):649–657
Calvert GA, Hansen PC, Iversen SD, Brammer MJ (2001) Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect. NeuroImage 14(2):427–438
Cappe C, Barone P (2005) Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. Eur J Neurosci 22(11):2886–2902. doi:10.1111/j.1460-9568.2005.04462.x
Cappe C, Rouiller EM, Barone P (2009) Multisensory anatomical pathways. Hear Res 258(1–2):28–36. doi:10.1016/j.heares.2009.04.017
Cappe C, Thut G, Romei V, Murray MM (2010) Auditory-visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci 30(38):12572–12580. doi:10.1523/JNEUROSCI.1099-10.2010
Clarke R, Morton J (1983) Cross modality facilitation in tachistoscope word recognition. Q J Exp Psychol 35A:79–96
Clavagnier S, Falchier A, Kennedy H (2004) Long-distance feedback projections to area V1: implications for multisensory integration, spatial awareness, and visual consciousness. Cogn Affect Behav Neurosci 4(2):117–126
Diederich A, Colonius H (2004) Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time. Percept Psychophys 66(8):1388–1404
Falchier A, Clavagnier S, Barone P, Kennedy H (2002) Anatomical evidence of multimodal integration in primate striate cortex. J Neurosci 22(13):5749–5759
Falchier A, Schroeder CE, Hackett TA, Lakatos P, Nascimento-Silva S, Ulbert I, Karmos G, Smiley JF (2010) Projection from visual areas V2 and prostriata to caudal auditory cortex in the monkey. Cereb Cortex 20(7):1529–1538. doi:10.1093/cercor/bhp213
Fort A, Delpuech C, Pernier J, Giard MH (2002a) Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. Cereb Cortex 12(10):1031–1039
Fort A, Delpuech C, Pernier J, Giard MH (2002b) Early auditory–visual interactions in human cortex during nonredundant target identification. Brain Res Cogn Brain Res 14(1):20–30
Giard M, Besle J (2010) Methodological considerations: electrophysiology of multisensory interactions in humans. In: Kaiser J, Naumer MJ (eds) Multisensory object perception in the primate brain. Springer, New York
Giard MH, Peronnet F (1999) Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J Cogn Neurosci 11(5):473–490
Gondan M, Röder B (2006) A new method for detecting interactions between the senses in event-related potentials. Brain Res 1073–1074:389–397. doi:10.1016/j.brainres.2005.12.050
Grill-Spector K, Henson R, Martin A (2006) Repetition and the brain: neural models of stimulus-specific effects. Trends Cogn Sci 10(1):14–23. doi:10.1016/j.tics.2005.11.006
Haxby JV, Horwitz B, Ungerleider LG, Maisog JM, Pietrini P, Grady CL (1994) The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations. J Neurosci 14(11 Pt 1):6336–6353
Heeger DJ, Ress D (2002) What does fMRI tell us about neuronal activity? Nat Rev 3(2):142–151. doi:10.1038/nrn730nrn730[pii]
Henson RN (2003) Neuroimaging studies of priming. Prog Neurobiol 70(1):53–81. doi:S0301008203000868[pii]
Hershenson M (1962) Reaction time as a measure of intersensory facilitation. J Exp Psychol 63:289–293
James TW, Gauthier I (2006) Repetition-induced changes in BOLD response reflect accumulation of neural activity. Hum Brain Mapp 27(1):37–46
James TW, Stevenson RA (2012) The use of fMRI to assess multisensory integration. In: Wallace MH, Murray MM (eds) Frontiers in the neural basis of multisensory processes. Taylor & Francis, London
James TW, Stevenson RA, Kim S (2009) Assessing multisensory integration with additive factors and functional MRI. In: Proceedings of the International Society for Psychophysics, Dublin
James TW, Stevenson RA, Kim S (2012) Inverse effectiveness in multisensory processing. In: Stein BE (ed) The new handbook of multisensory processes. MIT Press, Cambridge
Joassin F, Maurage P, Bruyer R, Crommelinck M, Campanella S (2004) When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices. Neurosci Lett 369(2):132–137
Kawashima R, O’Sullivan BT, Roland PE (1995) Positron-emission tomography studies of cross-modality inhibition in selective attentional tasks: closing the “mind’s eye”. Proc Natl Acad Sci USA 92(13):5969–5972
Kim S, James TW (2010) Enhanced effectiveness in visuo-haptic object-selective brain regions with increasing stimulus salience. Hum Brain Mapp 31(5):678–693. doi:10.1002/hbm.20897
Kim S, Stevenson RA, James TW (2011) Visuo-haptic neuronal convergence demonstrated with an inversely effective pattern of BOLD activation. J Cogn Neurosci 24(4):830–842. doi: 10.1162/jocn_a_00176
Klucharev V, Mottonen R, Sams M (2003) Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. Brain Res Cogn Brain Res 18(1):65–75
Kriegeskorte N, Simmons WK, Bellgowan PS, Baker CI (2009) Circular analysis in systems neuroscience: the dangers of double dipping. Nat Neurosci 12(5):535–540. doi:10.1038/nn.2303
Lachs L, Hernandez LR (1998) Update: the Hoosier Audiovisual Multitalker Database. Research on spoken language processing. Speech Research Laboratory, Indiana University, Bloomington
Laming D, Laming J (1992) F. Hegelmaier: on memory for the length of a line. Psychol Res 54(4):233–239
Latinus M, VanRullen R, Taylor MJ (2010) Top-down and bottom-up modulation in processing bimodal face/voice stimuli. BMC Neurosci 11:36. doi:10.1186/1471-2202-11-36
Laurienti PJ, Burdette JH, Wallace MT, Yen YF, Field AS, Stein BE (2002) Deactivation of sensory-specific cortex by cross-modal stimuli. J Cogn Neurosci 14(3):420–429
Logothetis NK (2002) The neural basis of the blood-oxygen-level-dependent functional magnetic resonance imaging signal. Philos Trans R Soc London 357(1424):1003–1037
Logothetis NK (2003) The underpinnings of the BOLD functional magnetic resonance imaging signal. J Neurosci 23(10):3963–3971
Lovelace CT, Stein BE, Wallace MT (2003) An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Brain Res Cogn Brain Res 17(2):447–453
Luce PA, Pisoni DB (1998) Recognizing spoken words: the neighborhood activation model. Ear Hear 19(1):1–36
Macaluso E, Frith CD, Driver J (2000) Modulation of human visual cortex by crossmodal spatial attention. Science 289(5482):1206–1208
Martuzzi R, Murray MM, Michel CM, Thiran JP, Maeder PP, Clarke S, Meuli RA (2007) Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb Cortex 17(7):1672–1679. doi:10.1093/cercor/bhl077
McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264(5588):746–748
Meredith MA, Stein BE (1983) Interactions among converging sensory inputs in the superior colliculus. Science 221(4608):389–391
Meredith MA, Stein BE (1986) Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56(3):640–662
Miller J (1982) Divided attention: evidence for coactivation with redundant signals. Cogn Psychol 14(2):247–279
Molholm S, Ritter W, Murray MM, Javitt DC, Schroeder CE, Foxe JJ (2002) Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study. Brain Res Cogn Brain Res 14(1):115–128
Molholm S, Ritter W, Javitt DC, Foxe JJ (2004) Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. Cereb Cortex 14(4):452–465
Murray MM, Molholm S, Michel CM, Heslenfeld DJ, Ritter W, Javitt DC, Schroeder CE, Foxe JJ (2005) Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cereb Cortex 15(7):963–974. doi:10.1093/cercor/bhh197
Murray MM, Brunet D, Michel CM (2008) Topographic ERP analyses: a step-by-step tutorial review. Brain Topogr 20(4):249–264. doi:10.1007/s10548-008-0054-5
Musacchia G, Schroeder CE (2009) Neuronal mechanisms, response dynamics and perceptual functions of multisensory interactions in auditory cortex. Hear Res 258(1–2):72–79. doi:10.1016/j.heares.2009.06.018
Nelson WT, Hettinger LJ, Cunningham JA, Brickman BJ, Haas MW, McKinley RL (1998) Effects of localized auditory information on visual target detection performance using a helmet-mounted display. Hum Factors 40(3):452–460
Pelli DG (1997) The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spatial Vis 10(4):437–442
Perrault TJ Jr, Vaughan JW, Stein BE, Wallace MT (2003) Neuron-specific response characteristics predict the magnitude of multisensory integration. J Neurophysiol 90(6):4022–4026. doi:10.1152/jn.00494.2003
Ponton C, Eggermont JJ, Khosla D, Kwong B, Don M (2002) Maturation of human central auditory system activity: separating auditory evoked potentials by dipole source modeling. Clin Neurophysiol 113(3):407–420. doi:S1388245701007337[pii]
Puce A, Epling JA, Thompson JC, Carrick OK (2007) Neural responses elicited to face motion and vocalization pairings. Neuropsychologia 45(1):93–106. doi:10.1016/j.neuropsychologia.2006.04.017
Raab DH (1962) Statistical facilitation of simple reaction times. Trans New York Acad Sci 24:574–590
Raichle ME, Mintun MA (2006) Brain work and brain imaging. Ann Rev Neurosci 29:449–476. doi:10.1146/annurev.neuro.29.051605.112819
Rockland KS, Ojima H (2003) Multisensory convergence in calcarine visual areas in macaque monkey. Int J Psychophysiol 50(1–2):19–26
Roediger HL III, McDermott KB (1993) Implicit memory in normal subjects. In: Boller F, Grafman J (eds) Handbook of neuropsychology. Elsevier, Amsterdam, pp 63–131
Romei V, Murray MM, Merabet LB, Thut G (2007) Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: implications for multisensory interactions. J Neurosci 27(43):11465–11472. doi:10.1523/JNEUROSCI.2827-07.2007
Romei V, Murray MM, Cappe C, Thut G (2009) Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Curr Biol 19(21):1799–1805. doi:10.1016/j.cub.2009.09.027
Ross LA, Saint-Amour D, Leavitt VM, Javitt DC, Foxe JJ (2007) Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cereb Cortex 17(5):1147–1153. doi:10.1093/cercor/bhl024
Sartori G, Umilta C (2000) The additive factor method in brain imaging. Brain Cogn 42(1):68–71. doi:10.1006/brcg.1999.1164
Senkowski D, Saint-Amour D, Hofle M, Foxe JJ (2011) Multisensory interactions in early evoked brain activity follow the principle of inverse effectiveness. NeuroImage. doi:10.1016/j.neuroimage.2011.03.075
Scarff CJ, Reynolds A, Goodyear BG, Ponton CW, Dort JC, Eggermont JJ (2004) Simultaneous 3-T fMRI and high-density recording of human auditory evoked potentials. NeuroImage 23(3):1129–1142. doi:10.1016/j.neuroimage.2004.07.035
Sheffert SM, Lachs L, Hernandez LR (1996) The Hooiser audiovisual multitalker database. Research on spoken language processing. Speech Research Laboratory, Indiana University, Bloomington
Smiley JF, Falchier A (2009) Multisensory connections of monkey auditory cerebral cortex. Hear Res 258(1–2):37–46. doi:10.1016/j.heares.2009.06.019
Soto-Faraco S, Navarra J, Alsius A (2004) Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task. Cognition 92(3):B13–B23
Stanford TR, Quessy S, Stein BE (2005) Evaluating the operations underlying multisensory integration in the cat superior colliculus. J Neurosci 25(28):6499–6508
Stein B, Meredith MA (1993) The merging of the senses. MIT Press, Boston
Stein BE, Wallace MT (1996) Comparisons of cross-modality integration in midbrain and cortex. Prog Brain Res 112:289–299
Stein BE, Huneycutt WS, Meredith MA (1988) Neurons and behavior: the same rules of multisensory integration apply. Brain Res 448(2):355–358
Stein BE, Stanford TR, Ramachandran R, Perrault TJ Jr, Rowland BA (2009) Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness. Exp Brain Res 198(2–3):113–126. doi:10.1007/s00221-009-1880-8
Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19(12):1964–1973. doi:10.1162/jocn.2007.19.12.1964
Sternberg S (1969) Memory-scanning: mental processes revealed by reaction-time experiments. Am Sci 57(4):421–457
Sternberg S (1998) Discovering mental processing stages: the method of additive factors. In: Scarborough D, Sternberg S (eds) An invitation to cognitive science: volume 4, methods, models, and conceptual issues, vol 4. MIT Press, Cambridge, pp 739–811
Sternberg S (2001) Seperate modifiability, mental modules, and the use of pure and composite measures to reveal them. Acta Psychol 106:147–246
Stevenson RA, James TW (2009) Audiovisual integration in human superior temporal sulcus: inverse effectiveness and the neural processing of speech and object recognition. NeuroImage 44(3):1210–1223. doi:10.1016/j.neuroimage.2008.09.034
Stevenson RA, Geoghegan ML, James TW (2007) Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects. Exp Brain Res 179(1):85–95
Stevenson RA, Kim S, James TW (2009) An additive-factors design to disambiguate neuronal and areal convergence: measuring multisensory interactions between audio, visual, and haptic sensory streams using fMRI. Exp Brain Res 198(2–3):183–194. doi:10.1007/s00221-009-1783-8
Stevenson RA, Altieri NA, Kim S, Pisoni DB, James TW (2010) Neural processing of asynchronous audiovisual speech perception. NeuroImage 49(4):3308–3318. doi:10.1016/j.neuroimage.2009.12.001
Stevenson RA, VanDerKlok RM, Pisoni DB, James TW (2011) Discrete neural substrates underlie complementary audiovisual speech integration processes. NeuroImage 55(3):1339–1345. doi:10.1016/j.neuroimage.2010.12.063
Sumby WH, Pollack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26:212–215
Teder-Salejarvi WA, McDonald JJ, Di Russo F, Hillyard SA (2002) An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Brain Res Cogn Brain Res 14(1):106–114
van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102(4):1181–1186. doi:10.1073/pnas.0408949102
Vroomen J, Stekelenburg JJ (2010) Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli. J Cogn Neurosci 22(7):1583–1596. doi:10.1162/jocn.2009.21308
Wallace MH, Murray MM (eds) (2011) Frontiers in the neural basis of multisensory processes. Taylor & Francis, London
Wallace MT, Meredith MA, Stein BE (1992) Integration of multiple sensory modalities in cat cortex. Exp Brain Res 91(3):484–488
Wallace MT, Meredith MA, Stein BE (1993) Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J Neurophysiol 69(6):1797–1809
Wallace MT, Wilkinson LK, Stein BE (1996) Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76(2):1246–1266
Wallace MT, Meredith MA, Stein BE (1998) Multisensory integration in the superior colliculus of the alert cat. J Neurophysiol 80(2):1006–1010
Watkins S, Shams L, Tanaka S, Haynes JD, Rees G (2006) Sound alters activity in human V1 in association with illusory visual perception. NeuroImage 31(3):1247–1256. doi:10.1016/j.neuroimage.2006.01.016
Watkins S, Shams L, Josephs O, Rees G (2007) Activity in human V1 follows multisensory perception. NeuroImage 37(2):572–578. doi:10.1016/j.neuroimage.2007.05.027
Werner S, Noppeney U (2009) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cereb Cortex. doi:10.1093/cercor/bhp248
Werner S, Noppeney U (2010) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. J Neurosci 30(7):2662–2675. doi:10.1523/JNEUROSCI.5091-09.2010
Wilkinson LK, Meredith MA, Stein BE (1996) The role of anterior ectosylvian cortex in cross-modality orientation and approach behavior. Exp Brain Res 112(1):1–10
Acknowledgments
This research was supported in part by a grant to T. W. James from Indiana University’s Faculty Research Support Program administered by the office of the vice provost for research, an NIH NIDCD grant to R. A. Stevenson, 1F32 DC011993, an NIH NIDCD grant to M. T. Wallace, R34 DC010927, and NIH NIDCD training grant T32 DC000012 Training in Speech, Hearing, and Sensory Communication. Thanks to Laurel Stevenson, Karin Harman James, and Jeanne Wallace for their support, to Luis Hernandez for the stimuli, and to Vera Blau for help with the manuscript.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
The most commonly used metric of multisensory interactions in ERP research is the additive metric, or model (Barth et al. 1995; Berman 1961; Besle et al. 2004b). The additive model’s null hypothesis asserts that the multisensory response should equal the summed responses measured with both unisensory presentations in isolation, an assertion based on the law of superposition of electrical fields (Besle et al. 2004b, 2009; Giard and Besle 2010). While this metric is useful, there are a number of situations where it may produce spurious superadditive, or subadditive results. In particular, variations in attention across sensory modality (including variations in difficulty across sensory modalities or divided attention in bi-sensory conditions relative to unisensory conditions) and common activity create serious concerns about the use of the additive metric in multisensory paradigms (for an in-depth discussion, see Besle et al. 2004b; Giard and Besle 2010; Gondan and Röder 2006). While the additive metric calculates interactions as:
In terms of sensory specific activations, it is more accurately written as:
where CA refers to common activation, i.e., activation of processes that commonly occur regardless of sensory input. In this case, the common activation is accounted for twice on the right side of the equation, but only once on the left, producing spurious findings of multisensory interactions (Besle et al. 2004b; Giard and Besle 2010). Importantly, the additive factors equation reduces the impact of the common activation, measuring a change in each sensory modality:
In terms of specific sensory activations, again, this equation is more accurately written as:
Here, it should be noted that the impact of common activations, or CA, are diminished relative to the classic additive metric equation. Each component of the additive-factors equation includes two common activations that are subtracted from one another, leaving only the difference between common activations associated with levels of the added factor.
Finally, it should also be noted that there are a number of other approaches that have been used to circumvent the issues associated with the additive criterion. One such example is the application of electrical neuroimaging analyses to ERPs that includes assessing not only the response amplitude and timing of responses, but also utilizes response topography. This analysis, in addition to bypassing issues of associated with the additive metric also allows the experimenter to differentiate effects cause by changes in response strength from a given set of generators from effects caused by changes in the configuration of these generators Furthermore, the use of global field power can allow for the identification of the directionality of those interactions (Cappe et al. 2010; Murray et al. 2005, 2008).
Rights and permissions
About this article
Cite this article
Stevenson, R.A., Bushmakin, M., Kim, S. et al. Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech. Brain Topogr 25, 308–326 (2012). https://doi.org/10.1007/s10548-012-0220-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10548-012-0220-7