Advertisement

Experimental Brain Research

, Volume 235, Issue 9, pp 2867–2876 | Cite as

Electrophysiological evidence for a self-processing advantage during audiovisual speech integration

  • Avril Treille
  • Coriandre Vilain
  • Sonia Kandel
  • Marc Sato
Research Article

Abstract

Previous electrophysiological studies have provided strong evidence for early multisensory integrative mechanisms during audiovisual speech perception. From these studies, one unanswered issue is whether hearing our own voice and seeing our own articulatory gestures facilitate speech perception, possibly through a better processing and integration of sensory inputs with our own sensory-motor knowledge. The present EEG study examined the impact of self-knowledge during the perception of auditory (A), visual (V) and audiovisual (AV) speech stimuli that were previously recorded from the participant or from a speaker he/she had never met. Audiovisual interactions were estimated by comparing N1 and P2 auditory evoked potentials during the bimodal condition (AV) with the sum of those observed in the unimodal conditions (A + V). In line with previous EEG studies, our results revealed an amplitude decrease of P2 auditory evoked potentials in AV compared to A + V conditions. Crucially, a temporal facilitation of N1 responses was observed during the visual perception of self speech movements compared to those of another speaker. This facilitation was negatively correlated with the saliency of visual stimuli. These results provide evidence for a temporal facilitation of the integration of auditory and visual speech signals when the visual situation involves our own speech gestures.

Keywords

Self recognition Speech perception Audiovisual integration EEG 

Notes

Acknowledgements

This study was supported by research funds from the European Research Council (FP7/2007-2013 Grant Agreement No. 339152).

Supplementary material

221_2017_5018_MOESM1_ESM.avi (3.8 mb)
Supplementary material 1 (AVI 3885 kb)
221_2017_5018_MOESM2_ESM.avi (3.8 mb)
Supplementary material 2 (AVI 3891 kb)
221_2017_5018_MOESM3_ESM.avi (3.7 mb)
Supplementary material 3 (AVI 3773 kb)

References

  1. Alcorn S (1932) The Tadoma method. Volta Rev 34:195–198Google Scholar
  2. Arnal LH, Morillon B, Kell CA, Giraud AL (2009) Dual neural routing of visual facilitation in speech processing. J Neurosci 29(43):13445–13453CrossRefPubMedGoogle Scholar
  3. Aruffo C, Shore DI (2012) Can you McGurk yourself? Self-face and self-voice in audiovisual speech. Psychon Bull Rev 19:66–72CrossRefPubMedGoogle Scholar
  4. Baart M (2016) Quantifying lip-read induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays. Psychophysiology 53(9):1295–1306CrossRefPubMedGoogle Scholar
  5. Baart M, Samuel AG (2015) Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing. J Mem Lang 85:42–59CrossRefGoogle Scholar
  6. Baart M, Stekelenburg JJ, Vroomen J (2014) Electrophysiological evidence for speech-specific audiovisual integration. Neuropsychologia 65:115–211CrossRefGoogle Scholar
  7. Benoît C, Mohamadi T, Kandel S (1994) Effects of phonetic context on audio–visual intelligibility of French speech in noise. J Speech Hear Res 37:1195–1203CrossRefPubMedGoogle Scholar
  8. Besle J, Fort A, Delpuech C, Giard MH (2004) Bimodal speech: early suppressive visual effects in human auditory cortex. Eur J Neurosci 20:2225–2234CrossRefPubMedPubMedCentralGoogle Scholar
  9. Boersma P and Weenink D (2013) Praat: doing phonetics by computer. Computer program, Version 5.3.42, retrieved 2 March 2013 from (http://www.fon.hum.uva.nl/praat/). Accessed 4 July 2017
  10. Burfin S, Pascalis O, Ruiz-Tada E, Costa A, Savariaux C, Kandel S (2014) Bilingualism affects the audio–visual processing of non-native phonemes. Front Psychol 5:1179 (Research topic “New advances on the perception and production of non-native speech sounds”—section language sciences) CrossRefPubMedPubMedCentralGoogle Scholar
  11. Callan DE, Jones JA, Munhall KG, Callan AM, Kroos C, Vatikiotis-Bateson E (2003) Neural processes underlying perceptual enhancement by visual speech gestures. Neuro Rep 14:2213–2217Google Scholar
  12. Callan DE, Jones JA, Munhall KG, Callan AM, Kroos C, Vatikiotis-Bateson E (2004) Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information. J Cogn Neurosci 16:805–816CrossRefPubMedGoogle Scholar
  13. Calvert GA, Campbell R (2003) Reading speech from still and moving faces: the neural substrates of visible speech. J Cogn Neurosci 15:57–70CrossRefPubMedGoogle Scholar
  14. Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10(11):649–657CrossRefPubMedGoogle Scholar
  15. Campbell R, MacSweeney M, Surguladze S, Calvert G, McGuire P, Suckling J, Brammer MJ, David AS (2001) Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning). Cogn Brain Res 12:233–243CrossRefGoogle Scholar
  16. Delorme A, Makeig S (2004) EEGLAB: an opensource toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 134:9–21. doi: 10.1016/j.jneumeth.2003.10.009 CrossRefPubMedGoogle Scholar
  17. Fowler C, Dekle D (1991) Listening with eye and hand: crossmodal contributions to speech perception. J Exp Psychol Hum Percept Perform 17:816–828CrossRefPubMedGoogle Scholar
  18. Frtusova JB, Winneke AH, Phillips NA (2013) ERP evidence that auditory–visual speech facilitates working memory in younger and older adults. Psychol Aging 28(2):481–494CrossRefPubMedGoogle Scholar
  19. Gick B, Jóhannsdóttir KM, Gibraiel D, Mühlbauer M (2008) Tactile enhancement of auditory and visual speech perception in untrained perceivers. J Acoust Soc Am 123:72–76CrossRefPubMedCentralGoogle Scholar
  20. Hickok G, Poeppel D (2007) The cortical organization of speech processing. Nat Rev Neurosci 8:393–402CrossRefPubMedGoogle Scholar
  21. Hommel B, Musseler J, Aschersleben G, Prinz W (2001) The theory of event coding (TEC): a framework for perception and action planning. Behav Brain Sci 24:849–878CrossRefPubMedGoogle Scholar
  22. Jones JA, Callan DE (2003) Brain activity during audiovisual speech perception: an fMRI study of the McGurk effect. Neuro Rep 14:1129–1133Google Scholar
  23. Kaganovich N, Schumaker J (2014) Audiovisual integration for speech during mid-childhood: electrophysiological evidence. Brain Lang 139:36–48CrossRefPubMedPubMedCentralGoogle Scholar
  24. Klucharev V, Möttönen R, Sams M (2003) Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. Brain Res Cogn Brain Res 18:65–75CrossRefPubMedGoogle Scholar
  25. Liberman AM, Mattingly IG (1985) The motor theory of speech perception revised. Cognition 21:1–36CrossRefPubMedGoogle Scholar
  26. McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264:746–748CrossRefGoogle Scholar
  27. Näätänen R, Picton TW (1987) The N1 wave of the human electric and magnetic response to sound: a review and an analysis of the component structure. Psychophysiology 24:375–425CrossRefPubMedGoogle Scholar
  28. Navarra J, Soto-Faraco S (2005) Hearing lips in a second language: visual articulatory information enables the perception of second language sounds. Psychol Res 71(1):4–12CrossRefPubMedGoogle Scholar
  29. Ojanen V, Möttönen R, Pekkola J, Jääskeläinen IP, Joensuu R, Autti T, Sams M (2005) Processing of audiovisual speech in Broca’sarea. NeuroImage 25:333–338CrossRefPubMedGoogle Scholar
  30. Pekkola J, Laasonen M, Ojanen V, Autti T, Jääskeläinen LP, Kujala T, Sams M (2006) Perception of matching and conflicting audiovisual speech in dyslexic and fluent readers: an fMRI study at 3T. NeuroImage 29(3):797–807CrossRefPubMedGoogle Scholar
  31. Pickering MJ, Garrod S (2013) An integrated theory of language production and comprehension. Behav Brain Sci 36:329–347CrossRefPubMedGoogle Scholar
  32. Pilling M (2010) Auditory event-related potentials (ERPs) in audiovisual speech perception. J Speech Lang Hear Res 52(4):1073–1081CrossRefGoogle Scholar
  33. Prinz W (1997) Perception and action planning. Eur J Cogn Psychol 9:129–154CrossRefGoogle Scholar
  34. Reed CM, Durlach NI, Braida LD, Schultz MC (1982) A analytic study of the Tadoma method: identification of consonants and vowels by an experienced Tadoma user. J Speech Hear Res 25:108–116CrossRefPubMedGoogle Scholar
  35. Reed CM, Rabinowitz WM, Durlach NI, Braida LD, Conway-Fithian S, Schultz MC (1985) Research on the Tadoma method of speech communication. J Acoust Soc Am 77(1):247–257CrossRefPubMedGoogle Scholar
  36. Reed CM, Rabinowitz WM, Durlach NI, Delhorne LA, Braida LD, Pemberton JC, Mulcahey BD, Washington DL (1992) Analytic study of the Tadoma method: improving performance through the use of supplementary tactual displays. J Speech Hear Res 35:450–465CrossRefPubMedGoogle Scholar
  37. Reisberg D, McLean J, Goldfield A (1987) Easy to hear but hard to understand: a lip-reading advantage with intact auditory stimuli. In: Dodd B, Campbell R (eds) Hearing by eye: the psychology of lipreading. Lawrence Erlbaum Associates, Inc, New Jersey, pp 97–114Google Scholar
  38. Sato M, Buccino G, Gentilucci M, Cattaneo L (2010) On the tip of the tongue: modulation of the primary motor cortex during audiovisual speech perception. Speech Commun 52(6):533–541CrossRefGoogle Scholar
  39. Scherg M, VonCramon D (1986) Evoked dipole source potentials of the human auditory cortex. Electroencephalogr Clin Neurol 65:344–360CrossRefGoogle Scholar
  40. Schwartz JL, Savariaux C (2001) Is it easier to lipread one’s own speech gestures than those of somebody else? It seems not! auditory–visual speech processing. ISCA Archive, Aalborg, pp 18–23Google Scholar
  41. Schwartz JL, Ménard L, Basirat A, Sato M (2012) The perception for action control theory (PACT): a perceptuo-motor theory of speech perception. J Neurolinguist 25(5):336–354CrossRefGoogle Scholar
  42. Skipper JI, Nusbaum HC, Small SL (2005) Listening to talking faces: motor cortical activation during speech perception. NeuroImage 25:76–89CrossRefPubMedGoogle Scholar
  43. Skipper J, Van Wassenhove V, Nussman H, Small S (2007) Hearing lips and seeing voices: how cortical areas supporting speech production meditate audiovisual speech perception. Cereb Cortex 17:2387–2399CrossRefPubMedPubMedCentralGoogle Scholar
  44. Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19:1964–1973CrossRefPubMedGoogle Scholar
  45. Sumby WH, Pollack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26:212–215CrossRefGoogle Scholar
  46. Treille A, Cordeboeuf C, Vilain C, Sato M (2014a) Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions. Neuropsychologia 57:71–77CrossRefPubMedGoogle Scholar
  47. Treille A, Vilain C, Sato M (2014b) The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception. Front Psychol 5(420):1–9Google Scholar
  48. Tye-Murray N, Spehar B, Myerson J, Hale S, Sommers MS (2013) Reading your own lips: common coding theory and visual speech perception. Psychon Bull Rev 20:115–119CrossRefPubMedPubMedCentralGoogle Scholar
  49. Tye-Murray N, Hale S, Spehar B, Myerson J, Sommers M (2014) Lipreading in school-age children: the roles of age, hearing status, and cognitive ability. J Speech Lang Hear Res 57:556–565CrossRefPubMedGoogle Scholar
  50. van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102:1181–1186CrossRefPubMedPubMedCentralGoogle Scholar
  51. Vroomen J, Stekelenburg JJ (2010) Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli. J Cogn Neurosci 22:1583–1596CrossRefPubMedGoogle Scholar
  52. Watkins KE, Paus T (2004) Modulation of motor excitability during speech perception: the role of Broca’s area. J Cogn Neurosci 16(6):978–987CrossRefPubMedGoogle Scholar
  53. Watkins KE, Strafella AP, Paus T (2003) Seeing and hearing speech excites the motor system involved in speech production. Neuropsychologia 41:989–994CrossRefPubMedGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  • Avril Treille
    • 1
    • 3
  • Coriandre Vilain
    • 1
  • Sonia Kandel
    • 1
  • Marc Sato
    • 2
  1. 1.GIPSA-lab, Département Parole & CognitionCNRS & Grenoble UniversitéGrenobleFrance
  2. 2.Laboratoire Parole & LangageCNRS & Aix-Marseille UniversitéMarseilleFrance
  3. 3.Gipsa-labUniversité StendhalGrenoble Cedex 9France

Personalised recommendations