Experimental Brain Research

, Volume 227, Issue 2, pp 275–288

Silent articulation modulates auditory and audiovisual speech perception

Authors

    • GIPSA-LAB, UMR CNRS 5216, Département Parole and CognitionGrenoble Université
    • Centre for Research on Brain, Language and MusicMcGill University
  • Emilie Troille
    • GIPSA-LAB, UMR CNRS 5216, Département Parole and CognitionGrenoble Université
    • Centre de Recherche sur l’ImaginaireUniversité Stendhal
  • Lucie Ménard
    • Centre for Research on Brain, Language and MusicMcGill University
    • Département de LinguistiqueUniversité du Québec à Montréal
  • Marie-Agnès Cathiard
    • Centre de Recherche sur l’ImaginaireUniversité Stendhal
  • Vincent Gracco
    • Centre for Research on Brain, Language and MusicMcGill University
    • School of Communication Sciences and DisordersMcGill University
    • Haskins Laboratories
Research Article

DOI: 10.1007/s00221-013-3510-8

Cite this article as:
Sato, M., Troille, E., Ménard, L. et al. Exp Brain Res (2013) 227: 275. doi:10.1007/s00221-013-3510-8

Abstract

The concept of an internal forward model that internally simulates the sensory consequences of an action is a central idea in speech motor control. Consistent with this hypothesis, silent articulation has been shown to modulate activity of the auditory cortex and to improve the auditory identification of concordant speech sounds, when embedded in white noise. In the present study, we replicated and extended this behavioral finding by showing that silently articulating a syllable in synchrony with the presentation of a concordant auditory and/or visually ambiguous speech stimulus improves its identification. Our results further demonstrate that, even in the case of perfect perceptual identification, concurrent mouthing of a syllable speeds up the perceptual processing of a concordant speech stimulus. These results reflect multisensory-motor interactions during speech perception and provide new behavioral arguments for internally generated sensory predictions during silent speech production.

Keywords

Speech perceptionSpeech productionSilent speechAudiovisual speech perceptionInternal forward modelsSensory-motor interactionsEfference copyMcGurk effect

Copyright information

© Springer-Verlag Berlin Heidelberg 2013