Sensorimotor training modulates automatic imitation of visual speech
- 27 Downloads
The observation-execution links underlying automatic-imitation processes are suggested to result from associative sensorimotor experience of performing and watching the same actions. Past research supporting the associative sequence learning (ASL) model has demonstrated that sensorimotor training modulates automatic imitation of perceptually transparent manual actions, but ASL has been criticized for not being able to account for opaque actions, such as orofacial movements that include visual speech. To investigate whether the observation-execution links underlying opaque actions are as flexible as has been demonstrated for transparent actions, we tested whether sensorimotor training modulated the automatic imitation of visual speech. Automatic imitation was defined as a facilitation in response times for syllable articulation (ba or da) when in the presence of a compatible visual speech distractor, relative to when in the presence of an incompatible distractor. Participants received either mirror (say /ba/ when the speaker silently says /ba/, and likewise for /da/) or countermirror (say /da/ when the speaker silently says /ba/, and vice versa) training, and automatic imitation was measured before and after training. The automatic-imitation effect was enhanced following mirror training and reduced following countermirror training, suggesting that sensorimotor learning plays a critical role in linking speech perception and production, and that the links between these two systems remain flexible in adulthood. Additionally, as compared to manual movements, automatic imitation of speech was susceptible to mirror training but was relatively resilient to countermirror training. We propose that social factors and the multimodal nature of speech might account for the resilience to countermirror training of sensorimotor associations of speech actions.
KeywordsAutomatic imitation Speech perception Speech production Sensorimotor learning
This work was supported by a grant of the China Scholarship Council to the first author and by the BIAL Foundation under Grant No. 267/14 to P.A.
Open practices statement
The data and materials for this experiment are available upon request, and no experiment was preregistered.
- Boersma, P., & Weenink, D. (2018). Praat: Doing phonetics by computer [Computer program]. Retrieved from www.praat.org
- Cracco, E., Bardi, L., Desmet, C., Genschow, O., Rigoni, D., De Coster, L., . . . Brass, M. (2018). Automatic imitation: A meta-analysis. Psychological Bulletin, 144, 453–500.Google Scholar
- Heyes, C. (2005). Imitation by association. In S. Hurley & N. Chater (Eds.), Perspectives on imitation: From neuroscience to social science. Vol. 1: Mechanisms of imitation and imitation in animals (pp. 157–176). Cambridge, MA: MIT Press.Google Scholar
- Schmitz, J., Bartoli, E., Maffongelli, L., Fadiga, L., Sebastian-Galles, N., & D’Ausilio, A. (2018). Motor cortex compensates for lack of sensory and motor experience during auditory speech perception. Neuropsychologia. Advance online publication. https://doi.org/10.1016/j.neuropsychologia.2018.01.006