Action-Related Speech Modulates Beta Oscillations During Observation of Tool-Use Gestures
Language and action have been thought of as closely related. Comprehending words or phrases that are related to actions commonly activates motor and premotor areas, and this comprehension process interacts with action preparation and/or execution. However, it remains unclear whether comprehending action-related language interacts with action observation. In the current study, we examined whether the observation of tool-use gesture subjects to interaction with language. In an electroencephalography (EEG) study (n = 20), participants were presented with video clips of an actor performing tool-use (TU, e.g., hammering with a fist) and emblematic (EM, e.g., the thumb up sign for ‘good job’) gestures accompanied by either comprehensible German (G) or incomprehensible Russian sentences (R). Participants performed a semantic judging task, evaluating whether the co-speech gestures were object- or socially-related. Behavioral results from the semantic task showed faster response for the TU versus EM gestures only in the German condition. For EEG, we found that TU elicited beta power decrease (~ 20 Hz) when compared to EM gestures, however this effect was reduced when gestures were accompanied by German instead of Russian sentences. We concluded that the processing of action-related sentences might facilitate gesture observation, in the sense that motor simulation required for TU gestures, as indexed by reduced beta power, was modulated when accompanied by comprehensible German speech. Our results corroborate the functional role of the beta oscillations during perception of hand gestures, and provide novel evidence concerning language–motor interaction.
KeywordsAction observation Beta oscillations Tool-use gesture Language–motor interaction Embodied cognition
This research project is supported by a grant from the ‘Von Behring-Röntgen-Stiftung’ (Project No. 59-0002; 64-0001) and by the ‘Deutsche Forschungsgemeinschaft’ (Project No. DFG: STR 1146/9-1 and SFB/TRR135 project A3). M.S. is supported by the DFG (Project No. STR 1146/4-1). B.S. is supported by the DFG (Project No. STR 1146/8-1). Part of the work of this paper has been presented at the Tilburg Gesture Research meeting (TiGER 2013).
- Goldin-Meadow S (2005) Hearing gesture: how our hands help us think. Harvard University Press, CambridgeGoogle Scholar
- Kendon A (1980) Gesticulation and speech: Two aspects of the process of utterance. Relat Verbal Nonverbal Commun 25:207–227Google Scholar
- McNeill D (1992) Hand and mind: What gestures reveal about thought. University of Chicago Press, ChicagoGoogle Scholar
- McNeill D (2008) Gesture and thought. University of Chicago Press, ChicagoGoogle Scholar