Psychological Research

, Volume 78, Issue 1, pp 55–69 | Cite as

Beyond words: evidence for automatic language–gesture integration of symbolic gestures but not dynamic landscapes

  • Dana Vainiger
  • Ludovica Labruna
  • Richard B. Ivry
  • Michal Lavidor
Original Article

Abstract

Understanding actions based on either language or action observation is presumed to involve the motor system, reflecting the engagement of an embodied conceptual network. We examined how linguistic and gestural information were integrated in a series of cross-domain priming studies. We varied the task demands across three experiments in which symbolic gestures served as primes for verbal targets. Primes were clips of symbolic gestures taken from a rich set of emblems. Participants responded by making a lexical decision to the target (Experiment 1), naming the target (Experiment 2), or performing a semantic relatedness judgment (Experiment 3). The magnitude of semantic priming was larger in the relatedness judgment and lexical decision tasks compared to the naming task. Priming was also observed in a control task in which the primes were pictures of landscapes with conceptually related verbal targets. However, for these stimuli, the amount of priming was similar across the three tasks. We propose that action observation triggers an automatic, pre-lexical spread of activation, consistent with the idea that language–gesture integration occurs in an obligatory and automatic fashion.

References

  1. Andrews, S., & Heathcote, A. (2001). Distinguishing common and task-specific processes in word identification: a matter of some moment? Journal of Experimental Psychology. Learning, Memory, and Cognition, 27(2), 514–544.PubMedCrossRefGoogle Scholar
  2. Bates, E., & Dick, F. (2002). Language, gesture, and the developing brain. Developmental Psychobiology, 40(3), 293–310.PubMedCrossRefGoogle Scholar
  3. Bernardis, P., & Caramelli, N. (2009). Meaning in words, gestures, and mental images. In Proceedings of the 31st annual conference of the Cognitive Science Society (pp. 1693–1697).Google Scholar
  4. Bernardis, P., & Gentilucci, M. (2006). Speech and gesture share the same communication system. Neuropsychologia, 44(2), 178–190.PubMedCrossRefGoogle Scholar
  5. Bernardis, P., Salillas, E., & Caramelli, N. (2008). Behavioural and neurophysiological evidence of semantic interaction between iconic gestures and words. Cognitive Neuropsychology, 25(7–8), 1114–1128.PubMedCrossRefGoogle Scholar
  6. Blair, I. V., Urland, G. R., & Ma, J. E. (2002). Using Internet search engines to estimate word frequency. Behavior Research Methods, Instruments, & Computers, 34(2), 286–290.CrossRefGoogle Scholar
  7. Brookes, H. (2005). What gestures do: some communicative functions of quotable gestures in conversations among Black urban South Africans. Journal of pragmatics, 37(12), 2044–2085.CrossRefGoogle Scholar
  8. Collins, A. M., & Loftus, E. F. (1975). A spreading-activation theory of semantic processing. Psychological Review, 82(6), 407–428.CrossRefGoogle Scholar
  9. Corballis, M. C. (1998). Cerebral asymmetry: Motoring on. Trends in Cognitive Sciences, 2(4), 152–158.PubMedCrossRefGoogle Scholar
  10. Duscherer, K., & Holender, D. (2005). The role of decision biases in semantic priming effects. Swiss Journal of Psychology, 64(4), 249–258.CrossRefGoogle Scholar
  11. Frick-Horbury, D., & Guttentag, R. E. (1998). The effects of restricting hand gesture production on lexical retrieval and free recall. The American Journal of Psychology, 111(1), 43–62.CrossRefGoogle Scholar
  12. Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119(2), 593–609.PubMedCrossRefGoogle Scholar
  13. Gentilucci, M., Bernardis, P., Crisi, G., & Volta, R. D. (2006). Repetitive transcranial magnetic stimulation of Broca’s area affects verbal responses to gesture observation. Journal of Cognitive Neuroscience, 18(7), 1059–1074.PubMedCrossRefGoogle Scholar
  14. Goodwyn, S. W., Acredolo, L. P., & Brown, C. A. (2000). Impact of symbolic gesturing on early language development. Journal of Nonverbal Behavior, 24(2), 81–103.CrossRefGoogle Scholar
  15. Gunter, T. C., & Bach, P. (2004). Communicating hands: ERPs elicited by meaningful symbolic hand postures. Neuroscience Letters, 372, 52–56.PubMedCrossRefGoogle Scholar
  16. Hauk, O., Johnsrude, I., & Pulvermüller, F. (2004). Somatotopic representation of action words in human motor and premotor cortex. Neuron, 41(2), 301–307.PubMedCrossRefGoogle Scholar
  17. Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19(7), 1175–1192.PubMedCrossRefGoogle Scholar
  18. Kang, M. S., Blake, R., & Woodman, G. F. (2011). Semantic analysis does not occur in the absence of awareness induced by interocular suppression. Journal of Neuroscience, 31(38), 13535–13545.PubMedCentralPubMedCrossRefGoogle Scholar
  19. Kelly, S. D., Creigh, P., & Bartolotti, J. (2010a). Integrating speech and iconic gestures in a stroop-like task: Evidence for automatic processing. Journal of Cognitive Neuroscience, 22(4), 683–694.PubMedCrossRefGoogle Scholar
  20. Kelly, S. D., Özyürek, A., & Maris, E. (2010b). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21(2), 260–267.PubMedCrossRefGoogle Scholar
  21. Kendon, A. (1994). Do gestures communicate? A review. Research on language and social interaction, 27(3), 175–200.Google Scholar
  22. Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge University Press, Cambridge.Google Scholar
  23. Kita, S. (2000). How representational gestures help speaking. In D. McNeill (Ed.), Language and gesture (pp. 162–185). Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  24. Kita, S., & Özyürek, A. (2003). What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking. Journal of Memory and Language, 48(1), 16–32.CrossRefGoogle Scholar
  25. Krauss, M., Chen, Y., & Gottesman, R. F. (2000). Lexical gestures and lexical access: A process model. In D. McNeill (Ed.), Language and gesture (pp. 261–283). New-York: Cambridge.CrossRefGoogle Scholar
  26. Krauss, R., & Hadar, U. (1999). The role of speech-related arm/hand gesture in word retrieval. In L. S. Messing & R. Campbell (Eds.), Gesture, speech, and sign (pp. 93–116). Oxford: Oxford University Press.CrossRefGoogle Scholar
  27. Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: University of Chicago Press.Google Scholar
  28. Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to Western thought. New York: Basic books.Google Scholar
  29. Lovseth, K., & Atchley, R. A. (2010). Examining lateralized semantic access using pictures. Brain and Cognition, 72(2), 202–209.PubMedCentralPubMedCrossRefGoogle Scholar
  30. McNamara, T. P. (2005). Semantic priming: Perspectives from memory and word recognition. New York: Psychology Press.Google Scholar
  31. McNeill, D. (1992). Hand and mind. What the hands reveal about thought. Chicago: University of Chicago Press.Google Scholar
  32. Morrel-Samuels, P., & Krauss, R. M. (1992). Word familiarity predicts temporal asynchrony of hand gestures and speech. Learning, Memory, 18(3), 615–622.Google Scholar
  33. Mukamel, R., Ekstrom, A. D., Kaplan, J., Iacoboni, M., & Fried, I. (2010). Single-neuron responses in humans during execution and observation of actions. Current Biology, 20, 750–756.PubMedCentralPubMedCrossRefGoogle Scholar
  34. Neely, J. H. (1991). Semantic priming effects in visual word recognition: A selective review of current findings and theories. In Basic processes in reading: Visual word recognition, 11. Google Scholar
  35. Neely, J. H., & Keefe, D. E. (1989). Semantic context effects on visual word processing: A hybrid prospective/retrospective processing theory. In G. H. Bower (Eds.), The psychology of learning and motivation: Advances in research and theory (Vol. 24, pp. 207–248). New York: Academic Press.Google Scholar
  36. Obermeier, C., Holle, H., & Gunter, T. C. (2011). What iconic gesture fragments reveal about gesture-speech integration: when synchrony is lost, memory can help. Journal of Cognitive Neuroscience, 23(7), 1648–1663.PubMedCrossRefGoogle Scholar
  37. Oldfield, R. C. (1971). The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia, 9(1), 97–113.PubMedCrossRefGoogle Scholar
  38. Oliveri, M., Finocchiaro, C., Shapiro, K., Gangitano, M., Caramazza, A., & Pascual-Leone, A. (2004). All talk and no action: a transcranial magnetic stimulation study of motor cortex activation during action word production. Journal of Cognitive Neuroscience, 16(3), 374–381.PubMedCrossRefGoogle Scholar
  39. Perea, M., & Rosa, E. (2002). The effects of associative and semantic priming in the lexical decision task. Psychological Research, 66(3), 180–194.PubMedCrossRefGoogle Scholar
  40. Pulvermüller, F., Shtyrov, Y., & Ilmoniemi, R. (2005). Brain signatures of meaning access in action word recognition. Journal of Cognitive Neuroscience, 17(6), 884–892.PubMedCrossRefGoogle Scholar
  41. Rauscher, F. H., Krauss, R. M., & Chen, Y. (1996). Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7(4), 226–231.CrossRefGoogle Scholar
  42. Rimé, B., & Schiaratura, L. (1991). Gesture and speech. In R. Feldman & B. Rimé (Eds.), Fundamentals of nonverbal behavior (pp. 239–281). Cambridge: Cambridge University Press.Google Scholar
  43. Rizzolatti, G., & Arbib, M. A. (1998). Language within our grasp. Trends in Neurosciences, 21(5), 188–194.PubMedCrossRefGoogle Scholar
  44. Rizzolatti, G., Fogassi, L., & Gallese, V. (2001). Neurophysiological mechanisms underlying the understanding and imitation of action. Nature Reviews Neuroscience, 2(9), 661–670.PubMedCrossRefGoogle Scholar
  45. Schegloff, E. A. (1984). On some gestures’ relation to talk. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action: Studies in conversation analysis (pp. 266–296). Cambridge: Cambridge University Press.Google Scholar
  46. Tettamanti, M., Buccino, G., Saccuman, M. C., Gallese, V., Danna, M., Scifo, P., et al. (2005). Listening to action-related sentences activates fronto-parietal motor circuits. Journal of Cognitive Neuroscience, 17(2), 273–281.PubMedCrossRefGoogle Scholar
  47. Wu, Y. C., & Coulson, S. (2007). Iconic gestures prime related concepts: An ERP study. Psychonomic Bulletin & Review, 14(1), 57–63.CrossRefGoogle Scholar
  48. Wu, Y. J., & Thierry, G. (2010). Chinese-English bilinguals reading English hear Chinese. Journal of Neuroscience, 30, 7646–7651.PubMedCrossRefGoogle Scholar
  49. Xu, J., Gannon, P. J., Emmorey, K., Smith, J. F., & Braun, A. R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences, 106(49), 20664–20669.CrossRefGoogle Scholar
  50. Yap, D. F., So, W. C., Melvin Yap, J. M., Tan, Y. Q., & Teoh, R. L. S. (2011). Iconic gestures prime words. Cognitive Science, 35(1), 171–183.PubMedCrossRefGoogle Scholar
  51. Zwaan, R. A. (2004). The immersed experiencer: Towards an embodied theory of language comprehension. In B. Ross (Ed.), The Psychology of Learning and Motivation (Vol. 44, pp 35–62). San Diego: Academic Press.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Dana Vainiger
    • 1
  • Ludovica Labruna
    • 2
  • Richard B. Ivry
    • 2
  • Michal Lavidor
    • 1
    • 3
  1. 1.Department of PsychologyBar-Ilan UniversityRamat GanIsrael
  2. 2.Department of PsychologyUniversity of CaliforniaBerkeleyUSA
  3. 3.Department of PsychologyUniversity of HullCottinghamUK

Personalised recommendations