Skip to main content
Log in

A Coding System with Independent Annotations of Gesture Forms and Functions During Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

  • Original Paper
  • Published:
Journal of Nonverbal Behavior Aims and scope Submit manuscript

Abstract

Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Ahlsén, E. (2011). Towards an integrated view of gestures related to speech. In Proceedings of the 3rd Nordic symposium on multimodal communication, Vol. 15, pp. 72–77.

  • Alibali, M. W., & DiRusso, A. A. (1999). The function of gesture in learning to count: More than keeping track. Cognitive Development, 14(1), 37–56.

    Article  Google Scholar 

  • Alibali, M. W., Kita, S., & Young, A. J. (2000). Gesture and the process of speech production: We think, therefore we gesture. Language & Cognitive Processes, 15(6), 593–613.

    Article  Google Scholar 

  • Anastasiou, D. (2012). A speech and gesture spatial corpus in assisted living. In Proceedings of the eight international conference on language resources and evaluation (LREC’12), Vol. 8, pp. 2351–2354.

  • Beattie, G., & Shovelton, H. (1999). Mapping the range of information contained in the iconic hand gestures that accompany spontaneous speech. Journal of Language and Social Psychology, 18(4), 438–462.

    Article  Google Scholar 

  • Beattie, G., & Shovelton, H. (2000). Iconic hand gestures and the predictability of words in context in spontaneous speech. British Journal of Psychology, 91(4), 473.

    Article  PubMed  Google Scholar 

  • Box, G. E. P., & Cox, D. R. (1964). An analysis of transformation. Journal of Royal Statistical Society (Series B), 26, 211–246.

    Google Scholar 

  • Bressem, J. (2008). Notating gestures: Proposal for a form based notation system of coverbal gestures. Unpublished manuscript.

  • Brugman, H., & Russel, A. (2004). Annotating multi-media/multi-modal resources with ELAN. LREC.

  • Butterworth, B., & Hadar, U. (1989). Gesture, speech, and computational stages: A reply to McNeill. Psychological Review, 96(1), 168–174.

    Article  PubMed  Google Scholar 

  • Cohen, R. L., & Borsoi, D. (1996). The role of gestures in description-communication: A cross-sectional study of aging. Journal of Nonverbal Behavior, 20(1), 45–63.

    Article  Google Scholar 

  • Colletta, J. M., Pellenq, C., & Guidetti, M. (2010). Age-related changes in co-speech gesture and narrative: Evidence from French children and adults. Speech Communication, 52(6), 565–576.

    Article  Google Scholar 

  • Crowder, E. M. (1996). Gestures at work in sense-making science talk. The Journal of the Learning Sciences, 5(3), 173–208.

    Article  Google Scholar 

  • Dadgostar, F., Barczak, A. L. C., & Sarrafzadeh, A. (2005). A color hand gesture database for evaluating and improving algorithms on hand gesture and posture recognition. Research Letters in the Information and Mathematical Sciences, 7, 127–134.

    Google Scholar 

  • Ekman, P., & Friesen, W. V. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1(1), 49–98.

    Google Scholar 

  • Feyereisen, P., & Havard, I. (1999). Mental imagery and production of hand gestures while speaking in younger and older adults. Journal of Nonverbal Behavior, 23(2), 153–171.

    Article  Google Scholar 

  • Field, A. P. (2009). Discovering statistics using SPSS: (and sex and drugs and rock ‘n’ roll) (3rd ed.). London: SAGE.

    Google Scholar 

  • German, D. J. (2001). It’s on the tip of my tongue: Word-finding strategies to remember names and words you often forget. Chicago, IL: Word Finding Materials Inc.

    Google Scholar 

  • Goldin-Meadow, S. (1999). The role of gesture in communication and thinking. Trends in Cognitive Sciences, 3(11), 419–429.

    Article  PubMed  Google Scholar 

  • Goldin-Meadow, S. (2003). Hearing gesture: How our hands help us think. Cambridge, MA: Belknap Press of Harvard University Press.

  • Gooch, C. M., Stern, Y., & Rakitin, B. C. (2009). Evidence for age-related changes to temporal attention and memory from the choice time production task. Aging, Neuropsychology & Cognition, 16(3), 285–310.

    Article  Google Scholar 

  • Hayamizu, S., Nagaya, S., Watanuki, K., Nakazawa, M., Nobe, S., & Yoshimura, T. (1999). A multimodal database of gestures and speech. In Proceeding of the sixth European conference on speech communication and technology, Vol. 6, pp, 2247–2250.

  • Herrmann, M., Reichle, T., Lucius-Hoene, G., Wallesch, C. W., & Johannsen-Horbach, H. (1988). Nonverbal communication as a compensation strategy for severely nonfluent aphasic? A quantitative approach. Brain and Language, 13, 41–54.

    Article  Google Scholar 

  • Hostetter, A. B., & Alibali, M. W. (2007). Raise your hand if you’re spatial: Relations between verbal and spatial skills and gesture production. Gesture, 7, 73–95.

    Article  Google Scholar 

  • Hwang, B. W., Kim, S. M., & Lee, S. W. (2006). A full-body gesture database for automatic gesture recognition. In Proceeding of the 7th international conference on automatic face and gesture recognition, Vol. 7, pp. 243–248.

  • Jacobs, N., & Garnham, A. (2007). The role of conversational hand gestures in a narrative task. Journal of Memory and Language, 56(2), 291–303.

    Article  Google Scholar 

  • Just, A., Rodriguez, Y., & Marcel, S. (1996). Hand posture classification and recognition using the modified census transform. In Proceeding of the IEEE international conference on automatic face and gesture recognition, Vol. 2, pp. 351–356.

  • Kong, A. P. H., Law, S. P., Kwan, C., Lai, C., Lam, V., & Lee, A. (2012, November). A comprehensive framework to analyze co-verbal gestures during discourse production. Poster presented at the 2012 American speech-language-hearing association (ASHA) convention, Atlanta, GA, USA.

  • Kong, A. P. H., Law, S. P., & Lee, A. S. Y. (2010). An investigation of use of non-verbal behaviors among individuals with aphasia in Hong Kong: Preliminary data. Procedia Social and Behavioral Sciences, 6, 57–58.

    Article  Google Scholar 

  • Krauss, R. M., Chen, Y., & Gottesman, R. F. (2000). Lexical gestures and lexical access: A process model. In D. McNeill (Ed.), Language and gesture (pp. 261–283). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Krauss, R. M., & Hadar, U. (1999). The role of speech-related arm/hand gestures in word retrieval. In R. Campbell & L. Messing (Eds.), Gesture, speech, and sign (pp. 93–116). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Lanyon, L., & Rose, M. L. (2009). Do the hands have it? The facilitation effects of arm and hand gesture on word retrieval in aphasia. Aphasiology, 23(7–8), 809–822.

    Article  Google Scholar 

  • Lausberg, H., & Sloetjes, H. (2008). Gesture coding with the NGCS–ELAN system. In. A. J. Spink, M. R. Ballintijn, N. D. Bogers, F. Grieco, L. W. S. Loijens, L. P. J. J. Noldus, G. Smit, & P. H. Zimmerman (Eds.), Proceedings of measuring behavior 2008 (pp. 176–177). Netherlands: Noldus Information Technology.

  • Lausberg, H., & Sloetjes, H. (2009). Coding gestural behavior with the NEUROGES-ELAN system. Behavior Research Methods, 41(3), 841–849.

    Article  PubMed  Google Scholar 

  • Le May, A., David, R., & Thomas, A. P. (1988). The use of spontaneous gesture by aphasic patients. Aphasiology, 2(2), 137–145.

    Article  Google Scholar 

  • Lücking, A., Bergmann, K., Hahn, F., Kopp, S., & Rieser, H. (2010). The Bielefeld speech and gesture alignment corpus (SaGA). In M. Kipp, J. P. Martin, P. Paggio, & D. Heylen, (Eds). LREC 2010 workshop: Multimodal corporaadvances in capturing, coding and analyzing multimodality (pp. 92–98). Republic of Malta.

  • MacWhinney, B. (2003). Child language analyses (CLAN) (version 23 September 2003) [Computer software]. Pittsburgh, PA: Author.

  • MacWhinney, B., Fromm, D., Forbes, M., & Holland, A. (2011). AphasiaBank: Methods for studying discourse. Aphasiology, 25, 1286–1307.

    Article  PubMed Central  PubMed  Google Scholar 

  • Marcel, S., Bernier, O., Viallet, J-E., & Collobert, D. (2000). Hand gesture recognition using input/ouput hidden markov models. In Proceedings of the 4th international conference on automatic face and gesture recognition, Vol. 4, pp. 398–402.

  • Mather, S. M. (2005). Ethnographic research on the use of visually based regulators for teachers and interpreters. In M. Metzger & E. Fleetwood (Eds.), Attitudes, innuendo, and regulators (pp. 136–161). Washington, DC: Gallaudet University Press.

    Google Scholar 

  • Max Planck Institute. (2001, February 27). Gesture database (GDB). Retrieved from http://www.mpi.nl/ISLE/overview/Overview_GDB.html/.

  • Max Planck Institute for Psycholinguistics. (2002). http://www.lat-mpi.eu/tools/elan/.

  • Mayberry, R. I., & Jaques, J. (2000). Gesture production during stuttered speech: Insights into the nature of gesture-speech integration. In D. McNeill (Ed.), Language and gesture (pp. 199–214). New York: Cambridge University Press.

    Chapter  Google Scholar 

  • McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.

    Google Scholar 

  • Montepare, J., Koff, E., Zaitchik, D., & Albert, M. (1999). The use of body movements and gestures as cues to emotions in younger and older adults. Journal of Nonverbal Behavior, 23, 133–152.

    Article  Google Scholar 

  • Montepare, J. M., & Tucker, J. S. (1999). Aging and nonverbal behavior: Current perspectives and future directions. Journal of Nonverbal Behavior, 23, 105–109.

    Article  Google Scholar 

  • Old, S. R., & Naveh-Benjamin, M. (2008). Differential effects of age on item and associative measures of memory: A meta-analysis. Psychology and Aging, 23(1), 104–118.

    Article  PubMed  Google Scholar 

  • Osborne, J. W. (2010). Improving your data transformations: Applying the Box-Cox transformation. Practical Assessment, Research & Evaluation, 15(12), 1–9.

    Google Scholar 

  • Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X. F., Kirbas, C., et al. (2002). Multimodal human discourse: Gesture and speech. ACM Transactions on Computer-Human Interaction, 9(3), 171–193.

    Article  Google Scholar 

  • Rauscher, F. H., Krauss, R. M., & Chen, Y. (1996). Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7(4), 226–231.

    Article  Google Scholar 

  • Ska, B., & Nespoulous, J. (1987). Pantomimes and aging. Journal of Clinical and Experimental Neuropsychology, 9, 754–766.

    Article  PubMed  Google Scholar 

  • Smith, L. (1987). Nonverbal competency in aphasic stroke patients’ conversation. Aphasiology, 1(2), 127–139.

    Article  Google Scholar 

  • Triesch, J., & von der Malsburg, C. (1996). Robust classification of hand postures against complex backgrounds. In Proceedings of the second international conference on automatic face and gesture recognition, Vol. 2, pp. 170–175.

  • Wilson, F. R. (1998). The hand: How its use shapes the brain, language, and human culture. New York: Pantheon Books.

    Google Scholar 

  • Wu, Y. C., & Coulson, S. (2007). How iconic gestures enhance communication: An ERP study. Brain and Language, 101, 234–245.

    Article  PubMed  Google Scholar 

  • Xu, J., Gannon, P. J., Emmorey, K., Jason, F. S., & Braun, A. R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences of the United States of America, 106(49), 20664–20669.

    Article  PubMed Central  PubMed  Google Scholar 

Download references

Acknowledgments

This study is supported by the grant “Towards a multi-modal and multi-level analysis of Chinese aphasic discourse” (1-R01-DC010398) funded by the National Institutes of Health to Kong, A.P.H. (PI), Law, S.P. (Co-I), and Lee, A. (Co-I). The authors would like to thank all subjects for their help and co-operation during data collection.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anthony Pak-Hin Kong.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kong, A.PH., Law, SP., Kwan, C.CY. et al. A Coding System with Independent Annotations of Gesture Forms and Functions During Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE). J Nonverbal Behav 39, 93–111 (2015). https://doi.org/10.1007/s10919-014-0200-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10919-014-0200-6

Keywords

Navigation