Movements and Holds in Fluent Sentence Production of American Sign Language: The Action-Based Approach
- 188 Downloads
The importance of bodily movements in the production and perception of communicative actions has been shown for the spoken language modality and accounted for by a theory of communicative actions (Cogn. Process. 2010;11:187–205). In this study, the theory of communicative actions was adapted to the sign language modality; we tested the hypothesis that in the fluent production of short sign language sentences, strong-hand manual sign actions are continuously ongoing without holds, while co-manual oral expression actions (i.e. sign-related actions of the lips, jaw, and tip of the tongue) and co-manual facial expression actions (i.e. actions of the eyebrows, eyelids, etc.), as well as weak-hand actions, show considerable holds. An American Sign Language (ASL) corpus of 100 sentences was analyzed by visually inspecting each frame-to-frame difference (30 frames/s) for separating movement and hold phases for each manual, oral, and facial action. Excluding fingerspelling and signs in sentence-final position, no manual holds were found for the strong hand (0%; the weak hand is not considered), while oral holds occurred in 22% of all oral expression actions and facial holds occurred for all facial expression actions analyzed (100%). These results support the idea that in each language modality, the dominant articulatory system (vocal tract or manual system) determines the timing of actions. In signed languages, in which manual actions are dominant, holds occur mainly in co-manual oral and co-manual facial actions. Conversely, in spoken language, vocal tract actions (i.e. actions of the lips, tongue, jaw, velum, and vocal folds) are dominant; holds occur primarily in co-verbal manual and co-verbal facial actions.
KeywordsSign language Spoken language Action theory Manual actions Oral actions Facial actions Co-verbal actions Co-manual actions Movements Holds
This work was supported in part by the German Research Council DFG grant Kr 1439/13-1 and grant Kr 1439/15-1. We thank the two anonymous reviewers for their helpful comments on earlier versions of this paper, and we thank Cigdem Capaat for labeling the ASL corpus.
- 3.Boston-200-Sentences-ASL-Corpus of the National Center for Sign Languages and Gesture Resources at Boston University. 2000. http://www.bu.edu/asllrp/cslgr/.
- 7.Cohn JF, Ambadar Z, Ekman P. Observer-based measurement of facial expression with the facial action coding system. In: Coan JA, Allen JJB, editors. Handbook of emotion elicitation and assessment. New York: Oxford University Press; 2007. p. 203–21.Google Scholar
- 8.Dreuw P, Rybach D, Deselaers T, Zahedi M, Ney H. Speech Recognition Techniques for a Sign Language Recognition System. 2007. Proceedings of Interspeech 2007 (Antwerp, Belgium). pp. 2513–2516.Google Scholar
- 10.Ekman P, Friesen WV. Facial action coding system. Palo Alto, CA: Consulting Psychologists Press; 1978.Google Scholar
- 11.Emmorey K. Language, cognition, and the brain: insights from sign language research. Lawrence Erlbaum Associates; 2002.Google Scholar
- 13.Goldin-Meadow S. Hearing gesture. Cambridge, London: Belknap & Harvard University Press; 2003.Google Scholar
- 17.Kendon A. Gesture: visible action as utterance. New York: Cambridge University Press; 2004.Google Scholar
- 18.Klima E, Bellugi U. The signs of language. Cambridge, MA: Harvard University Press; 1979.Google Scholar
- 23.Kröger BJ, Kopp S, Lowit A. A model for production, perception, and acquisition of actions in face-to-face communication. Cogn Process. 2010;11:187–205.Google Scholar
- 26.Liddell SK, Johnson RE. American sign language: the phonological base. Sign Lang Stud. 1989;64:195–277.Google Scholar
- 28.Liddell SK. Grammar, gesture and meaning in American sign language. New York: Cambridge University Press; 2003.Google Scholar
- 29.McNeill D. Hand and mind: what gestures reveal about thought. Chicago: University of Chicago Press; 1992.Google Scholar
- 30.McNeill D. Gesture and thought. Chicago: University of Chicago Press; 2005.Google Scholar
- 32.Perlmutter DM. Sonority and syllable structure in American sign language. Linguist Inquiry. 1992;23:407–42.Google Scholar
- 38.Stokoe WC (1960) Sign language structure: An outline of the visual communication systems of the American Deaf, Studies in Linguistics Occasional Paper 8, University of Buffalo.Google Scholar
- 40.Valli C, Lucas C. Linguistics of American sign language. An Introduction. Washington: Gallaudet University Press; 2000.Google Scholar
- 41.Vanger P, Hoenlinger R, Haken H. Computer aided generation of prototypical facial expressions of emotion. Methods of Psychological Research Online. 1998. Vol. 3, No. 1. http://www.dgps.de/fachgruppen/methoden/mpr-online.
- 42.Wilcox S, Morford JP. Empirical methods in signed language research. In: Gonzalez-Marquez M, Mittelberg I, Coulson S, Spivey MJ, editors. Methods in cognitive linguistics. Amsterdam/Philadelphia: John Benjamins; 2007. p. 171–200.Google Scholar