Advertisement

Using a Signing Avatar as a Sign Language Research Tool

  • Delroy Nelson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5398)

Abstract

The first section of this paper introduces the Hamnosys/Visicast signing avatar, which has been designed to produce individual signs and sentences in a number of different sign languages. The operating systems which underlie the avatar are described in detail. The second part of the paper describes a research project which aims to use the avatar’s properties to develop a toolkit for sign language research.

Keywords

avatar sign language SiGML HamNoSys 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Keehner, M., Gathercole, S.E.: Cognitive adaptations arising from nonnative experience of sign language in hearing adults. Memory & Cognition 35(4), 752–761 (2007)CrossRefGoogle Scholar
  2. 2.
    MacSweeney, M., Capek, C., Campbell, R., Woll, B.: The Signing Brain: The Neurobiology of Sign Language. Trends in Cognitive Science (in press, November 2008)Google Scholar
  3. 3.
    Woll, B.: How the Brain processes language in different modalities. In: Esposito, A., Hussain, A., Marinaro, M., Martone, R. (eds.) Multimodal Signals. LNCS (LNAI), vol. 5398, pp. 145–163. Springer, Heidelberg (2008)Google Scholar
  4. 4.
    Sutton-Spence, R.L., Woll, B.: The Linguistics of British Sign Language: an introduction. Cambridge University Press, Cambridge (1999)CrossRefGoogle Scholar
  5. 5.
    Prillwitz, S., Leven, R., Zienert, H., Hanke, T., Henning, J.: HamNoSys. Version 2.0. Hamburg Notation System for Sign Languages - An Introductory Guide. Signum Press, Hamburg (1989)Google Scholar
  6. 6.
    Takkinen, R.: Some Observations on the Use of HamNoSys (Hamburg Notation System for Sign Languages) in the Context of the Phonetic Transcription of Children’s Signing. Sign Lang Ling 1:8(2), 99–118 (2005)CrossRefGoogle Scholar
  7. 7.
  8. 8.
  9. 9.
    Animating Sign Language: The eSign Approach, http://www.visicast.sys.uea.ac.uk/Papers/eSIGNApproach.pdf
  10. 10.
    Elliott, R., Glauert, J.R.W., Kennaway, J.R., Marshall, I.: Development of language processing support for the ViSiCAST project. In: ASSETS 2000, 4th International ACM SIGCAPH Conference on Assistive Technologies (2000)Google Scholar
  11. 11.
    Elliott, R., Glauert, J.R.W., Kennaway, J.R., Parsons, K.J.: D5-2: SigML Definition VisiCAST Project Working Document (2001)Google Scholar
  12. 12.
    Glauert, J.R.W.: ViSiCAST Sign Language using Virtual Humans. In: ICAT 2002, International Conference on Assistive Technology, pp. 21–22 (2002)Google Scholar
  13. 13.
    Kennaway, R.: 5th International Workshop on Gesture and Sign Language based Humman-Computer Interaction, Genova, Italy (April 2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Delroy Nelson
    • 1
  1. 1.Deafness Cognition and Language Research CentreUCLLondonUK

Personalised recommendations