Measuring the Perception of Facial Expressions in American Sign Language Animations with Eye Tracking

  • Hernisa Kacorri
  • Allen Harper
  • Matt Huenerfauth
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8516)

Abstract

Our lab has conducted experimental evaluations of ASL animations, which can increase accessibility of information for signers with lower literacy in written languages. Participants watch animations and answer carefully engineered questions about the information content. Because of the labor-intensive nature of our current evaluation approach, we seek techniques for measuring user’s reactions to animations via eye-tracking technology. In this paper, we analyze the relationship between various metrics of eye movement behavior of native ASL signers as they watch various types of stimuli: videos of human signers, high-quality animations of ASL, and lower-quality animations of ASL. We found significant relationships between the quality of the stimulus and the proportional fixation time on the upper and lower portions of the signers face, the transitions between these portions of the face and the rest of the signer’s body, and the total length of the eye fixation path. Our work provides guidance to researchers who wish to evaluate the quality of sign language animations: to enable more efficient evaluation of animation quality to support the development of technologies to synthesize high-quality ASL animations for deaf users.

Keywords

American Sign Language accessibility technology for people who are deaf eye tracking animation evaluation user study 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Adamo-Villani, N., Doublestein, J., Martin, Z.: Sign language for K-8 mathematics by 3D interactive animation. Journal of Educational Technology Systems 33(3), 241–257 (2005)CrossRefGoogle Scholar
  2. 2.
    Adamo-Villani, N., Popescu, V., Lestina, J.: A non-expert-user interface for posing signing avatars. Disability and Rehabilitation: Assistive Technology 8(3), 238–248 (2013)CrossRefGoogle Scholar
  3. 3.
    Ardis, S.: ASL Animations supporting literacy development for learners who are deaf. Closing the Gap 24(5), 1–4 (2006)Google Scholar
  4. 4.
    Cavender, A., Rice, E.A., Wilamowska, K.M.: SignWave: Human Perception of Sign Language Video Quality as Constrained by Mobile Phone Technology. Emergency 7(12), 16–20Google Scholar
  5. 5.
    Cooke, L., Cuddihy, E.: Using eye tracking to address limitations in think-aloud protocol. In: Proceedings of the International Professional Communication Conference (IPCC 2005), pp. 653–658. IEEE (2005)Google Scholar
  6. 6.
    Duchowski, A.: A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34(4), 455–470 (2002)CrossRefGoogle Scholar
  7. 7.
    Emmorey, K., Thompson, R., Colvin, R.: Eye gaze during comprehension of American Sign Language by native and beginning signers. J. Deaf Stud. Deaf Educ. 14(2), 237–243 (2009)CrossRefGoogle Scholar
  8. 8.
    Huenerfauth, M.: Evaluation of a psycholinguistically motivated timing model for animations of American Sign Language. In: Proc. of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 129–136. ACM (2008)Google Scholar
  9. 9.
    Huenerfauth, M.: Spatial and planning models of ASL classifier predicates for machine translation. In: 10th International Conference on Theoretical and Methodological Issues in Machine Translation, TMI (2004)Google Scholar
  10. 10.
    Huenerfauth, M., Lu, P., Rosenberg, A.: Evaluating importance of facial expression in American Sign Language and pidgin signed English animations. In: Proc. of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 99–106. ACM (2011)Google Scholar
  11. 11.
    Huenerfauth, M., Zhao, L., Gu, E., Allbeck, J.: Evaluating American Sign Language generation through the participation of native ASL signers. In: Proc. of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 211–218. ACM (2007)Google Scholar
  12. 12.
    Jacob, R.J.K., Karn, K.S.: Eye Tracking in Human- Computer Interaction and Usability Research: Ready to Deliver the Promises. The Mind’s Eye (First Edition). In: Hyönä, J., Radach, R., Deubel, H. (eds.) Amsterdam, pp. 573–605 (2003)Google Scholar
  13. 13.
    Kacorri, H., Harper, A., Huenerfauth, M.: Comparing native signers’ perception of American Sign Language animations and videos via eye tracking. In: Proc. of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, p. 9. ACM (2013)Google Scholar
  14. 14.
    Kacorri, H., Lu, P., Huenerfauth, M.: Evaluating facial expressions in American Sign Language animations for accessible online information. In: Stephanidis, C., Antona, M. (eds.) UAHCI 2013, Part I. LNCS, vol. 8009, pp. 510–519. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  15. 15.
    Karpouzis, K., Caridakis, G., Fotinea, S.E., Efthimiou, E.: Educational resources and implementation of a Greek sign language synthesis architecture. Computers & Education 49(1), 54–74 (2007)CrossRefGoogle Scholar
  16. 16.
    Lu, P., Huenerfauth, M.: Accessible motion-capture glove calibration protocol for recording sign language data from deaf subjects. In: Proc. of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 83–90. ACM (2009)Google Scholar
  17. 17.
    Mitchell, R., Young, T., Bachleda, B., Karchmer, M.: How many people use ASL in the United States? Why estimates need updating. Sign. Lang. Studies 6(3), 306–335 (2006)CrossRefGoogle Scholar
  18. 18.
    Muir, L.J., Richardson, I.E.: Perception of sign language and its application to visual communications for deaf people. J. Deaf Stud. Deaf Educ. 10(4), 390–401 (2005)CrossRefGoogle Scholar
  19. 19.
    Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124(3), 372–422 (1998)Google Scholar
  20. 20.
    Traxler, C.: The Stanford achievement test, 9th edition: national norming and performance standards for deaf & hard-of-hearing students. J. Deaf Stud. & Deaf Educ. 5(4), 337–348 (2000)CrossRefGoogle Scholar
  21. 21.
    Wagner, M., Marder, C., Blackorby, J., Cameto, R., Newman, L., Levine, P., et al.: The achievements of 100 youth with disabilities during secondary school: A report from 101 the National Longitudinal Transition Study-2 (NLTS2). SRI International, Menlo 102 Park (2003)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Hernisa Kacorri
    • 1
  • Allen Harper
    • 1
  • Matt Huenerfauth
    • 2
  1. 1.Doctoral Program in Computer Science, The Graduate CenterThe City University of New York (CUNY)New YorkUSA
  2. 2.Computer Science Department, CUNY Queens College Computer Science and Linguistics Programs, CUNY Graduate CenterThe City University of New York (CUNY)FlushingUSA

Personalised recommendations