Measuring the Perception of Facial Expressions in American Sign Language Animations with Eye Tracking
- Cite this paper as:
- Kacorri H., Harper A., Huenerfauth M. (2014) Measuring the Perception of Facial Expressions in American Sign Language Animations with Eye Tracking. In: Stephanidis C., Antona M. (eds) Universal Access in Human-Computer Interaction. Design for All and Accessibility Practice. UAHCI 2014. Lecture Notes in Computer Science, vol 8516. Springer, Cham
Our lab has conducted experimental evaluations of ASL animations, which can increase accessibility of information for signers with lower literacy in written languages. Participants watch animations and answer carefully engineered questions about the information content. Because of the labor-intensive nature of our current evaluation approach, we seek techniques for measuring user’s reactions to animations via eye-tracking technology. In this paper, we analyze the relationship between various metrics of eye movement behavior of native ASL signers as they watch various types of stimuli: videos of human signers, high-quality animations of ASL, and lower-quality animations of ASL. We found significant relationships between the quality of the stimulus and the proportional fixation time on the upper and lower portions of the signers face, the transitions between these portions of the face and the rest of the signer’s body, and the total length of the eye fixation path. Our work provides guidance to researchers who wish to evaluate the quality of sign language animations: to enable more efficient evaluation of animation quality to support the development of technologies to synthesize high-quality ASL animations for deaf users.
KeywordsAmerican Sign Language accessibility technology for people who are deaf eye tracking animation evaluation user study
Unable to display preview. Download preview PDF.