Advertisement

Understanding, Evaluating and Analyzing Touch Screen Gestures for Visually Impaired Users in Mobile Environment

  • Vikas Luthra
  • Sanjay Ghosh
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)

Abstract

Smartphones usage among visually impaired users is growing in prominence and mobile phone providers are continuously looking for solutions to make touch screen interfaces more accessible to them. Key accessibility features for vision related impairment includes assistive screen reading applications like Voiceover (https://www.apple.com/ in/accessibility/ios/voiceover/) in iOS or Talkback (https://support.google.com/accessibility/android/answer/6007100?hl=en) in Android which supports a variety of touch gestures for performing basic functions and commands. Our preliminary interactions with users from this community revealed that some of these existing gestures are ambiguous, difficult to perform, non-intuitive and have accuracy and detection issues. Moreover there is lack of understanding regarding usage of these accessibility features and existing gestures. In this paper, we address these challenges through set of three experimental exercises-task based comparative evaluation, gesture elicitation and gesture performance done with a group of 12 visually impaired users. Based on experimental evidences we pinpoint the exact problems with few existing gestures. Additionally, this work contributes in identifying some characteristics of effective and easy gestures for the target segment. We also propose design solutions to resolve users pain points and discuss some touch screen accessibility design guidelines keeping in mind different type of visually impaired users - fully blind, extremely low vision and low vision.

Keywords

Accessibility Gestures Assistive technologies User centered design Mobile user interfaces 

References

  1. 1.
    Wobbrock, Jacob O., Meredith Ringel Morris, and Andrew D. Wilson. User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM (2009)Google Scholar
  2. 2.
    Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422. ACM (2011)Google Scholar
  3. 3.
    McGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges, pp. 298–307. ACM (2008)Google Scholar
  4. 4.
    Leporini, B., Buzzi, M.C., Buzzi, M.: Interacting with mobile devices via VoiceOver: usability and accessibility issues. In: Proceedings of the 24th Australian Computer-Human Interaction Conference, pp. 339–348. ACM (2012)Google Scholar
  5. 5.
    Vanderheiden, G.C.: Use of audio-haptic interface techniques to allow nonvisual access to touchscreen appliances. In: Human Factors and Ergonomics Society Annual Meeting Proceedings, vol. 40, no. 24, pp. 1266–1266 (1996)Google Scholar
  6. 6.
    Landau, S., and Wells. L: Merging tactile sensory input and audio data by means of the Talking Tactile Tablet. In: Proceedings of Eurohaptics, vol. 3 (2003)Google Scholar
  7. 7.
    Hill, D.R., Grieb, C.: Substitution for a restricted visual channel in multimodal computer-human dialogue. IEEE Trans. Syst. Man Cybern. 18(2), 285–304 (1988)CrossRefGoogle Scholar
  8. 8.
    Stephanidis, C., Pieper, M. (eds.): ERCIM Ws UI4ALL 2006. LNCS, vol. 4397. Springer, Heidelberg (2007)zbMATHGoogle Scholar
  9. 9.
    Shiri, A., Rector, K., Ladner, R., Wobbrock, J.: Passchords: secure multi-touch authentication for blind people. In: Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility, pp. 159–166. ACM (2012)Google Scholar
  10. 10.
    Kane, S.K., Bigham, J.P., Wobbrock, J.O.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th international ACM SIGACCESS Conference on Computers and Accessibility, pp. 73–80. ACM (2008)Google Scholar
  11. 11.
    Morris, M.R., Wobbrock, J.O., Wilson, A.D.: Understanding users’ preferences for surface gestures. In: Proceedings of graphics interface 2010, pp. 261–268. Canadian Information Processing Society (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Samsung Research and Development Institute-BangaloreBangaloreIndia

Personalised recommendations