No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry

  • Matthew N. Bonner
  • Jeremy T. Brudvik
  • Gregory D. Abowd
  • W. Keith Edwards
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6030)

Abstract

Mobile devices with multi-touch capabilities are becoming increasingly common, largely due to the success of the Apple iPhone and iPod Touch. While there have been some advances in touchscreen accessibility for blind people, touchscreens remain inaccessible in many ways. Recent research has demonstrated that there is great potential in leveraging multi-touch capabilities to increase the accessibility of touchscreen applications for blind people. We have created No-Look Notes, an eyes-free text entry system that uses multi-touch input and audio output. No-Look Notes was implemented on Apple’s iPhone platform. We have performed a within-subjects (n = 10) user study of both No-Look Notes and the text entry component of Apple’s VoiceOver, the recently released official accessibility component on the iPhone. No-Look Notes significantly outperformed VoiceOver in terms of speed, accuracy and user preference.

Keywords

accessibility mobile device multi-touch touchscreen text entry eyes-free 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    McGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic conference on Human-Computer Interaction: Building Bridges, pp. 298–307 (2008)Google Scholar
  2. 2.
    MacKenzie, I.S., Soukoreff, R.W.: Text Entry for Mobile Computing: Models and Methods, Theory and Practice. Human-Computer Interaction 17(2), 147 (2002)CrossRefGoogle Scholar
  3. 3.
    Sánchez, J., Aguayo, F.: Mobile Messenger for the Blind. In: Universal Access in Ambient Intelligence Environments, pp. 369-385 (2007)Google Scholar
  4. 4.
    Tinwala, H., MacKenzie, I.S.: Eyes-Free Text Entry on a Touchscreen Phone. In: Proceedings of the IEEE Toronto International Conference Science and Technology for Humanity TIC-STH 2009, pp. 83–89 (2009)Google Scholar
  5. 5.
    Perlin, K.: Quikwriting: continuous stylus-based text entry. In: Proceedings of the 11th annual ACM symposium on User interface software and technology, pp. 215–216 (1998)Google Scholar
  6. 6.
    Plimmer, B., Crossan, A., Brewster, S.A., Blagojevic, R.: Multimodal Collaborative Handwriting Training for Visually-Impaired People. In: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, pp. 393–402 (2008)Google Scholar
  7. 7.
    Yfantidis, G., Evreinov, G.: Adaptive Blind Interaction Technique for Touchscreens. Universal Access in the Information Society 4(4), 328–337 (2006)CrossRefGoogle Scholar
  8. 8.
    Speech Enabled Eyes Free Android Applications, http://code.google.com/p/eyes-free/
  9. 9.
  10. 10.
    Mynatt, E.D., Edwards, W.K.: Mapping GUIs to Auditory Interfaces. In: Proceedings of the 5th annual ACM symposium on User Interface Software and Technology, pp. 61–70 (1992)Google Scholar
  11. 11.
    Kane, S.K., Bigham, J.P., Wobbrock, J.O.: Slide Rule: Making Mobile Touch Screens Accessible to Blind people using Multi-Touch Interaction Techniques. In: Proceedings of the 10th international ACM SIGACCESS Conference on Computers and Accessibility, pp. 73–80 (2008)Google Scholar
  12. 12.
    Brewster, S., Chohan, F., Brown, L.: Tactile Feedback for Mobile Interactions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, pp. 159–162 (2007)Google Scholar
  13. 13.
    Leung, R., MacLean, K., Bertelsen, M.B., Saubhasik, M.: Evaluation of Haptically Augmented Touchscreen GUI Elements under Cognitive Load. In: Proceedings of the 9th International Conference on Multimodal Interfaces, pp. 374–381 (2007)Google Scholar
  14. 14.
    Kaaresoja, T.: Snap-Crackle-Pop: Tactile Feedback for Mobile Touch Screens. In: Proceedings of Eurohaptics, pp. 565–566 (2006)Google Scholar
  15. 15.
    Yatani, K., Truong, K.N.: SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-screen Devices. In: Proceedings of the ACM Symposium on User Interface Software and Technology, pp. 111–120 (2009)Google Scholar
  16. 16.
    E.161: Arrangement of Digits, Letters and Symbols on Telephones and Other Devices that can be Used for Gaining Access to a Telephone Network, http://www.itu.int/rec/T-REC-E.161-200102-I/en
  17. 17.
    Wobbrock, J.O., Myers, B.A., Kembel, J.A.: EdgeWrite: A Stylus-Based Text Entry Method Designed for High Accuracy and Stability of Motion. In: Proceedings of the 16th annual ACM Symposium on User Interface Software and Technology, pp. 61–70 (2003)Google Scholar
  18. 18.
    Black, A., Lenzo, K.: Flite: A Small Fast Runtime Synthesis Engine. In: 4th ISCA Speech Synthesis Workshop, pp. 157–162 (2007)Google Scholar
  19. 19.
    MacKenzie, I.S., Soukoreff, R.W.: Phrase Sets for Evaluating Text Entry Techniques. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems, pp. 754–755 (2003)Google Scholar
  20. 20.
    Wigdor, D., Balakrishnan, R.: TiltText: Using Tilt for Text Input to Mobile Phones. In: Proceedings of the 16th annual ACM symposium on User interface software and technology, pp. 81–90 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Matthew N. Bonner
    • 1
  • Jeremy T. Brudvik
    • 1
  • Gregory D. Abowd
    • 1
  • W. Keith Edwards
    • 1
  1. 1.GVU Center & School of Interactive ComputingGeorgia Institute of TechnologyAtlantaUSA

Personalised recommendations