International Conference on Computers for Handicapped Persons

ICCHP 2014: Computers Helping People with Special Needs pp 383-390

Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices

  • Martin Lukas Dorigo
  • Bettina Harriehausen-Mühlbauer
  • Ingo Stengel
  • Paul Dowland
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8547)

Abstract

There are a large number of highly structured documents, for example: newspaper articles, scientific, mathematical or technical literature. As a result of inductive research with 200 blind and visually impaired participants, a multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using earcons, tactons and speech synthesis, serving the aural and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. For the development, testing and evaluation of the user interface, a flexible platform independent software architecture has been developed and implemented for iOS and Android. The evaluation of the user interface has been undertaken by a structured observation of 160 blind and visually impaired participants using an implemented software (App) over the Internet.

Keywords

Assistive Technology User Interface Multi-Modal Nonvisual Presentation Navigation Manipulation Earcons Tactons Multitouch Gestures Motion Mobile Devices Smart Phone Smart Watch Smart Tablet Wearable Devices Document Structure Mathematics Accessibility Blind Visual Impairment 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
    Google TalkBack: An Open Source Screenreader for Android, http://www.google.com/accessibility/products/
  3. 3.
    Freedom Scientific, JAWS for Windows Screen Reading Software, http://www.freedomscientific.com/products/fs/jaws-product-page.asp
  4. 4.
    Minatani, K.: Development of a DAISY Player that Utilizes a Braille Display for Document Structure Presentation and Navigation. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part I. LNCS, vol. 7382, pp. 515–522. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  5. 5.
    Petit, G., Dufresne, A., Robert, J.M.: Introducing TactoWeb: A Tool to Spatially Explore Web Pages for Users with Visual Impairment. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2011. LNCS, vol. 6765, pp. 276–284. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  6. 6.
    James, F.: Presenting HTML Structure in Audio: User Satisfaction with Audio Hypertext. In: Frysinger, S., et al. (eds.) Proceedings of ICAD 1996, Palo Alto, pp. 97–103 (1996)Google Scholar
  7. 7.
    Bryman, A.: Social research methods. Oxford University Press, Oxford (2012)Google Scholar
  8. 8.
    Blattner, M., Sumikawa, D., Greenberg, R.: Earcons and icons: Their structure and common design principles. Human Computer Interaction 4(1), 11–44 (1989)CrossRefGoogle Scholar
  9. 9.
    Brewster, S.A., Wright, P.C., Edwards, A.D.: An evaluation of earcons for use in auditory human-computer interfaces. In: Arnold, B., et al. (eds.) INTERCHI 1993: Proceedings of INTERACT 1993 and CHI 1993, pp. 222–227. ACM, New York (1993)Google Scholar
  10. 10.
    Brewster, S.A., Wright, P.C., Edwards, A.D.: Parallel earcons: Reducing the length of audio messages. International Journal of Human-Computer Studies 43(22), 153–175 (1995)CrossRefGoogle Scholar
  11. 11.
    Brewster, S., Brown, L.M.: Tactons: Structured Tactile Messages for Non-Visual Information Display. In: Cockburn, A. (ed.) AUIC 2004 - Proceedings of AUIC 2004, vol. 28, pp. 15–23. Australian Computer Society, Darlinghurst (2004)Google Scholar
  12. 12.
    W3C - Document Object Model (DOM) Level 3 Core Specification, http://www.w3.org/TR/2004/REC-DOM-Level-3-Core-20040407/

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Martin Lukas Dorigo
    • 1
  • Bettina Harriehausen-Mühlbauer
    • 2
  • Ingo Stengel
    • 1
  • Paul Dowland
    • 1
  1. 1.Plymouth UniversityPlymouthUnited Kingdom
  2. 2.University of Applied Sciences Darmstadt, DarmstadtGermany

Personalised recommendations