Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices
There are a large number of highly structured documents, for example: newspaper articles, scientific, mathematical or technical literature. As a result of inductive research with 200 blind and visually impaired participants, a multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using earcons, tactons and speech synthesis, serving the aural and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. For the development, testing and evaluation of the user interface, a flexible platform independent software architecture has been developed and implemented for iOS and Android. The evaluation of the user interface has been undertaken by a structured observation of 160 blind and visually impaired participants using an implemented software (App) over the Internet.
KeywordsAssistive Technology User Interface Multi-Modal Nonvisual Presentation Navigation Manipulation Earcons Tactons Multitouch Gestures Motion Mobile Devices Smart Phone Smart Watch Smart Tablet Wearable Devices Document Structure Mathematics Accessibility Blind Visual Impairment
Unable to display preview. Download preview PDF.
- 1.Apple, VoiceOver, http://www.apple.com/accessibility/voiceover
- 2.Google TalkBack: An Open Source Screenreader for Android, http://www.google.com/accessibility/products/
- 3.Freedom Scientific, JAWS for Windows Screen Reading Software, http://www.freedomscientific.com/products/fs/jaws-product-page.asp
- 6.James, F.: Presenting HTML Structure in Audio: User Satisfaction with Audio Hypertext. In: Frysinger, S., et al. (eds.) Proceedings of ICAD 1996, Palo Alto, pp. 97–103 (1996)Google Scholar
- 7.Bryman, A.: Social research methods. Oxford University Press, Oxford (2012)Google Scholar
- 9.Brewster, S.A., Wright, P.C., Edwards, A.D.: An evaluation of earcons for use in auditory human-computer interfaces. In: Arnold, B., et al. (eds.) INTERCHI 1993: Proceedings of INTERACT 1993 and CHI 1993, pp. 222–227. ACM, New York (1993)Google Scholar
- 11.Brewster, S., Brown, L.M.: Tactons: Structured Tactile Messages for Non-Visual Information Display. In: Cockburn, A. (ed.) AUIC 2004 - Proceedings of AUIC 2004, vol. 28, pp. 15–23. Australian Computer Society, Darlinghurst (2004)Google Scholar
- 12.W3C - Document Object Model (DOM) Level 3 Core Specification, http://www.w3.org/TR/2004/REC-DOM-Level-3-Core-20040407/