Advertisement

AudioBrowser: a mobile browsable information access for the visually impaired

  • Xiaoyu Chen
  • Marilyn Tremaine
  • Robert Lutz
  • Jae-woo Chung
  • Patrick Lacsina
LONG PAPER

Abstract

Although a large amount of research has been conducted on building interfaces for the visually impaired that allows users to read web pages and generate and access information on computers, little development addresses two problems faced by the blind users. First, sighted users can rapidly browse and select information they find useful, and second, sighted users can make much useful information portable through the recent proliferation of personal digital assistants (PDAs). These possibilities are not currently available for blind users. This paper describes an interface that has been built on a standard PDA and allows its user to browse the information stored on it through a combination of screen touches coupled with auditory feedback. The system also supports the storage and management of personal information so that addresses, music, directions, and other supportive information can be readily created and then accessed anytime and anywhere by the PDA user. The paper describes the system along with the related design choices and design rationale. A user study is also reported.

Keywords

Information accessibility Mobile information browser AudioBrowser Non-visual interfaces for the blind and the visually impaired Audio–tactile interface 

References

  1. 1.
    American Foundation for the Blind (2001) Quick facts and figures on blindness and low vision. http://www.afb.org/info_document_view.asp?documentid=1374
  2. 2.
    Asakawa C, Itoh T (1998) User interface of a home page reader. In: Proceedings of the 3rd ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1998, pp 149–156Google Scholar
  3. 3.
    Bellik Y, Burger D (1994) Multimodal interfaces: new solutions to the problem of computer accessibility for the blind. In: Conference companion, CHI’94, 24–28 April 1994, pp 267–268Google Scholar
  4. 4.
    Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44CrossRefGoogle Scholar
  5. 5.
    Bregman AS (1990) Auditory scene analysis: the perceptual organization of sound. MIT, CambridgeGoogle Scholar
  6. 6.
    Brewster SA (1994) Providing a structured method for integrating non-speech audio into human–computer interfaces. Ph.D. thesis, University of YorkGoogle Scholar
  7. 7.
    Brewster SA (1998) Using nonspeech sounds to provide navigation cues. ACM Trans Comput Hum Interact 5(3):224–259CrossRefMathSciNetGoogle Scholar
  8. 8.
    Brewster SA, Lumsden J, Bell M, Hall M, Tasker S (2003) Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: Proceedings of Chi 2003, 5–10 April 2003, Ft Lauderdale, pp. 473–480Google Scholar
  9. 9.
    Crispien K, Würz W, Weber G (1994) Using spatial audio for the enhanced presentation of synthesized speech within screen-readers for blind computer users. In: Proceedings of international conference on computers for handicapped persons (ICCHP’94), Vienna, Austria, pp. 144–153Google Scholar
  10. 10.
    Dolphin Group (2004) Dolphin developed a screen reader for PDAs. http://www.dolphincomputeraccess.com/news/2004/hal_pda.htm (Last accessed on March 10, 2005)
  11. 11.
  12. 12.
    Dufresne A, Martial O, Ramstein Ch (1995) Multimodal user interaction system for blind and “visually occupied” users: ergonomic evaluation of the haptic and auditive dimensions. Interact’95, human–computer interaction. Chapman & Hall, pp. 163–168Google Scholar
  13. 13.
  14. 14.
    Elan Speech. http://www.elan.fr/
  15. 15.
  16. 16.
    Freedom Scientific, Braille n Speak, PAC Mate, etc. http://www.freedomscientific.com/fs_products/hardware.asp
  17. 17.
    Friedlander N, Schlueter K, Mantei M (1998) Bullseye! when Fitt’s law doesn’t fit. In: Proceedings of CHI’98. ACM, Addison-Wesley, Los Angeles, pp 257–264Google Scholar
  18. 18.
    Gaver WW (1994) Using and creating auditory icons. In: Kramer G (ed) Auditory display, SFI Proc. Vol. XVIII. Addison-Wesley, ReadingGoogle Scholar
  19. 19.
    Goldstein M, Book R, Alsio G, Tessa S (1999) Non-keyboard QWERTY touch typing: a portable input interface for the mobile user. In: Proceedings of the CHI 99, PittsburgGoogle Scholar
  20. 20.
    GW Micro, Inc., Window eyes. http://www.gwmicro.com/products
  21. 21.
    Karshmer AI, Gupta G, Pontelli E, Miesenberger K, Ammalai N, Gopal D, Batusic M, Stoger B, Palmer B, Guo H-F (2004) UMA: a system for universal mathematics accessibility. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 55–62Google Scholar
  22. 22.
    Microsoft Research, IRIT (2004) VoCal: non visual interaction on PDA. http://www.irit.fr/diamant/Projets/PDA/index.php, September 2004
  23. 23.
    Morley S, Petrie H, O’Neill AM, McNally P (1998) Auditory navigation in hyperspace: design and evaluation of a non-visual hypermedia system for blind users. In: Proceedings of the 3rd ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1998, pp. 100–107Google Scholar
  24. 24.
    Mynatt ED (1994) Auditory presentation of graphical user interfaces. Addison-Wesley, ReadingGoogle Scholar
  25. 25.
    Mynatt ED, Weber G (1994) Nonvisual presentation of graphical user interfaces: contrasting two approaches. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI’94). ACM, New York, pp. 166–172Google Scholar
  26. 26.
    OPENNETCF.org (2003) Multimedia.Audio Library. http://www.opennetcf.org/multimedia.asp
  27. 27.
    Oriola B, Vigouroux N, Decorét C (1996) Voice recognition and keyboard as interaction inputs for blind people: analysis of users’ behaviour. In: Proceedings of international conference on computers for handicapped persons (ICCHP’96). pp. 731–739Google Scholar
  28. 28.
    Parente P (2004) Audio enriched links: web page previews for blind users. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘04), Atlanta, 18–20 October 2004, pp. 2–7Google Scholar
  29. 29.
    Pulse Data International, Voice note and Braille note. http://www.pulsedata.com
  30. 30.
    Roth P, Petrucci L, Assimacopoulos A, Pun T (1998) AB-Web: active audio browser for visually impaired and blind users. In: ICAD’98 proceedings, November 1998Google Scholar
  31. 31.
    Roth P, Petrucci L, Assimacopoulos A, Pun T (2000) Audio-haptic internet browser and associated tools for blind and visually impaired computer users, Workshop on friendly exchanging through the net, 22–24 March 2000Google Scholar
  32. 32.
    Smith AC, Francioni JM, Anwar M, Cook JS, Hossain A, Rahman M (2004) Nonvisual tool for navigating hierarchical structures. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘04), Atlanta, 18–20 October 2004, pp. 133–139Google Scholar
  33. 33.
    Sun Microsystems Laboratories Speech Team, FreeTTS 1.2beta2—a speech synthesizer written entirely in the Java™ programming language. http://www.freetts.sourceforge.net/docs/index.php
  34. 34.
    Thatcher J (1994) Screen reader/2: access to OS/2 and the graphical user interface. In: Proceedings of the 1st ACM international conference on assistive technologies (ASSETS), Marina Del Rey, October 1994, pp. 39–46Google Scholar
  35. 35.
    The Sphinx Group, Carnegie Mellon University, The CMU Sphinx group open source speech recognition engines. http://www.cmusphinx.sourceforge.net/html/cmusphinx.php
  36. 36.
    The TCTS Lab, Faculté Polytechnique de Mons (Belgium), The MBROLA project: towards a freely available multilingual speech synthesizer. http://www.tcts.fpms.ac.be/synthesis/mbrola.html
  37. 37.
    Truillet P, Oriola B, Vigouroux N (1997) Multimodal presentation as a solution to access a structured document, poster. In: Sixth World Wide Web conference, April 1997, Santa Clara. http://www.ra.ethz.ch/CDstore/www6/Posters/758/758_POST.HTM
  38. 38.
    Vigouroux N, Oriola B (1994) Multimodal concept for a new generation of screen reader. In: Proceedings of international conference on computers for handicapped persons (ICCHP’94), Vienna, Austria, pp. 154–161Google Scholar
  39. 39.
    Vigouroux N, Seiler FP, Oriola B, Truillet P (1995) SMART—system for multimodal and multilingual access, reading and retrieval for electronic documents. In: Second TIDE congress, Paris, 26–28 April 1995Google Scholar
  40. 40.
    Williams C, Tremaine M (2001) SoundNews: an audio browsing tool for the blind. In: Proceedings of the international conference on universal access in human–computer interaction (UAHCI), August 2001, pp. 1029–1033Google Scholar
  41. 41.
    Wobbrock JO, Myers BA, Kembel JA (2003) A stylus-based text entry method designed for high accuracy and stability of motion. In: Proceedings of the ACM symposium on user interface software and technology (UIST’03), Vancouver, British Columbia, November 2003, pp. 61–70Google Scholar
  42. 42.
    Wobbrock JO, Myers BA, Aung HH, LoPresti EF (2004) Text entry from power wheelchairs: EdgeWrite for Joysticks and touchpads. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 110–117Google Scholar
  43. 43.
    Yesilada Y, Stevens R, Goble C, Hussein S (2004) Rendering tables in audio: the interaction of structure and reading styles. In: Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (ASSETS’04), Atlanta, 18–20 October 2004, pp. 16–23Google Scholar
  44. 44.
    Zhao H, Plaisant C, Shneiderman B, Duraiswami R (2004) Sonification of geo-referenced data for auditory information seeking: design principle and pilot study. In: Proceeding of the 10th international conference on auditory display, Sydney, Australia, 6–10 July 2004Google Scholar

Copyright information

© Springer-Verlag 2006

Authors and Affiliations

  • Xiaoyu Chen
    • 1
  • Marilyn Tremaine
    • 1
  • Robert Lutz
    • 1
  • Jae-woo Chung
    • 2
  • Patrick Lacsina
    • 1
  1. 1.Information Systems DepartmentNew Jersey Institute of TechnologyNewarkUSA
  2. 2.Media LabMassachusetts Institute of TechnologyCambridgeUSA

Personalised recommendations