Mobile Networks and Applications

, Volume 18, Issue 3, pp 295–309 | Cite as

Spatialized Audio Environmental Awareness for Blind Users with a Smartphone

  • Jeffrey R. Blum
  • Mathieu Bouchard
  • Jeremy R. Cooperstock
Article

Abstract

Numerous projects have investigated assistive navigation technologies for the blind community, tackling challenges ranging from interface design to sensory substitution. However, none of these have successfully integrated what we consider to be the three factors necessary for a widely deployable system that delivers a rich experience of one’s environment: implementation on a commodity device, use of a pre-existing worldwide point of interest (POI) database, and a means of rendering the environment that is superior to a naive playback of spoken text. Our “In Situ Audio Services” (ISAS) application responds to these needs, allowing users to explore an urban area without necessarily having a particular destination in mind. We describe the technical aspects of its implementation, user requirements, interface design, safety concerns, POI data source issues, and further requirements to make the system practical on a wider basis. Initial qualitative feedback from blind users is also discussed.

Keywords

Spatialized audio Blind navigation GPS Smartphone Audio augmented reality 

References

  1. 1.
    Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum-Comput Interact 4(1):11–44CrossRefGoogle Scholar
  2. 2.
    Blum J, Bouchard M, Cooperstock JR (2011) What’s around me? Spatialized audio augmented reality for blind users with a smartphone. In: Mobile and ubiquitous systems (Mobiquitous). Springer, New YorkGoogle Scholar
  3. 3.
    Cheverst K,Mitchell K, Davies N (2002) Exploring context-aware information push. PUC 6(4):276–281Google Scholar
  4. 4.
    Denham J (2011) Oh Kapten! My Kapten! Where am I?: a review of the Kapten PLUS personal navigation device. AccessWorld 12(7). http://www.afb.org/afbpress/pub.asp?DocID=aw120707
  5. 5.
    El-Shimy D, Grond F, Olmos A, Cooperstock JR (2011) Eyesfree environmental awareness for navigation. J Multimodal User Interfaces, Special Issue on Interactive Sonification 5(3–4):131–141Google Scholar
  6. 6.
    Gaver W (1986) Auditory icons: using sound in computer interfaces. Hum-Comput Interact 2(2):167–177CrossRefGoogle Scholar
  7. 7.
    Golledge RG, Klatzky RL, Loomis JM, Speigle J, Tietz J (1998) A geographical information system for a GPS based personal guidance system. Int J Geogr Inf Sci 12(7):727–749CrossRefGoogle Scholar
  8. 8.
    Golledge RG, Loomis JM, Klatzky RL, Flury A, Yang XL (1991) Designing a personal guidance system to aid navigation with out sight: progress on the GIS component. Int J Geogr Inf Sci 5(4):373–395CrossRefGoogle Scholar
  9. 9.
    Helal AS, Moore SE, Ramachandran B (2001) Drishti: an integrated navigation system for visually impaired and disabled. In: Proceedings fifth international symposium on wearable computers. IEEE Computer Society, Los Alamitos, pp 149–156CrossRefGoogle Scholar
  10. 10.
    Kammoun S, Mace´ M, Oriola B, Jouffrais C (2012) Towards a geographic information system facilitating navigation of visually impaired users. In: Computers helping people with special needs, vol 7383, pp 521–528Google Scholar
  11. 11.
    Katz BFG, Kammoun S, Parseihian G, Gutierrez O, Brilhault A, Auvray M, Truillet P, Denis M, Thorpe S, Jouffrais C (2006) NAVIG: augmented reality guidance system for the visually impaired. Virtual Real 16(3):1–17Google Scholar
  12. 12.
    Klatzky RL, Marston JR, Giudice NA, Golledge RG, Loomis JM (2006) Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J Exp Psychol Appl 12(4):223–32CrossRefGoogle Scholar
  13. 13.
    Linell M(2011) Comparison between two 3d-sound engines of the accuracy in determining the position of a source. Master’s thesis, Luleå U. of TechnologyGoogle Scholar
  14. 14.
    Mariette N (2007) From backpack to handheld: the recent trajectory of personal location aware spatial audio. In: Proceedings of 2007 Perth digital arts conference (PerthDAC 2007), Perth, Australia, 15–18 September 2007. http://www.unsworks.unsw.edu.au/primo_library/libweb/action/dlDisplay.do?docId=unsworks_8024&vid=UNSWORKS
  15. 15.
    McGookin D, Gibbs M, Nivala AM, Brewster S (2007) Initial development of a PDA mobility aid for visually impaired people. In: Human-computer interaction INTERACT 2007, pp 665–668Google Scholar
  16. 16.
    Petrie H, Johnson V, Strothotte T, Raab A, Fritz S, Michel R (1996) MOBIC: designing a travel aid for blind and elderly people. J Navig 49(1):45–52CrossRefGoogle Scholar
  17. 17.
    Petrie H, Johnson V, Strothotte T, Raab A, Michel R, Reichert L, Schalt A (1997) MoBIC: an aid to increase the independent mobility of blind travellers. Br J Vis Impair 15(2):63–66CrossRefGoogle Scholar
  18. 18.
    Ran L, Helal S, Moore S (2004) Drishti: an integrated indoor/outdoor blind navigation system and service. In: Proceedings of the second IEEE annual conference on pervasive computing and communications, 2004, pp 23–30Google Scholar
  19. 19.
    Roentgen UR, Jan Gelderblom G, Soede M, de Witte LP (2008) Inventory of electronic mobility aids for persons with visual impairments: a literature review. JVIB 102(11):702–724Google Scholar
  20. 20.
    Stewart J, Bauman S, Escobar M, Hilden J, Bihani K, Newman MW (2008) Accessible contextual information for urban orientation. In: Proceedings of the 10th international conference on ubiquitous computing—UbiComp ’08. ACM, New York, p 332CrossRefGoogle Scholar
  21. 21.
    Su J, Rosenzweig A, Goel A, de Lara E, Truong KN (2010) Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. ACM, New York, pp 17–26CrossRefGoogle Scholar
  22. 22.
    Sundareswaran V, Wang K, Chen S, Behringer R, McGee J, Tam C, Zahorik P (2003) 3D audio augmented reality, implementation and experiments. In: Proceedings of the the second IEEE and ACM international symposium on mixed and augmented reality, pp 296–297Google Scholar
  23. 23.
    Sutherland EE, Sproull RF, Schumacker RA (1974) A characterization of ten hidden-surface algorithms. ACM Comput Surv 6(1):1–55MATHCrossRefGoogle Scholar
  24. 24.
    Vazquez-Alvarez Y, Brewster S (2009) Investigating background & foreground interactions using spatial audio cues. In: Proceedings of the 27th international conference extended abstracts on human factors in computing systems—CHI EA ’09, p 3823Google Scholar
  25. 25.
    Völkel T, Kühn R, Weber G (2008) Mobility impaired pedestrians are not cars: requirements for the annotation of geographical data. In: Computers helping people with special needs, pp 1085–1092Google Scholar
  26. 26.
    Walker BN, Lindsay J et al (2005) Navigation performance in a virtual environment with bonephones. In: Proc. of the int’l conf. on auditory display (ICAD2005). Citeseer, pp 260–263Google Scholar
  27. 27.
    Walker BN, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the international conference on auditory display. Citeseer, London, UK, pp 63–68Google Scholar
  28. 28.
    Walker BN, Stanley R (2005) Thresholds of audibility for bone-conduction headsets. In: International conference on auditory display, Limerick, Ireland, pp 218–222Google Scholar
  29. 29.
    Wilson J, Walker BN, Lindsay J, Cambias C, Dellaert F (2007) SWAN: system for wearable audio navigation. In: 2007 11th IEEE international symposium on wearable computers, pp 1–8Google Scholar
  30. 30.
    Yang R, Park S, Mishra SR, Hong Z, Newsom C, Joo H, Hofer E, Newman MW (2011) Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments. In: The proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility—ASSETS ’11. ACM Press, New York, p 27CrossRefGoogle Scholar
  31. 31.
    Zandbergen PA, Barbeau SJ (2011) Positional accuracy of assisted GPS data from high-sensitivity GPS-enabled mobile phones. J Navig 64(3):381–399CrossRefGoogle Scholar
  32. 32.
    Zhao H, Plaisant C, Shneiderman B, Duraiswami R (2003) Sonification of geo-referenced data for auditory information seeking: design principle and pilot study. In: University of Maryland Tech. Report HCIL 2004, vol 3, pp 1–8Google Scholar

Copyright information

© Springer Science+Business Media New York 2012

Authors and Affiliations

  • Jeffrey R. Blum
    • 1
  • Mathieu Bouchard
    • 1
  • Jeremy R. Cooperstock
    • 1
  1. 1.McGill UniversityMontréalCanada

Personalised recommendations