Abstract
Numerous projects have investigated assistive navigation technologies for the blind community, tackling challenges ranging from interface design to sensory substitution. However, none of these have successfully integrated what we consider to be the three factors necessary for a widely deployable system that delivers a rich experience of one’s environment: implementation on a commodity device, use of a pre-existing worldwide point of interest (POI) database, and a means of rendering the environment that is superior to a naive playback of spoken text. Our “In Situ Audio Services” (ISAS) application responds to these needs, allowing users to explore an urban area without necessarily having a particular destination in mind. We describe the technical aspects of its implementation, user requirements, interface design, safety concerns, POI data source issues, and further requirements to make the system practical on a wider basis. Initial qualitative feedback from blind users is also discussed.
Similar content being viewed by others
Notes
Two examples are http://talkingsigns.com and http://eo-guidage.com.
Earlier versions extended to 150 m.
References
Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum-Comput Interact 4(1):11–44
Blum J, Bouchard M, Cooperstock JR (2011) What’s around me? Spatialized audio augmented reality for blind users with a smartphone. In: Mobile and ubiquitous systems (Mobiquitous). Springer, New York
Cheverst K,Mitchell K, Davies N (2002) Exploring context-aware information push. PUC 6(4):276–281
Denham J (2011) Oh Kapten! My Kapten! Where am I?: a review of the Kapten PLUS personal navigation device. AccessWorld 12(7). http://www.afb.org/afbpress/pub.asp?DocID=aw120707
El-Shimy D, Grond F, Olmos A, Cooperstock JR (2011) Eyesfree environmental awareness for navigation. J Multimodal User Interfaces, Special Issue on Interactive Sonification 5(3–4):131–141
Gaver W (1986) Auditory icons: using sound in computer interfaces. Hum-Comput Interact 2(2):167–177
Golledge RG, Klatzky RL, Loomis JM, Speigle J, Tietz J (1998) A geographical information system for a GPS based personal guidance system. Int J Geogr Inf Sci 12(7):727–749
Golledge RG, Loomis JM, Klatzky RL, Flury A, Yang XL (1991) Designing a personal guidance system to aid navigation with out sight: progress on the GIS component. Int J Geogr Inf Sci 5(4):373–395
Helal AS, Moore SE, Ramachandran B (2001) Drishti: an integrated navigation system for visually impaired and disabled. In: Proceedings fifth international symposium on wearable computers. IEEE Computer Society, Los Alamitos, pp 149–156
Kammoun S, Mace´ M, Oriola B, Jouffrais C (2012) Towards a geographic information system facilitating navigation of visually impaired users. In: Computers helping people with special needs, vol 7383, pp 521–528
Katz BFG, Kammoun S, Parseihian G, Gutierrez O, Brilhault A, Auvray M, Truillet P, Denis M, Thorpe S, Jouffrais C (2006) NAVIG: augmented reality guidance system for the visually impaired. Virtual Real 16(3):1–17
Klatzky RL, Marston JR, Giudice NA, Golledge RG, Loomis JM (2006) Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J Exp Psychol Appl 12(4):223–32
Linell M(2011) Comparison between two 3d-sound engines of the accuracy in determining the position of a source. Master’s thesis, Luleå U. of Technology
Mariette N (2007) From backpack to handheld: the recent trajectory of personal location aware spatial audio. In: Proceedings of 2007 Perth digital arts conference (PerthDAC 2007), Perth, Australia, 15–18 September 2007. http://www.unsworks.unsw.edu.au/primo_library/libweb/action/dlDisplay.do?docId=unsworks_8024&vid=UNSWORKS
McGookin D, Gibbs M, Nivala AM, Brewster S (2007) Initial development of a PDA mobility aid for visually impaired people. In: Human-computer interaction INTERACT 2007, pp 665–668
Petrie H, Johnson V, Strothotte T, Raab A, Fritz S, Michel R (1996) MOBIC: designing a travel aid for blind and elderly people. J Navig 49(1):45–52
Petrie H, Johnson V, Strothotte T, Raab A, Michel R, Reichert L, Schalt A (1997) MoBIC: an aid to increase the independent mobility of blind travellers. Br J Vis Impair 15(2):63–66
Ran L, Helal S, Moore S (2004) Drishti: an integrated indoor/outdoor blind navigation system and service. In: Proceedings of the second IEEE annual conference on pervasive computing and communications, 2004, pp 23–30
Roentgen UR, Jan Gelderblom G, Soede M, de Witte LP (2008) Inventory of electronic mobility aids for persons with visual impairments: a literature review. JVIB 102(11):702–724
Stewart J, Bauman S, Escobar M, Hilden J, Bihani K, Newman MW (2008) Accessible contextual information for urban orientation. In: Proceedings of the 10th international conference on ubiquitous computing—UbiComp ’08. ACM, New York, p 332
Su J, Rosenzweig A, Goel A, de Lara E, Truong KN (2010) Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. ACM, New York, pp 17–26
Sundareswaran V, Wang K, Chen S, Behringer R, McGee J, Tam C, Zahorik P (2003) 3D audio augmented reality, implementation and experiments. In: Proceedings of the the second IEEE and ACM international symposium on mixed and augmented reality, pp 296–297
Sutherland EE, Sproull RF, Schumacker RA (1974) A characterization of ten hidden-surface algorithms. ACM Comput Surv 6(1):1–55
Vazquez-Alvarez Y, Brewster S (2009) Investigating background & foreground interactions using spatial audio cues. In: Proceedings of the 27th international conference extended abstracts on human factors in computing systems—CHI EA ’09, p 3823
Völkel T, Kühn R, Weber G (2008) Mobility impaired pedestrians are not cars: requirements for the annotation of geographical data. In: Computers helping people with special needs, pp 1085–1092
Walker BN, Lindsay J et al (2005) Navigation performance in a virtual environment with bonephones. In: Proc. of the int’l conf. on auditory display (ICAD2005). Citeseer, pp 260–263
Walker BN, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the international conference on auditory display. Citeseer, London, UK, pp 63–68
Walker BN, Stanley R (2005) Thresholds of audibility for bone-conduction headsets. In: International conference on auditory display, Limerick, Ireland, pp 218–222
Wilson J, Walker BN, Lindsay J, Cambias C, Dellaert F (2007) SWAN: system for wearable audio navigation. In: 2007 11th IEEE international symposium on wearable computers, pp 1–8
Yang R, Park S, Mishra SR, Hong Z, Newsom C, Joo H, Hofer E, Newman MW (2011) Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments. In: The proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility—ASSETS ’11. ACM Press, New York, p 27
Zandbergen PA, Barbeau SJ (2011) Positional accuracy of assisted GPS data from high-sensitivity GPS-enabled mobile phones. J Navig 64(3):381–399
Zhao H, Plaisant C, Shneiderman B, Duraiswami R (2003) Sonification of geo-referenced data for auditory information seeking: design principle and pilot study. In: University of Maryland Tech. Report HCIL 2004, vol 3, pp 1–8
Acknowledgements
The authors would like to thank Florian Grond, who implemented the original spatialization patches and consulted on audio issues throughout the project, Adriana Olmos, Dalia El-Shimy and Sabrina Panëels, who designed and carried out the user testing with blind participants, and Zack Settel and Mike Wozniewski. Stephane Doyon and Lucio D’Intino graciously provided feedback on the system while in development.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was made possible thanks to the financial support of the Québec Secrétariat du Conseil du trésor through the Appui au passage à la société de l’information program, as well as additional funding from a Google Faculty Research Award. This paper is an expansion and update of the conference paper What’s around me? Spatialized audio augmented reality for blind users with a smartphone, presented at Mobiquitous 2011, Copenhagen, Denmark [2].
Rights and permissions
About this article
Cite this article
Blum, J.R., Bouchard, M. & Cooperstock, J.R. Spatialized Audio Environmental Awareness for Blind Users with a Smartphone. Mobile Netw Appl 18, 295–309 (2013). https://doi.org/10.1007/s11036-012-0425-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11036-012-0425-8