Eyes-free environmental awareness for navigation

  • Dalia El-ShimyEmail author
  • Florian Grond
  • Adriana Olmos
  • Jeremy R. Cooperstock
Original Paper


We consider the challenge of delivering location-based information through rich audio representations of the environment, and the associated opportunities that such an approach offers to support navigation tasks. This challenge is addressed by In-Situ Audio Services, or ISAS, a system intended primarily for use by the blind and visually impaired communities. It employs spatialized audio rendering to convey the relevant content, which may include information about the immediate surroundings, such as restaurants, cultural sites, public transportation locations, and other points of interest. Information is aggregated mostly from online data resources, converted using text-to-speech technology, and “displayed”, either as speech or more abstract audio icons, through a location-aware mobile device or smartphone. This is suitable not only for the specific constraints of the target population, but is equally useful for general mobile users whose visual attention is otherwise occupied with navigation. We designed and conducted an experiment to evaluate two techniques for delivering spatialized audio content to users via interactive auditory maps: the shockwave mode and the radar mode. While neither mode proved to be significantly better than the other, subjects proved competent at navigating the maps using these rendering strategies, and reacted positively to the system, demonstrating that spatial audio can be an effective technique for conveying location-based information. The results of this experiment and its implications to our project are described here.


Sound spatialization Auditory maps Mobile applications Blind and visually impaired community Location-based information 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bregman AS (1994) Auditory scene analysis: The perceptual organization of sound. The MIT Press, Cambridge Google Scholar
  2. 2.
    Brewster SA (1998) Using nonspeech sounds to provide navigation cues. ACM Trans Comput-Hum Interact 5:224–259 MathSciNetCrossRefGoogle Scholar
  3. 3.
    Frauenberger C, Putz V, Höldrich R (2004) Spatial auditory displays—a study on the use of virtual audio environments as interfaces for users with visual disabilities. In: DAFx04 proceedings, Naples, Italy, October 5–8 2004. 7th Int. Conference on Digital Audio Effects (DAFx’04), 7th Int. Conference on Digital Audio Effects (DAFx’04) Google Scholar
  4. 4.
    Hermann T, Ritter H (1999) Listen to your data: Model-based sonification for data analysis. In: Lasker GE (ed) Advances in intelligent computing and multimedia systems, pp 189–194, Baden-Baden, Germany, 08 1999. Int Inst for Advanced Studies in System Research and Cybernetics Google Scholar
  5. 5.
    Kish D (2003) Sonic echolocation: A modern review and synthesis of the literature.
  6. 6.
    Morland C, Mountain D (2008) Design of a sonar system for visually impaired humans. In: Proceedings of the 14th international conference on auditory display, Paris, France. Google Scholar
  7. 7.
    Mountford SJ, Gaver WW (1990) Talking and listening to computers. Addison-Wesley, Massachusetts Google Scholar
  8. 8.
    Nickerson LV, Stockman T, Thiebaut J (2007) Sonifying the ndon underground real-time-disruption map. In: Scavone GP (ed) Proceedings of the 13th international conference on auditory display (ICAD2007), pp 252–257, Montreal, Canada, 2007. Schulich School of Music, McGill University Google Scholar
  9. 9.
    Saerberg S (2010) Just go straight ahead” how blind and sighted pedestrians negotiate space. Senses Soc 5(3):364–381 CrossRefGoogle Scholar
  10. 10.
    Schafer RM (1977) The Tuning of the World, 1 edn. Random House, New York Google Scholar
  11. 11.
    Stockman T (2010) Listening to people, objects and interactions. In: Proceedings of ISon 2010, 3rd interactive sonification workshop. KTH Stockholm Sweden, April 2010 Google Scholar
  12. 12.
    Tannen RS (1998) Breaking the sound barrier: designing auditory displays for global usability. In: 4th conference on human factors and the web, Basking Ridge, USA Google Scholar
  13. 13.
    Walker A, Brewster S (2000) Spatial audio in small screen device displays. Pers Ubiquitous Comput 4:144–154. doi: 10.1007/BF01324121 CrossRefGoogle Scholar
  14. 14.
    Walker BN, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the international conference on auditory display, pp 95–98 Google Scholar
  15. 15.
    Yalla P, Walker BN (2008) Advanced auditory menus: design and evaluation of auditory scroll bars. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility, Assets ’08. ACM, New York, pp 105–112 CrossRefGoogle Scholar
  16. 16.
    Zhao H, Plaisant C, Schneiderman B, Duraiswami R (2004) Sonification of geo-referenced data for auditory information seeking: Design principle and pilot study. In: Barrass S, Vickers P (eds) Proceedings of the 10th international conference on auditory display (ICAD2004), Sydney, Australia International Community for Auditory Display (ICAD) Google Scholar

Copyright information

© OpenInterface Association 2011

Authors and Affiliations

  • Dalia El-Shimy
    • 1
    Email author
  • Florian Grond
    • 2
  • Adriana Olmos
    • 1
  • Jeremy R. Cooperstock
    • 1
  1. 1.Centre for Intelligent MachinesMcGill UniversityMontréalCanada
  2. 2.Ambient Intelligence GroupUniversität BielefeldBielefeldGermany

Personalised recommendations