Skip to main content

Bone-Conduction Audio Interface to Guide People with Visual Impairments

  • Conference paper
  • First Online:
Smart City and Informatization (iSCI 2019)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1122))

Included in the following conference series:

Abstract

The ActiVis project’s aim is to build a mobile guidance aid to help people with limited vision find objects in an unknown environment. This system uses bone-conduction headphones to transmit audio signals to the user and requires an effective non-visual interface. To this end, we propose a new audio-based interface that uses a spatialised signal to convey a target’s position on the horizontal plane. The vertical position on the median plan is given by adjusting the tone’s pitch to overcome the audio localisation limitations of bone-conduction headphones. This interface is validated through a set of experiments with blindfolded and visually impaired participants.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://lcas.github.io/ActiVis/.

  2. 2.

    https://en.wikipedia.org/wiki/Tango_(platform).

References

  1. Arditi, A., Tian, Y.: User interface preferences in the design if a camera-based navigation and wayfinding aid. J. Vis. Impair. Blind. 107(2), 118–129 (2013)

    Article  Google Scholar 

  2. Bajcsy, R., Aloimonos, Y., Tsotsos, J.K.: Revisiting active perception. Auton. Robot. 42(2), 177–196 (2018)

    Article  Google Scholar 

  3. Barfield, W., Cohen, M., Rosenberg, C.: Visual and auditory localization as a function of azimuth and elevation. Int. J. Aviat. Psychol. 7(2), 123–138 (1997)

    Article  Google Scholar 

  4. Blauert, J.: Spatial Hearing: The Psychophysics of Human Sound Localization. MIT Press, Cambridge (1997)

    Google Scholar 

  5. Blauert, J.: Sound localization in the median plane. Acta Acustica United Acustica 22(4), 205–213 (1969)

    Google Scholar 

  6. Blum, J.R., Bouchard, M., Cooperstock, J.R.: Spatialized audio environmental awareness for blind users with a smartphone. Mob. Netw. Appl. 18(3), 295–309 (2013)

    Article  Google Scholar 

  7. Chessa, M., Noceti, N., Odone, F., Solari, F., Sosa-García, J., Zini, L.: An integrated artificial vision framework for assisting visually impaired users. Comput. Vis. Image Underst. 149, 209–228 (2016)

    Article  Google Scholar 

  8. Durette, B., Louveton, N., Alleysson, D., Hérault, J.: Visuo-auditory sensory substitution for mobility assistance: testing TheVIBE. In: Workshop on Computer Vision Applications for the Visually Impaired (2008)

    Google Scholar 

  9. Gallina, P., Bellotto, N., Luca, M.D., Di Luca, M.: Progressive co-adaptation in human-machine interaction. In: International Conference on Informatics in Control, Automation and Robotics, vol. 2, pp. 362–368 (2015)

    Google Scholar 

  10. Gardner, W.G., Martin, K.D.: HRTF measurements of a KEMAR. J. Acoust. Soc. Am. 97(6), 3907–3908 (1995)

    Article  Google Scholar 

  11. Geronazzo, M., Bedin, A., Brayda, L., Campus, C., Avanzini, F.: Interactive spatial sonification for non-visual exploration of virtual maps. Int. J. Hum. Comput. Stud. 85, 4–15 (2016)

    Article  Google Scholar 

  12. Golledge, R.G., Marston, J.R., Loomis, J.M., Klatzky, R.L.: Stated preferences for components of a personal guidance system for nonvisual navigation. J. Vis. Impair. Blind. 98(3), 135–147 (2004)

    Article  Google Scholar 

  13. Hiebert, G.: OpenAL 1.1 Specification and Reference (2005)

    Google Scholar 

  14. Kanwal, N., Bostanci, E., Currie, K., Clark, A.F.: A navigation system for the visually impaired: a fusion of vision and depth sensor. Appl. Bionics Biomech. (2015). https://www.hindawi.com/journals/abb/2015/479857/cta/

  15. Katz, B.F.G., Picinali, L.: Spatial audio applied to research with the blind. In: Advances in Sound Localization, pp. 225–250 (2011)

    Google Scholar 

  16. Katz, B.F.G., Truillet, P., Thorpe, S.J., Jouffrais, C.: NAVIG: navigation assisted by artificial vision and GNSS. In: Workshop on Multimodal Location Based Techniques for Extreme Navigation, vol. 1, pp. 1–4 (2010)

    Google Scholar 

  17. Klatzky, R.L., Marston, J.R., Giudice, N.A., Golledge, R.G., Loomis, J.M.: Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J. Exp. Psychol.: Appl. 12(4), 223–232 (2006)

    Google Scholar 

  18. Lee, Y., Medioni, G.: RGB-D camera based wearable navigation system for the visually impaired. Comput. Vis. Image Underst. 149, 3–20 (2015)

    Article  Google Scholar 

  19. Lichenstein, R., Smith, D.C., Ambrose, J.L., Moody, L.A.: Headphone use and pedestrian injury and death in the united states: 2004–2011. Inj. Prev. 18(5), 287–290 (2012)

    Article  Google Scholar 

  20. Lock, J.C., Cielniak, G., Bellotto, N.: Portable navigations system with adaptive multimodal interface for the blind. In: AAAI Spring Symposium - Designing the User Experience of Machine Learning Systems (2017)

    Google Scholar 

  21. Lock, J.C., Cielniak, G., Bellotto, N.: Active object search with a mobile device for people with visual impairments. In: International Conference on Computer Vision Theory and Applications, pp. 476–485 (2019)

    Google Scholar 

  22. MacDonald, J.A., Henry, P.P., Letowski, T.R.: Spatial audio through a bone conduction interface. Int. J. Audiol. 45(10), 595–599 (2006)

    Article  Google Scholar 

  23. Mocanu, B., Tapu, R., Zaharia, T.: When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors 16(11), 1807 (2016)

    Article  Google Scholar 

  24. Pratt, C.: The spatial character of high and low tones. J. Exp. Psychol. 13(3), 278 (1930)

    Article  Google Scholar 

  25. Rivera-Rubio, J., Arulkumaran, K., Rishi, H., Alexiou, I., Bharath, A.A.: An assistive haptic interface for appearance-based indoor navigation. Comput. Vis. Image Underst. 149, 126–145 (2015)

    Article  Google Scholar 

  26. RNIB: UK vision strategy. Technical report, RNIB (2016). Accessed 19 July 2016

    Google Scholar 

  27. Rodríguez, A., Bergasa, L.M., Alcantarilla, P.F., Yebes, J., Cela, A.: Obstacle avoidance system for assisting visually impaired people. In: Intelligent Vehicles Symposium Workshops, pp. 1–6 (2012)

    Google Scholar 

  28. Schonstein, D., Ferré, L., Katz, B.F.: Comparison of headphones and equalization for virtual auditory source localization. J. Acoust. Soc. Am. 5, 3724–3724 (2008)

    Article  Google Scholar 

  29. Schwarze, T., Lauer, M., Schwaab, M., Romanovas, M., Bohm, S., Jurgensohn, T.: An intuitive mobility aid for visually impaired people based on stereo vision. In: International Conference on Computer Vision Workshops, pp. 17–25 (2015)

    Google Scholar 

  30. Shepard, R.: Circularity in judgments of relative pitch. J. Acoust. Soc. Am. 36(12), 2346–2353 (1964)

    Article  Google Scholar 

  31. Stanley, R.M., Walker, B.N.: Lateralization of sounds using bone-conduction headsets. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, pp. 1571–1575. SAGE Publications, Los Angeles (2006)

    Article  Google Scholar 

  32. Wilson, J., Walker, B.N., Lindsay, J., Cambias, C., Dellaert, F.: SWAN: system for wearable audio navigation. In: International Symposium on Wearable Computers, pp. 91–98 (2007)

    Google Scholar 

  33. Xiao, J., Joseph, S.L., Zhang, X., Li, B., Li, X., Zhang, J.: An assistive navigation framework for the visually impaired. IEEE Trans. Hum.-Mach. Syst. 45(5), 635–640 (2015)

    Article  Google Scholar 

  34. Yusif, S., Soar, J., Hafeez-Baig, A.: Older people, assistive technologies, and the barriers to adoption: a systematic review. Int. J. Med. Informatics 94, 112–116 (2016)

    Article  Google Scholar 

  35. Zwiers, M.P., Van Opstal, A.J., Cruysberg, J.R.M., Opstal, A.J.V., Cruysberg, J.R.M.: A spatial hearing deficit in early-blind humans. J. Neurosci. 21(1529–2401), RC142–RC145 (2001)

    Article  Google Scholar 

Download references

Acknowledgements

This research is partly supported by a Google Faculty Research Award. We would like to thank the Voluntary Centre Services UK for their help in facilitating the experiments with people with limited vision.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacobus C. Lock .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lock, J.C., Gilchrist, I.D., Cielniak, G., Bellotto, N. (2019). Bone-Conduction Audio Interface to Guide People with Visual Impairments. In: Wang, G., El Saddik, A., Lai, X., Martinez Perez, G., Choo, KK. (eds) Smart City and Informatization. iSCI 2019. Communications in Computer and Information Science, vol 1122. Springer, Singapore. https://doi.org/10.1007/978-981-15-1301-5_43

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-1301-5_43

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-1300-8

  • Online ISBN: 978-981-15-1301-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics