Abstract
The ActiVis project’s aim is to build a mobile guidance aid to help people with limited vision find objects in an unknown environment. This system uses bone-conduction headphones to transmit audio signals to the user and requires an effective non-visual interface. To this end, we propose a new audio-based interface that uses a spatialised signal to convey a target’s position on the horizontal plane. The vertical position on the median plan is given by adjusting the tone’s pitch to overcome the audio localisation limitations of bone-conduction headphones. This interface is validated through a set of experiments with blindfolded and visually impaired participants.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arditi, A., Tian, Y.: User interface preferences in the design if a camera-based navigation and wayfinding aid. J. Vis. Impair. Blind. 107(2), 118–129 (2013)
Bajcsy, R., Aloimonos, Y., Tsotsos, J.K.: Revisiting active perception. Auton. Robot. 42(2), 177–196 (2018)
Barfield, W., Cohen, M., Rosenberg, C.: Visual and auditory localization as a function of azimuth and elevation. Int. J. Aviat. Psychol. 7(2), 123–138 (1997)
Blauert, J.: Spatial Hearing: The Psychophysics of Human Sound Localization. MIT Press, Cambridge (1997)
Blauert, J.: Sound localization in the median plane. Acta Acustica United Acustica 22(4), 205–213 (1969)
Blum, J.R., Bouchard, M., Cooperstock, J.R.: Spatialized audio environmental awareness for blind users with a smartphone. Mob. Netw. Appl. 18(3), 295–309 (2013)
Chessa, M., Noceti, N., Odone, F., Solari, F., Sosa-García, J., Zini, L.: An integrated artificial vision framework for assisting visually impaired users. Comput. Vis. Image Underst. 149, 209–228 (2016)
Durette, B., Louveton, N., Alleysson, D., Hérault, J.: Visuo-auditory sensory substitution for mobility assistance: testing TheVIBE. In: Workshop on Computer Vision Applications for the Visually Impaired (2008)
Gallina, P., Bellotto, N., Luca, M.D., Di Luca, M.: Progressive co-adaptation in human-machine interaction. In: International Conference on Informatics in Control, Automation and Robotics, vol. 2, pp. 362–368 (2015)
Gardner, W.G., Martin, K.D.: HRTF measurements of a KEMAR. J. Acoust. Soc. Am. 97(6), 3907–3908 (1995)
Geronazzo, M., Bedin, A., Brayda, L., Campus, C., Avanzini, F.: Interactive spatial sonification for non-visual exploration of virtual maps. Int. J. Hum. Comput. Stud. 85, 4–15 (2016)
Golledge, R.G., Marston, J.R., Loomis, J.M., Klatzky, R.L.: Stated preferences for components of a personal guidance system for nonvisual navigation. J. Vis. Impair. Blind. 98(3), 135–147 (2004)
Hiebert, G.: OpenAL 1.1 Specification and Reference (2005)
Kanwal, N., Bostanci, E., Currie, K., Clark, A.F.: A navigation system for the visually impaired: a fusion of vision and depth sensor. Appl. Bionics Biomech. (2015). https://www.hindawi.com/journals/abb/2015/479857/cta/
Katz, B.F.G., Picinali, L.: Spatial audio applied to research with the blind. In: Advances in Sound Localization, pp. 225–250 (2011)
Katz, B.F.G., Truillet, P., Thorpe, S.J., Jouffrais, C.: NAVIG: navigation assisted by artificial vision and GNSS. In: Workshop on Multimodal Location Based Techniques for Extreme Navigation, vol. 1, pp. 1–4 (2010)
Klatzky, R.L., Marston, J.R., Giudice, N.A., Golledge, R.G., Loomis, J.M.: Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J. Exp. Psychol.: Appl. 12(4), 223–232 (2006)
Lee, Y., Medioni, G.: RGB-D camera based wearable navigation system for the visually impaired. Comput. Vis. Image Underst. 149, 3–20 (2015)
Lichenstein, R., Smith, D.C., Ambrose, J.L., Moody, L.A.: Headphone use and pedestrian injury and death in the united states: 2004–2011. Inj. Prev. 18(5), 287–290 (2012)
Lock, J.C., Cielniak, G., Bellotto, N.: Portable navigations system with adaptive multimodal interface for the blind. In: AAAI Spring Symposium - Designing the User Experience of Machine Learning Systems (2017)
Lock, J.C., Cielniak, G., Bellotto, N.: Active object search with a mobile device for people with visual impairments. In: International Conference on Computer Vision Theory and Applications, pp. 476–485 (2019)
MacDonald, J.A., Henry, P.P., Letowski, T.R.: Spatial audio through a bone conduction interface. Int. J. Audiol. 45(10), 595–599 (2006)
Mocanu, B., Tapu, R., Zaharia, T.: When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors 16(11), 1807 (2016)
Pratt, C.: The spatial character of high and low tones. J. Exp. Psychol. 13(3), 278 (1930)
Rivera-Rubio, J., Arulkumaran, K., Rishi, H., Alexiou, I., Bharath, A.A.: An assistive haptic interface for appearance-based indoor navigation. Comput. Vis. Image Underst. 149, 126–145 (2015)
RNIB: UK vision strategy. Technical report, RNIB (2016). Accessed 19 July 2016
Rodríguez, A., Bergasa, L.M., Alcantarilla, P.F., Yebes, J., Cela, A.: Obstacle avoidance system for assisting visually impaired people. In: Intelligent Vehicles Symposium Workshops, pp. 1–6 (2012)
Schonstein, D., Ferré, L., Katz, B.F.: Comparison of headphones and equalization for virtual auditory source localization. J. Acoust. Soc. Am. 5, 3724–3724 (2008)
Schwarze, T., Lauer, M., Schwaab, M., Romanovas, M., Bohm, S., Jurgensohn, T.: An intuitive mobility aid for visually impaired people based on stereo vision. In: International Conference on Computer Vision Workshops, pp. 17–25 (2015)
Shepard, R.: Circularity in judgments of relative pitch. J. Acoust. Soc. Am. 36(12), 2346–2353 (1964)
Stanley, R.M., Walker, B.N.: Lateralization of sounds using bone-conduction headsets. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, pp. 1571–1575. SAGE Publications, Los Angeles (2006)
Wilson, J., Walker, B.N., Lindsay, J., Cambias, C., Dellaert, F.: SWAN: system for wearable audio navigation. In: International Symposium on Wearable Computers, pp. 91–98 (2007)
Xiao, J., Joseph, S.L., Zhang, X., Li, B., Li, X., Zhang, J.: An assistive navigation framework for the visually impaired. IEEE Trans. Hum.-Mach. Syst. 45(5), 635–640 (2015)
Yusif, S., Soar, J., Hafeez-Baig, A.: Older people, assistive technologies, and the barriers to adoption: a systematic review. Int. J. Med. Informatics 94, 112–116 (2016)
Zwiers, M.P., Van Opstal, A.J., Cruysberg, J.R.M., Opstal, A.J.V., Cruysberg, J.R.M.: A spatial hearing deficit in early-blind humans. J. Neurosci. 21(1529–2401), RC142–RC145 (2001)
Acknowledgements
This research is partly supported by a Google Faculty Research Award. We would like to thank the Voluntary Centre Services UK for their help in facilitating the experiments with people with limited vision.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lock, J.C., Gilchrist, I.D., Cielniak, G., Bellotto, N. (2019). Bone-Conduction Audio Interface to Guide People with Visual Impairments. In: Wang, G., El Saddik, A., Lai, X., Martinez Perez, G., Choo, KK. (eds) Smart City and Informatization. iSCI 2019. Communications in Computer and Information Science, vol 1122. Springer, Singapore. https://doi.org/10.1007/978-981-15-1301-5_43
Download citation
DOI: https://doi.org/10.1007/978-981-15-1301-5_43
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-1300-8
Online ISBN: 978-981-15-1301-5
eBook Packages: Computer ScienceComputer Science (R0)