Abstract
A depth-data based obstacle detection and avoidance application for VI users to assist them in navigating independently in previously unmapped indoors environments is presented. The application is being developed for the recently introduced Google Project Tango Tablet Development Kit equipped with a powerful processor (NVIDIA Tegra K1 with 192 CUDA cores) as well as var-ious sensors which allow it track its motion and orientation in 3D space in real-time. Depth data for the area in front of the users, obtained using the tablet’s in-built infrared–based depth sensor, is analyzed to detect obstacles and audio-based navigation instructions are provided accordingly. A visual display option is also offered for users with low vision. The aim is to develop a real-time, affordable, aesthetically acceptable, mobile assistive stand-alone application on a cutting-edge device, adopting a user-centered approach, which allows VI users to micro-navigate autonomously in possibly unfamiliar indoor surroundings.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Shoval, S., Ulrich, I., Borenstein, J.: NavBelt and the Guide-Cane [obstacle-avoidance systems for the blind and visually impaired]. Robot. Autom. Mag. IEEE 10, 9–20 (2003)
Google Project Tango. https://www.google.com/atap/project-tango/
Ivanchenko, V., Coughlan, J., Gerrey, W., Shen, H.: Computer vision-based clear path guidance for blind wheelchair users. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, Nova Scotia, Canada, pp. 291–292 (2008)
Magatani, K., Sawa, K., Yanashima, K.: Development of the navigation system for the visually impaired by using optical beacons. In: Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2001, vol. 2, pp. 1488–1490 (2001)
Harada, T., Kaneko, Y., Hirahara, Y., Yanashima, K., Magatani, K.: Development of the navigation system for visually impaired. In: 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2004, IEMBS 2004, pp. 4900–4903 (2004)
Do, D.H., Riehle, T.H., Solinsky, R., Assadi-Lamouki, P., Hillesheim, C.T., Vu, A.N., Velie, T., Seifert, G.J.: Resolving subjects and measuring observer/subject distances with a thermal tactile imager. In: 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2008. EMBS 2008, pp. 4302–4305 (2008)
Mustapha, B., Zayegh, A., Begg, R.K.: Wireless obstacle detection system for the elderly and visually impaired people. In: 2013 IEEE International Conference on Smart Instrumentation, Measurement and Applications (ICSIMA), pp. 1–5 (2013)
Microsoft Kinect. https://dev.windows.com/en-us/kinect
Structure Sensor, Occipital Inc. http://structure.io/
Zöllner, M., Huber, S., Jetter, H.-C., Reiterer, H.: NAVI– a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part IV. LNCS, vol. 6949, pp. 584–587. Springer, Heidelberg (2011)
Filipe, V., Fernandes, F., Fernandes, H., Sousa, A., Paredes, H., Barroso, J.: Blind navigation support system based on microsoft kinect. Procedia Comput. Sci. 14, 94–101 (2012)
Brock, M., Kristensson, P.O.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, pp. 255–258 (2013)
Huang, H.-C., Hsieh, C.-T., Yeh, C.-H.: An indoor obstacle detection system using depth information and region growth. Sensors 15, 27116–27141 (2015)
Lee, Y.H., Medioni, G.: Wearable RGBD indoor navigation system for the blind. In: Agapito, L., Bronstein, M.M., Rother, C. (eds.) ECCV 2014 Workshops. LNCS, vol. 8927, pp. 493–508. Springer, Heidelberg (2015)
Ribeiro, F., Florencio, D., Chou, P.A., Zhengyou, Z.: Auditory augmented reality: Object sonification for the visually impaired. In: 2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP), pp. 319–324 (2012)
Golledge, R., Klatzky, R., Loomis, J., Marston, J.: Stated preferences for components of a personal guidance system for nonvisual navigation. J. Vis. Impairment Blindness 98, 135–147 (2004)
Jafri, R., Ali, S.A.: Exploring the potential of eyewear-based wearable display devices for use by the visually impaired. In: Proceedings of the 3rd International Conference on User Science and Engineering (i-USEr 2014), Shah Alam, Malaysia, pp. 119–124 (2014)
Visual impairment and blindness: Fact sheet number 282. WHO media center (2012). http://www.who.int/mediacentre/factsheets/fs282/en/
Anderson, D.: Navigation for the Visually Impaired Using a Google Tango RGB-D Tablet (2015). http://www.dan.andersen.name/navigation-for-the-visually-impaired-using-a-google-tango-rgb-d-tablet/
Tiresias: An app to help visually impaired people navigate easily through unfamiliar buildings. HackDuke 2015 (2015). http://devpost.com/software/tiresias-ie0vum
Project Tango SDK. https://developers.google.com/project-tango/downloads#project_tango_sdk_files
OpenCV Platforms: Android. http://opencv.org/platforms/android.html
Canny, J.F.: A computational approach to edge detection. In: Martin, A.F., Oscar, F. (eds.), Readings in Computer Vision: Issues, Problems, Principles, and Paradigms. Morgan Kaufmann Publishers Inc., pp. 184–203 (1987)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Jafri, R., Khan, M.M. (2016). Obstacle Detection and Avoidance for the Visually Impaired in Indoors Environments Using Google’s Project Tango Device. In: Miesenberger, K., Bühler, C., Penaz, P. (eds) Computers Helping People with Special Needs. ICCHP 2016. Lecture Notes in Computer Science(), vol 9759. Springer, Cham. https://doi.org/10.1007/978-3-319-41267-2_24
Download citation
DOI: https://doi.org/10.1007/978-3-319-41267-2_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-41266-5
Online ISBN: 978-3-319-41267-2
eBook Packages: Computer ScienceComputer Science (R0)