Skip to main content

Obstacle Detection and Avoidance for the Visually Impaired in Indoors Environments Using Google’s Project Tango Device

  • Conference paper
  • First Online:
Computers Helping People with Special Needs (ICCHP 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9759))

Abstract

A depth-data based obstacle detection and avoidance application for VI users to assist them in navigating independently in previously unmapped indoors environments is presented. The application is being developed for the recently introduced Google Project Tango Tablet Development Kit equipped with a powerful processor (NVIDIA Tegra K1 with 192 CUDA cores) as well as var-ious sensors which allow it track its motion and orientation in 3D space in real-time. Depth data for the area in front of the users, obtained using the tablet’s in-built infrared–based depth sensor, is analyzed to detect obstacles and audio-based navigation instructions are provided accordingly. A visual display option is also offered for users with low vision. The aim is to develop a real-time, affordable, aesthetically acceptable, mobile assistive stand-alone application on a cutting-edge device, adopting a user-centered approach, which allows VI users to micro-navigate autonomously in possibly unfamiliar indoor surroundings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Shoval, S., Ulrich, I., Borenstein, J.: NavBelt and the Guide-Cane [obstacle-avoidance systems for the blind and visually impaired]. Robot. Autom. Mag. IEEE 10, 9–20 (2003)

    Article  Google Scholar 

  2. Google Project Tango. https://www.google.com/atap/project-tango/

  3. Ivanchenko, V., Coughlan, J., Gerrey, W., Shen, H.: Computer vision-based clear path guidance for blind wheelchair users. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, Nova Scotia, Canada, pp. 291–292 (2008)

    Google Scholar 

  4. Magatani, K., Sawa, K., Yanashima, K.: Development of the navigation system for the visually impaired by using optical beacons. In: Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2001, vol. 2, pp. 1488–1490 (2001)

    Google Scholar 

  5. Harada, T., Kaneko, Y., Hirahara, Y., Yanashima, K., Magatani, K.: Development of the navigation system for visually impaired. In: 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2004, IEMBS 2004, pp. 4900–4903 (2004)

    Google Scholar 

  6. Do, D.H., Riehle, T.H., Solinsky, R., Assadi-Lamouki, P., Hillesheim, C.T., Vu, A.N., Velie, T., Seifert, G.J.: Resolving subjects and measuring observer/subject distances with a thermal tactile imager. In: 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2008. EMBS 2008, pp. 4302–4305 (2008)

    Google Scholar 

  7. Mustapha, B., Zayegh, A., Begg, R.K.: Wireless obstacle detection system for the elderly and visually impaired people. In: 2013 IEEE International Conference on Smart Instrumentation, Measurement and Applications (ICSIMA), pp. 1–5 (2013)

    Google Scholar 

  8. Microsoft Kinect. https://dev.windows.com/en-us/kinect

  9. Structure Sensor, Occipital Inc. http://structure.io/

  10. Zöllner, M., Huber, S., Jetter, H.-C., Reiterer, H.: NAVI– a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part IV. LNCS, vol. 6949, pp. 584–587. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  11. Filipe, V., Fernandes, F., Fernandes, H., Sousa, A., Paredes, H., Barroso, J.: Blind navigation support system based on microsoft kinect. Procedia Comput. Sci. 14, 94–101 (2012)

    Article  Google Scholar 

  12. Brock, M., Kristensson, P.O.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, pp. 255–258 (2013)

    Google Scholar 

  13. Huang, H.-C., Hsieh, C.-T., Yeh, C.-H.: An indoor obstacle detection system using depth information and region growth. Sensors 15, 27116–27141 (2015)

    Article  Google Scholar 

  14. Lee, Y.H., Medioni, G.: Wearable RGBD indoor navigation system for the blind. In: Agapito, L., Bronstein, M.M., Rother, C. (eds.) ECCV 2014 Workshops. LNCS, vol. 8927, pp. 493–508. Springer, Heidelberg (2015)

    Google Scholar 

  15. Ribeiro, F., Florencio, D., Chou, P.A., Zhengyou, Z.: Auditory augmented reality: Object sonification for the visually impaired. In: 2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP), pp. 319–324 (2012)

    Google Scholar 

  16. Golledge, R., Klatzky, R., Loomis, J., Marston, J.: Stated preferences for components of a personal guidance system for nonvisual navigation. J. Vis. Impairment Blindness 98, 135–147 (2004)

    Google Scholar 

  17. Jafri, R., Ali, S.A.: Exploring the potential of eyewear-based wearable display devices for use by the visually impaired. In: Proceedings of the 3rd International Conference on User Science and Engineering (i-USEr 2014), Shah Alam, Malaysia, pp. 119–124 (2014)

    Google Scholar 

  18. Visual impairment and blindness: Fact sheet number 282. WHO media center (2012). http://www.who.int/mediacentre/factsheets/fs282/en/

  19. Anderson, D.: Navigation for the Visually Impaired Using a Google Tango RGB-D Tablet (2015). http://www.dan.andersen.name/navigation-for-the-visually-impaired-using-a-google-tango-rgb-d-tablet/

  20. Tiresias: An app to help visually impaired people navigate easily through unfamiliar buildings. HackDuke 2015 (2015). http://devpost.com/software/tiresias-ie0vum

  21. Project Tango SDK. https://developers.google.com/project-tango/downloads#project_tango_sdk_files

  22. OpenCV Platforms: Android. http://opencv.org/platforms/android.html

  23. Canny, J.F.: A computational approach to edge detection. In: Martin, A.F., Oscar, F. (eds.), Readings in Computer Vision: Issues, Problems, Principles, and Paradigms. Morgan Kaufmann Publishers Inc., pp. 184–203 (1987)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rabia Jafri .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Jafri, R., Khan, M.M. (2016). Obstacle Detection and Avoidance for the Visually Impaired in Indoors Environments Using Google’s Project Tango Device. In: Miesenberger, K., Bühler, C., Penaz, P. (eds) Computers Helping People with Special Needs. ICCHP 2016. Lecture Notes in Computer Science(), vol 9759. Springer, Cham. https://doi.org/10.1007/978-3-319-41267-2_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-41267-2_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-41266-5

  • Online ISBN: 978-3-319-41267-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics