Advertisement

Wearable RGBD Indoor Navigation System for the Blind

  • Young Hoon Lee
  • Gérard Medioni
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8927)

Abstract

In this paper, we present a novel wearable RGBD camera based navigation system for the visually impaired. The system is composed of a smartphone user interface, a glass-mounted RGBD camera device, a real-time navigation algorithm, and haptic feedback system. A smartphone interface provides an effective way to communicate to the system using audio and haptic feedback. In order to extract orientational information of the blind users, the navigation algorithm performs real-time 6-DOF feature based visual odometry using a glass-mounted RGBD camera as an input device. The navigation algorithm also builds a 3D voxel map of the environment and analyzes 3D traversability. A path planner of the navigation algorithm integrates information from the egomotion estimation and mapping and generates a safe and an efficient path to a waypoint delivered to the haptic feedback system. The haptic feedback system consisting of four micro-vibration motors is designed to guide the visually impaired user along the computed path and to minimize cognitive loads. The proposed system achieves real-time performance at \(28.4\)Hz in average on a laptop, and helps the visually impaired extends the range of their activities and improve the mobility performance in a cluttered environment. The experiment results show that navigation in indoor environments with the proposed system avoids collisions successfully and improves mobility performance of the user compared to conventional and state-of-the-art mobility aid devices.

Keywords

Navigation System Mobility Performance Blind Subject Visual Odometry Navigation Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Borenstein, J., Ulrich, I.: The GuideCane - A computerized travel aid for the active guidance of blind pedestrians. In: IEEE Int. Conf. on Robotics and Automation. pp. 1283–1288 (1997)Google Scholar
  2. 2.
    Brock, M., Kristensson, P.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 255–258 (2013)Google Scholar
  3. 3.
    Clark-Carter, D., Heyes, A., Howarth, C.: The efficiency and walking speed of visually impaired people. Ergonomics 29(6), 779–789 (1986)CrossRefGoogle Scholar
  4. 4.
    Golledge, R.G., Marston, J.R., Costanzo, C.M.: Attituds of visually imparied persons towards the use of public transportation. Journal of Visual Impairment Blindness 91(5), 446–459 (1997)Google Scholar
  5. 5.
    Google: Google glass (2013). http://www.google.com/glass
  6. 6.
    Guerrero, L., Vasquez, F., Ochoa, S.: An indoor navigation system for the visually impaired. Sensors 12(6), 8236–8258 (2012)CrossRefGoogle Scholar
  7. 7.
    Helal, A., Moore, S., Ramachandran, B.: Drishti: An integrated navigation system for visually impaired and disabled. In: Proceedings Fifth International Symposium on Wearable Computers, pp. 149–156. IEEE (2001)Google Scholar
  8. 8.
    Hirschmuller, H., Innocent, P., Garibaldi, J.: Fast, unconstrained camera motion estimation from stereo without tracking and robust statistics. In: 7th International Conference on Control, Automation, Robotics and Vision. ICARCV 2002. vol. 2, pp. 1099–1104. IEEE (2002)Google Scholar
  9. 9.
    Holzer, S., Rusu, R., Dixon, M., Gedikli, S., Navab, N.: Adaptive neighborhood selection for real-time surface normal estimation from organized point cloud data using integral images. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2684–2689. IEEE (2012)Google Scholar
  10. 10.
    Horn, B.: Closed-form solution of absolute orientation using unit quaternions. JOSA A 4(4), 629–642 (1987)CrossRefMathSciNetGoogle Scholar
  11. 11.
    Howard, A.: Real-time stereo visual odometry for autonomous ground vehicles. In: IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS 2008, pp. 3946–3952. IEEE (2008)Google Scholar
  12. 12.
    Huang, A., Bachrach, A., Henry, P., Krainin, M., Maturana, D., Fox, D., Roy, N.: Visual odometry and mapping for autonomous flight using an rgb-d camera. In: International Symposium on Robotics Research (ISRR), pp. 1–16 (2011)Google Scholar
  13. 13.
    Koenig, S., Likhachev, M.: Fast replanning for navigation in unknown terrain. IEEE Transactions on Robotics 3(21), 354–363 (2005)CrossRefGoogle Scholar
  14. 14.
    Lee, Y., Medioni, G.: A RGB-D camera based navigation for the visually impaired. In: RSS 2011 RGB-D: Advanced Reasoning with Depth Camera Workshop, pp. 1–6 (2011)Google Scholar
  15. 15.
    Legood, R., Scuffham, P., Cryer, C.: Are we blind to injuries in the visually impaired? a review of the literature. Injury Prevention 8(2), 155–160 (2002)CrossRefGoogle Scholar
  16. 16.
    Loomis, J., Golledge, R., Klatzky, R.: Gps-based navigation systems for the visually impaired. Fundamentals of Wearable Computers and Augmented Reality 429(46) (2001)Google Scholar
  17. 17.
    Manduchi, R., Kurniawan, S.: Mobility-related accidents experienced by people with visual impairment. Insight: Research & Practice in Visual Impairment & Blindness 4(2), 44–54 (2011)Google Scholar
  18. 18.
    Martinez, J., Ruiz, F., et al.: Stereo-based aerial obstacle detection for the visually impaired. In: Workshop on Computer Vision Applications for the Visually Impaired (2008)Google Scholar
  19. 19.
    Mei, C., Sibley, G., Cummins, M., Newman, P., Reid, I.: A constant-time efficient stereo slam system. In: BMVC, pp. 1–11 (2009)Google Scholar
  20. 20.
    Moravec, H., Elfes, A.: High resolution maps from wide angle sonar. In: Proceedings 1985 IEEE International Conference on Robotics and Automation. vol. 2, pp. 116–121. IEEE (1985)Google Scholar
  21. 21.
    Nguyen, C., Izadi, S., Lovell, D.: Modeling kinect sensor noise for improved 3d reconstruction and tracking. In: 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT), pp. 524–530. IEEE (2012)Google Scholar
  22. 22.
    Pradeep, V., Medioni, G., Weiland, J.: Robot vision for the visually impaired. In: Computer Vision Applications for the Visually Impaired, pp. 15–22 (2010)Google Scholar
  23. 23.
    Sáez, J., Escolano, F.: 6dof entropy minimization slam for stereo-based wearable devices. Computer Vision and Image Understanding 115(2), 270–285 (2011)CrossRefGoogle Scholar
  24. 24.
    Schiller, J., Peregoy, J.: Provisional report: Summary health statistics for u.s. adults: National health interview survey, 2011. National Center for Health Statistics 10(256) (2012)Google Scholar
  25. 25.
    JVIB news service: Demographics update: Alternate estimate of the number of guide dog users. Journal of Visual Impairment Blindness 89(2), 4–6 (1995)Google Scholar
  26. 26.
    Turano, K., Broman, A., Bandeen-Roche, K., Munoz, B., Rubin, G., West, S., SEE PROJECT TEAM: Association of visual field loss and mobility performance in older adults: Salisbury eye evaluation study. Optometry & Vision Science 81(5), 298–307 (2004)CrossRefGoogle Scholar
  27. 27.
    West, S., Rubin, G., Broman, A., Munoz, B., Bandeen-Roche, K., Turano, K., THE SEE PROJECT: How does visual impairment affect performance on tasks of everyday life? Archives of Ophthalmology 120(6), 774–780 (2002)CrossRefGoogle Scholar
  28. 28.
    Wurm, K., Hornung, A., Bennewitz, M., Stachniss, C., Burgard, W.: Octomap: A probabilistic, flexible, and compact 3d map representation for robotic systems. In: Proc. of the ICRA 2010 workshop on best practice in 3D perception and modeling for mobile manipulation. vol. 2 (2010)Google Scholar
  29. 29.
    Yguel, M., Aycard, O., Laugier, C.: Update policy of dense maps: Efficient algorithms and sparse representation. In: Field and Service Robotics. pp. 23–33. Springer (2008)Google Scholar
  30. 30.
    Yuan, D., Manduchi, R.: Dynamic environment exploration using a virtual white cane. In: IEEE Conf. on Computer Vision and Pattern Recognition, pp. 243–249 (2005)Google Scholar
  31. 31.
    Zöllner, M., Huber, S., Jetter, H.-C., Reiterer, H.: NAVI – a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part IV. LNCS, vol. 6949, pp. 584–587. Springer, Heidelberg (2011) CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Institute for Robotics and Intelligent SystemsUniversity of Southern CaliforniaLos AngelesUSA

Personalised recommendations