Accurate Map-Based RGB-D SLAM for Mobile Robots

  • Dominik Belter
  • Michał Nowicki
  • Piotr SkrzypczyńskiEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 418)


In this paper we present and evaluate a map-based RGB-D SLAM (Simultaneous Localization and Mapping) system employing a novel idea of combining efficient visual odometry and a persistent map of 3D point features used to jointly optimize the sensor (robot) poses and the feature positions. The optimization problem is represented as a factor graph. The SLAM system consists of a front-end that tracks the sensor frame-by-frame, extracts point features, and associates them with the map, and a back-end that manages and optimizes the map. We propose a robust approach to data association, which combines efficient selection of candidate features from the map, matching of visual descriptors guided by the sensor pose prediction from visual odometry, and verification of the associations in both the image plane and 3D space. The improved accuracy and robustness is demonstrated on publicly available data sets.


SLAM Point features Tracking Factor graph RGB-D data 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (SURF). Computer Vision and Image Understanding 110(3), 346–359 (2008)CrossRefGoogle Scholar
  2. 2.
    Belter, D., Skrzypczyński, P.: Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping. Industrial Robot: An International Journal 40(3), 229–237 (2013)CrossRefGoogle Scholar
  3. 3.
    Belter, D., Nowicki, M., Skrzypczyński, P.: On the performance of pose-based RGB-D visual navigation systems. In: Cremers, D., et al. (eds.) Computer Vision – ACCV 2014. LNCS, vol. 9004, pp. 407–423. Springer (2015)Google Scholar
  4. 4.
    Belter, D., Skrzypczyński, P.: The importance of measurement uncertainty modeling in the feature-based RGB-D SLAM. In: Proc. Int. Workshop on Robot Motion and Control, Poznań, pp. 308–313 (2015)Google Scholar
  5. 5.
    Cummins, M., Newman, P.: Accelerating FAB-MAP with Concentration Inequalities. IEEE Trans. on Robotics 26(6), 1042–1050 (2010)CrossRefGoogle Scholar
  6. 6.
    Endres, F., Hess, J., Sturm, J., Cremers, D., Burgard, W.: 3-D Mapping with an RGB-D Camera. IEEE Trans. on Robotics 30(1), 177–187 (2014)CrossRefGoogle Scholar
  7. 7.
    Fioraio, N., Di Stefano, L.: SlamDunk: affordable real-time RGB-D SLAM. In: Computer Vision – ECCV 2014 Workshops. LNCS, vol. 8925, pp. 401-414. Springer (2015)Google Scholar
  8. 8.
    Gil, A., Martinez Mozos, O., Ballesta, M., Reinoso, O.: A comparative evaluation of interest point detectors and local descriptors for visual SLAM. Machine Vision and Applications 21(6), 905–920 (2010)CrossRefGoogle Scholar
  9. 9.
    Grisetti, G., Kümmerle, R., Ni, K.: Robust optimization of factor graphs by using condensed measurements. In: Proc. IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Vilamoura, pp. 581–588 (2012)Google Scholar
  10. 10.
    Handa, A., Whelan, T., McDonald, J. B., Davison, A. J.: A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. In: IEEE Int. Conf. on Robotics & Automation, Hong Kong, pp. 1524–1531 (2014)Google Scholar
  11. 11.
    Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. Int. Journal of Robot. Res. 31(5), 647–663 (2012)CrossRefGoogle Scholar
  12. 12.
    Kerl, C., Sturm, J., Cremers, D.: Robust odometry estimation for RGB-D cameras. In: Proc. IEEE Int. Conf. on Robotics & Automation, Karlsruhe, pp. 3748–3754 (2013)Google Scholar
  13. 13.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proc. Int. Symp. on Mixed and Augmented Reality, Nara, pp. 225–234 (2007)Google Scholar
  14. 14.
    Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., Burgard, W.: g2o: A general framework for graph optimization. In: IEEE Int. Conf. on Robotics & Automation, Shanghai, pp. 3607–3613 (2011)Google Scholar
  15. 15.
    Maier, R., Sturm, J., Cremers, D.: Submap-based bundle adjustment for 3D reconstruction from RGB-D Data. In: Pattern Recognition. LNCS, vol. 8753, pp. 54–65. Springer (2014)Google Scholar
  16. 16.
    Nowicki, M., Skrzypczyński, P.: Combining photometric and depth data for lightweight and robust visual odometry. In: European Conf. on Mobile Robots, Barcelona, pp. 125–130 (2013)Google Scholar
  17. 17.
    Ozawa, R., Takaoka, Y., Kida, Y., Nishiwaki, K., Chestnutt, J., Kuffner, J., Inoue, H.: Using visual odometry to create 3D maps for online footstep planning. In: IEEE Int. Conf. on Systems, Man and Cybernetics, Hawaii, pp. 2643–2648 (2005)Google Scholar
  18. 18.
    Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: IEEE Int. Conf. on Computer Vision, pp. 2564–2571 (2011)Google Scholar
  19. 19.
    Scherer, S., Zell, A.: Efficient onboard RGBD-SLAM for autonomous MAVs. In: Proc. IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Tokyo, pp. 1062–1068 (2013)Google Scholar
  20. 20.
    Shi, J., Tomasi, C.: Good features to track. In: IEEE Conf. on Comp. Vis. and Pattern Recog., Seattle, pp. 593–600 (1994)Google Scholar
  21. 21.
    Strasdat, H., Davison, A. J. Montiel, J., Konolige, K.: Double window optimisation for constant time visual SLAM. In: Proc. Int. Conf. on Computer Vision, Los Alamitos, pp. 2352–2359 (2011)Google Scholar
  22. 22.
    Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Vilamoura, pp. 573–580 (2012)Google Scholar
  23. 23.
    Triggs, B., McLauchlan, P.F., Hartley, R.I., Fitzgibbon, A.W.: Bundle adjustment – a modern synthesis. In: Vision Algorithms: Theory and Practice. LNCS, vol. 1883, pp. 298–372. Springer (2000)Google Scholar
  24. 24.
    Umeyama, S.: Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. on Pattern Analysis & Machine Intelligence 13(4), 376–380 (1991)CrossRefGoogle Scholar
  25. 25.
    Whelan, T., Johannsson, H., Kaess, M., Leonard, J., McDonald, J.: Robust real-time visual odometry for dense RGB-D mapping. In: IEEE Int. Conf. on Robotics & Automation, Karlsruhe, pp. 5704–5711 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Dominik Belter
    • 1
  • Michał Nowicki
    • 1
  • Piotr Skrzypczyński
    • 1
    Email author
  1. 1.Institute of Control and Information EngineeringPoznań University of TechnologyPoznańPoland

Personalised recommendations