Lightweight RGB-D SLAM System for Search and Rescue Robots

  • Dominik Belter
  • Michał Nowicki
  • Piotr Skrzypczyński
  • Krzysztof Walas
  • Jan Wietrzykowski
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 351)


Search and rescue robots ought to be autonomous, as it enables to keep the human personnel out of dangerous areas. To achieve desirable level of the autonomy both environment mapping and reliable self-localization have to be implemented. In this paper we analyse the application of a fast, lightweight RGB-D Simultaneous Localization and Mapping (SLAM) system for robots involved in indoor/outdoor search and rescue missions. We demonstrate that under some conditions the RGB-D sensors provide data reliable enough even for outdoor, real-time SLAM. Experiments are performed on a legged robot and a wheeled robot, using two representative RGB-D sensors: the Asus Xtion Pro Live and the recently introduced Microsoft Kinect ver. 2.


Pose-based SLAM RGB-D sensors USAR evaluation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bachrach, A., Prentice, S., He, R., Henry, P., Huang, A., Krainin, M., Maturana, D., Fox, D., Roy, N.: Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments. Int. Journal of Robot. Res. 31(11), 1320–1343 (2012)CrossRefGoogle Scholar
  2. 2.
    Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (SURF). Computer Vision and Image Understanding 110(3), 346–359 (2008)CrossRefGoogle Scholar
  3. 3.
    Belter, D., Skrzypczyński, P.: Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping. Industrial Robot: An International Journal 40(3), 229–237 (2013)CrossRefGoogle Scholar
  4. 4.
    Belter, D., Walas, K.: A compact walking robot – flexible research and development platform. In: Szewczyk, R., Zieliński, C., Kaliczyńska, M. (eds.) Recent Advances in Automation, Robotics and Measuring Techniques. AISC, vol. 267, pp. 343–352. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  5. 5.
    Belter, D., Nowicki, M., Skrzypczyński, P.: On the performance of pose-based RGB-D visual navigation systems. In: Cremers, D., et al. (eds.) Asian Conference on Computer Vision 2014. LNCS. Springer (2014)Google Scholar
  6. 6.
    Endres, F., Hess, J., Sturm, J., Cremers, D., Burgard, W.: 3-D Mapping with an RGB-D Camera. IEEE Trans. on Robotics 30(1), 177–187 (2014)CrossRefGoogle Scholar
  7. 7.
    Fraundorfer, F., Scaramuzza, D.: Visual odometry: Part II - matching, robustness and applications. IEEE Robotics & Automation Magazine 19(2), 78–90 (2012)CrossRefGoogle Scholar
  8. 8.
    Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., Burgard, W.: g2o: A general framework for graph optimization. In: IEEE Int. Conf. on Robotics & Automation, Shanghai, pp. 3607–3613 (2011)Google Scholar
  9. 9.
    Nowicki, M., Skrzypczyński, P.: Experimental verification of a walking robot self-localization system with the Kinect sensor. Journal of Automation, Mobile Robotics and Intelligent Systems 7(4), 42–51 (2013)CrossRefGoogle Scholar
  10. 10.
    Nowicki, M., Skrzypczyński, P.: Combining photometric and depth data for lightweight and robust visual odometry. In: European Conf. on Mobile Robots, Barcelona, pp. 125–130 (2013)Google Scholar
  11. 11.
    Rosten, E., Drummond, T.W.: Machine learning for high-speed corner detection. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006, Part I. LNCS, vol. 3951, pp. 430–443. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  12. 12.
    Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: IEEE Int. Conf. on Computer Vision, pp. 2564–2571 (2011)Google Scholar
  13. 13.
    Shi, J., Tomasi, C.: Good features to track. In: IEEE Conf. on Comp. Vis. and Pattern Recog., Seattle, pp. 593–600 (1994)Google Scholar
  14. 14.
    Skrzypczyński, P.: Laser scan matching for self-localization of a walking robot in man-made environments. Industrial Robot: An International Journal 39(3), 242–250 (2012)CrossRefGoogle Scholar
  15. 15.
    Stelzer, A., Hirschmüller, H., Görner, M.: Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain. Int. Journal of Robotics Research 31(4), 381–402 (2012)CrossRefGoogle Scholar
  16. 16.
    Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Vilamoura, pp. 573–580 (2012)Google Scholar
  17. 17.
    Umeyama, S.: Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. on Pattern Analysis & Machine Intelligence 13(4), 376–380 (1991)CrossRefGoogle Scholar
  18. 18.
    Whelan, T., Johannsson, H., Kaess, M., Leonard, J., McDonald, J.: Robust real-time visual odometry for dense RGB-D mapping. In: IEEE Int. Conf. on Robotics & Automation, Karlsruhe, pp. 5704–5711 (2013)Google Scholar
  19. 19.
    Whelan, T., Kaess, M., Leonard, J., McDonald, J.: Deformation-based loop closure for large scale dense RGB-D SLAM. In: IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Tokyo, pp. 548–555 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Dominik Belter
    • 1
  • Michał Nowicki
    • 1
  • Piotr Skrzypczyński
    • 1
  • Krzysztof Walas
    • 1
  • Jan Wietrzykowski
    • 1
  1. 1.Institute of Control and Information EngineeringPoznań University of TechnologyPoznańPoland

Personalised recommendations