Advertisement

Metrics for Real-Time Mono-VSLAM Evaluation Including IMU Induced Drift with Application to UAV Flight

  • Alexander Hardt-StremayrEmail author
  • Matthias Schörghuber
  • Stephan Weiss
  • Martin Humenberger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11130)

Abstract

Vision based algorithms became popular for state estimation and subsequent (local) control of mobile robots. Currently a large variety of such algorithms exists and their performance is often characterized through their drift relative to the total trajectory traveled. However, this metric has relatively low relevance for local vehicle control/stabilization. In this paper, we propose a set of metrics which allows to evaluate a vision based algorithm with respect to its usability for state estimation and subsequent (local) control of highly dynamic autonomous mobile platforms such as multirotor UAVs. As such platforms usually make use of inertial measurements to mitigate the relatively low update rate of the visual algorithm, we particularly focus on a new metric taking the expected IMU-induced drift between visual readings into consideration based on the probabilistic properties of the sensor. We demonstrate this set of metrics by comparing ORB-SLAM, LSD-SLAM and DSO on different datasets.

References

  1. 1.
    Burri, M., et al.: The EuRoC micro aerial vehicle datasets. Int. J. Robot. Res. 35(10), 1157–1163 (2016).  https://doi.org/10.1177/0278364915620033. http://ijr.sagepub.com/content/early/2016/01/21/0278364915620033.abstractCrossRefGoogle Scholar
  2. 2.
    Cadena, C., et al.: Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans. Robot. 32(6), 1309–1332 (2016)CrossRefGoogle Scholar
  3. 3.
    Engel, J., Koltun, V., Cremers, D.: Direct sparse odometry. IEEE Trans. Pattern Anal. Mach. Intell. 40(3), 611–625 (2018).  https://doi.org/10.1109/TPAMI.2017.2658577CrossRefGoogle Scholar
  4. 4.
    Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8690, pp. 834–849. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-10605-2_54CrossRefGoogle Scholar
  5. 5.
    Fuentes-Pacheco, J., Ruiz-Ascencio, J., Rendón-Mancha, J.M.: Visual simultaneous localization and mapping: a survey. Artif. Intell. Rev. 43(1), 55–81 (2015)CrossRefGoogle Scholar
  6. 6.
    Huletski, A., Kartashov, D., Krinkin, K.: Evaluation of the modern visual slam methods. In: 2015 Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT), pp. 19–25, November 2015.  https://doi.org/10.1109/AINL-ISMW-FRUCT.2015.7382963
  7. 7.
    Delmerico, J., Scaramuzza, D.: A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots. In: IEEE International Conference on Robotics and Automation (ICRA). IEEE (2018)Google Scholar
  8. 8.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 225–234 (2007).  https://doi.org/10.1109/ISMAR.2007.4538852
  9. 9.
    Kümmerle, R., et al.: On measuring the accuracy of SLAM algorithms. Auton. Robot. 27(4), 387–407 (2009)CrossRefGoogle Scholar
  10. 10.
    Mur-Artal, R., Montiel, J.M.M., Tardós, J.D.: ORB-SLAM: a versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015).  https://doi.org/10.1109/TRO.2015.2463671CrossRefGoogle Scholar
  11. 11.
    Nardi, L., et al.: Introducing SLAMBench, a performance and accuracy benchmarking methodology for SLAM. In: IEEE International Conference on Robotics and Automation, pp. 5783–5790, May 2015.  https://doi.org/10.1109/ICRA.2015.7140009
  12. 12.
    Newcombe, R.A., et al.: KinectFusion: real-time dense surface mapping and tracking. In: Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, pp. 127–136. IEEE Computer Society, Washington, DC (2011).  https://doi.org/10.1109/ISMAR.2011.6092378
  13. 13.
    Platinsky, L., Davison, A.J., Leutenegger, S.: Monocular visual odometry: sparse joint optimisation or dense alternation? In: IEEE International Conference on Robotics and Automation, pp. 5126–5133 (2017).  https://doi.org/10.1109/ICRA.2017.7989599
  14. 14.
    Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 573–580 (2012).  https://doi.org/10.1109/IROS.2012.6385773
  15. 15.
    Younes, G., Asmar, D.C., Shammas, E.A.: A survey on non-filter-based monocular visual SLAM systems. CoRR abs/1607.00470 (2016). http://arxiv.org/abs/1607.00470
  16. 16.
    Zia, M.Z., et al.: Comparative design space exploration of dense and semi-dense SLAM. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1292–1299. IEEE (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Alpen-Adria Universität KlagenfurtKlagenfurtAustria
  2. 2.Austrian Institute of TechnologyViennaAustria
  3. 3.NAVER LABS EuropeMeylanFrance

Personalised recommendations