Advertisement

Autonomous Robots

, Volume 41, Issue 2, pp 401–416 | Cite as

Low-drift and real-time lidar odometry and mapping

  • Ji Zhang
  • Sanjiv Singh
Article

Abstract

Here we propose a real-time method for low-drift odometry and mapping using range measurements from a 3D laser scanner moving in 6-DOF. The problem is hard because the range measurements are received at different times, and errors in motion estimation (especially without an external reference such as GPS) cause mis-registration of the resulting point cloud. To date, coherent 3D maps have been built by off-line batch methods, often using loop closure to correct for drift over time. Our method achieves both low-drift in motion estimation and low-computational complexity. The key idea that makes this level of performance possible is the division of the complex problem of Simultaneous Localization and Mapping, which seeks to optimize a large number of variables simultaneously, into two algorithms. One algorithm performs odometry at a high-frequency but at low fidelity to estimate velocity of the laser scanner. Although not necessary, if an IMU is available, it can provide a motion prior and mitigate for gross, high-frequency motion. A second algorithm runs at an order of magnitude lower frequency for fine matching and registration of the point cloud. Combination of the two algorithms allows map creation in real-time. Our method has been evaluated by indoor and outdoor experiments as well as the KITTI odometry benchmark. The results indicate that the proposed method can achieve accuracy comparable to the state of the art offline, batch methods.

Keywords

Ego-motion estimation Mapping Continuous-time Lidar 

Notes

Acknowledgments

The paper is based upon work supported by the National Science Foundation under Grant No. IIS-1328930. Special thanks are given to D. Huber, S. Scherer, M. Bergerman, M. Kaess, L. Yoder, S. Maeta for their insightful inputs and invaluable help.

References

  1. Andersen, R. (2008). Modern methods for robust regression. Sage University Paper Series on Quantitative Applications in the Social Sciences.Google Scholar
  2. Anderson, S. & Barfoot, T. (2013). Towards relative continuous-time SLAM. In IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.Google Scholar
  3. Anderson, S., & Barfoot, T. (2013). RANSAC for motion-distorted 3D visual sensors. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan.Google Scholar
  4. Badino, H., & Kanade, T. (2011). A head-wearable short-baseline stereo system for the simultaneous estimation of structure and motion. In IAPR Conference on Machine Vision Application, Nara, Japan.Google Scholar
  5. Badino, A.Y.H., & Kanade, T. (2013). Visual odometry by multi-frame feature integration. In Workshop on Computer Vision for Autonomous Driving (Collocated with ICCV 2013). Sydney, Australia.Google Scholar
  6. Bay, H., Ess, A., Tuytelaars, T., & Gool, L. (2008). SURF: Speeded up robust features. Computer Vision and Image Understanding, 110(3), 346–359.CrossRefGoogle Scholar
  7. Bellavia, F., Fanfani, M., Pazzaglia, F., & Colombo, C. (2013). Robust selective stereo slam without loop closure and bundle adjustment. Lecture Notes in Computer Science, 8156, 462–471.CrossRefGoogle Scholar
  8. Bosse, M., & and Zlot, R. (2009). Continuous 3D scan-matching with a spinning 2D laser. In IEEE International Conference on Robotics and Automation, Kobe, Japan.Google Scholar
  9. Bosse, M., Zlot, R., & Flick, P. (2012). Zebedee: Design of a spring-mounted 3-D range sensor with application to mobile mapping. IEEE Transactions on Robotics, 28(5), 1104–1119.CrossRefGoogle Scholar
  10. de Berg, M., van Kreveld, M., Overmars, M., & Schwarzkopf, O. (2008). Computation geometry: Algorithms and applications (3rd ed.). Berlin: Springer.CrossRefzbMATHGoogle Scholar
  11. Dong, H. & Barfoot, T. (2012). Lighting-invariant visual odometry using lidar intensity imagery and pose interpolation. In The 7th International Conference on Field and Service Robots, Matsushima, Japan.Google Scholar
  12. Furgale, P., Barfoot, T. & Sibley, G. (2012). Continuous-time batch estimation using temporal basis functions. In IEEE International Conference on Robotics and Automation (ICRA), St. Paul, MN.Google Scholar
  13. Geiger, A., Lenz, P. & Urtasun, R. (2012). Are we ready for autonomous driving? The kitti vision benchmark suite. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 3354–3361).Google Scholar
  14. Geiger, A., Lenz, P., Stiller, C., & Urtasun, R. (2013). Vision meets robotics: The KITTI dataset. International Journal of Robotics Research, 32, 1229–1235.CrossRefGoogle Scholar
  15. Guo, C.X., Kottas, D.G., DuToit, R.C., Ahmed, A., Li, R. & Roumeliotis, S.I. (2014). Efficient visual-inertial navigation using a rolling-shutter camera with inaccurate timestamps. In Proceedings of Robotics: Science and Systems, Berkeley, CA.Google Scholar
  16. Hartley, R., & Zisserman, A. (2004). Multiple view geometry in computer vision. New York: Cambridge University Press.CrossRefzbMATHGoogle Scholar
  17. Hong, S., Ko, H. & Kim, J. (2010). VICP: Velocity updating iterative closest point algorithm. In IEEE International Conference on Robotics and Automation (ICRA), Anchorage, Alaska.Google Scholar
  18. Li, Y. & Olson, E. (2011) Structure tensors for general purpose LIDAR feature extraction. In IEEE International Conference on Robotics and Automation, Shanghai, China, May 9–13.Google Scholar
  19. Li, M., & Mourikis, A. (2014). Vision-aided inertial navigation with rolling-shutter cameras. International Journal of Robotics Research, 33(11), 1490–1507.CrossRefGoogle Scholar
  20. Lu, W., Xiang, Z., & Liu, J. (2013). High-performance visual odometry with two-stage local binocular ba and gpu. In IEEE Intelligent Vehicles Symposium. Gold Coast City, Australia.Google Scholar
  21. Moosmann, F., & Stiller, C. (2011). Velodyne SLAM. In IEEE Intelligent Vehicles Symposium (IV). Baden-Baden, Germany.Google Scholar
  22. Murray, R., & Sastry, S. (1994). A mathematical introduction to robotic manipulaton. Boca Raton: CRC Press.Google Scholar
  23. Nuchter, A., Lingemann, K., Hertzberg, J., & Surmann, H. (2007). 6D SLAM-3D mapping outdoor environments. Journal of Field Robotics, 24(8–9), 699–722.CrossRefzbMATHGoogle Scholar
  24. Persson, M., Piccini, T., Mester, R., & Felsberg, M. (2015). Robust stereo visual odometry from monocular techniques. In IEEE Intelligent Vehicles Symposium. Seoul, Korea.Google Scholar
  25. Pomerleau, F., Colas, F., Siegwart, R., & Magnenat, S. (2013). Comparing ICP variants on real-world data sets. Autonomous Robots, 34(3), 133–148.CrossRefGoogle Scholar
  26. Quigley, M., Gerkey, B., Conley, K., Faust, J., Foote, T., Leibs, J., et al. (2009). ROS: An open-source robot operating system. In Workshop on Open Source Software (Collocated with ICRA 2009). Kobe, Japan.Google Scholar
  27. Rosen, D., Huang, G., & Leonard, J. (2014). Inference over heterogeneous finite-/infinite-dimensional systems using factor graphs and Gaussian processes. In IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China.Google Scholar
  28. Rusinkiewicz, S. & Levoy, M. (2001). Efficient variants of the ICP algorithm. In Third International Conference on 3D Digital Imaging and Modeling (3DIM), Quebec City, Canada.Google Scholar
  29. Rusu, R.B., & Cousins, S. (2011). 3D is here: Point Cloud Library (PCL). In IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, May 9–13.Google Scholar
  30. Scherer, S., Rehder, J., Achar, S., Cover, H., Chambers, A., Nuske, S., et al. (2012). River mapping from a flying robot: State estimation, river detection, and obstacle mapping. Autonomous Robots, 32(5), 1–26.Google Scholar
  31. Thrun, S., Burgard, W., & Fox, D. (2005). Probabilistic robotics. Cambridge, MA: The MIT Press.zbMATHGoogle Scholar
  32. Tong, C. H. & Barfoot, T. (2013). Gaussian process Gauss-Newton for 3D laser-based visual odometry. In IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.Google Scholar
  33. Tong, C., Furgale, P., & Barfoot, T. (2013). Gaussian process Gauss-newton for non-parametric simultaneous localization and mapping. International Journal of Robotics Research, 32(5), 507–525.CrossRefGoogle Scholar
  34. Zhang, J. & Singh, S. (2014). LOAM: Lidar odometry and mapping in real-time. In Robotics: Science and Systems Conference (RSS), Berkeley, CA.Google Scholar
  35. Zhang, J. & Singh, S. (2015). Visual-lidar odometry and mapping: Low-drift, robust, and fast. In IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, MayGoogle Scholar
  36. Zlot, R. & Bosse, M. (2012). Efficient large-scale 3D mobile mapping and surface reconstruction of an underground mine. In The 7th International Conference on Field and Service Robots, Matsushima, Japan.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Robotics Institute at Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations