Advertisement

Multi-Sensor SLAM with Online Self-Calibration and Change Detection

  • Fernando Nobre
  • Christoffer R. Heckman
  • Gabe T. Sibley
Conference paper
Part of the Springer Proceedings in Advanced Robotics book series (SPAR, volume 1)

Abstract

We present a solution for constant-time self-calibration and change detection of multiple sensor intrinsic and extrinsic calibration parameters without any prior knowledge of the initial system state or the need of a calibration target or special initialization sequence. This system is capable of continuously self-calibrating multiple sensors in an online setting, while seamlessly solving the online SLAM problem in real-time. We focus on the camera-IMU extrinsic calibration, essential for accurate long-term vision-aided inertial navigation. An initialization strategy and method for continuously estimating and detecting changes to the maximum likelihood camera-IMU transform are presented. A conditioning approach is used, avoiding problems associated with early linearization. Experimental data is presented to evaluate the proposed system and compare it with artifact-based offline calibration developed by our group.

Keywords

Self-calibration SLAM Constant-time Change detection 

Notes

Acknowledgments

This work is generously supported by Toyota Motor Corporation.

References

  1. 1.
    Jones, E.S., Soatto, S.: Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int. J. Robot. Res. 30(4), 407–430 (2011)CrossRefGoogle Scholar
  2. 2.
    Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)CrossRefGoogle Scholar
  3. 3.
    Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint kalman filter for vision-aided inertial navigation. In: IEEE International Conference on Robotics and Automation, pp. 3565–3572 (2007)Google Scholar
  4. 4.
    Li, M., Mourikis, A.I.: High-precision, consistent EKF-based visual–inertial odometry. Int. J. Robot. Res. 32(6), 690–711 (2013)CrossRefGoogle Scholar
  5. 5.
    Hesch, J.A., Kottas, D.G., Bowman, S.L., Roumeliotis, S.I.: Towards Consistent Vision-Aided Inertial Navigation. In: Frazzoli, E., Lozano-Perez, T., Roy, N., Rus, D. (eds.) Algorithmic Foundations of Robotics. Springer Tracts in Advanced Robotics, vol. 86, pp. 559–574. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  6. 6.
    Civera, J., Bueno, D.R., Davison, A.J., Montiel, J.M.M.: Camera self-calibration for sequential Bayesian structure from motion. In: International Conference on Robotics and Automation, pp. 403–408. IEEE (2009)Google Scholar
  7. 7.
    Li, M., Yu, H., Zheng, X., Mourikis, A.I.: High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation. In: International Conference on Robotics and Automation, pp. 409–416. IEEE (2014)Google Scholar
  8. 8.
    Keivan, N., Sibley, G.: Asynchronous adaptive conditioning for visual-inertial SLAM. Int. J. Robot. Res. 34(13), 1573–1589 (2015)CrossRefGoogle Scholar
  9. 9.
    Keivan, N., Sibley, G.: Online SLAM with any-time self-calibration and automatic change detection. In: International Conference on Robotics and Automation, pp. 5775–5782. IEEE (2015)Google Scholar
  10. 10.
    Keivan, N., Sibley, G.: Constant-time monocular self-calibration. In: Robotics and Biomimetics (ROBIO), pp. 1590–1595. IEEE (2014)Google Scholar
  11. 11.
    Dong-Si, T.C., Mourikis, A.I.: Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration. In: Intelligent Robots and Systems, pp. 1064–1071. IEEE (2012)Google Scholar
  12. 12.
    Carlone, L., Tron, R., Daniilidis, K., Dellaert, F.: Initialization techniques for 3D SLAM: a survey on rotation estimation and its use in pose graph optimization. In: International Conference on Robotics and Automation, pp. 4597–4604. IEEE (2015)Google Scholar
  13. 13.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: International Symposium on Mixed and Augmented Reality, pp. 225–234. IEEE (2007)Google Scholar
  14. 14.
    Civera, J., Davison, A.J., Montiel, J.M.M.: Inverse depth parametrization for monocular SLAM. Trans. Robot. 24(5), 932–945 (2008)CrossRefGoogle Scholar
  15. 15.
    Li, M., Mourikis, A.I.: 3-D motion estimation and online temporal calibration for camera-IMU systems. In: International Conference on Robotics and Automation, pp. 5709–5716. IEEE (2013)Google Scholar
  16. 16.
    Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual–inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)CrossRefGoogle Scholar
  17. 17.
    Autonomous Robotics, Perception Group (ARPG): VICalib visual-inertial calibration suite (2016). https://github.com/arpg/vicalib

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Fernando Nobre
    • 1
  • Christoffer R. Heckman
    • 1
  • Gabe T. Sibley
    • 1
  1. 1.Department of Computer ScienceUniversity of ColoradoBoulderUSA

Personalised recommendations