Advertisement

Modeling Varying Camera-IMU Time Offset in Optimization-Based Visual-Inertial Odometry

  • Yonggen Ling
  • Linchao Bao
  • Zequn Jie
  • Fengming Zhu
  • Ziyang Li
  • Shanmin Tang
  • Yongsheng Liu
  • Wei Liu
  • Tong Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11213)

Abstract

Combining cameras and inertial measurement units (IMUs) has been proven effective in motion tracking, as these two sensing modalities offer complementary characteristics that are suitable for fusion. While most works focus on global-shutter cameras and synchronized sensor measurements, consumer-grade devices are mostly equipped with rolling-shutter cameras and suffer from imperfect sensor synchronization. In this work, we propose a nonlinear optimization-based monocular visual inertial odometry (VIO) with varying camera-IMU time offset modeled as an unknown variable. Our approach is able to handle the rolling-shutter effects and imperfect sensor synchronization in a unified way. Additionally, we introduce an efficient algorithm based on dynamic programming and red-black tree to speed up IMU integration over variable-length time intervals during the optimization. An uncertainty-aware initialization is also presented to launch the VIO robustly. Comparisons with state-of-the-art methods on the Euroc dataset and mobile phone data are shown to validate the effectiveness of our approach.

Keywords

Visual-inertial odometry Online temporal camera-IMU calibration Rolling shutter cameras 

References

  1. 1.
    Baker, S., Matthews, I.: Lucas-Kanade 20 years on: a unifying framework. Int. J. Comput. Vis. 56(3), 221–255 (2004)CrossRefGoogle Scholar
  2. 2.
    Horn, B.K.P.: Closed-form solution of absolute orientation using unit quaternions. Opt. Soc. Am. 4, 629–642 (1987)CrossRefGoogle Scholar
  3. 3.
    Dai, Y., Li, H., Kneip, L.: Rolling shutter camera relative pose: generalized epipolar geometry. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (2016)Google Scholar
  4. 4.
    Dong-Si, T., Mourikis, A.I.: Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2012)Google Scholar
  5. 5.
    Forster, C., Carlone, L., Dellaert, F., Scaramuzza, D.: IMU preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation. In: Proceedings of Robotics: Science and System (2015)Google Scholar
  6. 6.
    Furgale, P., Rehder, J., Siegwart, R.: Unified temporal and spatial calibration for multi-sensor systems. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2013)Google Scholar
  7. 7.
    Gao, X.S., Hou, X.R., Tang, J., Cheng, H.F.: Complete solution classification for the perspective-three-point problem. IEEE Trans. Pattern Anal. Mach. Intell. 25, 930–943 (2003)CrossRefGoogle Scholar
  8. 8.
    Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? The KITTI vision benchmark suite. In: Conference on Computer Vision and Pattern Recognition (CVPR) (2012)Google Scholar
  9. 9.
  10. 10.
    Guo, C., Kottas, D., DuToit, R., Ahmed, A., Li, R., Roumeliotis, S.: Efficient visual-inertial navigation using a rolling-shutter camera with inaccurate timestamps. In: Proceedings of Robotics: Science and Systems (2014)Google Scholar
  11. 11.
    Hesch, J.A., Kottas, D.G., Bowman, S.L., Roumeliotis, S.I.: Consistency analysis and improvement of vision-aided inertial navigation. IEEE Trans. Robot. 30(1), 158–176 (2014)CrossRefGoogle Scholar
  12. 12.
    Jacovitti, G., Scarano, G.: Discrete time techniques for time delay estimation. IEEE Trans. Signal Process. 41, 525–533 (1993)CrossRefGoogle Scholar
  13. 13.
    Jung, S.H., Taylor, C.J.: Camera trajectory estimation using inertial sensor measurements and structure from motion results. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (2001)Google Scholar
  14. 14.
    Kelly, J., Sukhatme, G.: A general framework for temporal calibration of multiple proprioceptive and exteroceptive sensors. In: Khatib, O., Kumar, V., Sukhatme, G. (eds.) Experimental Robotics. STAR, vol. 79, pp. 195–209. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-28572-1_14CrossRefGoogle Scholar
  15. 15.
    Klein G., Murray D.: Parallel tracking and mapping on a camera phone. In: Proceedings of the Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality (2009)Google Scholar
  16. 16.
    Knuth, D.: The Art of Computer Programming, vol. 1–3. Addison-Wesley Longman Publishing Co., Inc., Boston (1998)zbMATHGoogle Scholar
  17. 17.
    Leutenegger, S., Furgale, P., Rabaud, V., Chli, M., Konolige, K., Siegwart, R.: Keyframe-based visual-inertial SLAM using nonlinear optimization. In: Proceedings of Robotics: Science and System (2013)Google Scholar
  18. 18.
    Li, M., Mourikis, A.I.: 3D motion estimation and online temporal calibration for camera- IMU systems. In: Proceedings of the IEEE International Conference on Robotics and Automation (2013)Google Scholar
  19. 19.
    Li, M., Mourikis, A.I.: High-precision, consistent EKF-based visual-inertial odometry. Intl. J. Robot. Res. 32(6), 690–711 (2013)CrossRefGoogle Scholar
  20. 20.
    Li, M., Mourikis, A.I.: Real-time motion tracking on a cellphone using inertial sensing and a rolling shutter camera. In: Proceedings of the IEEE International Conference on Robotics and Automation (2013)Google Scholar
  21. 21.
    Li, M., Mourikis, A.I.: Online temporal calibration for camera-IMU systems: theory and algorithms. Int. J. Robot. Res. 33, 947–964 (2014)CrossRefGoogle Scholar
  22. 22.
    Li, M., Mourikis, A.I.: Vision-aided inertial navigation with rolling-shutter cameras. Int. J. Robot. Res. 33, 1490–1507 (2014)CrossRefGoogle Scholar
  23. 23.
    Ling, Y., Kuse, M., Shen, S.: Edge alignment-based visual-inertial fusion for tracking of aggressive motions. Auton. Robot. 42, 513–528 (2017)CrossRefGoogle Scholar
  24. 24.
    Ling, Y., Shen, S.: Dense visual-inertial odometry for tracking of aggressive motions. In: Proceedings of the IEEE International Conference on Robotics and Biomimetics (2015)Google Scholar
  25. 25.
    Ling, Y., Shen, S.: Aggressive quadrotor flight using dense visual-inertial fusion. In: Proceedings of the IEEE International Conference on Robotics and Automation (2016)Google Scholar
  26. 26.
    Lynen, S., Achtelik, M., Weiss, S., Chli, M., Siegwart, R.: A robust and modular multi-sensor fusion approach applied to MAV navigation. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2013)Google Scholar
  27. 27.
    Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation (2007)Google Scholar
  28. 28.
    Mur-Artal, R., Tards, J.D.: Visual-inertial monocular slam with map reuse. IEEE Robot. Autom. Lett. 2, 796–803 (2017)CrossRefGoogle Scholar
  29. 29.
    Mur-Artal, R., Montiel, J.M.M., Tardós, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Robot. 31(5), 1147–1163 (2015).  https://doi.org/10.1109/TRO.2015.2463671CrossRefGoogle Scholar
  30. 30.
    Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. arXiv preprint arXiv:1708.03852 (2017)
  31. 31.
    Qin, T., Shen, S.: Robust initialization of monocular visual-inertial estimation on aerial robots. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2017)Google Scholar
  32. 32.
    Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In: Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, May 2015Google Scholar
  33. 33.
    Shen, S., Mulgaonkar, Y., Michael, N., Kumar, V.: Vision-based state estimation for autonomous rotorcraft MAVs in complex environments. In: Proceedings of the IEEE International Conference on Robotics and Automation (2013)Google Scholar
  34. 34.
    Shi, J., Tomasi, C.: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (1994)Google Scholar
  35. 35.
    Roumeliotis, S.I., Johnson, A.E., Montgomery, J.F.: Augmenting inertial navigation with image-based motion estimation. In: Proceedings of the IEEE International Conference on Robotics and Automation (2002)Google Scholar
  36. 36.
    Lovegrove, S., Patron-Perez, A., Sibley, G.: Spline fusion: a continuous-time representation for visual-inertial fusion with application to rolling shutter cameras. In: British Machine Vision Conference (2013)Google Scholar
  37. 37.
    Wang, J., Olson, E.: AprilTag 2: efficient and robust fiducial detection. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2016)Google Scholar
  38. 38.
    Weiss, S., Achtelik, M.W., Lynen, S., Chi, M., Siegwart, R.: Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In: Proceedings of the IEEE International Conference on Robotics and Automation (2012)Google Scholar
  39. 39.
    Yang, Z., Shen, S.: Monocular visual-inertial fusion with online initialization and camera-IMU calibration. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System (2015)Google Scholar
  40. 40.
    Yang, Z., Shen, S.: Tightly-coupled visual-inertial sensor fusion based on IMU pre-integration. Technical report, Hong Kong University of Science and Technology (2016). http://www.ece.ust.hk/~eeshaojie/vins2016zhenfei.pdf

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Yonggen Ling
    • 1
  • Linchao Bao
    • 1
  • Zequn Jie
    • 1
  • Fengming Zhu
    • 1
  • Ziyang Li
    • 1
  • Shanmin Tang
    • 1
  • Yongsheng Liu
    • 1
  • Wei Liu
    • 1
  • Tong Zhang
    • 1
  1. 1.Tencent AI LabShenzhenChina

Personalised recommendations