Skip to main content
Log in

A Binocular MSCKF-Based Visual Inertial Odometry System Using LK Optical Flow

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

The odometry is an important part of intelligent mobile robots to achieve positioning and navigation functions. At present, the mainstream visual odometry locates only through the visual information obtained by camera sensors. Therefore, in the case of insufficient light, texture missing and camera jitter, the visual odometry is difficult to locate accurately. To solve the problem, we propose a binocular MSCKF-based visual inertial odometry system using Lucas-Kanade (LK) optical flow. Firstly, the Inertial Measurement Unit (IMU) is introduced to overcome the above problems. Moreover, LK optical flow algorithm is utilized to process the visual information obtained by the binocular camera, and MSCKF algorithm is employed to realize the fusion of visual information and inertial information, which improves the accuracy and efficiency of the visual inertial odometry system positioning. Finally, the proposed method is simulated on the European Robotics Challenge (EuRoc) dataset by Robot Operating System (ROS), and compared with two other advanced visual inertial odometry systems, ROVIO and MSCKF-mono. A large number of simulations verify that the proposed method can achieve accurate pose estimation, which is superior to the two existing advanced visual inertial odometry systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Suzuki, T., Kubo, N.: Precise point positioning for mobile robots using software GNSS receiver and QZSS LEX signal. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 369–375. IEEE (2013)

  2. Nistér, D., Naroditsky, O., Bergen, J.: Visual odometry for ground vehicle applications. J. Field Robot. 23(1), 3–20 (2006)

    Article  MATH  Google Scholar 

  3. Mhiri, R., Ouerghi, S., Boutteau, R., Vasseur, P., Mousset, S., Bensrhair, A.: Asynchronous structure from motion at scale. J. Intell. Robot. Syst. 96(2), 159–177 (2019)

    Article  Google Scholar 

  4. Nong, X., Cheng, L., Dai, Y., Rui, P., Chen, Y., Wu, H.: Research on indoor navigation of mobile robot based on INS and ultrosound. In: 2017 12th IEEE Conference on Industrial Electronics and Applications (ICIEA), pp. 231–235. IEEE (2017)

  5. Xu, H., Yu, L., Hou, J., Fei, S.: Automatic reconstruction method for large scene based on multi-site point cloud stitching. Measurement. 131, 590–596 (2019)

    Article  Google Scholar 

  6. Cheng, L., Dai, Y., Peng, R., Nong, X.: Positioning and navigation of mobile robot with asynchronous fusion of binocular vision system and inertial navigation system. Int. J. Adv. Robot. Syst. 14(6), 1729881417745607 (2017)

    Article  Google Scholar 

  7. Li, C., Yu, L., Fei, S.: Real-time 3D motion tracking and reconstruction system using camera and IMU sensors. IEEE Sensors J. 19(15), 6460–6466 (2019)

    Article  Google Scholar 

  8. Asadi, E., Bottasso, C.L.: Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors. Adv. Robot. 28(11), 717–729 (2014)

    Google Scholar 

  9. Kaiser, J., Martinelli, A., Fontana, F., Scaramuzza, D.: Simultaneous state initialization and gyroscope bias calibration in visual inertial aided navigation. IEEE Robot. Autom. Lett. 2(1), 18–25 (2016)

    Article  Google Scholar 

  10. Li, C., Yu, L., Fei, S.: Large-scale, real-time 3D scene reconstruction using visual and IMU sensors. IEEE Sensors J. 20, 5597–5605 (2020). https://doi.org/10.1109/JSEN.2020.2971521

    Article  Google Scholar 

  11. Weiss, S., Siegwart, R.: Real-time metric state estimation for modular vision-inertial systems. In: 2011 IEEE International Conference on Robotics and Automation, pp. 4531–4537. IEEE (2011)

  12. Lynen, S., Achtelik, M.W., Weiss, S., Chli, M., Siegwart, R.: A robust and modular Multi-Sensor Fusion approach applied to MAV navigation. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3923–3929. IEEE (2013)

  13. Zhu, A.Z., Atanasov, N., Daniilidis, K.: Event-based visual inertial odometry. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5816–5824. IEEE (2017)

  14. Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual-inertial Odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2014)

    Article  Google Scholar 

  15. Qin, T., Li, P., Shen, S.: VINS-mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34(4), 1–17 (2018)

    Article  Google Scholar 

  16. Mourikis, A.I., Roumeliotis, S.I.: A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. In: Proceedings 2007 IEEE International Conference on Robotics and Automation, pp. 3565–3572. IEEE (2007)

  17. Li, M., Mourikis, A.I.: Improving the accuracy of EKF-based visual-inertial odometry. In: 2012 IEEE International Conference on Robotics and Automation, pp. 828–835. IEEE (2012)

  18. Yu, L., Li, C., Fei, S.: Any-wall touch control system with switching filter based on 3D sensor. IEEE Sensors J. 18(11), 4697–4703 (2018)

    Article  Google Scholar 

  19. Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., Kumar, V.: Robust stereo visual inertial Odometry for fast autonomous flight. IEEE Robot. Autom. Lett. 3(2), 965–972 (2018)

    Article  Google Scholar 

  20. Yating, D., Lei, C., Rui, P., Xiaoqi, N., Wenxia, X., Yang, C., Qiuxuan, W.: Asynchronous fusion in positioning of mobile robot based on vision-aided inertial navigation. In: 2017 36th Chinese Control Conference (CCC), pp. 6685–6690. IEEE (2017)

  21. Xu, H., Hou, J., Yu, L., Fei, S.: 3D reconstruction system for collaborative scanning based on multiple RGB-D cameras. Pattern Recogn. Lett. 128, 505–512 (2019)

    Article  Google Scholar 

  22. De Croce, M., Pire, T., Bergero, F.: DS-PTAM: distributed stereo parallel tracking and mapping SLAM system. J. Intell. Robot. Syst. 95(2), 365–377 (2019)

    Article  Google Scholar 

  23. Huang, G.P., Mourikis, A.I., Roumeliotis, S.I.: Observability-based rules for designing consistent EKF SLAM estimators. Int. J. Robot. Res. 29(5), 502–528 (2010)

    Article  Google Scholar 

  24. Huang, G.P., Trawny, N., Mourikis, A.I., Roumeliotis, S.I.: Observability-based consistent EKF estimators for multi-robot cooperative localization. Auton. Robot. 30(1), 99–122 (2011)

    Article  Google Scholar 

  25. Huang, J., Ma, X., Che, H., Han, Z.: Further result on interval observer Design for Discrete-time Switched Systems and Application to circuit systems. IEEE Trans. Circuits Syst. II-Express Briefs. 1 (2019). https://doi.org/10.1109/TCSII.2019.2957945

  26. Castellanos, J.A., Martinez-Cantin, R., Tardós, J.D., Neira, J.: Robocentric map joining: improving the consistency of EKF-SLAM. Robot. Auton. Syst. 55(1), 21–29 (2007)

    Article  Google Scholar 

  27. Yu, L., Huang, J., Fei, S.: Robust switching control of the direct-drive servo control systems based on disturbance observer for switching gain reduction. IEEE Trans. Circuits Syst. Express Briefs. 66(8), 1366–1370 (2019)

    Article  Google Scholar 

  28. Karami, E., Prasad, S., Shehata, M.: Image matching using SIFT, SURF, BRIEF and ORB: performance comparison for distorted images. arXiv:1710.02726 (2017)

  29. Rublee, E., Rabaud, V., Konolige, K., Bradski, G.R.: ORB: An efficient alternative to SIFT or SURF. In: 2011 International Conference on Computer Vision (ICCV), pp. 2564–2571. IEEE (2012)

  30. Lowe, D.G.: Distinctive image features from scale-invariant Keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  31. Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (SURF). Comput. Vis. Image Und. 110(3), 346–359 (2008)

    Article  Google Scholar 

  32. Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), pp. 674–679 (1981)

  33. Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: 2010 IEEE computer society conference on computer vision and pattern recognition, pp. 2432–2439. IEEE (2010)

  34. Xu, H., Yu, L., Fei, S.: Hand-held 3D reconstruction of large-scale scene with Kinect sensors based on surfel and video sequences. IEEE Geosci. Remote Sens. L. 15(12), 1842–1846 (2018)

    Article  Google Scholar 

  35. Fortun, D., Bouthemy, P., Kervrann, C.: Optical flow modeling and computation: a survey. Comput. Vis. Image Und. 134, 1–21 (2015)

    Article  MATH  Google Scholar 

  36. Rosten, E., Drummond, T.: Machine learning for high-speed corner detection. In: European conference on computer vision, pp. 430–443, Springer (2006)

  37. Rosten, E., Porter, R., Drummond, T.: Faster and better: a machine learning approach to corner detection. IEEE Trans. Pattern Anal. Mach. Intell. 32(1), 105–119 (2008)

    Article  Google Scholar 

  38. Saha, O., Dasgupta, P.: Experience learning from basic patterns for efficient robot navigation in indoor environments. J. Intell. Robot. Syst. 92(3–4), 545–564 (2018)

    Article  Google Scholar 

  39. Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., Siegwart, R.: The EuRoC micro aerial vehicle datasets. Int. J. Robot. Res. 35(10), 1157–1163 (2016)

    Article  Google Scholar 

  40. Bloesch, M., Omari, S., Hutter, M., Siegwart, R.: Robust visual inertial odometry using a direct EKF-based approach. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp. 298–304. IEEE (2015)

  41. Delmerico, J., Scaramuzza, D.: A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 2502–2509. IEEE (2018)

  42. Oskiper, T., Zhu, Z., Samarasekera, S., Kumar, R.: Visual odometry system using multiple stereo cameras and inertial measurement unit. In: 2007 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8. IEEE (2007)

Download references

Acknowledgements

The work is supported by the National Natural Science Foundation of China (No.61873176). Natural Science Foundation of Jiangsu Province, China (BK20181433); The open fund for the Jiangsu Smart Factory Engineering Research Center; Postgraduate Research & Practice Innovation Program of Jiangsu Province(KYCX19_1927).The authors would like to thank the referees for their constructive comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Yu.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, G., Yu, L. & Fei, S. A Binocular MSCKF-Based Visual Inertial Odometry System Using LK Optical Flow. J Intell Robot Syst 100, 1179–1194 (2020). https://doi.org/10.1007/s10846-020-01222-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-020-01222-z

Keywords

Navigation