Abstract
In the new era of robustness and perception, the Visual-Inertial Odometry (VIO), which is tightly coupled by the camera and the Inertial Measurement Unit (IMU), can obtain high-precision local pose results in unknown environment. Its low cost and miniaturization have received widespread attention. However, due to the limitation of the measurement principle, in the long-term runs, error will still accumulate. In addition, the outdoor large-scale environment is also a major challenge facing VIO. The Global Navigation Satellite System (GNSS) can provide accurate global estimates for VIO in an open outdoor environment and correct drift caused by long-term operation. Similarly, VIO can still perform in environments where GNSS is denied, which makes it possible for seamless indoor and outdoor navigation. Therefore, this paper proposes a visual-inertial SLAM algorithm assisted by GNSS. Taking the optimized tightly coupled VIO as the main body, and the pose information obtained by GNSS is combined with the VIO solution result to enhance the global positioning while ensuring the accuracy of the local pose accuracy. To this end, a simulation experiment based on the KITTI data set was carried out. The results show that the VIO system with the aid of GNSS can achieve the accuracy of 1.687 m error average, 1.176 m standard deviation, and 2.056 m root mean square error, which is nearly 80% higher than that without assistance. And it can also play a role in the environment where GNSS is denied, and the robustness of the system is also enhanced.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Davison, A.J., Reid, I.D., Molton, N.D., et al.: MonoSLAM: real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067.9 (2007)
Fuentes-Pacheco, J., Ruiz-Ascencio, J., Rendón-Mancha, J.M.: Visual simultaneous localization and mapping: a survey. Artif. Intell. Rev. 43(1), 55–81 (2012). https://doi.org/10.1007/s10462-012-9365-8
Cadena, C., Carlone, L., Carrillo, H., et al.: Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans. Rob. 32(6), 1309–1332 (2016)
Leutenegger, S., Lynen, S., Bosse, M., et al.: Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)
Li, L., Li, Z., Yuan, H., Wang, L., Hou, Y.: Integrity monitoring-based ratio test for GNSS integer ambiguity validation. GPS Solut. 20(3), 573–585 (2015). https://doi.org/10.1007/s10291-015-0468-y
Shi, J., Tomasi: Good features to track. In: 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600. IEEE, Seattle, WA, USA (1994)
Harris, C.J., Stephens, M.: A combined corner and edge detector. In: Proceedings of the 4th Alvey Vision Conference, Manchester, pp. 147–151 (1988)
Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th international joint conference on Artificial intelligence, pp. 674–679. Moran Kaufmann Publishers Inc., San Francisco, CA, USA (1981)
Forster, C., Carlone, L., Dellaert, F., et al.: On-Manifold preintegration for real-time visual-inertial odometry. IEEE Trans. Rob. 33(1), 1–21 (2017)
Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? The KITTI vision benchmark suite. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3354–3361. IEEE, Providence, RI, USA (2012)
Acknowledgements
This research was jointly funded by the National Key Research and Development Program (No. 2021YFB3901300), the National Natural Science Foundation of China (Nos. 61773132, 61633008, 61803115, 62003109), the 145 High-tech Ship Innovation Project sponsored by the Chinese Ministry of Industry and Information Technology, the Heilongjiang Province Research Science Fund for Excellent Young Scholars (No. YQ2020F009), and the Fundamental Research Funds for Central Universities (Nos. 3072019CF0401, 3072020CFT0403).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Aerospace Information Research Institute
About this paper
Cite this paper
Zhao, L., Wang, X., Zheng, X., Jia, C. (2022). Research on Visual-Inertial SLAM Technology with GNSS Assistance. In: Yang, C., Xie, J. (eds) China Satellite Navigation Conference (CSNC 2022) Proceedings. CSNC 2022. Lecture Notes in Electrical Engineering, vol 909. Springer, Singapore. https://doi.org/10.1007/978-981-19-2580-1_36
Download citation
DOI: https://doi.org/10.1007/978-981-19-2580-1_36
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-2579-5
Online ISBN: 978-981-19-2580-1
eBook Packages: EngineeringEngineering (R0)