Abstract
The conventional visual-inertial odometry (VIO)-based localization techniques perform well in environments where stable features are guaranteed. However, their performance is not assured in poor feature quality and quantity conditions. As a solution to this, the U-VIO, a tightly coupled UWB visual-inertial odometry algorithm, is proposed in this paper. In the front-end, distance measurement between a static UWB anchor and a keyframe was considered as a residual for pose estimation. In the back-end, after finding the loop closure relationship with the current keyframe among the previous keyframes based on visual information, UWB loop constraints were added. The proposed algorithm was evaluated with data collected using a UGV (Unmanned Ground Vehicle). In the experimental analysis, the case where the UWB factor was added only to the front-end and the case where the back-end was additionally considered were compared. The proposed algorithm that closely uses UWB factors for the entire graph structure showed the most robust pose estimation performance through these evaluations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Gage, D.W.: UGV history 101: a brief history of unmanned ground vehicle development efforts. Unmanned Syst. Mag. 13(3), 9–16 (1995)
Oh, T.J., Kim, H.G., Myung, H.: A hybrid graph-based SLAM using a 2D laser scanner and a camera for environments with laser scan ambiguity. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2015)
Youn, P.L., Myung, H., Jung, S.W.: Vector-field SLAM for indoor environment using fusion of UWB ranging and magnetic field. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2018)
Lim, S., Hwang, S., Shin, S., Myung, H.: Normal distributions transform is enough: real-time 3D scan matching for pose correction of mobile robot under large odometry uncertainties. In: Proceedings of the International Conference on Control, Automation and Systems (ICCAS), pp. 1155–1161 (2020)
Weiss, S., Siegwart, R.: Real-time metric state estimation for modular vision-inertial systems. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2011, pp. 4531–4537 (2011)
Huang, G., Kaess, M., Leonard, J.J.: Towards consistent visual-inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 4926–4933 (2014)
Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2015, pp. 5303–5310 (2015)
Schmid, K., Hirschmüller, H.: Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2013, pp. 4671–4678 (2013)
Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)
Andrew, A.M.: Multiple view geometry in computer vision. Kybernetes (2001)
Lepetit, V., Moreno-Noguer, F., Fua, P.: EPnP: an accurate \(O\)(n) solution to the PnP problem. Int. J. Comput. Vis. 81(2), 155 (2009)
Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and rgb-d cameras. IEEE Trans. Rob. 33(5), 1255–1262 (2017)
Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Rob. 34(4), 1004–1020 (2018)
Schneider, T., et al.: MAPLAB: an open framework for research in visual-inertial mapping and localization. IEEE Robot. Autom. Lett. 3(3), 1418–1425 (2018)
Mei, C., Sibley, G., Newman, P.: Closing loops without places. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010, pp. 3738–3744 (2010)
Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., Burgard, W.: g2o: a general framework for graph optimization. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2011, pp. 3607–3613 (2011)
Nguyen, T.-M., Qiu, Z., Cao, M., Nguyen, T.H., Xie, L.: An integrated localization-navigation scheme for distance-based docking of UAVs. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 5245–5250 (2018)
Nguyen, T.-M., Nguyen, T.H., Cao, M., Qiu, Z., Xie, L.: Integrated UWB-vision approach for autonomous docking of UAVs in GPS-denied environments. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2019, pp. 9603–9609 (2019)
Shule, W., Almansa, C.M., Queralta, J.P., Zou, Z., Westerlund, T.: UWB-based localization for multi-UAV systems and collaborative heterogeneous multi-robot systems. Procedia Comput. Sci. 175, 357–364 (2020)
Song, Y., Guan, M., Tay, W.P., Law, C.L., Wen, C.: UWB/LiDAR fusion for cooperative range-only SLAM. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2019, pp. 6568–6574 (2019)
Queralta, J.P., Qingqing, L., Schiano, F., Westerlund, T.: VIO-UWB-based collaborative localization and dense scene reconstruction within heterogeneous multi-robot systems arXiv preprint arXiv:2011.00830 (2020)
Xu, H., Wang, L., Zhang, Y., Qiu, K., Shen, S.: Decentralized visual-inertial-UWB fusion for relative state estimation of aerial swarm. In: Proceedings of the International Conference on Robotics and Automation (ICRA), 2020, pp. 8776–8782 (2020)
Shin, S., Lee, E., Choi, J., Myung, H.: MIR-VIO: mutual information residual-based visual inertial odometry with UWB fusion for robust localization arXiv preprint arXiv:2109.00747 (2021)
Faugeras, O.D., Lustman, F.: Motion and structure from motion in a piecewise planar environment. Int. J. Pattern Recognit. Artif. Intell. 2(03), 485–508 (1988)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)
Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), 2007, pp. 225–234 (2007)
Strasdat, H., Montiel, J.M., Davison, A.J.: Visual SLAM: why filter? Image Vis. Comput. 30(2), 65–77 (2012)
Xu, H., Zhang, Y., Zhou, B., Wang, L., Shen, S.: Omni-swarm: An aerial swarm system with decentralized omni-directional visual-inertial-UWB state estimation, arXiv preprint arXiv:2103.04131 (2021)
Mardiana, R., Kawasaki, Z.: Broadband radio interferometer utilizing a sequential triggering technique for locating fast-moving electromagnetic sources emitted from lightning. IEEE Trans. Instrum. Meas. 49(2), 376–381 (2000). https://doi.org/10.1109/19.843081
De Dominicis, C.M., Pivato, P., Ferrari, P., Macii, D., Sisinni, E., Flammini, A.: Timestamping of IEEE 802.15.4a CSS signals for wireless ranging and time synchronization. IEEE Trans. Instrum. Meas. 62(8), 2286–2296 (2013). https://doi.org/10.1109/TIM.2013.2255988
Acknowledgement
This work has been supported by the Unmanned Swarm CPS Research Laboratory program of Defense Acquisition Program Administration and Agency for Defense Development. (UD190029ED)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jung, K., Shin, S., Myung, H. (2022). U-VIO: Tightly Coupled UWB Visual Inertial Odometry for Robust Localization. In: Kim, J., et al. Robot Intelligence Technology and Applications 6. RiTA 2021. Lecture Notes in Networks and Systems, vol 429. Springer, Cham. https://doi.org/10.1007/978-3-030-97672-9_24
Download citation
DOI: https://doi.org/10.1007/978-3-030-97672-9_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-97671-2
Online ISBN: 978-3-030-97672-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)