Skip to main content

U-VIO: Tightly Coupled UWB Visual Inertial Odometry for Robust Localization

  • Conference paper
  • First Online:
Robot Intelligence Technology and Applications 6 (RiTA 2021)

Abstract

The conventional visual-inertial odometry (VIO)-based localization techniques perform well in environments where stable features are guaranteed. However, their performance is not assured in poor feature quality and quantity conditions. As a solution to this, the U-VIO, a tightly coupled UWB visual-inertial odometry algorithm, is proposed in this paper. In the front-end, distance measurement between a static UWB anchor and a keyframe was considered as a residual for pose estimation. In the back-end, after finding the loop closure relationship with the current keyframe among the previous keyframes based on visual information, UWB loop constraints were added. The proposed algorithm was evaluated with data collected using a UGV (Unmanned Ground Vehicle). In the experimental analysis, the case where the UWB factor was added only to the front-end and the case where the back-end was additionally considered were compared. The proposed algorithm that closely uses UWB factors for the entire graph structure showed the most robust pose estimation performance through these evaluations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Gage, D.W.: UGV history 101: a brief history of unmanned ground vehicle development efforts. Unmanned Syst. Mag. 13(3), 9–16 (1995)

    Google Scholar 

  2. Oh, T.J., Kim, H.G., Myung, H.: A hybrid graph-based SLAM using a 2D laser scanner and a camera for environments with laser scan ambiguity. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2015)

    Google Scholar 

  3. Youn, P.L., Myung, H., Jung, S.W.: Vector-field SLAM for indoor environment using fusion of UWB ranging and magnetic field. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2018)

    Google Scholar 

  4. Lim, S., Hwang, S., Shin, S., Myung, H.: Normal distributions transform is enough: real-time 3D scan matching for pose correction of mobile robot under large odometry uncertainties. In: Proceedings of the International Conference on Control, Automation and Systems (ICCAS), pp. 1155–1161 (2020)

    Google Scholar 

  5. Weiss, S., Siegwart, R.: Real-time metric state estimation for modular vision-inertial systems. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2011, pp. 4531–4537 (2011)

    Google Scholar 

  6. Huang, G., Kaess, M., Leonard, J.J.: Towards consistent visual-inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 4926–4933 (2014)

    Google Scholar 

  7. Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2015, pp. 5303–5310 (2015)

    Google Scholar 

  8. Schmid, K., Hirschmüller, H.: Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2013, pp. 4671–4678 (2013)

    Google Scholar 

  9. Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)

    Article  Google Scholar 

  10. Andrew, A.M.: Multiple view geometry in computer vision. Kybernetes (2001)

    Google Scholar 

  11. Lepetit, V., Moreno-Noguer, F., Fua, P.: EPnP: an accurate \(O\)(n) solution to the PnP problem. Int. J. Comput. Vis. 81(2), 155 (2009)

    Article  Google Scholar 

  12. Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and rgb-d cameras. IEEE Trans. Rob. 33(5), 1255–1262 (2017)

    Article  Google Scholar 

  13. Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Rob. 34(4), 1004–1020 (2018)

    Article  Google Scholar 

  14. Schneider, T., et al.: MAPLAB: an open framework for research in visual-inertial mapping and localization. IEEE Robot. Autom. Lett. 3(3), 1418–1425 (2018)

    Article  Google Scholar 

  15. Mei, C., Sibley, G., Newman, P.: Closing loops without places. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010, pp. 3738–3744 (2010)

    Google Scholar 

  16. Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., Burgard, W.: g2o: a general framework for graph optimization. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2011, pp. 3607–3613 (2011)

    Google Scholar 

  17. Nguyen, T.-M., Qiu, Z., Cao, M., Nguyen, T.H., Xie, L.: An integrated localization-navigation scheme for distance-based docking of UAVs. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 5245–5250 (2018)

    Google Scholar 

  18. Nguyen, T.-M., Nguyen, T.H., Cao, M., Qiu, Z., Xie, L.: Integrated UWB-vision approach for autonomous docking of UAVs in GPS-denied environments. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2019, pp. 9603–9609 (2019)

    Google Scholar 

  19. Shule, W., Almansa, C.M., Queralta, J.P., Zou, Z., Westerlund, T.: UWB-based localization for multi-UAV systems and collaborative heterogeneous multi-robot systems. Procedia Comput. Sci. 175, 357–364 (2020)

    Article  Google Scholar 

  20. Song, Y., Guan, M., Tay, W.P., Law, C.L., Wen, C.: UWB/LiDAR fusion for cooperative range-only SLAM. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2019, pp. 6568–6574 (2019)

    Google Scholar 

  21. Queralta, J.P., Qingqing, L., Schiano, F., Westerlund, T.: VIO-UWB-based collaborative localization and dense scene reconstruction within heterogeneous multi-robot systems arXiv preprint arXiv:2011.00830 (2020)

  22. Xu, H., Wang, L., Zhang, Y., Qiu, K., Shen, S.: Decentralized visual-inertial-UWB fusion for relative state estimation of aerial swarm. In: Proceedings of the International Conference on Robotics and Automation (ICRA), 2020, pp. 8776–8782 (2020)

    Google Scholar 

  23. Shin, S., Lee, E., Choi, J., Myung, H.: MIR-VIO: mutual information residual-based visual inertial odometry with UWB fusion for robust localization arXiv preprint arXiv:2109.00747 (2021)

  24. Faugeras, O.D., Lustman, F.: Motion and structure from motion in a piecewise planar environment. Int. J. Pattern Recognit. Artif. Intell. 2(03), 485–508 (1988)

    Article  Google Scholar 

  25. Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)

    Article  Google Scholar 

  26. Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), 2007, pp. 225–234 (2007)

    Google Scholar 

  27. Strasdat, H., Montiel, J.M., Davison, A.J.: Visual SLAM: why filter? Image Vis. Comput. 30(2), 65–77 (2012)

    Article  Google Scholar 

  28. Xu, H., Zhang, Y., Zhou, B., Wang, L., Shen, S.: Omni-swarm: An aerial swarm system with decentralized omni-directional visual-inertial-UWB state estimation, arXiv preprint arXiv:2103.04131 (2021)

  29. Mardiana, R., Kawasaki, Z.: Broadband radio interferometer utilizing a sequential triggering technique for locating fast-moving electromagnetic sources emitted from lightning. IEEE Trans. Instrum. Meas. 49(2), 376–381 (2000). https://doi.org/10.1109/19.843081

  30. De Dominicis, C.M., Pivato, P., Ferrari, P., Macii, D., Sisinni, E., Flammini, A.: Timestamping of IEEE 802.15.4a CSS signals for wireless ranging and time synchronization. IEEE Trans. Instrum. Meas. 62(8), 2286–2296 (2013). https://doi.org/10.1109/TIM.2013.2255988

Download references

Acknowledgement

This work has been supported by the Unmanned Swarm CPS Research Laboratory program of Defense Acquisition Program Administration and Agency for Defense Development. (UD190029ED)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hyun Myung .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jung, K., Shin, S., Myung, H. (2022). U-VIO: Tightly Coupled UWB Visual Inertial Odometry for Robust Localization. In: Kim, J., et al. Robot Intelligence Technology and Applications 6. RiTA 2021. Lecture Notes in Networks and Systems, vol 429. Springer, Cham. https://doi.org/10.1007/978-3-030-97672-9_24

Download citation

Publish with us

Policies and ethics