Advertisement

Machine Vision and Applications

, Volume 30, Issue 7–8, pp 1145–1155 | Cite as

Visual odometry with a single-camera stereo omnidirectional system

  • Carlos JaramilloEmail author
  • Liang Yang
  • J. Pablo Muñoz
  • Yuichi Taguchi
  • Jizhong XiaoEmail author
Original Paper
  • 131 Downloads

Abstract

This paper presents the advantages of a single-camera stereo omnidirectional system (SOS) in estimating egomotion in real-world environments. The challenge of applying omnidirectional stereo vision via a single camera is what separates our work from others. In practice, dynamic environments, deficient illumination, and poor textured surfaces result in the lack of features to track in the observable scene. As a consequence, this negatively affects the pose estimation of visual odometry systems, regardless of their field of view. We compare the tracking accuracy and stability of the single-camera SOS versus an RGB-D device under various real circumstances. Our quantitative evaluation is performed with respect to 3D ground-truth data obtained from a motion capture system. The datasets and experimental results we provide are unique due to the nature of our catadioptric omnistereo rig, and the situations in which we captured these motion sequences. We have implemented a tracking system with simple rules applicable to both synthetic and real scenes. Our implementation does not make any motion model assumptions, and it maintains a fixed configuration among the compared sensors. Our experimental outcomes confer the robustness in 3D metric visual odometry estimation that the single-camera SOS can achieve under normal and special conditions in which other perspective narrow view systems such as RGB-D cameras would fail.

Notes

Acknowledgements

We thank the MTA Metro-North Railroad for letting us collect video sequences at the main lobby of the Grand Central Terminal in NYC.

Supplementary material

Supplementary material 1 (mp4 41453 KB)

References

  1. 1.
    Jaramillo, C., Valenti, R.G., Guo, L., Xiao, J.: Design and analysis of a single-camera omnistereo sensor for quadrotor micro aerial vehicles (MAVs). Sensors 16(2), 217 (2016). 1CrossRefGoogle Scholar
  2. 2.
    Zhang, Z., Rebecq, H., Forster, C., Scaramuzza, D.: Benefit of large field-of-view cameras for visual odometry. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2016)Google Scholar
  3. 3.
    Zou, D., Tan, P.: Coslam: collaborative visual slam in dynamic environments. IEEE Trans. Pattern Anal. Mach. Intell. 35(2), 354–366 (2013)CrossRefGoogle Scholar
  4. 4.
    Newcombe, R.A., Lovegrove, S.J., Davison, A.J.: DTAM: dense tracking and mapping in real-time. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)Google Scholar
  5. 5.
    Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Proceedings of the European Conference on Computer Vision (ECCV) (2014)Google Scholar
  6. 6.
    Mur-Artal, R., Tardos, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017)CrossRefGoogle Scholar
  7. 7.
    Tardif, J.-P., Pavlidis, Y., Daniilidis, K.: Monocular visual odometry in urban environments using an omnidirectional camera. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2008)Google Scholar
  8. 8.
    Rituerto, A., Puig, L., Guerrero, J.J.: Visual SLAM with an omnidirectional camera. In: Proceedings of the IEEE International Conference on Pattern Recognition (ICPR), vol. 8, pp. 348–351 (2010)Google Scholar
  9. 9.
    Gutierrez, D., Rituerto, A., Montiel, J.M.M., Guerrero, J.J.: Adapting a real-time monocular visual SLAM from conventional to omnidirectional cameras. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops (2011)Google Scholar
  10. 10.
    Lemaire, T., Lacroix, S.: SLAM with panoramic vision. J. Field Robot. 24(1–2), 91–111 (2007)CrossRefGoogle Scholar
  11. 11.
    Geyer, C., Daniilidis, K.: A unifying theory for central panoramic systems and practical implications. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 445–461 (2000)Google Scholar
  12. 12.
    Rosten, E., Porter, R., Drummond, T.: Faster and better: a machine learning approach to corner detection. IEEE Trans. Pattern Anal. Mach. Intell. 32(1), 105–119 (2010). 1CrossRefGoogle Scholar
  13. 13.
    Schönbein, M., Geiger, A.: Omnidirectional 3D reconstruction in augmented Manhattan worlds. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2014)Google Scholar
  14. 14.
    Zhu, Z.: Omnidirectional stereo vision. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2001)Google Scholar
  15. 15.
    Jang, G., Kim, S., Kweon, I.: Single camera catadioptric stereo system. In: OMNIVIS Workshop (2005)Google Scholar
  16. 16.
    Jaramillo, C., Valenti, R.G., Xiao, J.: GUMS: a generalized unified model for stereo omnidirectional vision (demonstrated via a folded catadioptric system. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2528–2533 (2016)Google Scholar
  17. 17.
    Kneip, L., Furgale, P.: OpenGV: A unified and generalized approach to real-time calibrated geometric vision. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)Google Scholar
  18. 18.
    Shi, J., Tomasi, C.: Good features to track. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 593–600 (1994)Google Scholar
  19. 19.
    Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV) (2011)Google Scholar
  20. 20.
    Kneip, L., Furgale, P., Siegwart, R.: Using multi-camera systems in robotics: efficient solutions to the NPnP problem. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), no. 2004 (2013)Google Scholar
  21. 21.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1–10 (2007)Google Scholar
  22. 22.
    Handa, A., Whelan, T., McDonald, J.B., Davison, A.J.: A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)Google Scholar
  23. 23.
    Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 573–580 (2012)Google Scholar
  24. 24.
    Xiong, Z., Chen, W., Zhang, M.: Catadioptric omnidirectional stereo vision and its applications in moving objects detection. In: In Tech: Computer Vision, ch. 26, pp. 493–538 (2008)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Computer Science DepartmentThe Graduate Center of The City University of New York (CUNY)New YorkUSA
  2. 2.Electrical Engineering DepartmentThe City College, CUNYNew YorkUSA
  3. 3.Intel CorporationSanta ClaraUSA
  4. 4.Mitsubishi Electric Research LaboratoriesCambridgeUSA

Personalised recommendations