Advertisement

Patch-based Stereo Direct Visual Odometry Robust to Illumination Changes

  • Jae Hyung Jung
  • Sejong Heo
  • Chan Gook ParkEmail author
Article
  • 27 Downloads

Abstract

In this paper, we present a patch-based direct visual odometry (DVO) that is robust to illumination changes at a sequence of stereo images. Illumination change violates the photo-consistency assumption and degrades the performance of DVO, thus, it should be carefully handled during minimizing the photometric error. Our approach divides an incoming image into several buckets, and patches inside each bucket own its unique affine illumination parameter to account for local illumination changes for which the global affine model fails to account, then it aligns small patches placed at temporal images. We do not distribute affine parameters to each patch since this yields huge computational load. Furthermore, we propose a prior weight as a function of the previous pose in a constant velocity model which implies that the faster a camera moves, the more likely it maintains the constant velocity model. Lastly, we verify that the proposed algorithm outperforms the global affine illumination model at the publicly available micro aerial vehicle and the planetary rover dataset which exhibit irregular and partial illumination changes due to the automatic exposure of the camera and the strong outdoor sunlight, respectively.

Keywords

Affine illumination model direct visual odometry micro aerial vehicle nonlinear optimization rover navigation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    S. Davide and F. Fraundorfer, “Visual odometry [tutorial],” IEEE Robotics & Automation Magazine, vol. 18, no. 4, pp. 80–92, December 2011.CrossRefGoogle Scholar
  2. [2]
    C. Yang, M Maimone, and L. Matthies, “Visual odometry on the Mars exploration rovers,” Proc. of IEEE International Conference on Systems, Man and Cybernetics, pp. 903–910, October 2005.Google Scholar
  3. [3]
    C. Forster, M. Pizzoli, and D. Scaramuzza, “SVO: fast semi-direct monocular visual odometry,” Proc. of IEEE International Conference on Robotics & Automation, pp. 15–22, May 2014.Google Scholar
  4. [4]
    D. Nister, O. Naroditsky, and J. Bergen, “Visual odometry,” Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, July 2004.Google Scholar
  5. [5]
    F. Steinbrucker, J. Sturm, and D. Cremers, “Real-time visual odometry from dense RGB-D images,” Proc. of IEEE International Conference on Computer Vision Workshops, pp. 719–722, November 2011.Google Scholar
  6. [6]
    V. Q. Nhat and G. Lee, “Illumination invariant object tracking with adaptive sparse representation,” International Journal of Control, Automation and Systems, vol. 12, no. 1, pp. 195–201, February 2014.CrossRefGoogle Scholar
  7. [7]
    P. Kim, B. Coltin, O. Alexandrov, and H. J. Kim, “Robust visual localization in changing lighting conditions,” Proc. of IEEE International Conference on Robotics and Automation, pp. 5447–5252, June 2017.Google Scholar
  8. [8]
    S. Klose, P. Heise, and A. Knoll, “Efficient compositional approaches for real-time robust direct visual odometry from RGB-D data,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 1100–1106, November 2013.Google Scholar
  9. [9]
    J. Jordan and A. Zell, “Ground plane based visual odometry for RGBD-cameras using orthogonal projection,” International Federation of Automatic Control on Intelligent Autonomous Vehicles, pp. 108–113, June 2016.Google Scholar
  10. [10]
    H. Jin, P. Favaro, and S. Soatto, “Real-time feature tracking and outlier rejection with changes in illumination,” Proc. of IEEE International Conference on Computer Vision, pp. 684–689, July 2001.Google Scholar
  11. [11]
    J. Engel, J. Stuckler, and D. Cremers, “Large-scale direct SLAM with stereo cameras,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 1935–1942, September 2015.Google Scholar
  12. [12]
    P. Kim, H. Lim, and H. J. Kim, “Robust visual odometry to irregular illumination changes with RGB-D camera,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 3688–3694, October 2015.Google Scholar
  13. [13]
    B. Kitt, A. Geiger, and H. Lategahn, “Visual odometry based on stereo image sequences with ransac-based outlier rejection scheme,” IEEE Intelligent Vehicles Symposium, pp. 486–492, June 2010.Google Scholar
  14. [14]
    P. Furgale, P. Carle, J. Enright, and T. D. Barfoot, “The Devon island rover navigation dataset,” The International Journal of Robotics Research, vol. 31, no. 6, pp. 707–713, April 2012.CrossRefGoogle Scholar
  15. [15]
    M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The EuRoC micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, April 2016.CrossRefGoogle Scholar
  16. [16]
    J. Engel, J. Sturm, and D. Cremers, “Semi-dense visual odometry for a monocular camera,” IEEE International Conference on Computer Vision, pp. 1449–1456, December 2013.Google Scholar
  17. [17]
    C. Kerl, J. Sturm, and D. Cremers, “Robust odometry estimation for RGB-D cameras,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 3748–3754, May 2013.Google Scholar
  18. [18]
    C. Kerl, Odometry from RGB-D Cameras for Autonomous Quadcopters, Master’s Thesis, Technical University of Munich, 2012.Google Scholar
  19. [19]
    J. Shi, “Good features to track,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 593–600, June 1994.Google Scholar
  20. [20]
    B. D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision,” Proceedings DARPA Image UnderstandingWorkshop, pp. 121–130, April 1981.Google Scholar
  21. [21]
    J. Sturm, N. Engelhard, F. Endres, W. Burgard, and D. Cremers “A benchmark for the evaluation of RGB-D SLAM systems,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 573–580, October 2012.Google Scholar
  22. [22]
    X. Zheng, Z. Moratto, M. Li, and A. I. Mourikis, “Photometric patch-based visual-inertial odometry,” Proc. of IEEE International Conference on Robotics and Automation, pp. 3264–3271, May 2017.Google Scholar

Copyright information

© Institute of Control, Robotics and Systems and The Korean Institute of Electrical Engineers and Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Mechanical and Aerospace Engineering/Automation and System Research InstituteSeoul National UniversityDaehak-dong, Gwanak-gu, SeoulKorea

Personalised recommendations