Skip to main content
Log in

Patch-based Stereo Direct Visual Odometry Robust to Illumination Changes

  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript

Abstract

In this paper, we present a patch-based direct visual odometry (DVO) that is robust to illumination changes at a sequence of stereo images. Illumination change violates the photo-consistency assumption and degrades the performance of DVO, thus, it should be carefully handled during minimizing the photometric error. Our approach divides an incoming image into several buckets, and patches inside each bucket own its unique affine illumination parameter to account for local illumination changes for which the global affine model fails to account, then it aligns small patches placed at temporal images. We do not distribute affine parameters to each patch since this yields huge computational load. Furthermore, we propose a prior weight as a function of the previous pose in a constant velocity model which implies that the faster a camera moves, the more likely it maintains the constant velocity model. Lastly, we verify that the proposed algorithm outperforms the global affine illumination model at the publicly available micro aerial vehicle and the planetary rover dataset which exhibit irregular and partial illumination changes due to the automatic exposure of the camera and the strong outdoor sunlight, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S. Davide and F. Fraundorfer, “Visual odometry [tutorial],” IEEE Robotics & Automation Magazine, vol. 18, no. 4, pp. 80–92, December 2011.

    Article  Google Scholar 

  2. C. Yang, M Maimone, and L. Matthies, “Visual odometry on the Mars exploration rovers,” Proc. of IEEE International Conference on Systems, Man and Cybernetics, pp. 903–910, October 2005.

    Google Scholar 

  3. C. Forster, M. Pizzoli, and D. Scaramuzza, “SVO: fast semi-direct monocular visual odometry,” Proc. of IEEE International Conference on Robotics & Automation, pp. 15–22, May 2014.

    Google Scholar 

  4. D. Nister, O. Naroditsky, and J. Bergen, “Visual odometry,” Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, July 2004.

    Google Scholar 

  5. F. Steinbrucker, J. Sturm, and D. Cremers, “Real-time visual odometry from dense RGB-D images,” Proc. of IEEE International Conference on Computer Vision Workshops, pp. 719–722, November 2011.

    Google Scholar 

  6. V. Q. Nhat and G. Lee, “Illumination invariant object tracking with adaptive sparse representation,” International Journal of Control, Automation and Systems, vol. 12, no. 1, pp. 195–201, February 2014.

    Article  Google Scholar 

  7. P. Kim, B. Coltin, O. Alexandrov, and H. J. Kim, “Robust visual localization in changing lighting conditions,” Proc. of IEEE International Conference on Robotics and Automation, pp. 5447–5252, June 2017.

    Google Scholar 

  8. S. Klose, P. Heise, and A. Knoll, “Efficient compositional approaches for real-time robust direct visual odometry from RGB-D data,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 1100–1106, November 2013.

    Google Scholar 

  9. J. Jordan and A. Zell, “Ground plane based visual odometry for RGBD-cameras using orthogonal projection,” International Federation of Automatic Control on Intelligent Autonomous Vehicles, pp. 108–113, June 2016.

    Google Scholar 

  10. H. Jin, P. Favaro, and S. Soatto, “Real-time feature tracking and outlier rejection with changes in illumination,” Proc. of IEEE International Conference on Computer Vision, pp. 684–689, July 2001.

    Google Scholar 

  11. J. Engel, J. Stuckler, and D. Cremers, “Large-scale direct SLAM with stereo cameras,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 1935–1942, September 2015.

    Google Scholar 

  12. P. Kim, H. Lim, and H. J. Kim, “Robust visual odometry to irregular illumination changes with RGB-D camera,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 3688–3694, October 2015.

    Google Scholar 

  13. B. Kitt, A. Geiger, and H. Lategahn, “Visual odometry based on stereo image sequences with ransac-based outlier rejection scheme,” IEEE Intelligent Vehicles Symposium, pp. 486–492, June 2010.

    Google Scholar 

  14. P. Furgale, P. Carle, J. Enright, and T. D. Barfoot, “The Devon island rover navigation dataset,” The International Journal of Robotics Research, vol. 31, no. 6, pp. 707–713, April 2012.

    Article  Google Scholar 

  15. M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The EuRoC micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, April 2016.

    Article  Google Scholar 

  16. J. Engel, J. Sturm, and D. Cremers, “Semi-dense visual odometry for a monocular camera,” IEEE International Conference on Computer Vision, pp. 1449–1456, December 2013.

    Google Scholar 

  17. C. Kerl, J. Sturm, and D. Cremers, “Robust odometry estimation for RGB-D cameras,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 3748–3754, May 2013.

    Google Scholar 

  18. C. Kerl, Odometry from RGB-D Cameras for Autonomous Quadcopters, Master’s Thesis, Technical University of Munich, 2012.

    Google Scholar 

  19. J. Shi, “Good features to track,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 593–600, June 1994.

    Google Scholar 

  20. B. D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision,” Proceedings DARPA Image UnderstandingWorkshop, pp. 121–130, April 1981.

    Google Scholar 

  21. J. Sturm, N. Engelhard, F. Endres,W. Burgard, and D. Cremers “A benchmark for the evaluation of RGB-D SLAM systems,” Proc. of IEEE International Conference on Intelligent Robots and Systems, pp. 573–580, October 2012.

    Google Scholar 

  22. X. Zheng, Z. Moratto, M. Li, and A. I. Mourikis, “Photometric patch-based visual-inertial odometry,” Proc. of IEEE International Conference on Robotics and Automation, pp. 3264–3271, May 2017.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chan Gook Park.

Additional information

Recommended by Associate Editor Kang-Hyun Jo under the direction of Editor Euntai Kim. This work was supported by the Ministry of Science and ICT of the Republic of Korea through the Space Core Technology Development Program under Project NRF-2018M1A3A3A02065722.

Jae Hyung Jung is an M.S. student in the Department of Mechanical and Aerospace Engineering of Seoul National University, Korea. He received the B.S. degree in the Department of Aerospace Engineering from Pusan National University, Korea in 2017. His research interests include visual odometry and vision-aided inertial navigation for mobile robots.

Sejong Heo is a Ph.D. student in the Department of Mechanical and Aerospace Engineering of Seoul National University, Korea. He received the B.S. and M.S. degrees in the Department of Mechanical and Aerospace Engineering from Seoul National University, Korea, in 2008 and 2010, respectively. He worked for Doosan DST in Korea, which is the maker of the high precision INS. His current research topics include the high precision inertial navigation, Bayesian filtering, nonlinear optimization and vision-aided inertial navigation for land vehicles and mobile robots.

Chan Gook Park received the B.S., M.S., and Ph.D. in control and instrumentation engineering from Seoul National University, South, Korea, in 1985, 1987, and 1993, respectively. He worked with Prof. Jason L. Speyer on peak seeking control for formation flight at the University of California, Los Angeles (UCLA) as a postdoctoral fellow in 1998. From 1994 to 2003, he was with Kwangwoon University, Seoul, Korea, as an associate professor. In 2003, he joined the faculty of the School of Mechanical and Aerospace Engineering at Seoul National University, Korea, where he is currently a professor. From 2009 to 2010, he was a visiting scholar with the Department of Aerospace Engineering at Georgia Institute of Technology, Atlanta, GA. He served as a chair of IEEE AES Korea Chapter until 2009. His current research topics include advanced filtering techniques, high precision INS, GPS/INS integration, MEMSbased pedestrian dead reckoning, and visual inertial navigation.

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jung, J.H., Heo, S. & Park, C.G. Patch-based Stereo Direct Visual Odometry Robust to Illumination Changes. Int. J. Control Autom. Syst. 17, 743–751 (2019). https://doi.org/10.1007/s12555-018-0199-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12555-018-0199-2

Keywords

Navigation