Autonomous Robots

, Volume 43, Issue 6, pp 1605–1622 | Cite as

Autonomous flight with robust visual odometry under dynamic lighting conditions

  • Pyojin Kim
  • Hyeonbeom Lee
  • H. Jin KimEmail author


Sensitivity to light conditions poses a challenge when utilizing visual odometry (VO) for autonomous navigation of small aerial vehicles in various applications. We present an illumination-robust direct visual odometry for a stable autonomous flight of an aerial robot under unpredictable light condition. The proposed stereo VO achieves robustness with respect to the light-changing environment by employing the patch-based affine illumination model to compensate abrupt, irregular illumination changes during direct motion estimation. We furthermore incorporate a motion prior from feature-based stereo visual odometry in the optimization, resulting in higher accuracy and more stable motion estimate. Thorough analyses of convergence rate and linearity index for the feature-based and direct VO methods support the effectiveness of the usage of the motion prior knowledge. We extensively evaluate the proposed algorithm on synthetic and real micro aerial vehicle datasets with ground-truth. Autonomous flight experiments with an aerial robot show that the proposed method successfully estimates 6-DoF pose under significant illumination changes.


Aerial robotics Stereo visual odometry Robustness Illumination changes 



This research was supported by the Ministry of Science, ICT, under the Information Technology Research Center (ITRC) program (IITP-2018-2017-0-01637) supervised by the Institute for Information & communications Technology Promotion, and Automation and Systems Research Institute (ASRI), and Samsung Research, Samsung Electronics Co., Ltd.

Supplementary material

Supplementary material 1 (mp4 63455 KB)


  1. Achtelik, M., Achtelik, M., et al. (2012). Sfly: Swarm of micro flying robots. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 2649–2650). IEEE.Google Scholar
  2. Alismail, H., Kaess, M., Browning, B., & Lucey, S. (2017). Direct visual odometry in low light using binary descriptors. IEEE Robotics and Automation Letters, 2(2), 444–451.CrossRefGoogle Scholar
  3. Benhimane, S., & Malis, E. (2004). Real-time image-based tracking of planes using efficient second-order minimization. In: 2004 IEEE/RSJ international conference on intelligent robots and systems, 2004. IROS 2004. Proceedings (Vol. 1, pp 943–948). IEEE.Google Scholar
  4. Bergmann, P., Wang, R., & Cremers, D. (2018). Online photometric calibration of auto exposure video for realtime visual odometry and slam. IEEE Robotics and Automation Letters, 3(2), 627–634.CrossRefGoogle Scholar
  5. Blanco, J. L. (2010). A tutorial on se (3) transformation parameterizations and on-manifold optimization (p. 3). University of Malaga, Tech Rep.Google Scholar
  6. Burri, M., & Nikolic, J. (2016). The EuRoc micro aerial vehicle datasets. The International Journal of Robotics Research, 35, 1157–1163.CrossRefGoogle Scholar
  7. Civera, J., Davison, A. J., & Montiel, J. M. (2008). Inverse depth parametrization for monocular slam. IEEE Transactions on Robotics, 24(5), 932–945.CrossRefGoogle Scholar
  8. Engel, J., Schöps, T., & Cremers, D. (2014). LSD-SLAM: Large-scale direct monocular slam. In: European Conference on Computer Vision (pp. 834–849). Springer.Google Scholar
  9. Engel, J., Stückler, J., & Cremers, D. (2015). Large-scale direct slam with stereo cameras. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1935–1942). IEEE.Google Scholar
  10. Engel, J., Koltun, V., & Cremers, D. (2018). Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(3), 611–625.CrossRefGoogle Scholar
  11. Fang, Z., & Scherer, S. (2014). Experimental study of odometry estimation methods using RGB-D cameras. In: IROS (pp. 680–687). IEEE.Google Scholar
  12. Forster, C., Pizzoli, M., & Scaramuzza, D. (2014). SVO: Fast semi-direct monocular visual odometry. In: 2014 IEEE international conference on robotics and automation (ICRA) (pp. 15–22). IEEE.Google Scholar
  13. Forster, C., Zhang, Z., Gassner, M., Werlberger, M., & Scaramuzza, D. (2017). SVO: Semidirect visual odometry for monocular and multicamera systems. IEEE Transactions on Robotics, 33(2), 249–265.CrossRefGoogle Scholar
  14. Geiger, A., Roser, M., & Urtasun, R. (2010). Efficient large-scale stereo matching. In: Asian conference on computer vision (pp. 25–38). Springer.Google Scholar
  15. Geiger, A., Ziegler, J., & Stiller, C. (2011). Stereoscan: Dense 3d reconstruction in real-time. In: 2011 IEEE Intelligent Vehicles Symposium (IV) (pp. 963–968). IEEE.Google Scholar
  16. Huang, A. S., Bachrach, A., Henry, P., Krainin, M., Maturana, D., Fox, D., et al. (2011). Visual odometry and mapping for autonomous flight using an RGB-D camera. In: ISRR, pp. 1–16.Google Scholar
  17. Jin, H., Favaro, P., & Soatto, S. (2001). Real-time feature tracking and outlier rejection with changes in illumination. In ICCV, pp. 684–689.Google Scholar
  18. Kerl, C. (2012). Odometry from RGB-D cameras for autonomous quadrocopters. Master’s thesis, Technical University Munich, Germany.Google Scholar
  19. Kerl, C., Sturm, J., & Cremers, D. (2013). Robust odometry estimation for RGB-D cameras. In: 2013 IEEE international conference on robotics and automation (ICRA) (pp. 3748–3754). IEEE.Google Scholar
  20. Kerl, C., Souiai, M., Sturm, J., & Cremers, D. (2014). Towards illumination-invariant 3D reconstruction using ToF RGB-D cameras. In: 3DV.Google Scholar
  21. Kim, P., Lim, H., & Kim, H. J. (2015). Robust visual odometry to irregular illumination changes with RGB-D camera. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3688–3694). IEEE.Google Scholar
  22. Kitt, B., Geiger, A., & Lategahn, H. (2010). Visual odometry based on stereo image sequences with ransac-based outlier rejection scheme. In: 2010 IEEE Intelligent Vehicles Symposium (IV) (pp. 486–492). IEEE.Google Scholar
  23. Klose, S., Heise, P., & Knoll, A. (2013). Efficient compositional approaches for real-time robust direct visual odometry from RGB-D data. In: 2013 IEEE/RSJ international conference on intelligent robots and systems (pp. 1100–1106). IEEE.Google Scholar
  24. Krombach, N., Droeschel, D., & Behnke, S. (2016). Combining feature-based and direct methods for semi-dense real-time stereo visual odometry. In: International conference on intelligent autonomous systems (pp. 855–868). Springer.Google Scholar
  25. Lee, H., Kim, H., & Kim, H. J. (2018). Planning and control for collision-free cooperative aerial transportation. IEEE Transactions on Automation Science and Engineering, 15(1), 189–201.CrossRefGoogle Scholar
  26. Ma, Y., Soatto, S., Kosecka, J., & Sastry, S. S. (2012). An invitation to 3-d vision: From images to geometric models (Vol. 26). Berlin: Springer.zbMATHGoogle Scholar
  27. Maimone, M., Cheng, Y., & Matthies, L. (2007). Two years of visual odometry on the mars exploration rovers. Journal of Field Robotics, 24(3), 169–186.CrossRefGoogle Scholar
  28. Mur-Artal, R., Montiel, J., & Tardós, J. D. (2015). ORB-SLAM: A versatile and accurate monocular slam system. IEEE Transactions on Robotics, 31(5), 1147–1163.CrossRefGoogle Scholar
  29. Nikolic, J., Rehder, J., et al. (2014). A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time slam. In: 2014 IEEE international conference on robotics and automation (ICRA) (pp. 431–437). IEEE.Google Scholar
  30. Nistér, D., Naroditsky, O., & Bergen, J. (2004). Visual odometry. In: Proceedings of the 2004 IEEE computer society conference on computer vision and pattern recognition, 2004. CVPR 2004 (Vol. 1, pp. I–I). IEEE.Google Scholar
  31. Park, S., Schöps, T., & Pollefeys, M. (2017). Illumination change robustness in direct visual slam. In: 2017 IEEE international conference on robotics and automation (ICRA) (pp. 4523–4530). IEEE.Google Scholar
  32. Scaramuzza, D., Achtelik, M., et al. (2014). Vision-controlled micro flying robots: From system design to autonomous navigation and mapping in GPS-denied environments. IEEE Robotics & Automation Magazine, 21(3), 26–40.CrossRefGoogle Scholar
  33. Scaramuzza, D., & Fraundorfer, F. (2011). Visual odometry [tutorial]. IEEE Robotics & Automation Magazine, 18(4), 80–92.CrossRefGoogle Scholar
  34. Sturm, J., et al. (2012). A benchmark for the evaluation of RGB-D slam systems. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (pp. 573–580). IEEE.Google Scholar
  35. Valenti, R. G., et al. (2014). Autonomous quadrotor flight using onboard RGB-D visual odometry. In: 2014 IEEE International Conference on Robotics and Automation (ICRA) (pp. 5233–5238). IEEE.Google Scholar
  36. von Stumberg, L., Usenko, V., Engel, J., Stückler, J., & Cremers, D. (2016). Autonomous exploration with a low-cost quadrocopter using semi-dense monocular slam. ArXiv preprint arXiv:1609.07835.
  37. Wang, R., Schwörer, M., & Cremers, D. (2017). Stereo DSO: Large-scale direct sparse visual odometry with stereo cameras. In: International conference on computer vision (ICCV), Venice, Italy.Google Scholar
  38. Zhang, J., Kaess, M., & Singh, S. (2017). A real-time method for depth enhanced visual odometry. Autonomous Robots, 41(1), 31–43.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Mechanical and Aerospace EngineeringSeoul National UniversitySeoulSouth Korea
  2. 2.Department of Electronics EngineeringKyungpook National UniversityDaeguSouth Korea

Personalised recommendations