Fusion of Monocular Visual-Inertial Measurements for Three Dimensional Pose Estimation

  • Gonzalo Perez-PainaEmail author
  • Claudio Paz
  • Miroslav Kulich
  • Martin Saska
  • Gastón Araguás
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9991)


This work describes a novel fusion schema to estimate the pose of a UAV using inertial sensors and a monocular camera. The visual motion algorithm is based on the plane induced homography using so called spectral features. The algorithm is able to operate with images presenting small amount of corner-like features, which gives more robustness to the state estimation. The key contribution of the paper is the use of this visual algorithm in a fusion schema with inertial sensors, exploiting the complementary properties of these two sensors. Results are presented in simulation with six degrees of freedom motion that satisfies dynamic constraints of a quadcopter. Virtual views are generated from this simulated motion cropped from a real floor image. Simulation results show that the presented algorithm would have enough precision to be used in an on-board algorithm to control the UAV in hovering operations.


Sensor fusion Visual odometry Inertial sensors Pose estimation UAV Kalman filter 


  1. 1.
    Araguás, G., Paz, C., Paina, G.P., Canali, L.: Visual homography-based pose estimation of a quadrotor using spectral features. In: 2015 Latin America Congress on Computational Intelligence (LA-CCI), pp. 1–6, October 2015Google Scholar
  2. 2.
    Araguás, G., Paz, C., Gaydou, D., Perez Paina, G.: Quaternion-based orientation estimation fusing a camera and inertial sensors for a hovering UAV. J. Intell. Robot. Syst. 77(1), 37–53 (2015). doi: 10.1007/s10846-014-0092-z CrossRefGoogle Scholar
  3. 3.
    Araguás, G., Paz, C., Perez Paina, G., Canali, L.: Visual homography-based pose estimation of a quadrotor using spectral features. In: Designing with Computational Intelligence. Studies in Computational Intelligence (in press)Google Scholar
  4. 4.
    Bloesch, M., Omari, S., Fankhauser, P., Sommer, H., Gehring, C., Hwangbo, J., Hoepflinger, M., Hutter, M., Siegwart, R.: Fusion of optical flow and inertial measurements for robust egomotion estimation. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ), pp. 3102–3107, September 2014Google Scholar
  5. 5.
    Camposeco, F., Pollefeys, M.: Using vanishing points to improve visual-inertial odometry. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 5219–5225, May 2015Google Scholar
  6. 6.
    Chudoba, J., Kulich, M., Saska, M., Báča, T., Přeučil, L.: Exploration and mapping technique suited for visual-features based localization of MAVS. J. Intell. Robot. Syst. 1–19 (2016).
  7. 7.
    Corke, P.I.: Robotics, Vision and Control: Fundamental Algorithms in MATLAB. Springer, Heidelberg (2011)CrossRefzbMATHGoogle Scholar
  8. 8.
    Gaydou, D., Suarez, G., Paz, C., Perez Paina, G., Araguás, G.: Robot volador no tripulado QA3. Diseño y construcción de un cuatrirrotor para experimentación. In: Proceedings of the VIII Jornadas Argentinas de Robótica (JAR) (2014)Google Scholar
  9. 9.
    Ma, Y., Soatto, S., Kosecka, J., Sastry, S.S.: An Invitation to 3-D Vision: From Images to Geometric Models. Springer, Heidelberg (2003)zbMATHGoogle Scholar
  10. 10.
    Markley, F.L.: Attitude error representations for Kalman filtering. J. Guidance Control Dyn. 26, 311–317 (2003)CrossRefGoogle Scholar
  11. 11.
    Markley, F.L.: Multiplicative vs. additive filtering for spacecraft attitude determination. In: Dynamics and Control of Systems and Structures in Space (2004)Google Scholar
  12. 12.
    Mourikis, A., Roumeliotis, S.: A multi-state constraint Kalman filter for vision-aided inertial navigation. In: 2007 IEEE International Conference on Robotics and Automation, pp. 3565–3572, April 2007Google Scholar
  13. 13.
    Pucheta, M.A., Paz, C.J., Pereyra, M.E.: Representaciones cinemáticas de orientación y ecuaciones de estimación. In: XXI Congreso sobre Métodos Numéricos y sus Aplicaciones ENIEF, vol. XXXIII, pp. 2303–2324 (2014)Google Scholar
  14. 14.
    Roumeliotis, S., Burdick, J.: Stochastic cloning: a generalized framework for processing relative state measurements. In: Proceedings of the IEEE International Conference on Robotics and Automation, ICRA 2002, vol. 2, pp. 1788–1795 (2002)Google Scholar
  15. 15.
    Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVS. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 5303–5310, May 2015Google Scholar
  16. 16.
    Shen, S., Mulgaonkar, Y., Michael, N., Kumar, V.: Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 4974–4981, May 2014Google Scholar
  17. 17.
    Simon, D.: Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches. Wiley-Interscience, Hoboken (2006)CrossRefGoogle Scholar
  18. 18.
    Tanskanen, P., Naegeli, T., Pollefeys, M., Hilliges, O.: Semi-direct EKF-based monocular visual-inertial odometry. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015) (2015)Google Scholar
  19. 19.
    Trawny, N., Roumeliotis, S.I.: Indirect Kalman filter for 3D attitude estimation. Technical report 2005–002, University of Minnesota, Department of Computer Science and Engineering, March 2005Google Scholar
  20. 20.
    Zitová, B., Flusser, J.: Image registration methods: a survey. Image Vis. Comput. 21(11), 977–1000 (2003)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Gonzalo Perez-Paina
    • 1
    Email author
  • Claudio Paz
    • 1
  • Miroslav Kulich
    • 2
  • Martin Saska
    • 3
  • Gastón Araguás
    • 1
  1. 1.Center for IT ResearchNational Technological UniversityCórdobaArgentina
  2. 2.Czech Institute of Informatics, Robotics, and CyberneticsCzech Technical University in PraguePragueCzech Republic
  3. 3.Department of Cybernetics, Faculty of Electrical EngineeringCzech Technical University in PraguePragueCzech Republic

Personalised recommendations