Autonomous Robots

, Volume 41, Issue 4, pp 903–917 | Cite as

Vision-based and IMU-aided scale factor-free linear velocity estimator

  • Rafik MebarkiEmail author
  • Vincenzo Lippiello
  • Bruno Siciliano


This paper presents a new linear velocity estimator based on the unscented Kalman filter and making use of image information aided with inertial measurements. The proposed technique is independent of the scale factor in case of planar observed scene and does not require a priori knowledge of the scene. Image moments of virtual objects, i.e. sets of classical image features such as corners collected online, are employed as the sole correcting information to be fed back to the estimator. Experimental results performed with a quadrotor equipped with a fisheye camera highlight the potential of the proposed approach.


UAV quadrotors Velocity estimation Computer vision Data fusion Kalman filter 



The research leading to these results has been supported by the ARCAS and SHERPA collaborative projects, which have received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreements ICT-287617 and ICT-600958, respectively. The authors are solely responsible for its content. It does not represent the opinion of the European Community and the Community is not responsible for any use that might be made of the information contained therein.


  1. Achtelik, M., Achtelik, M., Weiss, S., & Siegwart, R. (2011). Onboard IMU and monocular vision based control for MAVs in unknown in- and outdoor environments. In IEEE International Conference on Robotics and Automation.Google Scholar
  2. Castillo, P., Dzul, A., & Lozano, R. (2004). Real-time stabilization and tracking of a four-rotor mini rotorcraft. IEEE Transactions on Control Systems and Technology, 12, 510–516.CrossRefGoogle Scholar
  3. Chaumette, F. (2004). Image moments: A general and useful set of features for visual servoing. IEEE Transactions on Robotics, 20(4), 713–723.CrossRefGoogle Scholar
  4. Crassidis, J. L., & Markley, F. L. (2003). Unscented filtering for spacecraft attitude estimation. Journal of Guidance, Control, and Dynamics, 6, 536–542.CrossRefGoogle Scholar
  5. Espiau, B., Chaumette, F., & Rives, P. (1992). A new approach to visual servoing in robotics. IEEE Transactions on Robotics and Automation, 8, 313–326.CrossRefGoogle Scholar
  6. Grabe, V., Bulthoff, H., & Giordano, P. (2012). Robust optical-flow based self-motion estimation for a quadrotor UAV. In IEEE International Conference on Intelligent Robots and Systems (pp. 2153–2159).Google Scholar
  7. Hamel, T., & Mahony, R. (2002). Visual servoing of an under-actuated rigid body system: An image based approach. IEEE Transactions on Robotics and Automation, 18, 187–198.CrossRefGoogle Scholar
  8. Honegger, D., Meier, L., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In IEEE International Conference on Robotics and Automation.Google Scholar
  9. Hu, M. K. (1962). Visual pattern recognition by moment invariants. IRE Transactions on Information Theory, 8, 179–187.zbMATHGoogle Scholar
  10. Julier, S., & Uhlmann, J. (1997). A new extension of the kalman filter to nonlinear systems. In 11th International Symposium on Aerospace/Defense Sensing, Simulation and Controls.Google Scholar
  11. Kneip, L., Martinelli, A., Weiss, S., Scaramuzza, D., & Siegwart, R. (2011). Closed-form solution for absolute scale velocity determination combining inertial measurements and a single feature correspondence. In IEEE International Conference on Robotics and Automation (pp. 4546–4553).Google Scholar
  12. Lippiello, V., & Mebarki, R. (2013). Closed-form solution for absolute scale velocity estimation using visual and inertial data with a sliding least-squares estimation. In 21st Mediterranean Conference on Control and Automation (pp. 1261–1266).Google Scholar
  13. Lucas, B. D., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In 7th International Joint Conference on Artificial Intelligence (pp. 674–679).Google Scholar
  14. Ma, Y., Soatto, S., Kosecka, J., & Sastry, S. S. (2003). An Invitation to 3-D Vision: From Images to Geometric Models. New york: Springer.zbMATHGoogle Scholar
  15. Mebarki, R., & Lippiello, V. (2014). Image moments-based velocity estimation of UAVs in GPS denied environments. IEEE International Symposium on Safety, Security, and Rescue Robotics (pp. 1–6).Google Scholar
  16. Mebarki, R., Lippiello, V., & Siciliano, B. (2015). Nonlinear visual control of unmanned aerial vehicles in GPS-denied environments. IEEE Transactions on Robotics, 31(4), 1004–1017.CrossRefGoogle Scholar
  17. Mebarki, R., & Siciliano, B. (2013). Velocity-free image-based control of unmanned aerial vehicles. In 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (pp. 1522–1527).Google Scholar
  18. Mourikis, A. I., & Roumeliotis, S. I. (2007). A multi-state constraint Kalman filter for vision-aided inertial navigation. In IEEE International Conference on Robotics and Automation (pp. 3565–3572).Google Scholar
  19. Mourikis, A. I., Trawny, N., Roumeliotis, S. I., Johnson, A. E., Ansar, A., & Matthies, L. (2009). Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Transactions on Robotics, 25, 264–280.CrossRefGoogle Scholar
  20. Prasad, J., Calise, A. J., Johnson, E. N., Sattigeri, R., & Moon, J. (2008). Flight demonstration of an adaptive guidance controller for autonomous formation flight. In American Helicopter Society 64th Annual Forum.Google Scholar
  21. Rublee, E., Rabaud, V., Konolige, K., & Bradski, G. (2011). ORB: An efficient alternative to SIFT or SURF. In IEEE International Conference on Computer Vision (pp. 2564–2571).Google Scholar
  22. Shakernia, O., Koo, T., & Sastry, S. (1999). Landing an unmanned air vehicle: Vision based motion estimation and nonlinear control. Asian Journal of Control, 1, 128–145.CrossRefGoogle Scholar
  23. Shen, S., Michael, M., & Kumar, V. (2011). Autonomous multi-floor indoor navigation with a computationally constrained MAV. In IEEE International Conference on Robotics and Automation (pp. 20–25).Google Scholar
  24. Steger, C. (1996). On the calculation of arbitrary moments of polygons. Technical Report FGBV-96-05, Forschungsgruppe Bildverstehen (FG BV) Informatik IX, Technische Universität München.Google Scholar
  25. Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In IEEE International Conference on Robotics and Automation (pp. 957–964).Google Scholar
  26. Zhao, S., Lin, F., Peng, K., Dong, X., Chen, B. M., & Lee, T. H. (2015). Vision-aided estimation of attitude, velocity, and inertial measurement bias for UAV stabilization. Journal of Intelligent and Robotic Systems, 81, 531–549.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Rafik Mebarki
    • 1
    Email author
  • Vincenzo Lippiello
    • 1
  • Bruno Siciliano
    • 1
  1. 1.PRISMA Lab, Dipartimento di Ingegneria Elettrica e Tecnologie dell’InformazioneUniversità degli Studi di Napoli Federico IINaplesItaly

Personalised recommendations