Skip to main content
Log in

Inertial-Aided Metric States and Surface Normal Estimation using a Monocular Camera

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In this paper, a visual inertial fusion framework is proposed for estimating the metric states of a Micro Aerial Vehicle (MAV) using optic flow (OF) and a homography model. Aided by the attitude estimation from the on-board Inertial Measurement Unit (IMU), the computed homography matrix is reshaped into a vector and directly fed into an Extend Kalman Filter (EKF). The sensor fusion method is able to recover metric distance, speed, acceleration bias and surface normal of the observed plane. We further consider reducing the size of the filter by using only part of the homography matrix as the system observation. Simulation results show that these smaller filters have reduced observability compared with the filter using the complete homography matrix, however it is still possible to estimate the metric states as long as one of the axes is linearly excited. Experiments using real sensory data show that our method is superior to the homography decomposition method for state and slope estimation. The proposed method is also validated in closed-loop flight tests of a quadrotor.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Fraundorfer, F.: Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Rob. 31(4), 654–675 (2014)

  2. Corke, P.: An inertial and visual sensing system for a small autonomous helicopter. J. Robot. Syst. 21(2), 43–51 (2004)

  3. Leonard, J.J., Durrant-Whyte, H.F.: Mobile robot localization by tracking geometric beacons. IEEE Trans. Robot. Autom. 7(3), 376–382 (1991)

    Article  Google Scholar 

  4. Li, S.G., Garratt, M., Lambert, A.: Real-Time 6DoF Deck Pose Estimation and Target Tracking for Landing an UAV in a Cluttered Shipboard Environment using On-board Vision IEEE International Conference on Mechatronics and Automation, Beijing, China (2015)

  5. Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision, Cambridge University Press (2003)

  6. Ma, Y., Soatto, S., Košecká, J., Sastry, S.S.: An Invitation to 3-D Vision, Interdisciplinary Applied Mathematics, vol. 26, Springer (2004)

  7. Cheng, Y.: Real-time Surface Slope Estimation by Homography Alignment for Spacecraft Safe Landing IEEE International Conference on Robotics and Automation, Anchorage, Alaska, USA (2010)

  8. Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera sLAM. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)

  9. Klein, G., Murray, D.W.: Parallel tracking and mapping for small AR workspaces International Symposium on Mixed and Augmented Reality (ISMAR), p. 225–234 (2007)

  10. Weiss, S., Achtelik, M.W., Lynen, S., Achtelik, M.C., Kneip, L., Chli, M., Siegwart, R.: Monocular vision for long-term micro aerial vehicle state estimation: a compendium. J. Field Rob. 30(5), 803–831 (2013)

  11. Cherian, A., Andersh, J., Morellas, V., Papanikolopoulos, N., Mettler, B.: Autonomous altitude estimation of a UAV using a single onboard camera IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO (2009)

  12. Guizilini, V., Ramos, F.: Semi-parametric learning for visual odometry. Int. J. Robot. Res. 32 (5), 526–546 (2012)

    Article  Google Scholar 

  13. Li, B., Shen, C., Dai, Y., van den Hengel, A., He, M.: Depth and surface normal estimation from monocular images using regression on deep features and hierarchical CRFs IEEE Conference on Computer Vision and Pattern Recognition (2015)

  14. Ho, H., De Wagter, C., Remes, B., de Croon, G.: Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing. arXiv:1509.01423v2 (2015)

  15. Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A Benchmark for the Evaluation of RGB-D SLAM Systems IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal (2012)

  16. Garratt, M., Chahl, J.: Vision-based terrain following for an unmanned rotorcraft. J. Field Rob. 25(4), 284–301 (2008)

  17. Hwangbo, M., Kim, J.S., Kanade, T.: Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation. Int. J. Robot. Res. 30(14), 1755–1774 (2011)

    Article  Google Scholar 

  18. Klein, G.S.W., Drummond, T.W.: Tightly integrated sensor fusion for robust visual tracking. Image Vis. Comput. 22, 769–776 (2004)

    Article  Google Scholar 

  19. Kukelova, Z., Bujnak, M., Pajdla, T.: Closed-Form Solutions to Minimal Absolute Pose Problems with Known Vertical Direction 10th Asian Conference on Computer Vision, Queenstown, New Zealand (2010)

  20. Troiani, C., Martinelli, A., Laugier, C., Scaramuzza, D.: Low computational-complexity algorithms for vision-aided inertial navigation of micro aerial vehicles. Robot. Auton. Syst. 69, 80–97 (2014)

    Article  Google Scholar 

  21. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  22. Li, M.Y., Mourikis, A.I.: 3-D Motion Estimation and Online Temporal Calibration for Camera-IMU Systems IEEE International Conference on Robotics and Automation, Karlsruhe, Germany (2013)

  23. Tsotsos, K., Chiuso, A., Soatto, S.: Robust Inference for Visual-Inertial Sensor Fusion IEEE International Conference on Robotics and Automation, WA, USA (2015)

  24. Li, M.Y., Yu, H., Zheng, X., Mourikis, A.I.: High-fidelity Sensor Modeling and Self-Calibration in Vision-aided Inertial Navigation, IEEE International Conference on Robotics and Automation, Hong Kong, China (2014)

  25. Weiss, S., Brockers, R., Albrektsen, S., Matthies, L.: Inertial Optical Flow for Throw-And-Go Micro Air Vehicles, IEEE Winter Conference on Applications of Computer Vision (2015)

  26. Bloesch, M., Omari, S., Hutter, M., Siegwart, R.: Robust Visual Inertial Odometry Using a Direct EKF-Based Approach IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, Germany (2015)

  27. Jones, E., Soatto, S.: Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int. J. Robot. Res. 30(4), 407–430 (2011)

  28. Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)

  29. Martinelli, A.: Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Trans. Robot. 28(1), 44–60 (2012)

    Article  Google Scholar 

  30. Panahandeh, G., Hutchinson, S., Händel, P., Jansson, M.: Planar-Based Visual Inertial Navigation: Observability Analysis and Motion Estimation, Journal of Intelligent and Robotic Systems (2015)

  31. Kottas, D.G., Wu, K.J., Roumeliotis, S.I.: Detecting and Dealing with Hovering Maneuvers in Vision-aided Inertial Navigation Systems (2013)

  32. Abeywardena, D., Wang, Z., Kodagoda, S., Dissanayake, G.: Visual-Inertial Fusion for Quadrotor Micro Air Vehicles with Improved Scale Observability IEEE International Conference on Robotics and Automation, Karlsruhe, Germany (2013)

  33. Li, P., Garratt, M., Lambert, A.: Inertial-Aided State and Slope Estimation using a Monocular Camera IEEE International Conference on Robotics and Biomimetics, Zhuhai, China (2015)

  34. de Croon, G., Ho, H.W., De Wagter, C., van Kampen, E., Remes, B., Chu, Q.P.: Optic-flow based slope estimation for autonomous landing. Int. J. Micro Air Vehicle 5(4), 287–297 (2013)

  35. Caballero, F., Merino, L., Ferruz, J., Ollero, A.: Unmanned aerial vehicle localization based on monocular vision and online mosaicking. J. Intell. Robot. Syst. 55(4), 323–343 (2009)

  36. Grabe, V., Bǔlthoff, H. H., Giordano, P.R.: On-board Velocity Estimation and Closed-loop Control of a Quadrotor UAV based on Optical Flow IEEE/RJS International Conference on Intelligent Robots and Systems, Portugal (2012)

  37. Grabe, V., Bülthoff, H. H., Giordano, P.R.: A Comparison of Scale Estimation Schemes for a Quadrotor UAV based on Optical Flow and IMU Measurements International Conference on Intelligent Robots and Systems, Japan (2013)

  38. Li, P., Garratt, M., Lambert, A., Lin, S.G.: Metric sensing and control of a quadrotor using a homography-based visual inertial fusion method. Robot. Auton. Syst. 76, 1–14 (2016)

  39. Zhao, S., Liny, F., Pengy, K., Chen, B.M., Lee, T.H.: Homography-based Vision-aided Inertial Navigation of UAVs in Unknown Environments AIAA Guidance, Navigation, and Control Conference (2012)

  40. Weiss, S., Brockers, R., Matthies, L.: 4DoF Drift Free Navigation Using Inertial Cues and Optical Flow IEEE/RJS International Conference on Intelligent Robots and Systems, Tokyo, Japan (2013)

  41. Omari, S., Ducard, G.: Metric Visual-Inertial Navigation System Using Single Optical Flow Feature European Control Conference (ECC), July 17–19, 2013, Zürich, Switzerland

  42. Bloesch, M., Omari, S., Fankhauser, P., Sommer, H., Gehring, C., Hwangbo, J., Hoepflinger, M., Hutter, M., Siegwart, R.: Fusion of optical flow and inertial measurements for robust egomotion estimation IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, USA (2014)

  43. Briod, A., Zufferey, J.C., Floreano, D.: A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments. Auton. Robot. 40(5), 789–803 (2016)

    Article  Google Scholar 

  44. Eudes, A., Morin, P., Mahony, R., Hamel, T.: Visuo-inertial fusion for homography-based filtering and estimation International Conference on Intelligent Robots and Systems, Japan (2013)

  45. Corke, P., Lobo, J., Dias, J.: An introduction to inertial and visual sensing. Int. J. Robot. Res. 26(6), 519–535 (2007)

  46. Lucas, B.D., Kanadei, T.: An Iterative Image Registration Technique with an Application to Stereo Vision DARPA Image Understanding Workshop, p. 121–130 (1981)

  47. Quigley, M., Conley, K., Gerkey, B.P., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: ROS: an open-source Robot Operating System ICRA Workshop on Open Source Software (2009)

  48. Breugel, F.V., Morgansen, K., Dickinson, M.H.: Monocular distance estimation from optic flow during active landing maneuvers, Bioinspiration and Biomimetics, vol. 9, no. 2 (2014)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ping Li.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, P., Garratt, M., Lambert, A. et al. Inertial-Aided Metric States and Surface Normal Estimation using a Monocular Camera. J Intell Robot Syst 87, 439–454 (2017). https://doi.org/10.1007/s10846-017-0506-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-017-0506-9

Keywords

Navigation