Skip to main content
Log in

A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We aim at developing autonomous miniature hovering flying robots capable of navigating in unstructured GPS-denied environments. A major challenge is the miniaturization of the embedded sensors and processors that allow such platforms to fly by themselves. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real-time on a microcontroller and enables autonomous flight. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional distance sensors or assumptions about the environment. In this method, we introduce the translational optic-flow direction constraint, which uses the optic-flow direction but not its scale to correct for inertial sensor drift during changes of direction. This solution requires comparatively much simpler electronics and sensors and works in environments of any geometry. Here we describe the implementation and performance of the method on a hovering robot equipped with eight 0.65 g optic-flow sensors, and show that it can be used for closed-loop control of various motions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. www.naturalpoint.com/optitrack/.

References

  • Aloimonos, J., Weiss, I., & Bandyopadhyay, A. (1988). Active vision. International Journal of Computer Vision, 1(4), 333–356.

    Article  Google Scholar 

  • Baird, E., Srinivasan, M. V., Zhang, S., & Cowling, A. (2005). Visual control of flight speed in honeybees. The Journal of Experimental Biology, 208(Pt 20), 3895–3905.

    Article  Google Scholar 

  • Barrows, G. L., Humbert, S., Leonard, A., Neely, C. W., & Young, T. (2006). Vision Based Hover in Place. Patent WO 2011/123758.

  • Beyeler, A., Zufferey, J.-C., & Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27(3), 201–219.

    Article  Google Scholar 

  • Blösch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based mav navigation in unknown and unstructured environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 21–28), Anchorage, AK.

  • Boeddeker, N., & Hemmi, J. M. (2010). Visual gaze control during peering flight manoeuvres in honeybees. Proceedings of the Royal Society B, 277(1685), 1209–1217.

    Article  Google Scholar 

  • Briod, A., Zufferey, J.-C., & Floreano, D. (2012). Automatically calibrating the viewing direction of optic-flow sensors. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 3956–3961), Saint Paul, MN.

  • Briod, A., Zufferey, J.-C., & Floreano, D. (2013). Optic-flow based control of a 46g quadrotor. In Proceedings of the IEEE/RSJ IROS’13 International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments. Tokyo, Japan.

  • Bristeau, P., Callou, F., Vissière, D., & Petit, N. (2011). The navigation and control technology inside the ar. drone micro uav. In 18th IFAC World Congress (pp. 1477–1484).

  • Chen, Y.-S., Liou, L.-G., Hung, Y.-P., & Fuh, C.-S. (2001). Three-dimensional ego-motion estimation from motion fields observed with multiple cameras. Pattern Recognition, 34(8), 1573–1583.

    Article  MATH  Google Scholar 

  • Corke, P., Lobo, J., & Dias, J. (2007). An introduction to inertial and visual sensing. International Journal of Robotics Research, 26(6), 519–535.

    Article  Google Scholar 

  • Davison, A. (2003). Real-time simultaneous localisation and mapping with a single camera. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1403–1410).

  • Diel, D. D., DeBitetto, P., & Teller, S. (2005). Epipolar constraints for vision-aided inertial navigation. In IEEE Workshop on Application of Computer Vision, (pp. 221–228), Breckenridge, CO.

  • Dissanayake, G., & Sukkarieh, S. (2001). The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications. IEEE Transactions on Robotics, 17(5), 731–747.

    Article  Google Scholar 

  • Duhamel, P.-E. J., Perez-Arancibia, C. O., Barrows, G. L., & Wood, R. J. (2013). Biologically Inspired optical-flow sensing for altitude control of flapping-wing microrobots. IEEE/ASME Transactions on Mechatronics, 18(2), 556–568.

    Article  Google Scholar 

  • Floreano, D., Pericet-Camara, R., Viollet, S., Ruffier, F., Brückner, A., Leitel, R., et al. (2013). Miniature curved artificial compound eyes. Proceedings of the National Academy of Sciences of the United States of America, 110(23), 9267–9272.

    Article  Google Scholar 

  • Franz, M., Chahl, J. S., & Krapp, H. G. (2004). Insect-inspired estimation of egomotion. Neural Computation, 16(11), 2245–2260.

    Article  MATH  Google Scholar 

  • Fraundorfer, F., & Scaramuzza, D. (2012). Visual odometry: Part II: Matching, robustness, optimization, and applications. IEEE Robotics and Automation Magazine, 19(2), 78–90.

    Article  Google Scholar 

  • Grewal, M. S., & Andrews, A. P. (2001). Kalman filtering: Theory and practice using MATLAB (2nd ed.). New York: Wiley-IEEE Press.

    MATH  Google Scholar 

  • Hartley, R. I., & Zisserman, A. (2004). Multiple view geometry in computer vision (2nd ed.). Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • Herisse, B., Russotto, F.-X., Hamel, T., & Mahony, R. (2008). Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle using optical flow. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 801–806).

  • Honegger, D., Greisen, P., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Real-time velocity estimation based on optical flow and disparity matching. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 5177–5182), Vilamoura, Portugal.

  • Honegger, D., Meier, L., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.

  • Jones, E. S., & Soatto, S. (2011). Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research, 30(4), 407–430.

    Google Scholar 

  • Kelly, J., & Sukhatme, G. S. (2010). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 56–79.

    Article  Google Scholar 

  • Kendoul, F., Fantoni, I., & Nonami, K. (2009). Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57(6–7), 591–602.

    Article  Google Scholar 

  • Kern, R., Boeddeker, N., Dittmar, L., & Egelhaaf, M. (2012). Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information. The Journal of Experimental Biology, 215(14), 2501–2514.

    Article  Google Scholar 

  • Kim, A., & Golnaraghi, M. (2004). A quaternion-based orientation estimation algorithm using an inertial measurement unit. In Position Location and Navigation Symposium (pp. 268–272).

  • Kim, J., & Brambley, G. (2007). Dual optic-flow integrated navigation for small-scale flying robots. In Proceedings of Australasian Conference on Robotics and Automation, Brisbane, Australia.

  • Klein, G., & Murray, D. (2007). Parallel tracking and mapping for small AR workspaces. In Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 225–234). IEEE.

  • Koenderink, J., & Doorn, A. (1987). Facts on optic flow. Biological Cybernetics, 56(4), 247–254.

    Article  MATH  Google Scholar 

  • Krapp, H. G., & Hengstenberg, R. (1996). Estimation of self-motion by optic flow processing in single visual interneurons. Nature, 384(6608), 463–6.

    Article  Google Scholar 

  • Kushleyev, A., Mellinger, D., Powers, C., & Kumar, V. (2013). Towards a swarm of agile micro quadrotors. Autonomous Robots, 35(4), 287–300.

    Article  Google Scholar 

  • Lucas, B. D., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. Proceedings of the Seventh International Joint Conference on Artificial Intelligence, vol. 130, (pp. 121–130).

  • Martinelli, A. (2012). Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Transactions on Robotics, 28(1), 44–60.

    Article  MathSciNet  Google Scholar 

  • Mourikis, A., & Roumeliotis, S. (2007). A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 3565–3572), Roma, Italy.

  • Nelson, R., & Aloimonos, J. (1988). Finding motion parameters from spherical motion fields (or the advantages of having eyes in the back of your head). Biological Cybernetics, 58(4), 261–273.

    Article  Google Scholar 

  • Pylvanainen, T. (2008). Automatic and adaptive calibration of 3D field sensors. Applied Mathematical Modelling, 32(4), 575–587.

    Article  MATH  Google Scholar 

  • Rudnick, J., & Gaspari, G. (2004). Elements of the random walk. Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • Ruffier, F., & Franceschini, N. (2005). Optic flow regulation: The key to aircraft automatic guidance. Robotics and Autonomous Systems, 50(4), 177–194.

    Article  Google Scholar 

  • Scaramuzza, D., Achtelik, M. C., Doitsidis, L., Fraundorfer, F., Kosmatopoulos, E. B., Martinelli, A., Achtelik, M. W., Chli, M., Chatzichristofis, S. A., Kneip, L., Gurdan, D., Heng, L., Lee, G. H., Lynen, S., Meier, L., Pollefeys, M., Siegwart, R., Stumpf, J. C., Tanskanen, P., Troiani, C., & Weiss, S. (2013). Vision-controlled micro flying robots: From system design to autonomous navigation and mapping in GPS-denied environments. IEEE Robotics and Automation Magazine.

  • Scaramuzza, D., & Fraundorfer, F. (2011). Visual odometry [Tutorial]. IEEE Robotics and Automation Magazine, 18(4), 80–92.

    Article  Google Scholar 

  • Schill, F., & Mahony, R. (2011). Estimating ego-motion in panoramic image sequences with inertial measurements. Robotics Research, 70, 87–101.

    Article  Google Scholar 

  • Schilstra, C., & van Hateren, J. H. (1999). Blowfly flight and optic flow, I. Thorax kinematics and flight dynamics. Journal of Experimental Biology, 202, 1481–1490.

    Google Scholar 

  • Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2013). Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In Proceedings of Robotics: Science and Systems (RSS), Berlin, Germany.

  • Srinivasan, M. V. (1994). An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics, 71(5), 401–415.

    Article  MATH  Google Scholar 

  • Srinivasan, M. V., Thurrowgood, S., & Soccol, D. (2009). From visual guidance in flying insects to autonomous aerial vehicles. In D. Floreano, et al. (Eds.), Flying insects and robots (pp. 15–28). Berlin: Springer.

    Chapter  Google Scholar 

  • Srinivasan, M. V., Zhang, S., Lehrer, M., & Collett, T. (1996). Honeybee navigation en route to the goal: Visual flight control and odometry. The Journal of Experimental Biology, 199, 237–244.

    Google Scholar 

  • Tammero, L. F., & Dickinson, M. H. (2002). The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. The Journal of Experimental Biology, 205(Pt 3), 327–343.

    Google Scholar 

  • Taylor, C. N. (2008). Enabling navigation of MAVs through inertial, vision, and air pressure sensor fusion. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, vol. 35 (pp. 475–480), Seoul.

  • Taylor, C. N., Veth, M., Raquet, J., & Miller, M. (2011). Comparison of two image and inertial sensor fusion techniques for navigation in unmapped environments. IEEE Transactions on Aerospace and Electronic Systems, 47(2), 946–958.

    Article  Google Scholar 

  • Titterton, D. H. (2004). Strapdown inertial navigation technology (2nd ed.). London: The Institution of Engineering and Technology.

    Book  Google Scholar 

  • Triggs, B., Mclauchlan, P., Hartley, R., & Fitzgibbon, A. (2000). Bundle adjustment a modern synthesis. Vision algorithms: Theory and practice. Lecture notes in computer science (Vol. 34099, pp. 298–372). Berlin: Springer.

    Chapter  Google Scholar 

  • van Breugel, F., & Dickinson, M. H. (2012). The visual control of landing and obstacle avoidance in the fruit fly Drosophila melanogaster. The Journal of Experimental Biology, 215(11), 1783–1798.

    Article  Google Scholar 

  • Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012a). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN.

  • Weiss, S., Achtelik, M. W., Chli, M., & Siegwart, R. (2012b). Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 31–38), Saint Paul, MN

  • Wood, R. J. (2008). The first takeoff of a biologically inspired at-scale robotic insect. IEEE Transactions on Robotics, 24(2), 341–347.

    Article  Google Scholar 

  • Zufferey, J.-C., & Floreano, D. (2006). Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics, 22(1), 137–146.

    Article  Google Scholar 

Download references

Acknowledgments

The authors thank the Parc Scientifique office of Logitech at EPFL for providing the bare mouse chips. The authors also thank Przemyslaw Kornatowski for helping designing and manufacturing the flying platform. We also thank Ramon Pericet-Camara, Felix Schill and Julien Lecoeur for their help. We thank Auke Ijspeert for giving us access to some motion capture equipment. Finally, we thank the anonymous reviewers for their contribution in improving the manuscript. The method described in this paper has been submitted for patenting (European patent filing number EP12191669.6). This research was supported by the Swiss National Science Foundation through the National Centre of Competence in Research (NCCR) Robotics.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adrien Briod.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mpeg 12218 KB)

Supplementary material 2 (mpeg 13074 KB)

Supplementary material 3 (mpeg 12512 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Briod, A., Zufferey, JC. & Floreano, D. A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments. Auton Robot 40, 789–803 (2016). https://doi.org/10.1007/s10514-015-9494-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-015-9494-4

Keywords

Navigation