Autonomous Robots

, Volume 40, Issue 5, pp 789–803 | Cite as

A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments

  • Adrien Briod
  • Jean-Christophe Zufferey
  • Dario Floreano
Article

Abstract

We aim at developing autonomous miniature hovering flying robots capable of navigating in unstructured GPS-denied environments. A major challenge is the miniaturization of the embedded sensors and processors that allow such platforms to fly by themselves. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real-time on a microcontroller and enables autonomous flight. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional distance sensors or assumptions about the environment. In this method, we introduce the translational optic-flow direction constraint, which uses the optic-flow direction but not its scale to correct for inertial sensor drift during changes of direction. This solution requires comparatively much simpler electronics and sensors and works in environments of any geometry. Here we describe the implementation and performance of the method on a hovering robot equipped with eight 0.65 g optic-flow sensors, and show that it can be used for closed-loop control of various motions.

Keywords

Aerial robotics Sensor fusion  Ego-motion estimation Optic-flow 

Notes

Acknowledgments

The authors thank the Parc Scientifique office of Logitech at EPFL for providing the bare mouse chips. The authors also thank Przemyslaw Kornatowski for helping designing and manufacturing the flying platform. We also thank Ramon Pericet-Camara, Felix Schill and Julien Lecoeur for their help. We thank Auke Ijspeert for giving us access to some motion capture equipment. Finally, we thank the anonymous reviewers for their contribution in improving the manuscript. The method described in this paper has been submitted for patenting (European patent filing number EP12191669.6). This research was supported by the Swiss National Science Foundation through the National Centre of Competence in Research (NCCR) Robotics.

Supplementary material

Supplementary material 1 (mpeg 12218 KB)

Supplementary material 2 (mpeg 13074 KB)

Supplementary material 3 (mpeg 12512 KB)

References

  1. Aloimonos, J., Weiss, I., & Bandyopadhyay, A. (1988). Active vision. International Journal of Computer Vision, 1(4), 333–356.CrossRefGoogle Scholar
  2. Baird, E., Srinivasan, M. V., Zhang, S., & Cowling, A. (2005). Visual control of flight speed in honeybees. The Journal of Experimental Biology, 208(Pt 20), 3895–3905.CrossRefGoogle Scholar
  3. Barrows, G. L., Humbert, S., Leonard, A., Neely, C. W., & Young, T. (2006). Vision Based Hover in Place. Patent WO 2011/123758.Google Scholar
  4. Beyeler, A., Zufferey, J.-C., & Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27(3), 201–219.CrossRefGoogle Scholar
  5. Blösch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based mav navigation in unknown and unstructured environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 21–28), Anchorage, AK.Google Scholar
  6. Boeddeker, N., & Hemmi, J. M. (2010). Visual gaze control during peering flight manoeuvres in honeybees. Proceedings of the Royal Society B, 277(1685), 1209–1217.CrossRefGoogle Scholar
  7. Briod, A., Zufferey, J.-C., & Floreano, D. (2012). Automatically calibrating the viewing direction of optic-flow sensors. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 3956–3961), Saint Paul, MN.Google Scholar
  8. Briod, A., Zufferey, J.-C., & Floreano, D. (2013). Optic-flow based control of a 46g quadrotor. In Proceedings of the IEEE/RSJ IROS’13 International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments. Tokyo, Japan.Google Scholar
  9. Bristeau, P., Callou, F., Vissière, D., & Petit, N. (2011). The navigation and control technology inside the ar. drone micro uav. In 18th IFAC World Congress (pp. 1477–1484).Google Scholar
  10. Chen, Y.-S., Liou, L.-G., Hung, Y.-P., & Fuh, C.-S. (2001). Three-dimensional ego-motion estimation from motion fields observed with multiple cameras. Pattern Recognition, 34(8), 1573–1583.CrossRefMATHGoogle Scholar
  11. Corke, P., Lobo, J., & Dias, J. (2007). An introduction to inertial and visual sensing. International Journal of Robotics Research, 26(6), 519–535.CrossRefGoogle Scholar
  12. Davison, A. (2003). Real-time simultaneous localisation and mapping with a single camera. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1403–1410).Google Scholar
  13. Diel, D. D., DeBitetto, P., & Teller, S. (2005). Epipolar constraints for vision-aided inertial navigation. In IEEE Workshop on Application of Computer Vision, (pp. 221–228), Breckenridge, CO.Google Scholar
  14. Dissanayake, G., & Sukkarieh, S. (2001). The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications. IEEE Transactions on Robotics, 17(5), 731–747.CrossRefGoogle Scholar
  15. Duhamel, P.-E. J., Perez-Arancibia, C. O., Barrows, G. L., & Wood, R. J. (2013). Biologically Inspired optical-flow sensing for altitude control of flapping-wing microrobots. IEEE/ASME Transactions on Mechatronics, 18(2), 556–568.CrossRefGoogle Scholar
  16. Floreano, D., Pericet-Camara, R., Viollet, S., Ruffier, F., Brückner, A., Leitel, R., et al. (2013). Miniature curved artificial compound eyes. Proceedings of the National Academy of Sciences of the United States of America, 110(23), 9267–9272.CrossRefGoogle Scholar
  17. Franz, M., Chahl, J. S., & Krapp, H. G. (2004). Insect-inspired estimation of egomotion. Neural Computation, 16(11), 2245–2260.CrossRefMATHGoogle Scholar
  18. Fraundorfer, F., & Scaramuzza, D. (2012). Visual odometry: Part II: Matching, robustness, optimization, and applications. IEEE Robotics and Automation Magazine, 19(2), 78–90.CrossRefGoogle Scholar
  19. Grewal, M. S., & Andrews, A. P. (2001). Kalman filtering: Theory and practice using MATLAB (2nd ed.). New York: Wiley-IEEE Press.MATHGoogle Scholar
  20. Hartley, R. I., & Zisserman, A. (2004). Multiple view geometry in computer vision (2nd ed.). Cambridge: Cambridge University Press.CrossRefMATHGoogle Scholar
  21. Herisse, B., Russotto, F.-X., Hamel, T., & Mahony, R. (2008). Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle using optical flow. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 801–806).Google Scholar
  22. Honegger, D., Greisen, P., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Real-time velocity estimation based on optical flow and disparity matching. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 5177–5182), Vilamoura, Portugal.Google Scholar
  23. Honegger, D., Meier, L., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.Google Scholar
  24. Jones, E. S., & Soatto, S. (2011). Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research, 30(4), 407–430.Google Scholar
  25. Kelly, J., & Sukhatme, G. S. (2010). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 56–79.CrossRefGoogle Scholar
  26. Kendoul, F., Fantoni, I., & Nonami, K. (2009). Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57(6–7), 591–602.CrossRefGoogle Scholar
  27. Kern, R., Boeddeker, N., Dittmar, L., & Egelhaaf, M. (2012). Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information. The Journal of Experimental Biology, 215(14), 2501–2514.CrossRefGoogle Scholar
  28. Kim, A., & Golnaraghi, M. (2004). A quaternion-based orientation estimation algorithm using an inertial measurement unit. In Position Location and Navigation Symposium (pp. 268–272).Google Scholar
  29. Kim, J., & Brambley, G. (2007). Dual optic-flow integrated navigation for small-scale flying robots. In Proceedings of Australasian Conference on Robotics and Automation, Brisbane, Australia.Google Scholar
  30. Klein, G., & Murray, D. (2007). Parallel tracking and mapping for small AR workspaces. In Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 225–234). IEEE.Google Scholar
  31. Koenderink, J., & Doorn, A. (1987). Facts on optic flow. Biological Cybernetics, 56(4), 247–254.CrossRefMATHGoogle Scholar
  32. Krapp, H. G., & Hengstenberg, R. (1996). Estimation of self-motion by optic flow processing in single visual interneurons. Nature, 384(6608), 463–6.CrossRefGoogle Scholar
  33. Kushleyev, A., Mellinger, D., Powers, C., & Kumar, V. (2013). Towards a swarm of agile micro quadrotors. Autonomous Robots, 35(4), 287–300.CrossRefGoogle Scholar
  34. Lucas, B. D., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. Proceedings of the Seventh International Joint Conference on Artificial Intelligence, vol. 130, (pp. 121–130).Google Scholar
  35. Martinelli, A. (2012). Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Transactions on Robotics, 28(1), 44–60.MathSciNetCrossRefGoogle Scholar
  36. Mourikis, A., & Roumeliotis, S. (2007). A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 3565–3572), Roma, Italy.Google Scholar
  37. Nelson, R., & Aloimonos, J. (1988). Finding motion parameters from spherical motion fields (or the advantages of having eyes in the back of your head). Biological Cybernetics, 58(4), 261–273.CrossRefGoogle Scholar
  38. Pylvanainen, T. (2008). Automatic and adaptive calibration of 3D field sensors. Applied Mathematical Modelling, 32(4), 575–587.CrossRefMATHGoogle Scholar
  39. Rudnick, J., & Gaspari, G. (2004). Elements of the random walk. Cambridge: Cambridge University Press.CrossRefMATHGoogle Scholar
  40. Ruffier, F., & Franceschini, N. (2005). Optic flow regulation: The key to aircraft automatic guidance. Robotics and Autonomous Systems, 50(4), 177–194.CrossRefGoogle Scholar
  41. Scaramuzza, D., Achtelik, M. C., Doitsidis, L., Fraundorfer, F., Kosmatopoulos, E. B., Martinelli, A., Achtelik, M. W., Chli, M., Chatzichristofis, S. A., Kneip, L., Gurdan, D., Heng, L., Lee, G. H., Lynen, S., Meier, L., Pollefeys, M., Siegwart, R., Stumpf, J. C., Tanskanen, P., Troiani, C., & Weiss, S. (2013). Vision-controlled micro flying robots: From system design to autonomous navigation and mapping in GPS-denied environments. IEEE Robotics and Automation Magazine.Google Scholar
  42. Scaramuzza, D., & Fraundorfer, F. (2011). Visual odometry [Tutorial]. IEEE Robotics and Automation Magazine, 18(4), 80–92.CrossRefGoogle Scholar
  43. Schill, F., & Mahony, R. (2011). Estimating ego-motion in panoramic image sequences with inertial measurements. Robotics Research, 70, 87–101.CrossRefGoogle Scholar
  44. Schilstra, C., & van Hateren, J. H. (1999). Blowfly flight and optic flow, I. Thorax kinematics and flight dynamics. Journal of Experimental Biology, 202, 1481–1490.Google Scholar
  45. Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2013). Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In Proceedings of Robotics: Science and Systems (RSS), Berlin, Germany.Google Scholar
  46. Srinivasan, M. V. (1994). An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics, 71(5), 401–415.CrossRefMATHGoogle Scholar
  47. Srinivasan, M. V., Thurrowgood, S., & Soccol, D. (2009). From visual guidance in flying insects to autonomous aerial vehicles. In D. Floreano, et al. (Eds.), Flying insects and robots (pp. 15–28). Berlin: Springer.CrossRefGoogle Scholar
  48. Srinivasan, M. V., Zhang, S., Lehrer, M., & Collett, T. (1996). Honeybee navigation en route to the goal: Visual flight control and odometry. The Journal of Experimental Biology, 199, 237–244.Google Scholar
  49. Tammero, L. F., & Dickinson, M. H. (2002). The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. The Journal of Experimental Biology, 205(Pt 3), 327–343.Google Scholar
  50. Taylor, C. N. (2008). Enabling navigation of MAVs through inertial, vision, and air pressure sensor fusion. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, vol. 35 (pp. 475–480), Seoul.Google Scholar
  51. Taylor, C. N., Veth, M., Raquet, J., & Miller, M. (2011). Comparison of two image and inertial sensor fusion techniques for navigation in unmapped environments. IEEE Transactions on Aerospace and Electronic Systems, 47(2), 946–958.CrossRefGoogle Scholar
  52. Titterton, D. H. (2004). Strapdown inertial navigation technology (2nd ed.). London: The Institution of Engineering and Technology.CrossRefGoogle Scholar
  53. Triggs, B., Mclauchlan, P., Hartley, R., & Fitzgibbon, A. (2000). Bundle adjustment a modern synthesis. Vision algorithms: Theory and practice. Lecture notes in computer science (Vol. 34099, pp. 298–372). Berlin: Springer.CrossRefGoogle Scholar
  54. van Breugel, F., & Dickinson, M. H. (2012). The visual control of landing and obstacle avoidance in the fruit fly Drosophila melanogaster. The Journal of Experimental Biology, 215(11), 1783–1798.CrossRefGoogle Scholar
  55. Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012a). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN.Google Scholar
  56. Weiss, S., Achtelik, M. W., Chli, M., & Siegwart, R. (2012b). Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 31–38), Saint Paul, MNGoogle Scholar
  57. Wood, R. J. (2008). The first takeoff of a biologically inspired at-scale robotic insect. IEEE Transactions on Robotics, 24(2), 341–347.CrossRefGoogle Scholar
  58. Zufferey, J.-C., & Floreano, D. (2006). Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics, 22(1), 137–146.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Adrien Briod
    • 1
  • Jean-Christophe Zufferey
    • 2
  • Dario Floreano
    • 1
  1. 1.The Laboratory of Intelligent Systems (LIS)Ecole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland
  2. 2.SenseFly LtdCheseaux-sur-LausanneSwitzerland

Personalised recommendations