Advertisement

Autonomous Robots

, 27:201 | Cite as

Vision-based control of near-obstacle flight

  • Antoine Beyeler
  • Jean-Christophe Zufferey
  • Dario Floreano
Article

Abstract

This paper presents a novel control strategy, which we call optiPilot, for autonomous flight in the vicinity of obstacles. Most existing autopilots rely on a complete 6-degree-of-freedom state estimation using a GPS and an Inertial Measurement Unit (IMU) and are unable to detect and avoid obstacles. This is a limitation for missions such as surveillance and environment monitoring that may require near-obstacle flight in urban areas or mountainous environments. OptiPilot instead uses optic flow to estimate proximity of obstacles and avoid them.

Our approach takes advantage of the fact that, for most platforms in translational flight (as opposed to near-hover flight), the translatory motion is essentially aligned with the aircraft main axis. This property allows us to directly interpret optic flow measurements as proximity indications. We take inspiration from neural and behavioural strategies of flying insects to propose a simple mapping of optic flow measurements into control signals that requires only a lightweight and power-efficient sensor suite and minimal processing power.

In this paper, we first describe results obtained in simulation before presenting the implementation of optiPilot on a real flying platform equipped only with lightweight and inexpensive optic computer mouse sensors, MEMS rate gyroscopes and a pressure-based airspeed sensor. We show that the proposed control strategy not only allows collision-free flight in the vicinity of obstacles, but is also able to stabilise both attitude and altitude over flat terrain. These results shed new light on flight control by suggesting that the complex sensors and processing required for 6 degree-of-freedom state estimation may not be necessary for autonomous flight and pave the way toward the integration of autonomy into current and upcoming gram-scale flying platforms.

Keywords

Vision-based control Optic-flow-based control Obstacle avoidance Near-obstacle flight Autonomous unmanned aerial vehicle (UAV) Micro-air vehicle (MAV) 

Supplementary material

10514_2009_9139_MOESM1_ESM.zip (54.1 mb)
Zip archive containing a video of experiments in section 5.3 (optiPilot_AR_flight_stabilisation.mp4) and a video of obstacle avoidance experiments (optiPilot_AR_obstacle_avoidance.mp4) (ZIP, 57 Mb)

References

  1. Argyros, A. A., Tsakiris, D. P., & Groyer, C. (2004). Biomimetic centering behavior for mobile robots with panoramic sensors. IEEE Robotics and Automation Magazine, 11(4), 21–68. CrossRefGoogle Scholar
  2. Barber, D., Griffiths, S., McLain, T., & Beard, R. (2005). Autonomous landing of miniature aerial vehicles. In AIAA Infotech@Aerospace. Google Scholar
  3. Barron, J., Fleet, D., & Beauchemin, S. (1994). Performance of optical flow techniques. International Journal of Computer Vision, 12(1), 43–77. CrossRefGoogle Scholar
  4. Barrows, G., Neely, C., & Miller, K. (2001). Optic flow sensors for MAV navigation. In T. J. Mueller (Ed.), Fixed and flapping wing aerodynamics for micro air vehicle applications, progress in astronautics and aeronautics (Vol. 195, pp. 557–574). Washington: AIAA. Google Scholar
  5. Beard, R., Kingston, D., Quigley, M., Snyder, D., Christiansen, R., & Johnson, W. (2005). Autonomous vehicle technologies for small fixed-wing UAVs. Journal of Aerospace Computing, Information, and Communication, 2(1), 92–108. CrossRefGoogle Scholar
  6. Beyeler, A., Zufferey, J.-C., & Floreano, D. (2007). 3D vision-based navigation for indoor microflyers. In Proc. IEEE int. conf. on robotics and automation, Roma (pp. 1336–1341). Google Scholar
  7. Beyeler, A., Magnenat, S., & Habersaat, A. (2008). Ishtar: a flexible and lightweight software for remote data access. In Proceedings of the 2008 European micro air vehicle conference EMAV08. Google Scholar
  8. Beyeler, A., Zufferey, J.-C., Floreano, D. (2009, in press). optiPilot: control of take-off and landing using optic flow. In Proc. of the 2009 European Micro Air Vehicle conference (EMAV 2009), Delft, Netherland. Google Scholar
  9. Braitenberg, V. (1984). Vehicles—experiments in synthetic psychology. Cambridge: MIT Press. Google Scholar
  10. Chahl, J., Srinivasan, M., & Zhang, H. (2004). Landing strategies in honeybees and applications to uninhabited airborne vehicles. The International Journal of Robotics Research, 23(2), 101–110. CrossRefGoogle Scholar
  11. Cooke, J., Zyda, M., Pratt, D., & McGhee, R. (1992). NPSNET: Flight simulation dynamic modeling using quaternions. Presence: Teleoperators and Virtual Environments, 1(4), 404–420. Google Scholar
  12. Dahmen, H., Millers, A., & Mallot, H. (2009). Insect inspired odometry by optic flow recorded with optical mouse chips. In D. Floreano, J.-C. Zufferey, M. Srinivasan & C. Ellington (Eds.), Flying insects and robots. Berlin: Springer. Google Scholar
  13. Egelhaaf, M., & Kern, R. (2002). Vision in flying insects. Current Opinion in Neurobiology, 12(6), 699–706. CrossRefGoogle Scholar
  14. Fearing, R., Avadhanula, S., Campolo, D., Sitti, M., Jan, J., & Wood, R. (2002). A micromechanical flying insect thorax (pp. 469–480). Cambridge: MIT Press. Google Scholar
  15. Fennema, C., & Thompson, W. (1979). Velocity determination in scenes containing several moving objects. Computer Graphics and Image Processing, 9, 301–315. CrossRefGoogle Scholar
  16. Franceschini, N., Ruffier, F., & Serres, J. (2007). A bio-inpired flying robot sheds light on insect piloting abilities. Current Biology, 17, 1–7. CrossRefGoogle Scholar
  17. Garratt, M. A., & Chahl, J. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25(4), 284–301. CrossRefGoogle Scholar
  18. Gibson, J. (1950). The perception of the visual world. Boston: Houghton Mifflin. Google Scholar
  19. Green, W., & Oh, P. (2008). Optic-flow-based collision avoidance. IEEE Robotics & Automation Magazine, 15(1), 96–103. CrossRefGoogle Scholar
  20. Green, W., Oh, P., Sevcik, K., & Barrows, G. (2003). Autonomous landing for indoor flying robots using optic flow. In ASME international mechanical engineering congress and exposition, Washington, D.C. (vol. 2, pp. 1347–1352). Google Scholar
  21. Griffiths, S., Saunders, J., Curtis, A., McLain, T., & Beard, R. (2007). Obstacle and terrain avoidance for miniature aerial vehicles. In K. Valavanis (Ed.), Advances in unmanned aerial vehicles: state of the art and the road to autonomy (Vol. 33, pp. 213–244). Berlin: Springer. Chap. I.7. CrossRefGoogle Scholar
  22. Hrabar, S., & Sukhatme, G. S. (2006). Optimum camera angle for optic flow-based centering response. In Proceedings of the 2006 IEEE/RSJ international conference on intelligent robots and systems, Beijing, China (pp. 3922–3927). Google Scholar
  23. Humbert, J., Conroy, J. K., Neely, C., & Barrows, G. (2009). Wide-field integration methods for visuomotor control. In D. Floreano, J.-C. Zufferey, M. Srinivasan & C. Ellington (Eds.), Flying insects and robots. Berlin: Springer. Chap. 5. Google Scholar
  24. Hyslop, A. M., & Humbert, J. (2008). Wide-field integration methods for autonomous navigation of 3-D environments. In Proceedings of the 2008 AIAA guidance, navigation and control conference and exhibit. Google Scholar
  25. Karmeier, K., van Hateren, J. H., Kern, R., & Egelhaaf, M. (2006). Encoding of naturalistic optic flow by a population of blowfly motion-sensitive neurons. Journal of Neurophysiology, 96, 1602–1614. CrossRefGoogle Scholar
  26. Kendoul, F., Fantoni, I., & Nonami, K. (2009). Optic flow-based vision system for autonomous 3d localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57, 591–602. CrossRefGoogle Scholar
  27. Koenderink, J., & van Doorn, A. (1987). Facts on optic flow. Biological Cybernetics, 56, 247–254. MATHCrossRefGoogle Scholar
  28. Krapp, H., Hengstenberg, B., & Hengstenberg, R. (1998). Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly. Journal of Neurophysiology, 79, 1902–1917. Google Scholar
  29. Leven, S., Zufferey, J.-C., & Floreano, D. (2007). A simple and robust fixed-wing platform for outdoor flying robot experiments. In International symposium on flying insects and robots (pp. 69–70). Google Scholar
  30. Leven, S., Zufferey, J.-C., & Floreano, D. (2009). A minimalist control strategy for small UAVs. In Proceedings of the 2009 IEEE/RSJ international conference on intelligent robots and systems, St. Louis. Google Scholar
  31. Mehta, S., & Etienne-Cummings, R. (2003). Normal optical flow chip. In Proceedings of the IEEE international symposium on circuits and systems (ISCAS 2003) (pp. 784–787). Google Scholar
  32. Moeckel, R., & Liu, S. C. (2007). Motion detection circuits for a time-to-travel algorithm. In IEEE international symposium on circuits and systems (ISCAS), New Orleans (pp. 3079–3082). Google Scholar
  33. Muratet, L., Doncieux, S., Brière, Y., & Meyer, J. (2005). A contribution to vision-based autonomous helicopter flight in urban environments. Robotics and Autonomous Systems, 50(4), 195–209. CrossRefGoogle Scholar
  34. Neumann, T., & Bülthoff, H. (2002). Behavior-oriented vision for biomimetic flight control. In Proceedings of the EPSRC/BBSRC international workshop on biologically inspired robotics (pp. 196–203). Google Scholar
  35. Niclass, C., Rochas, A., Besse, P. A., & Charbon, E. (2005). Design and characterization of a CMOS 3-D image sensor based on single photon avalanche diodes. IEEE Journal of Solid-State Circuits, 40(9), 1847–1854. CrossRefGoogle Scholar
  36. Pudas, M., Viollet, S., Ruffier, F., Kruusing, A., Amic, S., Leppävuori, S., & Franceschini, N. (2007). A miniature bio-inspired optic flow sensor based on low temperature co-fired ceramics (ltcc) technology. Sensors and Actuators A, 133, 88–95. CrossRefGoogle Scholar
  37. Rodriguez, A., Andersen, E., Bradley, J., & Taylor, C. (2007). Wind estimation using an optical flow sensor on a miniature air vehicle. In AIAA conference on guidance, navigation, and control. Google Scholar
  38. Ruderman, D. L. (1994). The statistics of natural images. Computation in Neural Systems, 5, 517–548. MATHCrossRefGoogle Scholar
  39. Ruffier, F., & Franceschini, N. (2005). Optic flow regulation: the key to aircraft automatic guidance. Robotics and Autonomous Systems, 50(4), 177–194. CrossRefGoogle Scholar
  40. Ruffier, F., & Franceschini, N. (2008). Aerial robot piloted in steep relief by optic flow sensors. In Proceedings of the 2008 IEEE/RSJ international conference on intelligent robots and systems (IROS 2008) (ISCAS 2008) (pp. 1266–1273). Google Scholar
  41. Scherer, S., Singh, S., Chamberlain, L., & Saripalli, S. (2007). Flying fast and low among obstacles. In Proceedings of the 2007 IEEE conference on robotics and automation (pp. 2023–2029). Google Scholar
  42. Scherer, S., Singh, S., Chamberlain, L., & Elgersma, M. (2008). Flying fast and low among obstacles: methodology and experiments. The International Journal of Robotics Research, 27(5), 549–574. CrossRefGoogle Scholar
  43. Srinivasan, M. (1994). An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics, 71, 401–416. MATHCrossRefGoogle Scholar
  44. Srinivasan, M., & Zhang, S. (2004). Visual motor computations in insects. Annual Reviews in Neuroscience, 27, 679–696. CrossRefGoogle Scholar
  45. Stevens, B. L., & Lewis, F. L. (2003). In Aircraft control and simulation (2nd ed.). New York: Wiley. Google Scholar
  46. Tammero, L., & Dickinson, M. (2002). The influence of visual landscape on the free flight behavior of the fruit fly drosophila melanogaster. The Journal of Experimental Biology, 205, 327–343. Google Scholar
  47. Thrun, S., Burgard, W., & Fox, D. (2005). Probabilistic robotics. Cambridge: MIT Press. MATHGoogle Scholar
  48. Valavanis, K. P. (2007). Advances in unmanned aerial vehicles. Berlin: Springer. MATHCrossRefGoogle Scholar
  49. van Hateren, J., & Schilstra, C. (1999). Blowfly flight and optic flow. II. head movements during flight. Journal of Experimental Biology, 202, 1491–1500. Google Scholar
  50. Wehner, R. (1987). Matched filters—neural models of the external world. Journal of Comparative Physiology A, 161, 511–531. CrossRefGoogle Scholar
  51. Whiteside, T., & Samuel, G. (1970). Blur zone. Nature, 225, 94–95. CrossRefGoogle Scholar
  52. Wood, R. (2008). Fly, robot, fly. IEEE Spectrum, 45(3), 25–29. CrossRefGoogle Scholar
  53. Zufferey, J.-C. (2008). Bio-inspired flying robots: experimental synthesis of autonomous indoor flyers. Boca Raton: EPFL/CRC Press. Google Scholar
  54. Zufferey, J.-C., & Floreano, D. (2006). Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics, 22, 137–146. CrossRefGoogle Scholar
  55. Zufferey, J.-C., Klaptocz, A., Beyeler, A., Nicoud, J. D., & Floreano, D. (2007). A 10-gram vision-based flying robot. Advanced Robotics, Journal of the Robotics Society of Japan, 21(14), 1671–1684. CrossRefGoogle Scholar
  56. Zufferey, J.-C., Beyeler, A., & Floreano, D. (2009). Optic-flow to steer and avoid collisions in 3D. In D. Floreano, J.-C. Zufferey, M. Srinivasan & C. Ellington (Eds.), Flying insects and robots. Berlin: Springer. Chap. 6. Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  • Antoine Beyeler
    • 1
  • Jean-Christophe Zufferey
    • 1
  • Dario Floreano
    • 1
  1. 1.Ecole Polytechnique Fédérale de Lausanne (EPFL)Laboratory of Intelligent Systems (LIS)LausanneSwitzerland

Personalised recommendations