Advertisement

Autonomous Robots

, Volume 29, Issue 3–4, pp 381–399 | Cite as

A terrain-following control approach for a VTOL Unmanned Aerial Vehicle using average optical flow

  • Bruno HérisséEmail author
  • Tarek Hamel
  • Robert Mahony
  • François-Xavier Russotto
Article

Abstract

This paper presents a nonlinear controller for terrain following of a vertical take-off and landing vehicle (VTOL). The VTOL vehicle is assumed to be a rigid body, equipped with a minimum sensor suite (camera, IMU and barometric altimeter) maneuvering over a textured rough terrain made of a concatenation of planar surfaces. Assuming that the forward velocity is separately regulated to a desired value, the proposed control approach ensures terrain following and guarantees that the vehicle does not collide with the ground during the task. The proposed control acquires an optical flow from multiple spatially separate observation points, typically obtained via multiple cameras or non collinear directions in a unique camera. The proposed control algorithm has been tested extensively in simulation and then implemented on a quadrotor UAV to demonstrate the performance of the closed-loop system.

Keywords

Optical flow Terrain following Obstacle avoidance Nonlinear control Aerial robotic vehicle 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

(MP4 4.79 MB)

References

  1. Altug, E., Ostrowski, J., & Mahony, R. (2002). Control of a quadrotor helicopter using visual feedback. In Proceedings of the IEEE international conference on robotics and automation, Washington DC, Virginia. Google Scholar
  2. Barron, J. L., Fleet, D. J., & Beauchemin, S. S. (1994). Performance of optical flow techniques. International Journal of Computer Vision, 12(1), 43–77. CrossRefGoogle Scholar
  3. Barrows, G. L., Chahl, J. S., & Srinivasan, M. V. (2002). Biomimetic visual sensing and flight control. In Seventeenth international Unmanned Air Vehicle systems. Conference, Bristol, UK. Google Scholar
  4. Bertrand, S., Hamel, T., & Piet-Lahanier, H. (2008). Stability analysis of an uav controller using singular perturbation theory. In Proceedings of the 17th IFAC world congress, Seoul, Korea. Google Scholar
  5. Beyeler, A., Zufferey, J. C., & Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27(3), 201–219. CrossRefGoogle Scholar
  6. Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots. Google Scholar
  7. Garratt, M. A., & Chahl, J. S. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25, 284–301. CrossRefGoogle Scholar
  8. Green, W. E., & Oh, P. Y. (2008). Optic flow based collision avoidance. IEEE Robotics & Automation Magazine, 15(1), 96–103. CrossRefGoogle Scholar
  9. Green, W. E., Oh, P. Y., & Barrows, G. (2004). Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. In ICRA, New Orleans, LA. Google Scholar
  10. Guenard, N., Hamel, T., Moreau, V., & Mahony, R. (2006). Design of a controller allowed the intuitive control of an x4-flyer. In SYROCO’06, University of Bologna, Italy. Google Scholar
  11. Guenard, N., Hamel, T., & Mahony, R. (2008). A practical visual servo control for an unmanned aerial vehicle. IEEE Transactions on Robotics, 24, 331–340. CrossRefGoogle Scholar
  12. Hamel, T., & Mahony, R. (2002). Visual servoing of an under-actuated dynamic rigid-body system: an image based approach. IEEE Transactions on Robotics, 18(2), 187–198. CrossRefGoogle Scholar
  13. Hérissé, B., Russotto, F. X., Hamel, T., & Mahony, R. (2008). Hovering flight and vertical landing control of a vtol unmanned aerial vehicle using optical flow. In IEEE/RSJ int. conf. on intelligent robots and systems, Nice, France. Google Scholar
  14. Hérissé, B., Oustrières, S., Hamel, T., Mahony, R., & Russotto, F. X. (2010). A general optical flow based terrain-following strategy for a vtol uav using multiple views. In IEEE int. conf. on robotics and automation, Anchorage, Alaska, USA. Google Scholar
  15. Humbert, J. S., & Hyslop, A. M. (2010). Bioinspired visuomotor convergence. IEEE Transactions on Robotics, 26(1), 121–130. CrossRefGoogle Scholar
  16. Humbert, J. S., Murray, R. M., & Dickinson, M. H. (2005). Pitch-altitude control and terrain following based on bio-inspired visuomotor convergence. In AIAA conference on guidance, navigation and control, San Francisco, CA. Google Scholar
  17. Khalil, H. K. (1996). Nonlinear systems (2nd edn.). Englewood Cliffs: Prentice Hall. Google Scholar
  18. Koenderink, J., & van Doorn, A. (1987). Facts on optic flow. Biological Cybernetics, 56, 247–254. zbMATHCrossRefGoogle Scholar
  19. Lee, D. N. (1976). A theory of visual control of braking based on information about time to collision. Perception, 5(4), 437–459. CrossRefGoogle Scholar
  20. Lucas, B., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In Proceedings of the seventh international joint conference on artificial intelligence, Vancouver (pp. 674–679). Google Scholar
  21. Mahony, R., & Hamel, T. (2004). Robust trajectory tracking for a scale model autonomous helicopter. International Journal of Non-linear and Robust Control, 14, 1035–1059. zbMATHCrossRefMathSciNetGoogle Scholar
  22. Mahony, R., Corke, P., & Hamel, T. (2008). Dynamic image-based visual servo control using centroid and optic flow features. Journal of Dynamic Systems Measurement and Control, 130(1), 011005. CrossRefGoogle Scholar
  23. Metni, N., Pflimlin, J., Hamel, T., & Souares, P. (2005). Attitude and gyro bias estimation for a flying uav. In Proceedings of the IEEE international conference on intelligent robots and systems, Edmonton, Canada. Google Scholar
  24. Muratet, L., Doncieux, S., Briere, Y., & Meyer, J. A. (2005). A contribution to vision-based autonomous helicopter flight in urban environments. Robotics and Autonomous Systems, 50(4), 195–209. CrossRefGoogle Scholar
  25. Ruffier, F., & Franceschini, N. (2004). Visually guided micro-aerial vehicle: automatic take off, terrain following, landing and wind reaction. In Proceedings of international conference on robotics and automation, LA, New Orleans. Google Scholar
  26. Ruffier, F., & Franceschini, N. (2005). Optic flow regulation: the key to aircraft automatic guidance. Robotics and Autonomous Systems, 50, 177–194. CrossRefGoogle Scholar
  27. Ruffier, F., & Franceschini, N. (2008). Aerial robot piloted in steep relief by optic flow sensors. In IEEE/RSJ int. conf. on intelligent robots and systems, Nice, France (pp. 1266–1273). Google Scholar
  28. Saripalli, S., Montgomery, J., & Sakhatme, G. (2002). Vision based autonomous landing of an unmanned aerial vehicle. In Proceedings of the IEEE international conference on robotics and automation, Washington DC, Virginia. Google Scholar
  29. Serres, J., Dray, D., Ruffier, F., & Franceschini, N. (2008). A vision-based autopilot for a miniature air vehicle: joint speed control and lateral obstacle avoidance. Autonomous Robots, 25(1–2), 103–122. CrossRefGoogle Scholar
  30. Shakernia, O., Ma, Y., Koo, T. J., & Sastry, S. (1999). Landing an unmanned air vehicle: vision based motion estimation and nonlinear control. Asian Journal of Control, 1(3), 128–146. CrossRefGoogle Scholar
  31. Srinivasan, M., Zhang, S., Chahl, J. S., Barth, E., & Venkatesh, S. (2000). How honeybees make grazing landings on flat surfaces. Biological Cybernetics, 83, 171–183. CrossRefGoogle Scholar
  32. Umeyama, S. (1991). Least-squares estimation of transformation parameters between two point patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(4), 376–380. CrossRefGoogle Scholar
  33. Valavanis, K. P. (2007). Advances in unmanned aerial vehicles. Berlin: Springer. zbMATHCrossRefGoogle Scholar
  34. Vassallo, R., Santos-Victor, J., & Schneebeli, H. (2002). A general approach for egomotion estimation with omnidirectional images. In OMNIVIS’02, Copenhagen, Denmark. Google Scholar
  35. Yuan, C., Recktenwald, F., & Mallot, H. A. (2009). Visual steering of uav in unknown environments. In IEEE/RSJ int. conf. on intelligent robots and systems, St. Louis, USA. Google Scholar
  36. Zufferey, J. C., & Floreano, D. (2006). Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics, 22(1), 137–146. doi: 10.1109/TRO.2005.858857, http://phd.zuff.info. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Bruno Hérissé
    • 1
    Email author
  • Tarek Hamel
    • 2
  • Robert Mahony
    • 3
  • François-Xavier Russotto
    • 1
  1. 1.CEA, LIST, Interactive Robotics LaboratoryFontenay aux RosesFrance
  2. 2.I3S, UNSA-CNRSSophia Antipolis CedexFrance
  3. 3.Department of EngineeringThe Australian National UniversityCanberraAustralia

Personalised recommendations