Autonomous Robots

, Volume 34, Issue 1–2, pp 19–34 | Cite as

Design of a 3D snapshot based visual flight control system using a single camera in hover

  • Matthew A. Garratt
  • Andrew J. Lambert
  • Hamid Teimoori
Article

Abstract

The problem of developing a reliable system for sensing and controlling the hover of a Micro Air Vehicle (MAV) using visual snapshots is considered. The current problem is part of a larger project, which is developing an autonomous MAV, controlled by vision only information. A new algorithm is proposed that uses a stored image of the ground, a snapshot taken of the ground directly under the MAV, as a visual anchor point. The absolute translation of the aircraft and its velocity are then calculated by comparing the subsequent frames with the stored image and fed into the position controller. In order to increase the performance, several issues, such as effects of scale uncertainty on the closed loop stability of the platform are investigated. For controller design and testing purposes, we analytically derive a complete model of a small size helicopter with no stabilizing bar (flybar). The simulation results for 2D and 3D snapshots confirm the effectiveness of the proposed algorithm.

Keywords

Micro Air Vehicle (MAV) Visual guidance Optic flow 

Notes

Acknowledgements

This work was carried out with funding provided by the Australian Defence Science and Technology Organisation.

References

  1. Ahrens, S., Levine, D., Andrews, G., & How, J. (2009). Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments. In IEEE international conference on robotics and automation, ICRA’09 (pp. 2643–2648). New York: IEEE Press. CrossRefGoogle Scholar
  2. Amidi, O. (1996). An autonomous vision-guided helicopter. PhD thesis, Dept. of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh. Google Scholar
  3. Amidi, O., Kanade, T., & Fujita, K. (1999). A visual odometer for autonomous helicopter flight. Robotics and Autonomous Systems, 28(2), 185–193. CrossRefGoogle Scholar
  4. Beyeler, A., Zufferey, J., & Floreano, D. (2007). 3D Vision-based navigation for indoor microflyers. In Proceedings of the 2007 IEEE international conference on robotics and automation, Rome, Italy, 10–14 April 2007 (pp. 1336–1341). CrossRefGoogle Scholar
  5. Blosch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based MAV navigation in unknown and unstructured environments. In IEEE international conference on robotics and automation (ICRA) (pp. 21–28). New York: IEEE Press. Google Scholar
  6. Cartwright, B. A., & Collet, T. S. (1983). How honey bees use landmarks to guide their return to a food source. Journal of Comparative & Physiological Psychology, 151, 521–543. CrossRefGoogle Scholar
  7. Cartwright, B. A., & Collet, T. S. (1992). Landmark learning in bees. Nature, 295, 560. CrossRefGoogle Scholar
  8. Castillo, C., Alvis, W., Castillo-Effen, M., Valavanis, K., & Moreno, W. (2005). Small scale helicopter analysis and controller design for non-aggressive flights. In IEEE international conference on SMC, Hawaii, October 2005. Google Scholar
  9. Cherian, A., Andersh, J., Morellas, V., Papanikolopoulos, N., & Mettler, B. (2009). Autonomous altitude estimation of a UAV using a single onboard camera. In IROS 2009, IEEE/RSJ international conference on intelligent robots and systems (pp. 3900–3905). New York: IEEE Press. CrossRefGoogle Scholar
  10. Corke, P. (2004). An inertial and visual sensing system for a small autonomous helicopter. Journal of Robotic Systems, 21(2), 43–51. CrossRefGoogle Scholar
  11. Garratt, M. A. (2007). Biologically inspired vision and control for an autonomous helicopter. PhD thesis, Research School of Biological Sciences, Australian National University, Canberra, Australia. Google Scholar
  12. Garratt, M. A., & Chahl, J. S. (2007). An optic flow damped hover controller for an autonomous helicopter. In Proceedings of the 22nd international UAV systems conference, Bristol, UK, 16–18 April 2007 (pp. 16–18). Google Scholar
  13. Garratt, M. A., & Chahl, J. S. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25(4–5), 284–301. CrossRefGoogle Scholar
  14. Garratt, M. A., Lambert, A., & Guillemette, T. (2009). FPGA implementation of an optic flow sensor using the image interpolation algorithm. In Proceedings of the Australian conference on robotics and automation, Sydney, Australia, 2–4 December 2009. Google Scholar
  15. Griffiths, S., Saunders, J., Curtis, A., Barber, B., McLain, T., & Beard, R. (2006). Maximizing miniature aerial vehicles: obstacle and terrain avoidance for MAVs. IEEE Robotics and Automation Magazine, 13(3), 34–43. CrossRefGoogle Scholar
  16. Hung, A., Pickering, M., & Garratt, M. A. (2011). Fast image registration using a multi-pass image interpolation approach. In 7th international conference on information technology and applications (ICITA 2011), Sydney, 21–24 Nov. 2011. Google Scholar
  17. Keyence Corporation (2011). Revolutor H-610 operating instructions. http://hobby.keyence.co.jp/english/manual.html. Accessed 1 Nov. 2011.
  18. Mejias, L., Saripalli, S., Campoy, P., & Sukhatme, G. S. (2006). Visual servoing of an autonomous helicopter in urban areas using feature tracking. Journal of Field Robotics, 23, 185–199. CrossRefGoogle Scholar
  19. Mettler, B. (2003). Identification modeling and characteristics of miniature rotorcraft. Norwell: Kluwer Academic. CrossRefGoogle Scholar
  20. Mettler, B., Tischler, M. B., Kanade, T., & Messner, W. (2000). Attitude control optimization for a small-scale unmanned helicopter. In AIAA guidance, navigation and control conference (pp. 40–59). Google Scholar
  21. Padfield, G. D. (2007). Helicopter flight dynamics. Ames: Blackwell. CrossRefGoogle Scholar
  22. Prouty, R. W. (1990). Helicopter performance, stability, and control. Melbourne: Robert E. Krieger. Google Scholar
  23. Saripalli, S., Montgomery, J. F., & Sukhatme, G. S. (2003). Visually guided landing of an unmanned aerial vehicle. IEEE Transactions on Robotics and Automation, 19, 371–380. CrossRefGoogle Scholar
  24. Shakernia, O., Vidal, R., Sharp, C., Ma, Y., & Sastry, S. (2002). Multiple view motion estimation and control for landing an unmanned aerial vehicle. In Proceedings of the IEEE international conference on robotics and automation, Nov. 2002 (Vol. 3, pp. 2793–2798). Google Scholar
  25. Sharp, C. S., Shakernia, O., & Sastry, S. S. (2001). A vision system for landing an unmanned aerial vehicle. In IEEE international conference on robotics and automation. Google Scholar
  26. Srinivasan, M. V. (1994). An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics, 71(5), 401–415. MATHCrossRefGoogle Scholar
  27. Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). MAV navigation through indoor corridors using optical flow. In IEEE international conference on robotics and automation (ICRA) (pp. 3361–3368). New York: IEEE Press. Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  • Matthew A. Garratt
    • 1
  • Andrew J. Lambert
    • 1
  • Hamid Teimoori
    • 1
  1. 1.School of Engineering and Information TechnologyUniversity of New South Wales at the Australian Defence Force AcademyCanberraAustralia

Personalised recommendations