Advertisement

Autonomous Robots

, Volume 29, Issue 1, pp 17–34 | Cite as

Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visual systems

  • Iván F. MondragónEmail author
  • Miguel A. Olivares-Méndez
  • Pascual Campoy
  • Carol Martínez
  • Luís Mejias
Article

Abstract

This paper presents an implementation of an aircraft pose and motion estimator using visual systems as the principal sensor for controlling an Unmanned Aerial Vehicle (UAV) or as a redundant system for an Inertial Measure Unit (IMU) and gyros sensors. First, we explore the applications of the unified theory for central catadioptric cameras for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV’s attitude. Then we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Additionally, we show the use of a stereo system to calculate the aircraft height and to measure the UAV’s motion. Finally, we present a visual tracking system based on Fuzzy controllers working in both a UAV and a camera pan and tilt platform. Every part is tested using the UAV COLIBRI platform to validate the different approaches, which include comparison of the estimated data with the inertial values measured onboard the helicopter platform and the validation of the tracking schemes on real flights.

Keywords

Omnidirectional images Catadioptric systems Stereo vision Unmanned aerial vehicles (UAV) Motion estimation Fuzzy control 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Antonisse, H. J. (1982). Image segmentation in pyramids. Computer Vision, Graphics, and Image Processing, 19(4), 367–383. CrossRefGoogle Scholar
  2. Ashbrook, A. P. (1992). Evaluations of the Susan corner detection algorithm (Tech. rep.). Electronic System group, Department of Electronic and Electrical Engineering, University of Sheffield, UK. Google Scholar
  3. Baker, S., & Nayar, S. K. (1999). A theory of single-viewpoint catadioptric image formation. International Journal of Computer Vision, 35(2), 1–22. CrossRefGoogle Scholar
  4. Barreto, Ja, & Araujo, H. (2001). Issues on the geometry of central catadioptric image formation. Computer Vision and Pattern Recognition. doi:  10.1109/CVPR.2001.990992. Google Scholar
  5. Barreto, Ja, & Araujo, H. (2002). Geometric properties of central catadioptric line images. In ECCV ’02: proceedings of the 7th European conference on computer vision, part IV (pp. 237–251). London: Springer. Google Scholar
  6. Beyeler, A., Zufferey, J. C., & Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27(3), 201–219. doi:  10.1007/s10514-009-9139-6. CrossRefGoogle Scholar
  7. Campoy, P., Correa, J., Mondragon, I., Martinez, C., Olivares, M., Mejias, L., & Artieda, J. (2008). Computer vision onboard UAVs for civilian tasks. Journal of Intelligent and Robotic Systems. doi:  10.1007/s10846-008-9256-z. zbMATHGoogle Scholar
  8. Carnie, R., Walker, R., & Corke, P. (2006). Image processing algorithms for UAV “sense and avoid”. In Robotics and Automation. doi:  10.1109/ROBOT.2006.1642133.
  9. Cheng, Y., Maimone, M. W., & Matthies, L. (2006). Visual odometry on the Mars exploration rovers. IEEE Robotics and Automation magazine, 13(2), 54–62. CrossRefGoogle Scholar
  10. COLIBRI. (2009). Universidad Politécnica de Madrid. Computer Vision Group. COLIBRI Project. http://www.disam.upm.es/colibri.
  11. Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots, 27(3), 189–198. doi:  10.1007/s10514-009-9140-0. CrossRefGoogle Scholar
  12. Corke, P., Strelow, D., & Singh, S. (2004). Omnidirectional visual odometry for a planetary rover. In IEEE/RSJ international conference on intelligent robots and systems, Japan. Google Scholar
  13. Cornall, T., & Egan, G. (2004). Measuring horizon angle from video on a small unmanned air vehicle. In 2nd international conference on autonomous robots and agents. Google Scholar
  14. Cornall, T., Egan, G., & Price, A. (2006). Aircraft attitude estimation from horizon video. Electronics Letters, 42(13), 744–745. doi:  10.1049/el:20060547. CrossRefGoogle Scholar
  15. Demonceaux, C., Vasseur, P., & Pgard, C. (2006). Omnidirectional vision on UAV for attitude computation. In IEEE international conference on robotics and automation (ICRA’06) (pp. 2842–2847). Orlando: IEEE. Google Scholar
  16. Dusha, D., Boles, W., & Walker, R. (2007). Fixed-wing attitude estimation using computer vision based horizon detection. In: Proceedings 12th Australian international aerospace congress (pp. 1–19), Melbourne, Australia. Google Scholar
  17. Ettinger, S. M. Nechyba, M. C., Ifju, P. G., Waszak, M. (2002). Vision-guided flight stability and control for micro air vehicles. In IEEE international conference on intelligent robots and systems. New York: IEEE. Google Scholar
  18. Geyer, C., & Daniilidis, K. (2000). A unifying theory for central panoramic systems and practical applications. In ECCV (Vol. 2, pp. 445–461). Google Scholar
  19. Geyer, C., & Daniilidis, K. (2001). Catadioptric projective geometry. Journal of Computer Vision, 43, 223–243. CrossRefGoogle Scholar
  20. Hrabar, S., & Sukhatme, G. (2003). Omnidirectional vision for an autonomous helicopter. In IEEE international conference on robotics and automation (pp. 558–563). Google Scholar
  21. Hrabar, S., & Sukhatme, G. (2009). Vision-based navigation through urban canyons. Journal of Field Robotics, 26(5), 431–452. doi:  10.1002/rob.v26:5. CrossRefGoogle Scholar
  22. Hrabar, S., Sukhatme, G., Corke, P., Usher, K., & Roberts, J. (2005). Combined optic-flow and stereo-based navigation of urban canyons for a UAV. Intelligent Robots and Systems. doi:  10.1109/IROS.2005.1544998. Google Scholar
  23. Kendoul, F., Nonami, K., Fantoni, I., & Lozano, R. (2009). An adaptive vision-based autopilot for mini flying machines guidance, navigation and control. Autonomous Robots, 27(3), 165–188. doi:  10.1007/s10514-009-9135-x. CrossRefGoogle Scholar
  24. Labrosse, F. (2006). The visual compass: performance and limitations of an appearance-based method. Journal of Field Robotics, 23(10), 913–941. CrossRefGoogle Scholar
  25. Lucas, B. D., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th IJCAI (pp. 674–679), Vancouver, Canada. Google Scholar
  26. Martin, J., & Crowley, J. (1995). Experimental comparison of correlation techniques (Tech. rep.). IMAG-LIFIA, 46 Av. Félix Viallet, 38031 Grenoble, France. Google Scholar
  27. Matthies, L. (1989). Dynamic stereo vision. Cmu-cs-89-195, Carnegie Mellon University. Computer Science Department. Google Scholar
  28. Mejias, L. (2006). Control visual de un vehiculo aereo autonomo usando detección y seguimiento de características en espacios exteriores. PhD thesis, Escuela Técnica Superior de Ingenieros Industriales, Universidad Politécnica de Madrid, Spain. Google Scholar
  29. Mejias, L., Saripalli, S., Campoy, P., & Sukhatme, G. (2006). Visual servoing of an autonomous helicopter in urban areas using feature tracking. Journal of Field Robotics, 23(3–4), 185–199. CrossRefGoogle Scholar
  30. Mejias, L., Campoy, P., Mondragon, I., & Doherty, P. (2007). Stereo visual system for autonomous air vehicle navigation. In 6th IFAC symposium on intelligent autonomous vehicles (IAV 07), Toulouse, France. Google Scholar
  31. Milella, A., & Siegwart, R. (2006). Stereo-based ego-motion estimation using pixel tracking and iterative closest point. In Proceedings of the fourth IEEE international conference on computer vision systems (p. 21). Washington: IEEE Computer Society. CrossRefGoogle Scholar
  32. Nayar, S., & Baker, S. (1997). A theory of catadioptric image formation (Technical report CUCS-015-97). Department of Computer Science, Columbia University. Google Scholar
  33. Nikolos, I. K., Tsourveloudis, N. C., & Valavanis, K. P. (2004). A uav vision system for airborne surveillance. In Proceedings of the IEEE international conference on robotics and automation (ICRA ’04) (pp. 77–83), New Orleans, LA, USA. Google Scholar
  34. Nistér, D., Naroditsky, O., & Bergen, J. (2006). Visual odometry for ground vehicle applications. Journal of Field Robotics, 23(1), 3–20. zbMATHCrossRefGoogle Scholar
  35. Olivares, M., & Madrigal, J. (2007). Fuzzy logic user adaptive navigation control system for mobile robots in unknown environments. Intelligent Signal Processing. doi:  10.1109/WISP.2007.4447633. Google Scholar
  36. Olivares, M., Campoy, P., Correa, J., Martinez, C., & Mondragon, I. (2008). Fuzzy control system navigation using priority areas. In Proceedings of the 8th international FLINS conference (pp. 987–996), Madrid, Spain. Google Scholar
  37. Olivares-Mendez, M. A., Campoy, P., Mondragon, I., & Martinez, C. (2009a). A pan-tilt camera fuzzy vision controller on an unmanned aerial vehicle. In IEEE/RSJ international conference on intelligent robots and systems (IROS09). Google Scholar
  38. Olivares-Mendez, M. A., Campoy, P., Mondragon, I., & Martinez, C. (2009b). Visual servoing using fuzzy controllers on an unmanned aerial vehicle. In Eurofuse workshop 09, preference modelling and decision analysis. Google Scholar
  39. Puri, A., Valavanis, K., & Kontitsis, M. (2007). Statistical profile generation for traffic monitoring using real-time UAV based video data. In Mediterranean conference on control and automation (MED’07) (pp. 1–6). Google Scholar
  40. Scaramuzza, D., & Siegwart, R. (2008). Appearance guided monocular omnidirectional visual odometry for outdoor ground vehicles. IEEE Transactions on Robotics, 24(5), 1015–1026. CrossRefGoogle Scholar
  41. Todorovic, S., Nechyba, M., & Ifju, P. (2003). Sky/ground modeling for autonomous MAV flight. Robotics and Automation. doi:  10.1109/ROBOT.2003.1241791. zbMATHGoogle Scholar
  42. Ying, X. & Hu, Z. (2004). Catadioptric camera calibration using geometric invariants. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(10), 1260–1271. doi:  10.1109/TPAMI.2004.79. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Iván F. Mondragón
    • 1
    Email author
  • Miguel A. Olivares-Méndez
    • 1
  • Pascual Campoy
    • 1
  • Carol Martínez
    • 1
  • Luís Mejias
    • 2
  1. 1.Computer Vision Group U.P.M.MadridSpain
  2. 2.Australian Research Centre for Aerospace Automation (ARCAA), School of Engineering SystemsQueensland University of TechnologyBrisbaneAustralia

Personalised recommendations