Control Theory and Technology

, Volume 16, Issue 2, pp 145–159 | Cite as

Experimental evaluation of a real-time GPU-based pose estimation system for autonomous landing of rotary wings UAVs

  • Alessandro Benini
  • Matthew J. Rutherford
  • Kimon P. Valavanis
Article
  • 14 Downloads

Abstract

This paper proposes a real-time system for pose estimation of an unmanned aerial vehicle (UAV) using parallel image processing and a fiducial marker. The system exploits the capabilities of a high-performance CPU/GPU embedded system in order to provide on-board high-frequency pose estimation enabling autonomous takeoff and landing. The system is evaluated extensively with lab and field tests using a custom quadrotor. The autonomous landing is successfully demonstrated, through experimental tests, using the proposed algorithm. The results show that the system is able to provide precise pose estimation with a framerate of at least 30 fps and an image resolution of 640×480 pixels. The main advantage of the proposed approach is in the use of the GPU for image filtering and marker detection. The GPU provides an upper bound on the required computation time regardless of the complexity of the image thereby allowing for robust marker detection even in cluttered environments.

Keywords

UAV Vision GPU Kalman filter 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    C. Bu, Y. Ai, H. Du. Vision-based autonomous landing for rotorcraft unmanned aerial vehicle. IEEE International Conference on Vehicular Electronics and Safety (ICVES), Beijing: IEEE, 2016: 1–6. DOI 10.1109/ICVES.2016.7548174.Google Scholar
  2. [2]
    A. Gautam, P. B. Sujit, S. Saripalli. A survey of autonomous landing techniques for UAVs. International Conference on Unmanned Aircraft Systems (ICUAS), Orlando: IEEE, 2014: 1210–1218. DOI 10.1109/ICUAS.2014.6842377.Google Scholar
  3. [3]
    M. F. R. Lee, S. F. Su, J. W. E. Yeah, et al. Autonomous landing system for aerial mobile robot cooperation. The Joint 7th International Conference on Soft Computing and Intelligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS), Kitakyushu: IEEE, 2014: 1306–1311. DOI 10.1109/SCIS-ISIS.2014.7044826.Google Scholar
  4. [4]
    X. Guan, H. Bai. A GPU accelerated real-time self-contained visual navigation system for UAVs. IEEE International Conference on Information and Automation, Shenyang: IEEE, 2012: 578–581. DOI 10.1109/ICInfA.2012.6246879.Google Scholar
  5. [5]
    S. Yang, S. A. Scherer, K. Schauwecker, et al. Onboard monocular vision for landing of an MAV on a landing site specified by a single reference image. International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta: IEEE, 2013: 318–325. DOI 10.1109/ICUAS.2013.6564704.CrossRefGoogle Scholar
  6. [6]
    S. Yang, S. A. Scherer, K. Schauwecker, A. Zell. Autonomous Landing of MAVs on an arbitrarily textured landing site using onboard monocular vision. Journal of Intelligent & Robotic Systems, 2014, 74(1/2): 27–43.CrossRefGoogle Scholar
  7. [7]
    G. Klein, D. Murray. Parallel tracking and mapping for small AR workspaces. The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara: IEEE, 2007: 225–234. DOI 10.1109/ISMAR.2007.4538852.Google Scholar
  8. [8]
    F. Cocchioni, A. Mancini, S. Longhi. Autonomous navigation, landing and recharge of a quadrotor using artificial vision. International Conference on Unmanned Aircraft Systems (ICUAS), Orlando: IEEE, 2014: 418–429. DOI 10.1109/ICUAS.2014. 6842282.Google Scholar
  9. [9]
    Y. Jung, D. Lee, H. Bang. Study on ellipse fitting problem for vision-based autonomous landing of an UAV. The 14th International Conference on Control, Automation and Systems (ICCAS), Seoul: IEEE, 2014: 1631–1634. DOI 10.1109/ICCAS.2014.6987819Google Scholar
  10. [10]
    K. Li, P. Liu, T. Pang, et al. Development of an unmanned aerial vehicle for rooftop landing and surveillance. International Conference on Unmanned Aircraft Systems (ICUAS), Denver: IEEE, 2015: 832–838. DOI 10.1109/ICUAS.2015.7152368.CrossRefGoogle Scholar
  11. [11]
    A. Masselli, S. Yang, K. E. Wenzel, et al. A cross-platform comparison of visual marker based approaches for autonomous flight of quadrocopters. International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta: IEEE, 2013: 685–693. DOI 10.1109/ICUAS.2013.6564749.CrossRefGoogle Scholar
  12. [12]
    W. Roozing, A. H. Goktogan. Low-cost vision-based 6-DOF MAV localization using IR beacons. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong: IEEE, 2013: 1003–1009. DOI 10.1109/AIM.2013.6584225.CrossRefGoogle Scholar
  13. [13]
    H. Cheng, Y. Chen, X. Li, et al. Autonomous takeoff, tracking and landing of a UAV on a moving UGV using onboard monocular vision. Proceedings of the 32nd Chinese Control Conference, Xi’an: IEEE, 2013: 5895–5901.Google Scholar
  14. [14]
    S. Lange, N. Sunderhauf, P. Protzel. A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments. International Conference on Advanced Robotics, Munich, 2009: 1–6.Google Scholar
  15. [15]
    K. H. Hsia, S. F. Lien, J. P. Su. Height estimation via stereo vision system for unmanned helicopter autonomous landing. International Symposium on Computer, Communication, Control and Automation (3CA), Tainan: IEEE, 2010: 257–260. DOI 10.1109/3CA.2010.5533535.Google Scholar
  16. [16]
    S. Saripalli, G. S. Sukhatme. Landing on a moving target using an autonomous helicopter. Field and Service Robotics: Recent Advances in Research and Applications. Berlin: Springer, 2006: 277–286. DOI 10.1007/10991459_27.CrossRefGoogle Scholar
  17. [17]
    D. Jeon, D.-H. Kim, Y.-G. Ha, et al. Image processing acceleration for intelligent unmanned aerial vehicle on mobile GPU. Soft Computing, 2016, 20(5): 1713–1720. DOI http://dx.doi.org/10.1007/s00500-015-1656-y. CrossRefGoogle Scholar
  18. [18]
    F. Ababsa, M. Mallem. A robust circular fiducial detection technique and real-time 3D camera tracking. Journal of Multimedia, 2008, 3(4): 34–41CrossRefGoogle Scholar
  19. [19]
    L. Calvet, P. Gurdjos, V. Charvillat. Camera tracking using concentric circle markers: Paradigms and algorithms. The 19th IEEE International Conference on Image Processing, Orlando: IEEE, 2012: 1361–1364. DOI 10.1109/ICIP.2012.6467121.Google Scholar
  20. [20]
    F. Ababsa, M. Mallem. A robust circular fiducial detection technique and real-time 3D camera tracking. Journal of Multimedia, 2008, 3(4): 34–41.CrossRefGoogle Scholar
  21. [21]
    A. Benini, M. J. Rutherford, K. P. Valavanis. Real-time, GPUbased pose estimation of a UAV for autonomous takeoff and landing. IEEE International Conference on Robotics and Automation (ICRA), Stockholm: IEEE, 2016: 3463–3470. DOI 10.1109/ICRA.2016.7487525.Google Scholar
  22. [22]
    S. A. Conyers, N. I. Vitzilaios, M. J. Rutherford, et al. A mobile self-leveling landing platform for VTOL UAVs. IEEE International Conference on Robotics and Automation (ICRA), Seattle: IEEE, 2015: 815–822. DOI 10.1109/ICRA.2015.7139272.Google Scholar
  23. [23]
  24. [24]

Copyright information

© South China University of Technology, Academy of Mathematics and Systems Science, Chinese Academy of Sciences and Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  • Alessandro Benini
    • 1
  • Matthew J. Rutherford
    • 1
  • Kimon P. Valavanis
    • 1
  1. 1.DU Unmanned Systems Research Institute (DU2SRI)University of Denver (DU)DenverU.S.A.

Personalised recommendations