Advertisement

Journal of Intelligent & Robotic Systems

, Volume 87, Issue 1, pp 141–168 | Cite as

Survey on Computer Vision for UAVs: Current Developments and Trends

  • Christoforos KanellakisEmail author
  • George Nikolakopoulos
Open Access
Article

Abstract

During last decade the scientific research on Unmanned Aerial Vehicless (UAVs) increased spectacularly and led to the design of multiple types of aerial platforms. The major challenge today is the development of autonomously operating aerial agents capable of completing missions independently of human interaction. To this extent, visual sensing techniques have been integrated in the control pipeline of the UAVs in order to enhance their navigation and guidance skills. The aim of this article is to present a comprehensive literature review on vision based applications for UAVs focusing mainly on current developments and trends. These applications are sorted in different categories according to the research topics among various research groups. More specifically vision based position-attitude control, pose estimation and mapping, obstacle detection as well as target tracking are the identified components towards autonomous agents. Aerial platforms could reach greater level of autonomy by integrating all these technologies onboard. Additionally, throughout this article the concept of fusion multiple sensors is highlighted, while an overview on the challenges addressed and future trends in autonomous agent development will be also provided.

Keywords

UAVs SLAM Visual servoing Obstacle avoidance Target tracking 

References

  1. 1.
    U.S Department of Transportation: Federal Aviation Administration. https://www.faa.gov/uas/faqs/
  2. 2.
    U.K Ministry of Defence: Unmanned Aircraft Systems: Terminology, Definitions and ClassificationGoogle Scholar
  3. 3.
    U.S. Department of Defense: Standard practice for system safety. MIL-STD-882D (2000)Google Scholar
  4. 4.
    Huang, H.-M.: Autonomy levels for unmanned systems (ALFUS) framework, volume I: Terminology, Version 2.0 (2008)Google Scholar
  5. 5.
    Valavanis, K.P.: Advances in Unmanned Aerial Vehicles: State of the Art and the Road to Autonomy, vol. 33. Springer Science & Business Media (2008)Google Scholar
  6. 6.
  7. 7.
  8. 8.
    ShadowAir: Super Bat ShadowAir. http://www.shadowair.com
  9. 9.
    Association Unmanned Aerial Vehicle Systems: Civil and Commercial UAS Applications. https://www.uavs.org/commercial
  10. 10.
    Mejias, L., Correa, J.F., Mondragón, I., Campoy, P.: Colibri: A vision-guided uav for surveillance and visual inspection (2007)Google Scholar
  11. 11.
    Araar, O., Aouf, N.: A new hybrid approach for the visual servoing of vtol uavs from unknown geometries. In: IEEE 22nd Mediterranean Conference of Control and Automation (MED), pp. 1425–1432. IEEE (2014)Google Scholar
  12. 12.
    Carrillo, L.R.G., López, A.E.D., Lozano, R., Pégard, C.: Combining stereo vision and inertial navigation system for a quad-rotor uav. J. Intelli. Robot. Syst. 65(1-4), 373–387 (2012)CrossRefGoogle Scholar
  13. 13.
    Max Botix: XL-MaxSonar-EZ4 Ultrasonic Sensor. http://www.maxbotix.com
  14. 14.
    SkyBotix AG: VI sensor. http://www.skybotix.com/
  15. 15.
    TeraRanger: TeraRanger Rotating Lidar. http://www.teraranger.com/products/teraranger-lidar/
  16. 16.
  17. 17.
    Szeliski, R.: Computer Vision: Algorithms and Applications. Springer Science & Business Media (2010)Google Scholar
  18. 18.
    Kendoul, F.: Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems. J. Field Robot. 29(2), 315–378 (2012)CrossRefGoogle Scholar
  19. 19.
    Hutchinson, S., Hager, G.D., Corke, P.I.: A tutorial on visual servo control. IEEE Trans. Robot. Autom. 12(5), 651–670 (1996)CrossRefGoogle Scholar
  20. 20.
    Corke, P.: Robotics, Vision and Control: Fundamental Algorithms in MATLAB, vol. 73. Springer Science & Business Media (2011)Google Scholar
  21. 21.
    Asl, H.J., Oriolo, G., Bolandi, H.: An adaptive scheme for image-based visual servoing of an underactuated uav. IEEE Trans. Robot. Autom. 29(1) (2014)Google Scholar
  22. 22.
    Ozawa, R., Chaumette, F.: Dynamic visual servoing with image moments for a quadrotor using a virtual spring approach. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 5670–5676. IEEE (2011)Google Scholar
  23. 23.
    Araar, O., Aouf, N.: Visual servoing of a quadrotor uav for autonomous power lines inspection. In: 22nd Mediterranean Conference of Control and Automation (MED), pp. 1418–1424. IEEE (2014)Google Scholar
  24. 24.
    Azinheira, J.R., Rives, P.: Image-based visual servoing for vanishing features and ground lines tracking: Application to a uav automatic landing. Int. J. Optomechatron. 2(3), 275–295 (2008)CrossRefGoogle Scholar
  25. 25.
    Sa, I., Hrabar, S., Corke, P.: Inspection of pole-like structures using a vision-controlled vtol uav and shared autonomy. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4819–4826. IEEE (2014)Google Scholar
  26. 26.
    Mills, S.J., Ford, J.J., Mejías, L.: Vision based control for fixed wing uavs inspecting locally linear infrastructure using skid-to-turn maneuvers. J. Intell. Robot. Syst. 61(1–4), 29–42 (2011)CrossRefGoogle Scholar
  27. 27.
    Peliti, P., Rosa, L., Oriolo, G., Vendittelli, M.: Vision-based loitering over a target for a fixed-wing uav. In: Proceedings of the 10th International IFAC Symposium on Robot Control (2012)Google Scholar
  28. 28.
    Guenard, N., Hamel, T., Mahony, R.: A practical visual servo control for an unmanned aerial vehicle. IEEE Trans. Robot. 24(2), 331–340 (2008)CrossRefGoogle Scholar
  29. 29.
    Metni, N., Hamel, T.: A uav for bridge inspection: Visual servoing control law with orientation limits. Autom. Construct. 17(1), 3–10 (2007)CrossRefGoogle Scholar
  30. 30.
    Hamel, T., Mahony, R.: Image based visual servo control for a class of aerial robotic systems. Automatica 43(11), 1975–1983 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Chriette, A.: An analysis of the zero-dynamics for visual servo control of a ducted fan uav. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 2515–2520. IEEE (2006)Google Scholar
  32. 32.
    Le Bras, F., Mahony, R., Hamel, T., Binetti, P.: Adaptive filtering and image based visual servo control of a ducted fan flying robot. In: 45th IEEE Conference on Decision and Control, pp. 1751–1757. IEEE (2006)Google Scholar
  33. 33.
    Kim, S., Choi, S., Lee, H., Kim, H.J.: Vision-based collaborative lifting using quadrotor uavs. In: 14th International Conference on Control, Automation and Systems (ICCAS), pp. 1169–1174. IEEE (2014)Google Scholar
  34. 34.
    Barajas, M., Dávalos-Viveros, J.P., Garcia-Lumbreras, S., Gordillo, J.L.: Visual servoing of uav using cuboid model with simultaneous tracking of multiple planar faces. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 596–601. IEEE (2013)Google Scholar
  35. 35.
    Fahimi, F., Thakur, K.: An alternative closed-loop vision-based control approach for unmanned aircraft systems with application to a quadrotor. In: International Conference on Unmanned Aircraft Systems (ICUAS), pp. 353–358. IEEE (2013)Google Scholar
  36. 36.
    Lee, D., Ryan, T., Kim, H.J.: Autonomous landing of a vtol uav on a moving platform using image-based visual servoing. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 971–976. IEEE (2012)Google Scholar
  37. 37.
    Huh, S., Shim, D.H.: A vision-based automatic landing method for fixed-wing uavs. J. Intell. Robot. Syst. 57(1–4), 217–231 (2010)CrossRefGoogle Scholar
  38. 38.
    Salazar, S., Romero, H., Gomez, J., Lozano, R.: Real-time stereo visual servoing control of an uav having eight-rotors. In: 6th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE), pp. 1–11. IEEE (2009)Google Scholar
  39. 39.
    Dib, A., Zaidi, N., Siguerdidjane, H.: Robust control and visual servoing of an uav. In: 17th IFAC World Congress 2008, pp. CD–ROM (2008)Google Scholar
  40. 40.
    Kendoul, F., Fantoni, I., Nonami, K.: Optic flow-based vision system for autonomous 3d localization and control of small aerial vehicles. Robot. Autonom. Syst. 57(6), 591–602 (2009)CrossRefGoogle Scholar
  41. 41.
    Eberli, D., Scaramuzza, D., Weiss, S., Siegwart, R.: Vision based position control for mavs using one single circular landmark. J. Intell. Robot. Syst. 61(1–4), 495–512 (2011)CrossRefGoogle Scholar
  42. 42.
    Kendoul, F., Fantoni, I., Lozano, R.: Adaptive vision-based controller for small rotorcraft uavs control and guidance. In: Proceedings of the 17th IFAC world congress, pp. 6–11 (2008)Google Scholar
  43. 43.
    Lange, S., Sunderhauf, N., Protzel, P.: A vision based onboard approach for landing and position control of an autonomous multirotor uav in gps-denied environments. In: International Conference on Advanced Robotics, 2009. ICAR 2009, pp. 1–6. IEEE (2009)Google Scholar
  44. 44.
    Alkowatly, M.T., Becerra, V.M., Holderbaum, W.: Bioinspired autonomous visual vertical control of a quadrotor unmanned aerial vehicle. J. Guid. Control Dyn., 1–14 (2014)Google Scholar
  45. 45.
    Ghadiok, V., Goldin, J., Ren, W.: On the design and development of attitude stabilization, vision-based navigation, and aerial gripping for a low-cost quadrotor. Autonom. Robots 33(1–2), 41–68 (2012)CrossRefGoogle Scholar
  46. 46.
    Fucen, Z., Haiqing, S., Hong, W.: The object recognition and adaptive threshold selection in the vision system for landing an unmanned aerial vehicle. In: International Conference on Information and Automation (ICIA), pp. 117–122. IEEE (2009)Google Scholar
  47. 47.
    Zhao, Y., Pei, H.: An improved vision-based algorithm for unmanned aerial vehicles autonomous landing. Phys. Proced. 33, 935–941 (2012)CrossRefGoogle Scholar
  48. 48.
    Artieda, J., Sebastian, J.M., Campoy, P., Correa, J.F., Mondragón, I.F., Martínez, C., Olivares, M.: Visual 3-d slam from uavs. J. Intell. Robot. Syst. 55(4–5), 299–321 (2009)CrossRefzbMATHGoogle Scholar
  49. 49.
    Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., Scaramuzza, D.: Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot. (2015)Google Scholar
  50. 50.
    Fraundorfer, F., Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Pollefeys, M.: Vision-based autonomous mapping and exploration using a quadrotor mav. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4557–4564. IEEE (2012)Google Scholar
  51. 51.
    Schmid, K., Lutz, P., Tomić, T., Mair, E., Hirschmüller, H.: Autonomous vision-based micro air vehicle for indoor and outdoor navigation. J. Field Robot. 31(4), 537–570 (2014)CrossRefGoogle Scholar
  52. 52.
    Harmat, A., Trentini, M., Sharf, I.: Multi-camera tracking and mapping for unmanned aerial vehicles in unstructured environments. J. Intell. Robot. Syst., 1–27 (2014)Google Scholar
  53. 53.
    Leishman, R.C., McLain, T.W., Beard, R.W.: Relative navigation approach for vision-based aerial gps-denied navigation. J. Intell. Robot. Syst. 74(1–2), 97–111 (2014)CrossRefGoogle Scholar
  54. 54.
    Forster, C., Pizzoli, M., Scaramuzza, D.: Svo: Fast semi-direct monocular visual odometry. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 15–22. IEEE (2014)Google Scholar
  55. 55.
    Forster, C., Faessler, M., Fontana, F., Werlberger, M., Scaramuzza, D.: Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 111–118. IEEE (2015)Google Scholar
  56. 56.
    Lynen, S., Achtelik, M.W., Weiss, S., Chli, M., Siegwart, R.: A robust and modular multi-sensor fusion approach applied to mav navigation. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3923–3929. IEEE (2013)Google Scholar
  57. 57.
    Pizzoli, M., Forster, C., Scaramuzza, D.: Remode: Probabilistic, monocular dense reconstruction in real time. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 2609–2616. IEEE (2014)Google Scholar
  58. 58.
    Fu, C., Olivares-Mendez, M.A., Suarez-Fernandez, R., Campoy, P.: Monocular visual-inertial slam-based collision avoidance strategy for fail-safe uav using fuzzy logic controllers. J. Intell. Robot. Syst. 73(1–4), 513–533 (2014)CrossRefGoogle Scholar
  59. 59.
    Wang, T., Wang, C., Liang, J., Zhang, Y.: Rao-blackwellized visual slam for small uavs with vehicle model partition. Indus. Robot: Int. J. 41(3), 266–274 (2014)CrossRefGoogle Scholar
  60. 60.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)CrossRefGoogle Scholar
  61. 61.
    Magree, D., Johnson, E.N.: Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle. In: American Control Conference (ACC), pp. 1900–1905. IEEE (2014)Google Scholar
  62. 62.
    Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefzbMATHGoogle Scholar
  63. 63.
    Mahalanobis, P.C.: On the generalised distance in statistics 2(1), 49–55 (1936)Google Scholar
  64. 64.
    Huh, S., Shim, D.H., Kim, J.: Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of uavs. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3158–3163. IEEE (2013)Google Scholar
  65. 65.
    Wang, C.L., Wang, T.M., Liang, J.H., Zhang, Y.C., Zhou, Y.: Bearing-only visual slam for small unmanned aerial vehicles in gps-denied environments. Int. J. Autom. Comput. 10(5), 387–396 (2013)CrossRefGoogle Scholar
  66. 66.
    Nemra, A., Aouf, N.: Robust cooperative uav visual slam. In: IEEE 9th International Conference on Cybernetic Intelligent Systems (CIS), pp. 1–6. IEEE (2010)Google Scholar
  67. 67.
    Min, J., Jeong, Y., Kweon, I.S.: Robust visual lock-on and simultaneous localization for an unmanned aerial vehicle. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 93–100. IEEE (2010)Google Scholar
  68. 68.
    Jama, M., Schinstock, D.: Parallel tracking and mapping for controlling vtol airframe. J. Control Sci. Eng. 2011, 26 (2011)Google Scholar
  69. 69.
    Törnqvist, D., Schön, T.B., Karlsson, R., Gustafsson, F.: Particle filter slam with high dimensional vehicle model. J. Intell. Robot. Syst 55(4–5), 249–266 (2009)CrossRefzbMATHGoogle Scholar
  70. 70.
    Harris, C., Stephens, M.: A combined corner and edge detector. In: Alvey vision conference, vol. 15, p. 50. Manchester (1988)Google Scholar
  71. 71.
    Montemerlo, M., Thrun, S., Koller, D., Wegbreit, B., et al.: Fastslam: A factored solution to the simultaneous localization and mapping problem. In: AAAI/IAAI, pp. 593–598 (2002)Google Scholar
  72. 72.
    Bryson, M., Sukkarieh, S.: Building a robust implementation of bearing-only inertial slam for a uav. J. Field Robot. 24(1–2), 113–143 (2007)CrossRefGoogle Scholar
  73. 73.
    Kim, J., Sukkarieh, S.: Real-time implementation of airborne inertial-slam. Robot. Autonom. Syst. 55(1), 62–71 (2007)CrossRefGoogle Scholar
  74. 74.
    Liming Luke Chen, D.R.M., Dr Matthias Steinbauer, P., Mossel, A., Leichtfried, M., Kaltenriner, C., Kaufmann, H.: Smartcopter: Enabling autonomous flight in indoor environments with a smartphone as on-board processing unit. Int. J. Pervas. Comput. Commun. 10(1), 92–114 (2014)CrossRefGoogle Scholar
  75. 75.
    Yang, J., Dani, A., Chung, S.J., Hutchinson, S.: Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements. In: AIAA Guidance, Navigation, and Control Conference. Boston (2013)Google Scholar
  76. 76.
    Zhang, R., Liu, H.H.: Vision-based relative altitude estimation of small unmanned aerial vehicles in target localization. In: American Control Conference (ACC), 2011, pp. 4622–4627. IEEE (2011)Google Scholar
  77. 77.
    Nourani-Vatani, N., Pradalier, C.: Scene change detection for vision-based topological mapping and localization. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3792–3797. IEEE (2010)Google Scholar
  78. 78.
    Canny, J.: A computational approach to edge detection. Trans. Pattern Anal. Mach. Intell. 6, 679–698 (1986)CrossRefGoogle Scholar
  79. 79.
    Caballero, F., Merino, L., Ferruz, J., Ollero, A.: Unmanned aerial vehicle localization based on monocular vision and online mosaicking. J. Intell. Robot. Syst. 55(4–5), 323–343 (2009)CrossRefzbMATHGoogle Scholar
  80. 80.
    Lee, S.J., Kim, J.H.: Development of a quadrocoptor robot with vision and ultrasonic sensors for distance sensing and mapping. In: Robot Intelligence Technology and Applications 2012, pp. 477–484. Springer (2013)Google Scholar
  81. 81.
    Engel, J., Sturm, J., Cremers, D.: Scale-aware navigation of a low-cost quadrocopter with a monocular camera. Robot. Autonom. Syst. 62(11), 1646–1656 (2014)CrossRefGoogle Scholar
  82. 82.
    Chowdhary, G., Johnson, E.N., Magree, D., Wu, A., Shein, A.: Gps-denied indoor and outdoor monocular vision aided navigation and control of unmanned aircraft. J. Field Robot. 30(3), 415–438 (2013)CrossRefGoogle Scholar
  83. 83.
    Zhang, X., Xian, B., Zhao, B., Zhang, Y.: Autonomous flight control of a nano quadrotor helicopter in a gps-denied environment using on-board vision. IEEE Trans. Ind. Electron. 62 (10), 6392–6403 (2015)CrossRefGoogle Scholar
  84. 84.
    Harmat, A., Trentini, M., Sharf, I.: Multi-camera tracking and mapping for unmanned aerial vehicles in unstructured environments. J. Intell. Robot. Syst. 78(2), 291–317 (2015)CrossRefGoogle Scholar
  85. 85.
    Bloesch, M., Omari, S., Hutter, M., Siegwart, R.: Robust visual inertial odometry using a direct ekf-based approach. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 298–304. IEEE (2015)Google Scholar
  86. 86.
    Shen, S., Mulgaonkar, Y., Michael, N., Kumar, V.: Vision-based state estimation for autonomous rotorcraft mavs in complex environments. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 1758–1764. IEEE (2013)Google Scholar
  87. 87.
    Troiani, C., Martinelli, A., Laugier, C., Scaramuzza, D.: Low computational-complexity algorithms for vision-aided inertial navigation of micro aerial vehicles. Robot. Autonom. Syst. 69, 80–97 (2015)CrossRefGoogle Scholar
  88. 88.
    Loianno, G., Watterson, M., Kumar, V.: Visual inertial odometry for quadrotors on se (3). In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1544–1551. IEEE (2016)Google Scholar
  89. 89.
    Loianno, G., Thomas, J., Kumar, V.: Cooperative localization and mapping of mavs using rgb-d sensors. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 4021–4028. IEEE (2015)Google Scholar
  90. 90.
    Piasco, N., Marzat, J., Sanfourche, M.: Collaborative localization and formation flying using distributed stereo-vision. In: IEEE International Conference on Robotics and Automation. Stockholm (2016)Google Scholar
  91. 91.
    Nieuwenhuisen, M., Droeschel, D., Beul, M., Behnke, S.: Obstacle detection and navigation planning for autonomous micro aerial vehicles. In: International Conference on Unmanned Aircraft Systems (ICUAS), pp. 1040–1047. IEEE (2014)Google Scholar
  92. 92.
    Schmid, K., Tomic, T., Ruess, F., Hirschmuller, H., Suppa, M.: Stereo vision based indoor/outdoor navigation for flying robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3955–3962. IEEE (2013)Google Scholar
  93. 93.
    Heng, L., Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: Autonomous obstacle avoidance and maneuvering on a vision-guided mav using on-board processing. In: IEEE international conference on Robotics and automation (ICRA), pp. 2472–2477. IEEE (2011)Google Scholar
  94. 94.
    Magree, D., Mooney, J.G., Johnson, E.N.: Monocular visual mapping for obstacle avoidance on uavs. J. Intell. Robot. Syst. 74(1–2), 17–26 (2014)CrossRefGoogle Scholar
  95. 95.
    Xiaoyi, D., Qinhua, Z.: Research on laser-assisted odometry of indoor uav with monocular vision. In: 3rd Annual International Conference on Cyber Technology in Automation, Control and Intelligent Systems (CYBER), pp. 165–169. IEEE (2013)Google Scholar
  96. 96.
    Gosiewski, Z., Ciesluk, J., Ambroziak, L.: Vision-based obstacle avoidance for unmanned aerial vehicles. In: 4th International Congress on Image and Signal Processing (CISP), vol. 4, pp. 2020–2025. IEEE (2011)Google Scholar
  97. 97.
    Yuan, C., Recktenwald, F., Mallot, H.A.: Visual steering of uav in unknown environments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3906–3911. IEEE (2009)Google Scholar
  98. 98.
    Shah, S.I.A., Johnson, E.N.: 3d obstacle detection using a single camera. In: AIAA guidance, navigation, and control conference (AIAA), vol. 5678 (2009)Google Scholar
  99. 99.
    Lee, J.O., Lee, K.H., Park, S.H., Im, S.G., Park, J.: Obstacle avoidance for small uavs using monocular vision. Aircraft Eng. Aeros. Technol. 83(6), 397–406 (2011)CrossRefGoogle Scholar
  100. 100.
    Watanabe, Y., Fabiani, P., Le Besnerais, G.: Towards a uav visual air-to-ground target tracking in an urban environmentGoogle Scholar
  101. 101.
    Watanabe, Y., Lesire, C., Piquereau, A., Fabiani, P., Sanfourche, M., Le Besnerais, G.: The onera ressac unmanned autonomous helicopter: Visual air-to-ground target tracking in an urban environment. In: American Helicopter Society 66th Annual Forum (AHS 2010) (2010)Google Scholar
  102. 102.
    Jian, L., Xiao-min, L.: Vision-based navigation and obstacle detection for uav. In: International Conference on Electronics, Communications and Control (ICECC), pp. 1771–1774. IEEE (2011)Google Scholar
  103. 103.
    Byrne, J., Cosgrove, M., Mehra, R.: Stereo based obstacle detection for an unmanned air vehicle. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 2830–2835. IEEE (2006)Google Scholar
  104. 104.
    Yadav, V., Wang, X., Balakrishnan, S.: Neural network approach for obstacle avoidance in 3-d environments for uavs. In: American Control Conference, pp. 6–pp. IEEE (2006)Google Scholar
  105. 105.
    Srinivasan, M.V., Thurrowgood, S., Soccol, D.: An optical system for guidance of terrain following in uavs. In: International Conference on Video and Signal Based Surveillance (AVSS), pp. 51–51. IEEE (2006)Google Scholar
  106. 106.
    Hrabar, S., Sukhatme, G., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for a uav. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3309–3316. IEEE (2005)Google Scholar
  107. 107.
    Olivares-Mendez, M.A., Mejias, L., Campoy, P., Mellado-Bataller, I.: Quadcopter see and avoid using a fuzzy controller. In: Proceedings of the 10th International FLINS Conference on Uncertainty Modeling in Knowledge Engineering and Decision Making (FLINS 2012). World Scientific (2012)Google Scholar
  108. 108.
    Mohammed, A.D., Morris, T.: An improved camshift algorithm for object detection and extractionGoogle Scholar
  109. 109.
    Ahrens, S., Levine, D., Andrews, G., How, J.P.: Vision-based guidance and control of a hovering vehicle in unknown, gps-denied environments. In: International Conference on Robotics and Automation (ICRA), pp. 2643–2648. IEEE (2009)Google Scholar
  110. 110.
    Shi, J., Tomasi, C.: Good features to track. In: Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 593–600. IEEE (1994)Google Scholar
  111. 111.
    Lucas, B.D., Kanade, T., et al.: An iterative image registration technique with an application to stereo vision. In: IJCAI, vol. 81, pp. 674–679 (1981)Google Scholar
  112. 112.
    Mcfadyen, A., Mejias, L., Corke, P., Pradalier, C.: Aircraft collision avoidance using spherical visual predictive control and single point features. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 50–56. IEEE (2013)Google Scholar
  113. 113.
    Kim, Y., Jung, W., Bang, H.: Visual target tracking and relative navigation for unmanned aerial vehicles in a gps-denied environment. Int. J. Aeronaut. Space Sci. 15(3), 258–266 (2014)Google Scholar
  114. 114.
    Price, A., Pyke, J., Ashiri, D., Cornall, T.: Real time object detection for an unmanned aerial vehicle using an fpga based vision system. In: International Conference on Robotics and Automation (ICRA), pp. 2854–2859. IEEE (2006)Google Scholar
  115. 115.
    Jeon, B., Baek, K., Kim, C., Bang, H.: Mode changing tracker for ground target tracking on aerial images from unmanned aerial vehicles (iccas 2013). In: 13th International Conference on Control, Automation and Systems (ICCAS), pp. 1849–1853. IEEE (2013)Google Scholar
  116. 116.
    Rodriguez, J., Castiblanco, C., Mondragon, I., Colorado, J.: Low-cost quadrotor applied for visual detection of landmine-like objects. In: International Conference on Unmanned Aircraft Systems (ICUAS), pp. 83–88. IEEE (2014)Google Scholar
  117. 117.
    Gu, A., Xu, J.: Vision based ground marker fast detection for small robotic uav. In: 5th IEEE International Conference on Software Engineering and Service Science (ICSESS), pp. 975–978. IEEE (2014)Google Scholar
  118. 118.
    Zou, J.T., Tseng, Y.C.: Visual track system applied in quadrotor aerial robot. In: 2012 Third International Conference on Digital Manufacturing and Automation (ICDMA), pp. 1025–1028. IEEE (2012)Google Scholar
  119. 119.
    Teuliere, C., Eck, L., Marchand, E.: Chasing a moving target from a flying uav. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4929–4934. IEEE (2011)Google Scholar
  120. 120.
    Zhou, J.: Ekf based object detect and tracking for uav by using visual-attention-model. In: International Conference on Progress in Informatics and Computing (PIC), pp. 168–172. IEEE (2014)Google Scholar
  121. 121.
    Watanabe, Y., Fabiani, P., Le Besnerais, G.: Simultaneous visual target tracking and navigation in a gps-denied environment. In: International Conference on Advanced Robotics (ICAR), pp. 1–6. IEEE (2009)Google Scholar
  122. 122.
    Saif, A.S., Prabuwono, A.S., Mahayuddin, Z.R.: Real time vision based object detection from uav aerial images: A conceptual framework. In: Intelligent Robotics Systems: Inspiring the NEXT, pp. 265–274. Springer (2013)Google Scholar
  123. 123.
    Gaszczak, A., Breckon, T.P., Han, J.: Real-time people and vehicle detection from uav imagery. In: IS&T/SPIE Electronic Imaging, pp. 78,780B–78,780B. International Society for Optics and Photonics (2011)Google Scholar
  124. 124.
    Li, Z., Ding, J.: Ground moving target tracking control system design for uav surveillance. In: IEEE International Conference on Automation and Logistics, pp. 1458–1463. IEEE (2007)Google Scholar
  125. 125.
    Maier, J., Humenberger, M.: Movement detection based on dense optical flow for unmanned aerial vehicles. Int. J. Adv. Robot. Syst. 10, 1–11 (2013)CrossRefGoogle Scholar
  126. 126.
    Tarhan, M., Altuġ, E.: A catadioptric and pan-tilt-zoom camera pair object tracking system for uavs. J. Intell. Robot. Syst. 61(1–4), 119–134 (2011)CrossRefGoogle Scholar
  127. 127.
    Majidi, B., Bab-Hadiashar, A.: Aerial tracking of elongated objects in rural environments. Mach. Vis. Appl. 20(1), 23–34 (2009)CrossRefGoogle Scholar
  128. 128.
    Liu, X., Lin, Z., Acton, S.T.: A grid-based bayesian approach to robust visual tracking. Digit. Signal Process. 22(1), 54–65 (2012)MathSciNetCrossRefGoogle Scholar
  129. 129.
    Candamo, J., Kasturi, R., Goldgof, D.: Using color profiles for street detection in low-altitude uav video. In: SPIE Defense, Security, and Sensing, pp. 73,070O–73,070O. International Society for Optics and Photonics (2009)Google Scholar
  130. 130.
    Pestana, J., Sanchez-Lopez, J.L., Saripalli, S., Campoy, P.: Computer vision based general object following for gps-denied multirotor unmanned vehicles. In: American Control Conference (ACC), pp. 1886–1891. IEEE (2014)Google Scholar
  131. 131.
    Qadir, A., Semke, W., Neubert, J.: Vision based neuro-fuzzy controller for a two axes gimbal system with small uav. J. Intell. Robot. Syst. 74(3–4), 1029–1047 (2014)CrossRefGoogle Scholar
  132. 132.
    Mondragon, I.F., Campoy, P., Correa, J.F., Mejias, L.: Visual model feature tracking for uav control. In: IEEE International Symposium on Intelligent Signal Processing (WISP), pp. 1–6. IEEE (2007)Google Scholar
  133. 133.
    Zhao, S., Hu, Z., Yin, M., Ang, K.Z., Liu, P., Wang, F., Dong, X., Lin, F., Chen, B.M., Lee, T.H.: A robust real-time vision system for autonomous cargo transfer by an unmanned helicopter. IEEE Trans. Ind. Electron. 62(2) (2015)Google Scholar
  134. 134.
    Cichella, V., Kaminer, I., Dobrokhodov, V., Hovakimyan, N.: Coordinated vision-based tracking for multiple uavs. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 656–661. IEEE (2015)Google Scholar
  135. 135.
    Lin, S., Garratt, M.A., Lambert, A.J.: Monocular vision-based real-time target recognition and tracking for autonomously landing an uav in a cluttered shipboard environment. Autonom Robots, 1–21 (2016)Google Scholar
  136. 136.
    Fu, C., Duan, R., Kircali, D., Kayacan, E.: Onboard robust visual tracking for uavs using a reliable global-local object model. Sensors 16(9), 1406 (2016)CrossRefGoogle Scholar
  137. 137.
    Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., Grixa, I.L., Ruess, F., Suppa, M., Burschka, D.: Toward a fully autonomous uav: Research platform for indoor and outdoor urban search and rescue. IEEE Robot. Autom. Mag. 19(3), 46–56 (2012)CrossRefGoogle Scholar
  138. 138.
    Wang, T., Wang, C., Liang, J., Chen, Y., Zhang, Y.: Vision-aided inertial navigation for small unmanned aerial vehicles in gps-denied environments. Int. J. Adv. Robot. Syst. (2013)Google Scholar
  139. 139.
    Zhao, S., Lin, F., Peng, K., Chen, B.M., Lee, T.H.: Homography-based vision-aided inertial navigation of uavs in unknown environments. In: AIAA Guidance, Navigation, and Control Conference (2012)Google Scholar
  140. 140.
    Cocchioni, F., Mancini, A., Longhi, S.: Autonomous navigation, landing and recharge of a quadrotor using artificial vision. In: International Conference on Unmanned Aircraft Systems (ICUAS), pp. 418–429. IEEE (2014)Google Scholar
  141. 141.
    Carrillo, L.R.G., Flores Colunga, G., Sanahuja, G., Lozano, R.: Quad rotorcraft switching control: An application for the task of path following. IEEE Trans. Control Syst. Technol. 22(4), 1255–1267 (2014)CrossRefGoogle Scholar
  142. 142.
    Lee, D., Kim, Y., Bang, H.: Vision-aided terrain referenced navigation for unmanned aerial vehicles using ground features. Proc. Inst. Mech. Eng. Part G: J. Aeros. Eng. 228(13), 2399–2413 (2014)CrossRefGoogle Scholar
  143. 143.
    Yol, A., Delabarre, B., Dame, A., Dartois, J.E., Marchand, E.: Vision-based absolute localization for unmanned aerial vehicles. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3429–3434. IEEE (2014)Google Scholar
  144. 144.
    Tardif, J.P., George, M., Laverne, M., Kelly, A., Stentz, A.: Vision-aided inertial navigation for power line inspection. In: 1st International Conference on Applied Robotics for the Power Industry (CARPI), pp. 1–6 (2010)Google Scholar
  145. 145.
    Sanahuja, G., Castillo, P.: Embedded laser vision system for indoor aerial autonomous navigation. J. Int. Robot. Syst. 69(1–4), 447–457 (2013)CrossRefGoogle Scholar
  146. 146.
    Tippetts, B.J., Lee, D.J., Fowers, S.G., Archibald, J.K.: Real-time vision sensor for an autonomous hovering micro unmanned aerial vehicle. J. Aeros. Comput. Inf. Commun. 6(10), 570–584 (2009)CrossRefGoogle Scholar
  147. 147.
    Boṡnak, M., Matko, D., BlaŻiċ, S.: Quadrocopter hovering using position-estimation information from inertial sensors and a high-delay video system. J. Intell. Robot. Syst. 67(1), 43–60 (2012)CrossRefGoogle Scholar
  148. 148.
    Frew, E.W., Langelaan, J., Stachura, M.: Adaptive planning horizon based on information velocity for vision-based navigation. In: AIAA Guidance, Navigation and Controls Conference (2007)Google Scholar
  149. 149.
    Bircher, A., Kamel, M., Alexis, K., Oleynikova, H., Siegwart, R.: Receding horizon next-best-view??? planner for 3d exploration. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1462–1468. IEEE (2016)Google Scholar
  150. 150.
    Nuske, S., Choudhury, S., Jain, S., Chambers, A., Yoder, L., Scherer, S., Chamberlain, L., Cover, H., Singh, S.: Autonomous exploration and motion planning for an unmanned aerial vehicle navigating rivers. J. Field Robot. 32(8), 1141–1162 (2015)CrossRefGoogle Scholar
  151. 151.
    Avellar, G.S., Pereira, G.A., Pimenta, L.C., Iscold, P.: Multi-uav routing for area coverage and remote sensing with minimum time. Sensors 15(11), 27,783–27,803 (2015)CrossRefGoogle Scholar
  152. 152.
    Burri, M., Oleynikova, H., Achtelik, M.W., Siegwart, R.: Real-time visual-inertial mapping, re-localization and planning onboard mavs in unknown environments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 1872–1878. IEEE (2015)Google Scholar

Copyright information

© The Author(s) 2017

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Christoforos Kanellakis
    • 1
    Email author
  • George Nikolakopoulos
    • 1
  1. 1.Luleå University of TechnologyLuleåSweden

Personalised recommendations