Skip to main content
Log in

Autonomous UAV Trail Navigation with Obstacle Avoidance Using Deep Neural Networks

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

This paper proposes a vision-based bike trail following approach with obstacle avoidance using CNN (Convolutional Neural Network) for the UAV (Unmanned Aerial Vehicle). The UAV is controlled to follow a given trail while keeping its position near the center of the trail using the CNN. Also, to return to the original path when the UAV goes out of the path or the camera misses the trail due to disturbances such as wind, the control commands from the CNN are stored for a certain duration of time and used for recovering from such disturbances. To avoid obstacles during the trail navigation, the optical flow computed with another CNN is used to determine the safe maneuver. By combining these methods of i) trail following, ii) disturbance recovery, and iii) obstacle avoidance, the UAV deals with various situations encountered when traveling on the trail. The feasibility and performance of the proposed approach are verified through realistic simulations and flight experiments in real-world environments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Giusti, A., Guzzi, J., Cireṡan, D. C., He, F. -L., Rodríguez, J. P., Fontana, F., Faessler, M., Forster, C., Schmidhuber, J., Di Caro, G., et al.: A machine learning approach to visual perception of forest trails for mobile robots. IEEE Robot. Autom. Lett. 1(2), 661–667 (2015)

    Article  Google Scholar 

  2. Smolyanskiy, N., Kamenev, A., Smith, J., Birchfield, S.: Toward low-flying autonomous mav trail navigation using deep neural networks for environmental awareness. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4241–4247 (2017)

  3. Maciel-Pearson, B. G., Carbonneau, P., Breckon, T. P.: Extending Deep Neural Network Trail Navigation for Unmanned Aerial Vehicle Operation within the Forest Canopy. In: Annual Conference Towards Autonomous Robotic Systems, pp. 147-158. Springer, Cham (2018)

  4. Zhilenkov, A. A., Epifantsev, I. R.: System of Autonomous Navigation of the Drone in Difficult Conditions of the Forest Trails. In: IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (EIConrus). IEEE (2018)

  5. Adhikari, S., Yang, C., Slot, K., Kim, H.: Accurate natural trail detection using a combination of a deep neural network and dynamic programming. Sensors 18(1), 178 (2018)

    Article  Google Scholar 

  6. Palossi, D., Loquercio, A., Conti, F., Flamand, E., Scaramuzza, D., Benini, L.: Ultra Low Power Deep-Learning-Powered Autonomous Nano Drones. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018) (2018)

  7. Rasmussen, C., Lu, Y., Kocamaz, M.: Appearance contrast for fast, robust trail-following. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3505–3512 (2009)

  8. Santana, P., Correia, L., Mendonċa, R., Alves, N., Barata, J.: Tracking natural trails with swarm-based visual saliency. J. Field Robot. 30(1), 64–86 (2013)

    Article  Google Scholar 

  9. Levin, A., Weiss, Y.: Learning to combine bottom-up and top-down segmentation. Int. J. Comput. Vis. 81(1), 105–118 (2009)

    Article  Google Scholar 

  10. Sermanet, P., Hadsell, R., Scoffier, M., Grimes, M., Ben, J., Erkan, A., Crudele, C., Miller, U., LeCun, Y.: A multirange architecture for collision-free off-road robot navigation. J. Field Robot. 26(1), 52–87 (2009)

    Article  Google Scholar 

  11. Hadsell, R., Sermanet, P., Ben, J., Erkan, A., Scoffier, M., Kavukcuoglu, K., Muller, U., LeCun, Y.: Learning long-range vision for autonomous off-road driving. J. Field Robot. 26(2), 120–144 (2009)

    Article  Google Scholar 

  12. Dosovitskiy, A., Fischer, P., Ilg, E., Hausser, P., Hazirbas, C., Golkov, V., Van Der Smagt, P., Cremers, D., Brox, T.: Flownet: Learning optical flow with convolutional networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2758–2766 (2015)

  13. Hillel, A. B., Lerner, R., Levi, D., Raz, G.: Recent progress in road and lane detection: a survey. Mach. Vis. Appl. 25(3), 727–745 (2014)

    Article  Google Scholar 

  14. Dickmanns, E. D., Zapp, A.: A curvature-based scheme for improving road vehicle guidance by computer vision. In: Mobile Robots I, vol. 727. International Society for Optics and Photonics, pp. 161–168 (1987)

  15. Dickmanns, E. D., Behringer, R., Dickmanns, D., Hildebrandt, T., Maurer, M., Thomanek, F., Schiehlen, J.: The seeing passenger car ’vamors-p’. In: Proceedings of the Intelligent Vehicles’ 94 Symposium, pp. 68–73 (1994)

  16. Dickmanns, E. D.: Dynamic vision for perception and control of motion. Springer Science & Business Media (2007)

  17. Thrun, S., Montemerlo, M., Dahlkamp, H., Stavens, D., Aron, A., Diebel, J., Fong, P., Gale, J., Halpenny, M., Hoffmann, G., et al.: Stanley: The robot that won the darpa grand challenge. J. Field Robot. 23(9), 661–692 (2006)

    Article  Google Scholar 

  18. Mori, T., Scherer, S.: First results in detecting and avoiding frontal obstacles from a monocular camera for micro unmanned aerial vehicles. In: 2013 IEEE International Conference on Robotics and Automation, pp. 1750–1757 (2013)

  19. Al-Kaff, A., Meng, Q., Martín, D., de la Escalera, A., Armingol, J. M.: Monocular vision-based obstacle detection/avoidance for unmanned aerial vehicles. In: IEEE Intelligent Vehicles Symposium (IV), pp. 92–97 (2016)

  20. Drews, P., de Bem, R., de Melo, A.: Analyzing and exploring feature detectors in images. In: 2011 9th IEEE International Conference on Industrial Informatics, pp. 305–310. IEEE (2011)

  21. Souhila, K., Karim, A.: Optical flow based robot obstacle avoidance. Int. J. Adv. Robot. Syst. 4(1), 2 (2007)

    Article  Google Scholar 

  22. Muratet, L., Doncieux, S., Briere, Y., Meyer, J. -A.: A contribution to vision-based autonomous helicopter flight in urban environments. Robot. Auton. Syst. 50(4), 195–209 (2005)

    Article  Google Scholar 

  23. Peng, X.-Z., Lin, H.-Y., Dai, J.-M.: Path planning and obstacle avoidance for vision guided quadrotor uav navigation. In: 2016 12th IEEE International Conference on Control and Automation (ICCA), pp. 984–989. IEEE (2016)

  24. Agrawal, P., Ratnoo, A., Ghose, D.: Inverse optical flow based guidance for uav navigation through urban canyons. Aerosp. Sci. Technol. 68, 163–178 (2017)

    Article  Google Scholar 

  25. Liau, Y. S., Zhang, Q., Li, Y., Ge, S. S.: Non-metric navigation for mobile robot using optical flow. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4953–4958 (2012)

  26. Gao, P., Zhang, D., Fang, Q., Jin, S.: Obstacle avoidance for micro quadrotor based on optical flow. In: 2017 29th Chinese Control And Decision Conference (CCDC), pp. 4033–4037 (2017)

  27. Yang, Y.-R., Gong, H.-J., Wang, X.-H., Jia, S.: Obstacle-avoidance strategy for small scale unmanned helicopter. In: 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), pp. 1594–1598 (2016)

  28. Eresen, A., İmamoġlu, N., Efe, M. Ö.: Autonomous quadrotor flight with vision-based obstacle avoidance in virtual environment. Expert Syst. Appl. 39(1), 894–905 (2012)

    Article  Google Scholar 

  29. Yoo, D.-W., Won, D.-Y., Tahk, M.-J.: Optical flow based collision avoidance of multi-rotor uavs in urban environments. Int. J. Aeronaut. Space Sci. 12(3), 252–259 (2011)

    Article  Google Scholar 

  30. Muratet, L., Doncieux, S., Meyer, J.-A.: A biomimetic reactive navigation system using the optical flow for a rotary-wing uav in urban environment. Proceedings of the International Session on Robotics, pp. 2262–2270 (2004)

  31. Prashanth, K., Shankpal, P., Nagaraja, B., Kadambi, G. R., Shankapal, S.: Real time obstacle avoidance and navigation of a quad-rotor mav using optical flow algorithms. Sastech J. 12(1), 31–35 (2013)

    Google Scholar 

  32. Cho, G., Kim, J., Oh, H.: Vision-based obstacle avoidance strategies for mavs using optical flows in 3-d textured environments. Sensors 19(11), 2523 (2019)

    Article  Google Scholar 

  33. Burton, A., Radford, J.: Thinking in perspective: critical essays in the study of thought processes, vol. 646. Routledge (1978)

  34. Warren, D. H. , Strelow, E. R.: Electronic spatial sensing for the blind: contributions from perception, rehabilitation, and computer vision, vol. 99. Springer Science & Business Media (2013)

  35. Horn, B. K., Schunck, B. G.: Determining optical flow. Artif. Intell. 17(1-3), 185–203 (1981)

    Article  Google Scholar 

  36. Lucas, B. D., Kanade, T., et al.: An iterative image registration technique with an application to stereo vision. In: Inproceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), p. 674–679 (1981)

  37. Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: Epicflow: Edge-preserving interpolation of correspondences for optical flow. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1164–1172 (2015)

  38. Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., Brox, T.: Flownet 2.0: Evolution of optical flow estimation with deep networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2462–2470 (2017)

  39. Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: Deepflow: Large displacement optical flow with deep matching. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1385–1392 (2013)

  40. Kingma, D. P., Ba, J.: Adam: A method for stochastic optimization, arXiv:1412.6980 (2014)

  41. Tensorflow. https://www.tensorflow.org/

  42. Robot operating system. http://www.ros.org

  43. Rotors simulator. http://github.com/ethz-asl/rotors_simulator

Download references

Acknowledgments

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2020R1A6A1A03040570) and Unmanned Vehicles Core Technology Research and Development Program through the National Research Foundation of Korea(NRF), Unmanned Vehicle Advanced Research Center (UVARC) funded by the Ministry of Science and ICT, the Republic of Korea (2020M3C1C1A01082375).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hyondong Oh.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Back, S., Cho, G., Oh, J. et al. Autonomous UAV Trail Navigation with Obstacle Avoidance Using Deep Neural Networks. J Intell Robot Syst 100, 1195–1211 (2020). https://doi.org/10.1007/s10846-020-01254-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-020-01254-5

Keywords

Navigation