Abstract
This paper proposes a vision-based bike trail following approach with obstacle avoidance using CNN (Convolutional Neural Network) for the UAV (Unmanned Aerial Vehicle). The UAV is controlled to follow a given trail while keeping its position near the center of the trail using the CNN. Also, to return to the original path when the UAV goes out of the path or the camera misses the trail due to disturbances such as wind, the control commands from the CNN are stored for a certain duration of time and used for recovering from such disturbances. To avoid obstacles during the trail navigation, the optical flow computed with another CNN is used to determine the safe maneuver. By combining these methods of i) trail following, ii) disturbance recovery, and iii) obstacle avoidance, the UAV deals with various situations encountered when traveling on the trail. The feasibility and performance of the proposed approach are verified through realistic simulations and flight experiments in real-world environments.
Similar content being viewed by others
References
Giusti, A., Guzzi, J., Cireṡan, D. C., He, F. -L., Rodríguez, J. P., Fontana, F., Faessler, M., Forster, C., Schmidhuber, J., Di Caro, G., et al.: A machine learning approach to visual perception of forest trails for mobile robots. IEEE Robot. Autom. Lett. 1(2), 661–667 (2015)
Smolyanskiy, N., Kamenev, A., Smith, J., Birchfield, S.: Toward low-flying autonomous mav trail navigation using deep neural networks for environmental awareness. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4241–4247 (2017)
Maciel-Pearson, B. G., Carbonneau, P., Breckon, T. P.: Extending Deep Neural Network Trail Navigation for Unmanned Aerial Vehicle Operation within the Forest Canopy. In: Annual Conference Towards Autonomous Robotic Systems, pp. 147-158. Springer, Cham (2018)
Zhilenkov, A. A., Epifantsev, I. R.: System of Autonomous Navigation of the Drone in Difficult Conditions of the Forest Trails. In: IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (EIConrus). IEEE (2018)
Adhikari, S., Yang, C., Slot, K., Kim, H.: Accurate natural trail detection using a combination of a deep neural network and dynamic programming. Sensors 18(1), 178 (2018)
Palossi, D., Loquercio, A., Conti, F., Flamand, E., Scaramuzza, D., Benini, L.: Ultra Low Power Deep-Learning-Powered Autonomous Nano Drones. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018) (2018)
Rasmussen, C., Lu, Y., Kocamaz, M.: Appearance contrast for fast, robust trail-following. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3505–3512 (2009)
Santana, P., Correia, L., Mendonċa, R., Alves, N., Barata, J.: Tracking natural trails with swarm-based visual saliency. J. Field Robot. 30(1), 64–86 (2013)
Levin, A., Weiss, Y.: Learning to combine bottom-up and top-down segmentation. Int. J. Comput. Vis. 81(1), 105–118 (2009)
Sermanet, P., Hadsell, R., Scoffier, M., Grimes, M., Ben, J., Erkan, A., Crudele, C., Miller, U., LeCun, Y.: A multirange architecture for collision-free off-road robot navigation. J. Field Robot. 26(1), 52–87 (2009)
Hadsell, R., Sermanet, P., Ben, J., Erkan, A., Scoffier, M., Kavukcuoglu, K., Muller, U., LeCun, Y.: Learning long-range vision for autonomous off-road driving. J. Field Robot. 26(2), 120–144 (2009)
Dosovitskiy, A., Fischer, P., Ilg, E., Hausser, P., Hazirbas, C., Golkov, V., Van Der Smagt, P., Cremers, D., Brox, T.: Flownet: Learning optical flow with convolutional networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2758–2766 (2015)
Hillel, A. B., Lerner, R., Levi, D., Raz, G.: Recent progress in road and lane detection: a survey. Mach. Vis. Appl. 25(3), 727–745 (2014)
Dickmanns, E. D., Zapp, A.: A curvature-based scheme for improving road vehicle guidance by computer vision. In: Mobile Robots I, vol. 727. International Society for Optics and Photonics, pp. 161–168 (1987)
Dickmanns, E. D., Behringer, R., Dickmanns, D., Hildebrandt, T., Maurer, M., Thomanek, F., Schiehlen, J.: The seeing passenger car ’vamors-p’. In: Proceedings of the Intelligent Vehicles’ 94 Symposium, pp. 68–73 (1994)
Dickmanns, E. D.: Dynamic vision for perception and control of motion. Springer Science & Business Media (2007)
Thrun, S., Montemerlo, M., Dahlkamp, H., Stavens, D., Aron, A., Diebel, J., Fong, P., Gale, J., Halpenny, M., Hoffmann, G., et al.: Stanley: The robot that won the darpa grand challenge. J. Field Robot. 23(9), 661–692 (2006)
Mori, T., Scherer, S.: First results in detecting and avoiding frontal obstacles from a monocular camera for micro unmanned aerial vehicles. In: 2013 IEEE International Conference on Robotics and Automation, pp. 1750–1757 (2013)
Al-Kaff, A., Meng, Q., Martín, D., de la Escalera, A., Armingol, J. M.: Monocular vision-based obstacle detection/avoidance for unmanned aerial vehicles. In: IEEE Intelligent Vehicles Symposium (IV), pp. 92–97 (2016)
Drews, P., de Bem, R., de Melo, A.: Analyzing and exploring feature detectors in images. In: 2011 9th IEEE International Conference on Industrial Informatics, pp. 305–310. IEEE (2011)
Souhila, K., Karim, A.: Optical flow based robot obstacle avoidance. Int. J. Adv. Robot. Syst. 4(1), 2 (2007)
Muratet, L., Doncieux, S., Briere, Y., Meyer, J. -A.: A contribution to vision-based autonomous helicopter flight in urban environments. Robot. Auton. Syst. 50(4), 195–209 (2005)
Peng, X.-Z., Lin, H.-Y., Dai, J.-M.: Path planning and obstacle avoidance for vision guided quadrotor uav navigation. In: 2016 12th IEEE International Conference on Control and Automation (ICCA), pp. 984–989. IEEE (2016)
Agrawal, P., Ratnoo, A., Ghose, D.: Inverse optical flow based guidance for uav navigation through urban canyons. Aerosp. Sci. Technol. 68, 163–178 (2017)
Liau, Y. S., Zhang, Q., Li, Y., Ge, S. S.: Non-metric navigation for mobile robot using optical flow. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4953–4958 (2012)
Gao, P., Zhang, D., Fang, Q., Jin, S.: Obstacle avoidance for micro quadrotor based on optical flow. In: 2017 29th Chinese Control And Decision Conference (CCDC), pp. 4033–4037 (2017)
Yang, Y.-R., Gong, H.-J., Wang, X.-H., Jia, S.: Obstacle-avoidance strategy for small scale unmanned helicopter. In: 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), pp. 1594–1598 (2016)
Eresen, A., İmamoġlu, N., Efe, M. Ö.: Autonomous quadrotor flight with vision-based obstacle avoidance in virtual environment. Expert Syst. Appl. 39(1), 894–905 (2012)
Yoo, D.-W., Won, D.-Y., Tahk, M.-J.: Optical flow based collision avoidance of multi-rotor uavs in urban environments. Int. J. Aeronaut. Space Sci. 12(3), 252–259 (2011)
Muratet, L., Doncieux, S., Meyer, J.-A.: A biomimetic reactive navigation system using the optical flow for a rotary-wing uav in urban environment. Proceedings of the International Session on Robotics, pp. 2262–2270 (2004)
Prashanth, K., Shankpal, P., Nagaraja, B., Kadambi, G. R., Shankapal, S.: Real time obstacle avoidance and navigation of a quad-rotor mav using optical flow algorithms. Sastech J. 12(1), 31–35 (2013)
Cho, G., Kim, J., Oh, H.: Vision-based obstacle avoidance strategies for mavs using optical flows in 3-d textured environments. Sensors 19(11), 2523 (2019)
Burton, A., Radford, J.: Thinking in perspective: critical essays in the study of thought processes, vol. 646. Routledge (1978)
Warren, D. H. , Strelow, E. R.: Electronic spatial sensing for the blind: contributions from perception, rehabilitation, and computer vision, vol. 99. Springer Science & Business Media (2013)
Horn, B. K., Schunck, B. G.: Determining optical flow. Artif. Intell. 17(1-3), 185–203 (1981)
Lucas, B. D., Kanade, T., et al.: An iterative image registration technique with an application to stereo vision. In: Inproceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), p. 674–679 (1981)
Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: Epicflow: Edge-preserving interpolation of correspondences for optical flow. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1164–1172 (2015)
Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., Brox, T.: Flownet 2.0: Evolution of optical flow estimation with deep networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2462–2470 (2017)
Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: Deepflow: Large displacement optical flow with deep matching. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1385–1392 (2013)
Kingma, D. P., Ba, J.: Adam: A method for stochastic optimization, arXiv:1412.6980 (2014)
Tensorflow. https://www.tensorflow.org/
Robot operating system. http://www.ros.org
Rotors simulator. http://github.com/ethz-asl/rotors_simulator
Acknowledgments
This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2020R1A6A1A03040570) and Unmanned Vehicles Core Technology Research and Development Program through the National Research Foundation of Korea(NRF), Unmanned Vehicle Advanced Research Center (UVARC) funded by the Ministry of Science and ICT, the Republic of Korea (2020M3C1C1A01082375).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Back, S., Cho, G., Oh, J. et al. Autonomous UAV Trail Navigation with Obstacle Avoidance Using Deep Neural Networks. J Intell Robot Syst 100, 1195–1211 (2020). https://doi.org/10.1007/s10846-020-01254-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-020-01254-5