Skip to main content

Ground Plane Estimation for Obstacle Avoidance During Fixed-Wing UAV Landing

  • Conference paper
  • First Online:
Intelligent Information and Database Systems (ACIIDS 2021)

Abstract

The automatic crash-landing of a fixed-wing UAV is challenging due to high velocity of approaching aircraft, limited manoeuvrability, lack of ability to hover and low quality of textural features in common landing sites. Available algorithms for ground plane estimation are designed for automatic cars, robots, and other land-bound platforms, where textural features can be easily compared due to low altitude, or for rotor-wing UAV that can hover. Their usefulness is limited when quick manoeuvre is needed to avoid collision with obstacles. Due to developments in parallelised, mobile computational architectures, an approach based on dense disparity estimation becomes available assuming proper constraints on ground plane transformation phase space. We propose an algorithm utilising such constraints for ground plane estimation on the often prohibitively time-consuming task of disparity calculation as well as plane estimation itself using a pyramid-based approach and random sample consensus in order to discard pixels belonging to obstacles. We use Inertial Navigation System, commonly available in fixed-wing UAVs, as a source of such constraints and improve our estimation in subsequent frames allowing for stable flight trajectory as well as detection of obstacles protruding from the ground plane.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Rosner, J., et al.: A system for automatic detection of potential landing sites for horizontally landing unmanned aerial vehicles. AIP Conf. Proc. 1978(1), 110006 (2018) https://doi.org/10.1063/1.5043764

  2. Hoffmann, J., Jüngel, M., Lötzsch, M.: A vision based system for goal-directed obstacle avoidance. In: Nardi, D., Riedmiller, M., Sammut, C., Santos-Victor, J. (eds.) RoboCup 2004. LNCS (LNAI), vol. 3276, pp. 418–425. Springer, Heidelberg (2005). https://doi.org/10.1007/978-3-540-32256-6_35

    Chapter  Google Scholar 

  3. Lenser, S., Veloso, M.: Visual sonar fast obstacle avoidance using monocular vision. In: IEEE/RSJ IROS 2003 Proceedings, pp. 886–891. IEEE, Las Vegas (2003). https://doi.org/10.1109/IROS.2003.1250741

  4. Lorigon, L.M., Brooks, R.A., Grimson, W.E.L.: Visually-guided obstacle avoidance in unstructured environments. In: Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovative Robotics for Real-World Applications. IROS 1997, pp. 373–379. IEEE, Grenoble (1997). https://doi.org/10.1109/IROS.1997.649086

  5. Kim, Y., Kim, H.: Layered ground floor detection for vision-based mobile robot navigation. In: IEEE International Conference on Robotics and Automation, 2004, Proceedings. ICRA 2004, vol. 1, pp. 13–18. IEEE, New Orleans (2004). https://doi.org/10.1109/ROBOT.2004.1307122

  6. Pęszor, D., Wojciechowska, M., Wojciechowski, K., Szender, M.: Fast moving UAV collision avoidance using optical flow and stereovision. In: Nguyen, N.T., Tojo, S., Nguyen, L.M., Trawiński, B. (eds.) ACIIDS 2017. LNCS (LNAI), Part II, vol. 10192, pp. 572–581. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-54430-4_55

    Chapter  Google Scholar 

  7. Pȩszor, D., Paszkuta, M., Wojciechowska, M., Wojciechowski, K.: Optical flow for collision avoidance in autonomous cars. In: Nguyen, N.T., Hoang, D.H., Hong, T.-P., Pham, H., Trawiński, B. (eds.) ACIIDS 2018, Part II. LNCS (LNAI), vol. 10752, pp. 482–491. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-75420-8_46

    Chapter  Google Scholar 

  8. Zhou, J., Li, B.: Robust ground plane detection with normalized homography in monocular sequences from a robot platform. In: 2006 International Conference on Image Processing, pp. 3017–3020. IEEE, Atlanta (2006). https://doi.org/10.1109/ICIP.2006.312972

  9. Zhou, J., Li, B.: Homography-based Ground Detection for A Mobile Robot Platform Using a Single Camera. In: Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006, ICRA 2006, pp. 4100–4105. IEEE, Orlando (2006) https://doi.org/10.1109/ROBOT.2006.1642332

  10. Zhou, H., Wallace, A.M., Green, P.R.: A multistage filtering technique to detect hazards on the ground plane. Pattern Recogn. Lett. 24(9–10), 1453–1461 (2003). https://doi.org/10.1016/S0167-8655(02)00385-9

    Article  MATH  Google Scholar 

  11. Conrad, D., DeSouza, G.N.: Homography-based ground plane detection for mobile robot navigation using a modified EM algorithm. In: 2010 IEEE International Conference on Robotics and Automation, pp. 910–915. IEEE, Anchorage (2010). https://doi.org/10.1109/ROBOT.2010.5509457

  12. Yamaguchi, K., Watanabe, A., Naito, T.: Road region estimation using a sequence of monocular images. In: 2008 19th International Conference on Pattern Recognition, pp. 1–4. IEEE, Tampa (2008). https://doi.org/10.1109/ICPR.2008.4761571

  13. Roumeliotis, S.I., Johnson, A.E., Montgomery, J.F.: Augmenting inertial navigation with image-based motion estimation. In: Proceedings 2002 IEEE International Conference on Robotics and Automation, vol. 4, pp. 4326–4333. IEEE, Washington (2002). https://doi.org/10.1109/ROBOT.2002.1014441

  14. Johnson, E., Mathies, H.: Precise image-based motion estimation for autonomous small body exploration. In: Artificial Intelligence, Robotics and Automation in Space, Proceedings of the Fifth International Symposium, ISAIRAS 1999, vol. 440, pp. 627–634 (2000)

    Google Scholar 

  15. Panahandeh, G., Jansson, M.: Vision-aided inertial navigation based on ground plane feature detection. IEEE/ASME Trans. Mech. 19(4), 1206–1215 (2014). https://doi.org/10.1109/TMECH.2013.2276404

    Article  Google Scholar 

  16. Sabe, K., Fukuchi, M., Gutmann, J.-S., Ohashi, T., Kawamoto, K., Yoshigahara, T.: Obstacle avoidance and path planning for humanoid robots using stereo vision. In: Proceedings on International Conference on Robotics and Automation, pp. 592–597. IEEE, New Orleans (2004). https://doi.org/10.1109/ROBOT.2004.1307213

  17. Mandelbaum, R., McDowell, L., Bogoni, L., Beich, B., Hansen, M.: Real-time stereo processing, obstacle detection, and terrain estimation from vehicle-mounted stereo cameras. In: Proceedings on 4th IEEE Workshop on Applications of Computer Vision, pp. 288–289. IEEE, Princenton (1998). https://doi.org/10.1109/ACV.1998.732909

  18. Chumerin, N., Van Hulle, M.M.: Ground plane estimation based on dense stereo disparity. In: Proceedings of the 5th International Conference on Neural Networks and Artificial Intelligence, pp. 209–213, Minsk, Belarus (2008)

    Google Scholar 

  19. Holland, P.W., Welsch, R.E.: Robust regression using iteratively reweighted least-squares. Commun. Stat. Theory Methods 6(9), 813–827 (1977). https://doi.org/10.1080/03610927708827533

    Article  MATH  Google Scholar 

  20. Crow, F.: Summed-area tables for texture mapping. In: SIGGRAPH 1984, pp. 207–212. Association for Computing Machinery, Minneapolis (1984). https://doi.org/10.1145/800031.808600

  21. Burt, P.J., Adelson, E.H.: The Laplacian pyramid as a compact image code. IEEE Trans. Commun. 31(4), 671–679 (1983). https://doi.org/10.1109/TCOM.1983.1095851

    Article  Google Scholar 

  22. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981). https://doi.org/10.1145/358669.358692

  23. Frigo, M., Johnson, S.G.: The design and implementation of FFTW3. Proc. IEEE 93(2), 216–231 (2005). https://doi.org/10.1109/JPROC.2004.840301

    Article  Google Scholar 

Download references

Acknowledgements

This work has been supported by the National Centre for Research and Development, Poland in the frame of project POIR.01.02.00-00-0009/2015 “System of the autonomous landing of a UAV in unknown terrain conditions on the basis of visual data”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Damian Pęszor .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pęszor, D., Wojciechowski, K., Szender, M., Wojciechowska, M., Paszkuta, M., Nowacki, J.P. (2021). Ground Plane Estimation for Obstacle Avoidance During Fixed-Wing UAV Landing. In: Nguyen, N.T., Chittayasothorn, S., Niyato, D., Trawiński, B. (eds) Intelligent Information and Database Systems. ACIIDS 2021. Lecture Notes in Computer Science(), vol 12672. Springer, Cham. https://doi.org/10.1007/978-3-030-73280-6_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-73280-6_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-73279-0

  • Online ISBN: 978-3-030-73280-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics