Skip to main content
Log in

Vision-based heading estimation for navigation of a micro-aerial vehicle in GNSS-denied staircase environment using vanishing point

  • Original Paper
  • Published:
Aerospace Systems Aims and scope Submit manuscript

Abstract

Micro-aerial vehicles (MAVs) find it extremely difficult to navigate in GNSS-denied indoor staircase environments with obstructed Global navigation satellite system (GNSS) signals. To avoid hitting both static and moving obstacles, MAV must estimate its position and heading in the staircase indoor scenes. In order to detect vanishing points and estimate heading for MAV navigation in a staircase environment, five different input colour space image frames—namely RGB image into a grayscale image and RGB image into hyper-opponent colour space—O1, O2, O3, and Sobel R channel image frames—have been used in this work. To determine the position and direction of the MAV, the Hough transform technique and K-means clustering algorithm have been incorporated for line and vanishing point recognition in the staircase image frames. The position of the vanishing point detected in the staircase image frames indicates the position of the MAV (Centre, left or right) in the staircase. In addition, to compute the heading of MAV, the Euclidean distance between the staircase picture centre, mid-pixel coordinates at the image’s last row, and the detected vanishing point pixel coordinates in the succeeding staircase image frames are used. The position and heading measurement can be utilised to send the MAV a suitable control signal and align it at the centre of the staircase when it deviates from the centre. The integrated Hough transform technique and K-means clustering-based vanishing point detection are suitable for real-time MAV heading measurement using the O2 channel staircase image frames for indoor MAVs with a high accuracy of ± 0.15° when compared to the state-of-the-art grid-based vanishing point detection method heading accuracy of ± 1.5°.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

Availability of data and materials

The data and code used for this study are available from the corresponding author upon request.

Abbreviations

MAV:

Micro-aerial vehicle

GPS:

Global positioning system

VSLAM:

Visual simultaneous localisation and mapping

MSLAM:

Monocular simultaneous localisation and mapping

EKF:

Extended Kalman filter

VTOL:

Vertical-take-off-and-landing

VFH:

Vector field histogram

IMU:

Inertial measurement unit

RMSE:

Root-mean-squared error

MAE:

Mean absolute error

References

  1. Blösch M, Weiss S, Scaramuzza D, Siegwart R (2010) Vision based MAV navigation in unknown and unstructured environments. In: IEEE international conference on robotics and automation. https://doi.org/10.1109/ROBOT.2010.5509920

  2. Urzua S, Munguía R, Grau A (2017) Vision-based SLAM system for MAVs in GNSS-denied environments. Int J Micro Air Veh 9:283–296. https://doi.org/10.1177/1756829317705325

    Article  Google Scholar 

  3. Anbarasu B, Anitha G (2017) Vision-based heading and lateral deviation estimation for indoor navigation of a quadrotor. IETE J Res 63:597–603. https://doi.org/10.1080/03772063.2017.1284620

    Article  Google Scholar 

  4. Bills C, Chen J, Saxena A (2011) Autonomous MAV flight in indoor environments using single image perspective cues. In: IEEE international conference on robotics and automation (ICRA). https://doi.org/10.1109/ICRA.2011.5980136

  5. Mansouri S S, Karvelis P, Kanellakis C, Kominiak D, Nikolakopoulos G (2019) Vision-based MAV navigation in underground mine using convolutional neural network. In: 45th Annual conference of the IEEE industrial electronics society. https://doi.org/10.1109/IECON.2019.8927168

  6. Agarwal S, Lazarus SB, Savvaris A (2012) Monocular vision based navigation and localization in indoor environments. IFAC Proc 45:97–102. https://doi.org/10.3182/20120213-3-IN-4034.00020

    Article  Google Scholar 

  7. Lu Y, Xue Z, Xia G, Zhang L (2018) A survey on vision-based UAV navigation. Geo Spat Inf Sci 21:21–32. https://doi.org/10.1080/10095020.2017.1420509

    Article  Google Scholar 

  8. Oleynikova H, Lanegger C, Taylor Z, Pantic M, Millane A, Siegwart R, Nieto J (2020) An open-source system for vision-based micro-aerial vehicle mapping, planning, and flight in cluttered environments. J Field Robot 37:642–666. https://doi.org/10.1002/rob.21950

    Article  Google Scholar 

  9. Schlaile C, Meister O, Frietsch N, Keßler C, Wendel J, Trommer GF (2009) Using natural features for vision based navigation of an indoor-VTOL MAV. Aerosp Sci Technol 13:349–357. https://doi.org/10.1016/j.ast.2009.09.001

    Article  Google Scholar 

  10. Fraundorfer F, Heng L, Honegger D, Lee G H, Meier L, Tanskanen P, Pollefeys M (2012) Vision-based autonomous mapping and exploration using a quadrotor MAV. In: IEEE/RSJ international conference on intelligent robots and systems. https://doi.org/10.1109/IROS.2012.6385934

  11. Pestana J, Sanchez-Lope JL, De la Puente P, Carrio A, Campoy P (2016) A vision-based quadrotor multi-robot solution for the indoor autonomy challenge of the 2013 international micro air vehicle competition. J Intell Robot Syst 84:601–620. https://doi.org/10.1007/s10846-015-0304-1

    Article  Google Scholar 

  12. Keshavan J, Gremillion G, Alvarez-Escobar H, Humbert JS (2015) Autonomous vision-based navigation of a quadrotor in corridor-like environments. Int J Micro Air Veh 7:111–123. https://doi.org/10.1260/1756-8293.7.2.111

    Article  Google Scholar 

  13. Smolyanskiy N, Kamenev A, Smith J, Birchfield S (2017) Toward low-flying autonomous MAV trail navigation using deep neural networks for environmental awareness. In: IEEE/RSJ international conference on intelligent robots and systems. https://doi.org/10.48550/arXiv.1705.02550

  14. Kahn P, Kitchen L, Riseman EM (1990) a fast line finder for vision-guided robot navigation. IEEE Trans Pattern Anal Mach Intell 12:1098–1102. https://doi.org/10.1109/34.61710

    Article  Google Scholar 

  15. Rother C (2002) A new approach to vanishing point detection in architectural environments. Image Vis Comput 20:647–655. https://doi.org/10.1016/S0262-8856(02)00054-9

    Article  Google Scholar 

  16. Chen X, Jia R, Ren H, Zhang Y (2010) A new vanishing point detection algorithm based on hough transform. In: Third international joint conference on computational science and optimization. https://doi.org/10.1109/CSO.2010.163

  17. Wang Y (2011) An efficient algorithm for UAV indoor pose estimation using vanishing geometry. In: IAPR international conference on machine vision applications

  18. Ma Y, Soatto S, Košecká J, Shankar Sastry S (2001) An invitation to 3-D vision: from images to geometric models. Springer, New York

    Google Scholar 

  19. Bailey D, Chang Y, Le Moan S (2020) Analysing arbitrary curves from the line hough transform. J Imaging 6:1–28. https://doi.org/10.3390/jimaging6040026

    Article  Google Scholar 

  20. Kogecka J, Zhang W (2022) Efficient computation of vanishing points. In: IEEE international conference on robotics and automation (ICRA). https://doi.org/10.1109/ROBOT.2002.1013365

  21. Andaló FA, Taubin G, Goldenstein S (2015) Efficient height measurements in single images based on the detection of vanishing points. Comput Vis Image Underst 138:51–60. https://doi.org/10.1016/j.cviu.2015.03.017

    Article  Google Scholar 

  22. Gerogiannis D, Nikou C, Likas A (2012) Fast and efficient vanishing point detection in indoor images. In: 21st International conference on pattern recognition (ICPR2012)

  23. Yang J, Rao D, Chung S J, Hutchinson S (2011) Monocular vision based navigation in GPS-denied riverine environments. In: Infotech @ aerospace conference. https://doi.org/10.2514/6.2011-1403

  24. Páli E, Máthé K, Tamás L, Buşoniu L (2014) Railway track following with the AR. Drone using vanishing point detection. In: IEEE international conference on automation, quality and testing. https://doi.org/10.1109/AQTR.2014.6857870

  25. He L, Ren X, Gao Q, Zhao X, Yao B, Chao Y (2017) The connected-component labeling problem: a review of state-of-the-art algorithms. Pattern Recognit 70:25–43. https://doi.org/10.1016/j.patcog.2017.04.018

    Article  Google Scholar 

  26. Liu X, Li X, Shi Q, Xu C, Tang Y (2021) UAV attitude estimation based on MARG and optical flow sensors using gated recurrent unit. Int J Distrib Sens Netw 17:1–10. https://doi.org/10.1177/15501477211009814

    Article  Google Scholar 

  27. Duda RO, Hart PE (1972) Use of the Hough transformation to detect lines and curves in pictures. Commun ACM 15:11–15. https://doi.org/10.1145/361237.361242

    Article  Google Scholar 

  28. Ebrahimpour R, Rasoolinezhad R, Hajiabolhasani Z, Ebrahimi M (2012) Vanishing point detection in corridors: using Hough transform and K-means clustering. IET Comput Vis 6:40–51. https://doi.org/10.1049/iet-cvi.2010.0046

    Article  MathSciNet  Google Scholar 

  29. Kosecka J, Zhang W (2002) Video compass. In: 7th European conference on computer vision. https://doi.org/10.5555/645318.649358

  30. Xiao Y, Wu J, Yuan J (2014) mCENTRIST: a multi-channel feature generation mechanism for scene categorization. IEEE Trans Image Process 23:823–836. https://doi.org/10.1109/TIP.2013.2295756

    Article  MathSciNet  Google Scholar 

  31. Hartigan JA, Wong MA (1979) Algorithm AS 136: a K-means clustering algorithm. J R Stat Soc Ser C Appl Stat 28(1):100–108. https://doi.org/10.2307/2346830

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

The author confirms that no financial resources, grants or other support were received during the preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

Anbarasu B has conducted the experiment and drafted the manuscript of the paper. The author read and approved the final manuscript.

Corresponding author

Correspondence to B. Anbarasu.

Ethics declarations

Conflict of interest

The author has no significant financial or non-financial interests to disclose.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Anbarasu, B. Vision-based heading estimation for navigation of a micro-aerial vehicle in GNSS-denied staircase environment using vanishing point. AS 7, 395–418 (2024). https://doi.org/10.1007/s42401-024-00282-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42401-024-00282-5

Keywords

Navigation