Skip to main content
Log in

A Real-time Positioning Model for UAV’s Patrolling Images Based on Airborne LiDAR Point Cloud Fusion

  • Surveying and Geo-Spatial Engineering
  • Published:
KSCE Journal of Civil Engineering Aims and scope

Abstract

The precise positioning issue of oblique aerial image has been widely studied in recent years. However, there are still some deficiencies in applying the existing methods to highly time-sensitive engineering. For the real-time positioning issues of oblique images involved in Unmanned Aerial Vehicle’s (UAV’s) patrolling applications, existing photogrammetry method cannot meet the real-time positioning requirements, existing binocular vision method cannot meet the dynamic and precise positioning requirements, existing optical flow method cannot meet the absolute positioning requirements, and existing multi-source feature matching method cannot meet the robust positioning requirements. In order to meet the real-time, dynamic, precise, absolute and robust positioning requirements of UAV’s patrolling images, a real-time positioning model for UAV’s patrolling images based on airborne LiDAR point cloud fusion is proposed. First, a precise Digital Surface Model (DSM) is generated by rasterizing and imaging the raw airborne LiDAR point cloud, in which a pixel’s grayscale is exactly equal to elevation of local area covered by the pixel. Second, the generated DSM and UAV’s patrolling image are fused under specific geometric constrains, so as to realize real-time positioning of UAV’s patrolling image pixel by pixel. Finally, more precise positioning of selected key points on UAV’s patrolling image can be realized by performing Principal Component Analysis (PCA)on the raw airborne LiDAR point cloud that surrounds the selected key points. The above methods are analyzed and verified by three groups of practical experiments, and results indicate that the proposed model can achieve real-time positioning of a single UAV’s patrolling image (4000 × 6000 pixels) with an accuracy of 0.5 m within 0.38 seconds in arbitrary areas, and can further realize precise positioning of any selected key point on UAV’s patrolling image with an accuracy of 0.2 m in 0.001 seconds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

Download references

Acknowledgments

We thank the editors for reviewing the manuscript, and the anonymous reviewers for providing suggestions that greatly improved the quality of the work.

This research was supported by China Postdoctoral Science Foundation (2021M701373).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haoyang Pei.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fan, W., Liu, H., Pei, H. et al. A Real-time Positioning Model for UAV’s Patrolling Images Based on Airborne LiDAR Point Cloud Fusion. KSCE J Civ Eng (2024). https://doi.org/10.1007/s12205-024-2254-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12205-024-2254-2

Keywords

Navigation