Skip to main content

Camera Selection and Flight Planning for Post Processing 3D Reconstruction Automatization

  • Conference paper
  • First Online:
Advances in Service and Industrial Robotics (RAAD 2017)

Part of the book series: Mechanisms and Machine Science ((Mechan. Machine Science,volume 49))

Included in the following conference series:

  • 4620 Accesses

Abstract

Currently available 3D reconstruction and aerial mapping solutions based on the Structure-from-Motion (SfM) and bundle adjustment, require careful equipment selection/configuration (camera, optics and imaging parameters’ settings), accurate flight planning (number of images and their overlap percentage) and often user intervention during post processing. This paper studies the various factors affecting the quality of the final 3D reconstruction, their dependencies and trade-offs trying to push the whole 3D reconstruction and aerial mapping solutions to a more automated scheme with minimum user intervention. We address the design trade-offs between the camera and the flight plan parameters and their effect on the images’ quality and the number and quality of matched features consequently. Then we consider the camera calibration as a fundamental stage in the SfM framework that affects afterwards the depth estimation and point cloud densification steps. An automated photogrammetry calibration solution - based on the Caltech Camera Calibration Matlab Toolbox and Andreas Geiger corner detector - is implemented and tested, allowing for a user-free precise camera calibration process. Finally, a professional photogrammetry application software based on the SfM (PIX4Dmappper) is used to assess the use of additional data sets - like the commonly used GPS data – on the overall resulting 3D reconstruction quality in terms of model richness and accuracy. Two geo-localization data sets (standard GPS used by the UAV autopilot and an L1/L2 GPS corrected with PPK) were used with various stated uncertainty bounds processed with the same images’ data set. The comparison of the resulting 3D reconstructed models highlights the decisive importance of identifying the actual uncertainty bounds - combining the equipment nominal uncertainty with the system introduces factors like the camera capture/GPS synchronization- instead of using relaxed (larger) uncertainty bounds or tighter bounds.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. The Association for Unmanned Vehicle Systems International (AUVSI): the benefits of unmanned aircraft systems: saving time, saving money, saving lives. https://epic.org/events/UAS-Uses-Saving-Time-Saving-Money-Saving-Lives.pdf

  2. Snavely N, Seitz S, Szeliski R (2008) Modeling the world from internet photo collections. Int J Comput Vis 80(2):189–210

    Article  Google Scholar 

  3. https://www.vision.caltech.edu/bouguetj/calib_doc/

  4. Geiger A (2012) Automatic camera and range sensor calibration using a single shot. In: ICRA. IEEE

    Google Scholar 

  5. Harris C, Stephens M (1988) A combined corner and edge detector, Plessey Research Roke Manor, UK. The Plessey Company Pic

    Google Scholar 

  6. Zhang Z (1998) A flexible new technique for camera calibration. Technical Report MSRTR-98–71, Microsoft Research. http://research.microsoft.com/˜zhang/Calib/

  7. Toma A, Chiaberge M, Silvagni M, Dara G (2014) The vision-based terrain navigation facility: a technological overview. In: Modelling and simulation for autonomous systems/Jan Hodicky. Springer, pp 56–66

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mario Silvagni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Silvagni, M., Chiaberge, M., Osman, A. (2018). Camera Selection and Flight Planning for Post Processing 3D Reconstruction Automatization. In: Ferraresi, C., Quaglia, G. (eds) Advances in Service and Industrial Robotics. RAAD 2017. Mechanisms and Machine Science, vol 49. Springer, Cham. https://doi.org/10.1007/978-3-319-61276-8_54

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-61276-8_54

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-61275-1

  • Online ISBN: 978-3-319-61276-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics