Skip to main content
Log in

Towards a system for analyzing accidents of unmanned aerial vehicles

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

Our goal was to analyze accidents of unmanned aerial vehicles (UAVs) by replicating their circumstances, using data obtained from the sensors and flight recorder installed on the UAVs. In this paper, we investigated the performance of three tools for 3-D mapping to reproduce the surrounding environment along the path of the UAV, and found that LIDAR data can construct more accurate and broader maps than methods that use monocular or stereo camera images. We then applied an optical flow method to images taken by a rotating monocular camera and found that imaging at more than 120 fps is appropriate for accurate motion tracking of a spinning and falling UAV. Finally, we developed a visualization system that displays replications of UAV flights and its surrounding environment on a computer screen.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Engel J, Schöps T, Cremers D (2014) LSD-SLAM: large-scale direct monocular SLAM. In: European conference on computer vision (ECCV), Septemper 6–12, pp 1–16

  2. Tykkala T, Comport A (2011) A dense structure model for image based stereo SLAM. In: IEEE international conference on robotics and automation (ICRA), May 9–13, pp 1758–1763

  3. Engel J, Stückler J, Cremers D (2015) Large-scale direct SLAM with stereo camera. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), September 28–October 2, pp 1–8

  4. Zhang J, Singh S (2014) LOAM: lidar odometry and mapping in real-time. In: Robotics: science and system conference (RSS), July 12–16, pp 1–9

  5. Veitch-Michaels J, Muller P, Walton D, Storey J, Foster M, Crutchley B (2016) Enhancement of stereo imagery by artificial texture projection generated using a LIDAR. Int Arch Photogramm Remote Sens Spat Inf Sci XLI-B5:599–606

    Article  Google Scholar 

  6. Farneback G (2003) Two-frame estimation based on polynomial expansion. In: Proceeding of the 13th scandinavian conference on image analysis, June 29–July 2, pp 363–370

  7. Yamada R, Yaguchi Y, Miwa M (2018) Construction of the system to visualize the UAV accidents (in Japanese). In: Proceeding of the robotics and mechatronics conference 2018, June 2–5, pp 1–4

  8. Lucas B, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: Proceedings of imaging understanding workshop, pp 121–130

  9. Horn BKP, Schunck BG (1980) Determining optical flow. Artif Intell 17:185–203

    Article  Google Scholar 

  10. Weinzaepfel P, Revaud J, Harchaoui Z, Schmid C (2013) Deep flow: large displacement optical flow with deep matching. In: Proceeding of international conference on computer vision, December 1–8, pp 1385–1392

Download references

Acknowledgements

This study was supported by funding from the New Energy and Industrial Technology Development Organization (NEDO). We appreciate the research support from Dr. Ichiryu, Mr. Arai, and Dr. Wasantha from Kikuchi Seisakusho Co., Ltd.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryuhei Yamada.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yamada, R., Yaguchi, Y., Yoshida, M. et al. Towards a system for analyzing accidents of unmanned aerial vehicles. Artif Life Robotics 24, 94–99 (2019). https://doi.org/10.1007/s10015-018-0460-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-018-0460-z

Keywords

Navigation