Skip to main content
Log in

Landslide detection from bitemporal satellite imagery using attention-based deep neural networks

  • Technical Note
  • Published:
Landslides Aims and scope Submit manuscript

A Correction to this article was published on 23 June 2022

This article has been updated

Abstract

Torrential rainfall predisposes hills to catastrophic landslides resulting in serious damage to life and property. Landslide inventory maps are therefore essential for rapid response and developing disaster mitigation strategies. Manual mapping techniques are laborious and time-consuming, and thus not ideal in rapid response situations. Automated landslide mapping from optical satellite imagery using deep neural networks (DNNs) is becoming popular. However, distinguishing landslides from other changed objects in optical imagery using backbone DNNs alone is difficult. Attention modules have been introduced recently into the architecture of DNNs to address this problem by improving the discriminative ability of DNNs and suppressing noisy backgrounds. This study compares two state-of-the-art attention-boosted deep Siamese neural networks in mapping rainfall-induced landslides in the mountainous Himalayan region of Nepal using Planetscope (PS) satellite imagery. Our findings confirm that attention networks improve the performance of DNNs as they can extract more discriminative features. The Siamese Nested U-Net (SNUNet) produced the best and most coherent landslide inventory map among the methods in the test area, achieving an F1-score of 0.73, which is comparable to other similar studies. Our findings demonstrate a prospect for application of the attention-based DNNs in rapid landslide mapping and disaster mitigation not only for rainfall-triggered landslides but also for earthquake-triggered landslides.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Change history

References

Download references

Acknowledgements

All authors are grateful to the reviewers and editor for their valuable and constructive comments.

Funding

This research was funded by the National Key Research and Development Program of China (2019YFC1510203), the National Natural Science Foundation (41875094, 61872189), and the Sino-German Cooperation Group Project (GZ1447).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: Solomon Obiri Yeboah Amankwah, Guojie Wang; methodology: Solomon Obiri Yeboah Amankwah; resources: Guojie Wang, Kaushal Gnyawali; software: Solomon Obiri Yeboah Amankwah, Dong Zhen; visualization: Solomon Obiri Yeboah Amankwah; writing, review, and editing: Solomon Obiri Yeboah Amankwah, Guojie Wang, Kaushal Gnyawali, Daniel Fiifi Tawiah Hagan, Isaac Sarfo, Isaac Kwesi Nooni, Waheed Khan, Zheng Duan; supervision: Guojie Wang; funding acquisition: Guojie Wang. All authors have read and consented to the published version of the manuscript.

Corresponding author

Correspondence to Guojie Wang.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

The original online version of this article was revised: The author’s regret that the name of Dr. Zheng Duan was captured in reverse in the original article. It is now corrected in this erratum article.

The original article has been corrected.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 732 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Amankwah, S.O.Y., Wang, G., Gnyawali, K. et al. Landslide detection from bitemporal satellite imagery using attention-based deep neural networks. Landslides 19, 2459–2471 (2022). https://doi.org/10.1007/s10346-022-01915-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10346-022-01915-6

Keywords

Navigation