Skip to main content

What is this article about? Generative summarization with the BERT model in the geosciences domain

Abstract

In recent years, a large amount of data has been accumulated, such as those recorded in geological journals and report literature, which contain a wealth of information, but these data have not been fully exploited or mined. Automatic information extraction offers an effective way to achieve new discoveries and pursue further analysis, which is of great significance for users, researchers or decision makers to aid and support analysis. In this paper, we utilize the bidirectional encoder representations from transformers (BERT) model, which is fine-tuned and then applied to automatically generate the title of a given input summarization based on the collection of published literature samples. The framework contains an encoder module, decoder module and training module. The core stages of summary generation involve the combination of encoder and decoder modules, and the multi-stage function is then used to connect modules, thus endowing the text summarization model with a multi-task learning architecture. Compared to other baseline models, our proposed model obtains the best results on the constructed dataset. Therefore, based on the proposed model, an automatic geological briefing generation platform is developed and used as an online platform to support the excavation of key areas and a visual presentation analysis of the literature.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

References

  1. Al-Abdallah RZ, Al-Taani AT (2017) Arabic single-document text summarization using particle swarm optimization algorithm[J]. Procedia Comput Sci 117:30–37

    Article  Google Scholar 

  2. Bhat IK , Mohd M , Hashmy R (2018) SumItUp: a hybrid single-document text summarizer[M]

  3. Carbonell J , Goldstein J (1998) The use of MMR, diversity-based reranking for reordering documents and producing summaries. ACM:335–336

  4. Ceylan H , Mihalcea R , O'Zertem U, et al. (2010) Quantifying the limits and success of extractive summarization systems across domains[C]// human language technologies: the conference of the north American chapter of the Association for Computational Linguistics. Association for Computational Linguistics

  5. Chen YC , Bansal M (2018) Fast abstractive summarization with reinforce-selected sentence rewriting[C]// proceedings of the 56th annual meeting of the Association for Computational Linguistics (volume 1: long papers)

  6. Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. Proceedings of the 54th annual meeting of the Association for Computational Linguistics, Berlin, pp 484–494

    Google Scholar 

  7. Devlin J, Chang M W, Lee K, et al (2018) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[J]. arXiv preprint arXiv:1810.04805

  8. El-Kassas WS, Salama CR, Rafea AA et al (2020) Automatic text summarization: a comprehensive survey[J]. Expert Syst Appl 113679

  9. fxsjy (2018) https://github.com/fxsjy/jieba

  10. Grusky M, Naaman M, Artzi Y (2018) NEWSROOM: A dataset of 1.3 million summarieswith diverse extractive strategies. Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans

    Google Scholar 

  11. Hou S, Lu R (2020) Knowledge-guided unsupervised rhetorical parsing for text summarization[J]. Inf Syst :101615

  12. Hou L, Hu P, Bei C (2017) Abstractive document summarization via neural model with joint attention. Paper presented at the natural language processing and Chinese computing, Dalian

    Google Scholar 

  13. Howard J, Ruder S (2018) Universal language model fine-tuning for text classification. Proceedings of the 56th annual meeting of the association for computational linguistics (volume 1: long papers):328–39

  14. Hu B & Chen Q, Zhu F (2015) LCSTS: A Large Scale Chinese Short Text Summarization Dataset. https://doi.org/10.18653/v1/D15-1229

  15. Hunter J, Freer Y, Gatt A et al (2012) Automatic generation of natural language nursing shift summaries in neonatal intensive care: BT-nurse[J]. Artif Intell Med 56(3):157–172

    Article  Google Scholar 

  16. Joshi M , Hui W, Mcclean S (2018) Dense semantic graph and its application in single document summarisation. Emerging Ideas on Information Filtering and Retrieval

  17. Jun Q (2019) Hybrid text summarization model based on reinforcement learning[J]. Information Technol Inform Technol 226(01):67–70

    Google Scholar 

  18. Kingma, D, Ba J (2014) Adam: a method for stochastic optimization. International Conference on Learning Representations

  19. Liang Z , Du J , Li C (2020) Abstractive social media text summarization using selective reinforced Seq2Seq attention model[J]. Neurocomputing, 410

  20. Lin C-Y (2004) ROUGE: a package for automatic evaluation of summaries. Proceedings of the ACL Workshop: Text Summarization Braches Out 10

  21. Lin J, Sun X, Ma S, Su Q (2018) Global Encoding for Abstractive Summarization. 163–169. https://doi.org/10.18653/v1/P18-2027

  22. Liu B (2012) Sentiment analysis and opinion mining[J]. Synthesis Lectures Human Language Technol 5(1):160–167

    Google Scholar 

  23. Liu Y (2019) Fine-tune BERT for extractive summarization[J]

  24. Lu R, Wang T, Zeng BQ, Liu X (2020) TSPT: a three-stage composite text summarization model based on pre-training. Appl Res Comput 37(10):2917–2921

  25. Ma X (2019) Geo-Data Science: Leveraging Geoscience Research with Geoinformatics, Semantics and Open Data. Acta Geologica Sinica 93:44–47. https://doi.org/10.1111/1755-6724.14240

    Article  Google Scholar 

  26. Ma S, Sun X, Lin J, Wang H (2018) Autoencoder as Assistant Supervisor: Improving Text Representation for Chinese Social Media Text Summarization

  27. Ma X, Ma C, Wang C (2020) A new structure for representing and tracking version information in a deep time knowledge graph. Comput Geosci 145:104620. https://doi.org/10.1016/j.cageo.2020.104620

    Article  Google Scholar 

  28. Mao X, Yang H, Huang S et al (2019) Extractive Summarization Using Supervised and Unsupervised Learning[J]. Expert Syst Appl 133:173–181

    Article  Google Scholar 

  29. Mohan MJ, Sunitha C, Ganesh A, Jaya A (2016) A study on ontology based abstractive summarization. Procedia Comput Sci 87:32–37

    Article  Google Scholar 

  30. Nallapati R, Xiang B , Zhou B (2016a) Sequence-to-sequence RNNs for text summarization[J]

  31. Nallapati R, Zhai F, Zhou B (2016b) SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents

  32. Narayan S , Cohen SB, Lapata M (2018a) Don't give me the details, just the summary! Topic-aware convolutional neural networks for extreme summarization[J]

  33. Narayan S , Cohen SB , Lapata M (2018b) Don't give me the details, just the summary! Topic-aware convolutional neural networks for extreme summarization[J]

  34. Narayan S, Cardenas R, Topoulos NP, Cohen SB, Lapata M, Yu JS, Chang Y (2018c) Document modeling with external attention for sentence extraction. Proceedings of the 56th annual meeting of the Association for Computational Linguistics, Melbourne

    Book  Google Scholar 

  35. Nenkova A , Mckeown K (2012) A survey of text summarization techniques[J]. Springer US

  36. Peters M, Neumann M, Iyyer M, Gardner M, Clark C, Lee K, et al. (2018) Deep contextualized word representations. Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long papers):2227–37

  37. Qiu Q, Xie Z, Wu L, Wenjia L (2018a) DGeoSegmenter: A dictionary-based Chinese word segmenter for the geoscience domain. Comput Geosci 121. https://doi.org/10.1016/j.cageo.2018.08.006

  38. Qiu Q, Xie Z, Wu L (2018b) A cyclic self-learning Chinese word segmentation for the geoscience domain. Geomatica. https://doi.org/10.1139/geomatica-2018-0007

  39. Qiu Q, Xie Z, Wu L, Wenjia L (2019) Geoscience Keyphrase Extraction Algorithm Using Enhanced Word Embedding Expert Systems with Applications 125. https://doi.org/10.1016/j.eswa.2019.02.001

  40. Qiu Q, Xie Z, Wu L, Tao L (2020) Automatic spatiotemporal and semantic information extraction from unstructured geoscience reports using text mining techniques Earth Sci Inform 13. https://doi.org/10.1007/s12145-020-00527-9

  41. Radev DR (2004) LexRank: graph-based lexical centrality as salience in text summarization[J]. J Qiqihar Junior Teachers College 22:2004

    Google Scholar 

  42. Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training

  43. Rush AM, Chopra S, Weston J (2015) A Neural Attention Model for Abstractive Sentence Summarization[J]. Computer Science. Sequence model for extractive summarization of documents. In Proceedings of the 31st AAAI conference on artificial intelligence, pages 3075–3081, San Francisco

  44. Sandhaus E (2008) The New York Times Annotated Corpus. Linguistic Data Consortium, Philadelphia, 6(12)

  45. Shuai W , Xiang Z , Bo L , et al. (2017) Integrating extractive and abstractive models for long text summarization[C]// 2017 IEEE international congress on big data (BigData congress). IEEE

  46. Siddiqui T , Shamsi J A (2018) Generating abstractive summaries using sequence to sequence attention model[C]// Frontiers of information technology. IEEE Comput Soc

  47. Sutskever I , Vinyals O , Le Q V (2014) Sequence to sequence learning with neural networks[J]. NIPS

  48. Tan J , Wan X , Xiao J (2017) Abstractive document summarization with a graph-based attentional neural model[C]// meeting of the Association for Computational Linguistics

  49. Wang C, Hazen R, Cheng Q, Stephenson M, Zhou C, Fox P, Shen S, Oberhänsli R, Hou Z, Ma X, Feng Z, Fan J, Ma C, Hu X, Luo B, Wang J (2021) The deep-time digital earth program: data-driven discovery in geosciences. Natl Sci Rev. https://doi.org/10.1093/nsr/nwab027

  50. Wenjia L, Ma K, Qiu Q, Wu L, Xie Z, Li S, Chen S (2021) Chinese Word Segmentation Based on Self-Learning Model and Geological Knowledge for the Geoscience Domain Earth and Space Science 8. https://doi.org/10.1029/2021EA001673

  51. Zhang H , Gong Y , Yan Y , et al. (2019) Pretraining-based natural language generation for text summarization[J]

  52. Zhou Q , Yang N , Wei F , et al. (2018) Neural document summarization by jointly learning to score and select sentences

Download references

Acknowledgements

We would like to thank the anonymous reviewers for carefully reading this paper and their very useful comments. This study was financially supported by the National Natural Science Foundation of China (42050101, U1711267, 41871311, 41871305), National Key Research and Development Program (2018YFB0505500, 2018YFB0505504) and the Fundamental Research Funds for the Central Universities, China University of Geosciences (Wuhan) (No. CUG2106116)).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Qinjun Qiu.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by: H. Babaie

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ma, K., Tian, M., Tan, Y. et al. What is this article about? Generative summarization with the BERT model in the geosciences domain. Earth Sci Inform (2021). https://doi.org/10.1007/s12145-021-00695-2

Download citation

Keywords

  • Geological domain
  • Fine-tuned BERT model
  • Automatic text summarization
  • Briefing generation framework