Text Summarization with Different Encoders for Pointer Generator Network

  • Minakshi TomerEmail author
  • Manoj Kumar
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1087)


The ever-growing increase of documents has compelled the need of text summarization. In the past, deep learning models have shown state-of-the-art results for text summarization. In this paper, a comparison is conducted between different encoders for pointer generator network. The two different encoders used for comparison are bi-directional GRU encoder and bi-directional LSTM encoder. The decoder used with both the encoders is unidirectional LSTM decoder. The results are evaluated using ROUGE value. The experiments show that the bi-directional LSTM gives better result in comparison to GRU encoder.


Text summarization Abstractive Deep learning RNN 


  1. 1.
    A. Nenkova, K. McKeown, A survey of text summarization techniques. Min. Text Data 43–76 (Springer, 2012)Google Scholar
  2. 2.
    E. Lloret, M. Palomar, Text summarisation in progress: a literature review. Artif. Intell. Rev. 37, 1–41 (2012)CrossRefGoogle Scholar
  3. 3.
    M. Gambhir, V. Gupta, Recent automatic text summarization techniques: a survey. Artif. Intell. Rev. 47(1), 1–66 (2017)CrossRefGoogle Scholar
  4. 4.
    S. Jones, Automatic summarising: the state of the art. Inf. Process. Manage. 43(6), 1449–1481 (2007)CrossRefGoogle Scholar
  5. 5.
    T. Young, D. Hazarika, S. Poria, E. Cambria, Recent Trends in Deep Learning Based Natural Language Processing. arXiv:1708.02709v6 [cs.CL] 4 Aug 2018 (2018)
  6. 6.
    H.P. Luhn, The automatic creation of literature abstracts. IBM J. Res. Dev. 159–165 (1958)MathSciNetCrossRefGoogle Scholar
  7. 7.
    E. Lloret, Text Summarization: An Overview. Paper supported by the Spanish Government under the project TEXT-MESS (TIN2006-15265-C06-01 (2008)Google Scholar
  8. 8.
    K. Cho, B. Merrienboer et al., Learning Phrase Representations Using RNN Encoder-Decoder for Statistical Machine Translation. arXiv preprint arXiv:1406.1078 (2014)
  9. 9.
    J. Cheng, M. Lapata, Neural Summarization by Extracting Sentences and Words. arXiv:1603.07252v3 [cs.CL] 1 July 2016 (2016)
  10. 10.
    A.M. Rush, S. Chopra, J. Weston, A Neural Attention Model for Abstractive Sentence Summarization. arXiv:1509.00685v2 [cs.CL] 3 Sept 2015 (2015)
  11. 11.
    B. Hu, Q. Chen, F. Zhu, LCSTS: a large scale chinese short text, in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, September. Association for Computational Linguistics (2015), pp. 1967–1972Google Scholar
  12. 12.
    S. Chopra, M. Auli, A. Rush, Abstractive sentence summarization with attentive recurrent neural networks, in Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2016), pp. 93–98Google Scholar
  13. 13.
    R. Nallapati, B. Zhou, C. Santos, Sequence-to-sequence RNNs for text summarization. ICLR Workshop abs/1602.06023 (2016)Google Scholar
  14. 14.
    S. Song, H. Huang, T. Ruan, Abstractive text summarization using LSTM-CNN based deep learning. Multimedia Tools Appl. 1–19 (2018)Google Scholar
  15. 15.
    A. See, P. Liu, C. Manning, Get To The Point: Summarization with Pointer-Generator Networks. arXiv:1704.04368 [cs.CL] (2017)
  16. 16.
    K. Hermann, M. Moritz et al., Teaching machines to read and comprehend. Adv. Neural Inf. Process. Syst. 1693–1701 (2015)Google Scholar
  17. 17.
    J. Duchi, E. Hazan, Y. Singer, Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetzbMATHGoogle Scholar
  18. 18.
    J. Steinberger, K. Jězek, Evaluation measures for text summarization. Comput. Inform. 28, 1001–1026 (2009)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.USICT, GGSIPUDelhiIndia
  2. 2.Deptt ITMSITDelhiIndia
  3. 3.AIACTR, GGSIPUDelhiIndia

Personalised recommendations