Advertisement

Main Point Generator: Summarizing with a Focus

  • Tong Lee Chung
  • Bin Xu
  • Yongbin Liu
  • Chunping Ouyang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10827)

Abstract

Text summarization is attracting more and more attention while deep neural network has had many successful application in NLP. One problem of such models is its inability to focus on the essentials of documents, thus generating summaries that may not be important, especially during multi-sentence summarization. In this paper, we propose Main Pointer Generator (MPG) to address the problem, where at each decoder step the whole document is taken into consideration when calculating the probability of next generated token. We experiment with CNN/Daily news corpus and results show that summaries our MPG generated follow the main theme while outperforming the original pointer generator network by about 0.5 ROUGE point.

Keywords

Text summarization Sequence-to-sequence Pointer Coverage 

Notes

Acknowledgments

This work is supported by China National High-Tech Project (863) under grant (No. 2015AA015401). Beijing Key Lab of Networked Multimedia also supports our research work. The work is supported by State Key Program of National Natural Science of China (No. 61533018), National Natural Science Foundation of China (No. 61402220), and the Philosophy and Social Science Foundation of Hunan Province (No. 16YBA323).

References

  1. 1.
    Nenkova, A., McKeown, K.: Automatic summarization. Found. Trends Inf. Retrieval 5(23), 103–233 (2011)CrossRefGoogle Scholar
  2. 2.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2, NIPS 2014 (2014)Google Scholar
  3. 3.
    Fan, A., Grangier, D., Auli, M.: Controllable abstractive summarization. CoRR, abs/1711.05217 (2017)Google Scholar
  4. 4.
    See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics (2017)Google Scholar
  5. 5.
    Nallapati, R., Zhou, B., dos Santos, C.N., Gülçehre, Ç., Xiang, B.: Abstractive text summarization using sequence-to-sequence rnns and beyond. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016, Berlin, Germany, 11–12 Aug 2016Google Scholar
  6. 6.
    Chopra, S., Auli, M., Rush, A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: The 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2016, San Diego California, USA, 12–17 June 2016Google Scholar
  7. 7.
    Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, Lisbon, Portugal, 17–21 Sept 2015Google Scholar
  8. 8.
    Paulus, R., Xiong, C., Socher, R.: A deep reinforced model for abstractive summarization. CoRR, abs/1705.04304 (2017)Google Scholar
  9. 9.
    Saggion, H., Poibeau, T.: Automatic text summarization: past, present and future. In: Poibeau, T., Saggion, H., Piskorski, J., Yangarber, R. (eds.) Multi-source, Multilingual Information Extraction and Summarization. Theory and Applications of Natural Language Processing, pp. 3–21. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-28569-1_1CrossRefGoogle Scholar
  10. 10.
    Shen, S., Zhao, Y., Liu, Z., Sun, M.: Neural headline generation with sentence-wise optimization. arXiv Computation and Language (2016)Google Scholar
  11. 11.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: International Conference on Learning Representations (ICLR) (2015)Google Scholar
  12. 12.
    Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. In: Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 30, pp. 6000–6010. Curran Associates Inc. (2017)Google Scholar
  13. 13.
    Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 2692–2700. Curran Associates Inc. (2015)Google Scholar
  14. 14.
    Mi, H., Sankaran, B., Wang, Z., Ittycheriah, A.: Coverage embedding models for neural machine translation. In: EMNLP (2016)Google Scholar
  15. 15.
    Kikuchi, Y., Neubig, G., Sasano, R., Takamura, H., Okumura, M.: Controlling output length in neural encoder-decoders. CoRR, abs/1609.09552 (2016)Google Scholar
  16. 16.
    Lin, C.-Y.: Rouge: a package for automatic evaluation of summaries. In: Text Summarization Branches Out: Proceedings of the ACL-04 WorkshopGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Computer Science and TechnologyTsinghua UniversityBeijingChina
  2. 2.Beijing National Research Center for Information Science and Technology (BNRist)BeijingChina
  3. 3.College of ComputingUniversity of South ChinaHengyangChina

Personalised recommendations