Advertisement

T2S: An Encoder-Decoder Model for Topic-Based Natural Language Generation

  • Wenjie Ou
  • Chaotao Chen
  • Jiangtao Ren
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10859)

Abstract

Natural language generation (NLG) plays a critical role in various natural language processing (NLP) applications. And the topics provide a powerful tool to understand the natural language. We propose a novel topic-based NLG model which can generate topic coherent sentences given single topic or combination of topics. The model is an extension of the recurrent encoder-decoder framework by introducing a global topic embedding matrix. Experimental results show that our encoder can not only transform a source sentence to a representative topic distribution which can give a better interpretation of the source sentence, but also generate topic coherent and diversified sentences given different topic distribution without any text-level input.

Keywords

Natural language generation Topic Encoder-decoder 

References

  1. 1.
    Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)zbMATHGoogle Scholar
  2. 2.
    Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A.M., Józefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: Proceedings of CoNLL, pp. 10–21 (2016)Google Scholar
  3. 3.
    Cao, Z., Li, S., Liu, Y., Li, W., Ji, H.: A novel neural topic model and its supervised extension. In: Proceedings of AAAI, pp. 2210–2216 (2015)Google Scholar
  4. 4.
    Cho, K., van Merrienboer, B., Gülçehre, Ç., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of EMNLP, pp. 1724–1734 (2014)Google Scholar
  5. 5.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  6. 6.
    Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. CoRR abs/1412.6980 (2014)Google Scholar
  7. 7.
    Kiros, R., Zhu, Y., Salakhutdinov, R., Zemel, R.S., Urtasun, R., Torralba, A., Fidler, S.: Skip-thought vectors. In: Proceeding of NIPS, pp. 3294–3302 (2015)Google Scholar
  8. 8.
    Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: Proceedings of AAAI, pp. 2267–2273 (2015)Google Scholar
  9. 9.
    Li, J., Luong, M., Jurafsky, D.: A hierarchical neural autoencoder for paragraphs and documents. In: Proceedings of ACL, pp. 1106–1115 (2015)Google Scholar
  10. 10.
    Liu, Y., Liu, Z., Chua, T., Sun, M.: Topical word embeddings. In: Proceedings of AAAI, pp. 2418–2424 (2015)Google Scholar
  11. 11.
    Mikolov, T., Zweig, G.: Context dependent recurrent neural network language model. In: Proceedings of SLT, pp. 234–239 (2012)Google Scholar
  12. 12.
    Neubig, G.: Neural machine translation and sequence-to-sequence models: A tutorial. CoRR abs/1703.01619 (2017)Google Scholar
  13. 13.
    Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: Proceedings of EMNLP, pp. 1532–1543 (2014)Google Scholar
  14. 14.
    Pontiki, M., Galanis, D., Pavlopoulos, J., Papageorgiou, H., Androutsopoulos, I., Manandhar, S.: Semeval-2014 task 4: aspect based sentiment analysis. In: Proceedings of SemEval@COLING, pp. 27–35 (2014)Google Scholar
  15. 15.
    Serban, I.V., Sordoni, A., Lowe, R., Charlin, L., Pineau, J., Courville, A.C., Bengio, Y.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: Proceedings of AAAI, pp. 3295–3301 (2017)Google Scholar
  16. 16.
    Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: Proceedings of ACL, pp. 1577–1586 (2015)Google Scholar
  17. 17.
    Sordoni, A., Galley, M., Auli, M., Brockett, C., Ji, Y., Mitchell, M., Nie, J., Gao, J., Dolan, B.: A neural network approach to context-sensitive generation of conversational responses. In: Proceedings of NAACL-HLT, pp. 196–205 (2015)Google Scholar
  18. 18.
    Tran, V., Nguyen, L., Tojo, S.: Neural-based natural language generation in dialogue using RNN encoder-decoder with semantic aggregation. In: Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue, Saarbrücken, Germany, 15–17 August 2017. pp. 231–240 (2017)Google Scholar
  19. 19.
    Wen, T., Gasic, M., Kim, D., Mrksic, N., Su, P., Vandyke, D., Young, S.J.: Stochastic language generation in dialogue using recurrent neural networks with convolutional sentence reranking. In: Proceedings of SIGDIAL, pp. 275–284 (2015)Google Scholar
  20. 20.
    Wen, T., Gasic, M., Mrksic, N., Rojas-Barahona, L.M., Su, P., Vandyke, D., Young, S.J.: Multi-domain neural network language generation for spoken dialogue systems. In: Proceedings of NAACL-HLT, pp. 120–129 (2016)Google Scholar
  21. 21.
    Wen, T., Gasic, M., Mrksic, N., Su, P., Vandyke, D., Young, S.J.: Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. In: Proceedings of EMNLP, pp. 1711–1721 (2015)Google Scholar
  22. 22.
    Xing, C., Wu, W., Wu, Y., Liu, J., Huang, Y., Zhou, M., Ma, W.: Topic aware neural response generation. In: Proceedings of AAAI, pp. 3351–3357 (2017)Google Scholar
  23. 23.
    Xu, J., Xu, B., Wang, P., Zheng, S., Tian, G., Zhao, J., Xu, B.: Self-taught convolutional neural networks for short text clustering. CoRR abs/1701.00185 (2017)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Data and Computer ScienceSun Yat-sen UniversityGuangzhouChina

Personalised recommendations