Skip to main content

Dynamic Multi-level Attention Models for Dialogue Response Generation

  • Conference paper
  • First Online:
Distributed Computing and Artificial Intelligence, Special Sessions, 17th International Conference (DCAI 2020)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1242))

  • 641 Accesses

Abstract

One of the key challenges for creating a successful chat bot is to find an effective way to learn from human-human conversation data. Recently, a few neural network based dialog models, including the RNN language model (RNNLM) and the hierarchical recurrent encoder-decoder (HRED) model have shown promising results on dialog response generation. However, there is a critical challenge that the responses generated by these models incline to chit-chat style instead of being informative. In this paper, we empirically investigate this problem and also propose multilevel attention models to extend HRED with a hope that the attention mechanism can capture more informative content. The experiment studies on two multi-turn dialogue Datasets have shown the model’s potential.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://tcci.ccf.org.cn/conference/2018/dldoc/taskgline05.pdf.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of 3rd International Conference on Learning Representations (2015)

    Google Scholar 

  2. Fleiss, J.L., Cohen, J.: The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability. Educ. Psychol. Meas. 33, 613–619 (1973). https://doi.org/10.1177/001316447303300309

    Article  Google Scholar 

  3. Gu, J., Lu, Z., Li, H., Li, V.O.K.: Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of 54th Annual Meeting of the Association for Computational Linguistics, pp. 1631–1640 (2016). https://doi.org/10.18653/v1/p16-1154

  4. Hosking, T., Riedel, S.: Evaluating rewards for question generation models. In: Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics, pp. 2278–2283 (2019). https://doi.org/10.18653/v1/n19-1237

  5. Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Proceedings of 2014 Annual Conference on Neural Information Processing Systems, pp. 2042–2050 (2014)

    Google Scholar 

  6. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of 3rd International Conference on Learning Representations (2015)

    Google Scholar 

  7. Li, J., Galley, M., Brockett, C., Gao, J., Dolan, B.: A diversity-promoting objective function for neural conversation models. In: Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics, pp. 110–119 (2016). https://doi.org/10.18653/v1/n16-1014

  8. Li, J., Monroe, W., Ritter, A., Jurafsky, D., Galley, M., Gao, J.: Deep reinforcement learning for dialogue generation. In: Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing, pp. 1192–1202 (2016). https://doi.org/10.18653/v1/d16-1127

  9. Li, J., Monroe, W., Shi, T., Jean, S., Ritter, A., Jurafsky, D.: Adversarial learning for neural dialogue generation. In: Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing, pp. 2157–2169 (2017). https://doi.org/10.18653/v1/d17-1230

  10. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015). https://doi.org/10.18653/v1/d15-1166

  11. Mei, H., Bansal, M., Walter, M.R.: Coherent dialogue with attention-based language models. In: Proceedings of 31st AAAI Conference on Artificial Intelligence, pp. 3252–3258 (2017)

    Google Scholar 

  12. Papineni, K., Roukos, S., Ward, T., Zhu, W.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318 (2002)

    Google Scholar 

  13. Serban, I.V., Klinger, T., Tesauro, G., Talamadupula, K., Zhou, B., Bengio, Y., Courville, A.C.: Multiresolution recurrent neural networks: an application to dialogue response generation. In: Proceedings of 31st AAAI Conference on Artificial Intelligence, pp. 3288–3294 (2017)

    Google Scholar 

  14. Serban, I.V., Sordoni, A., Bengio, Y., Courville, A.C., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: Proceedings of 30th AAAI Conference on Artificial Intelligence, pp. 3776–3784 (2016)

    Google Scholar 

  15. Serban, I.V., Sordoni, A., Lowe, R., Charlin, L., Pineau, J., Courville, A.C., Bengio, Y.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: Proceedings of 31st AAAI Conference on Artificial Intelligence, pp. 3295–3301 (2017)

    Google Scholar 

  16. Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: Proceedings of 53rd Annual Meeting of the Association for Computational Linguistics, pp. 1577–1586 (2015). https://doi.org/10.3115/v1/p15-1152

  17. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of 2014 Annual Conference on Neural Information Processing Systems, pp. 3104–3112 (2014)

    Google Scholar 

  18. Tran, K.M., Bisazza, A., Monz, C.: Recurrent memory networks for language modeling. In: Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics, pp. 321–331 (2016). https://doi.org/10.18653/v1/n16-1036

  19. Yan, R., Zhao, D.: Smarter response with proactive suggestion: a new generative neural conversation paradigm. In: Proceedings of 27th International Joint Conference on Artificial Intelligence, pp. 4525–4531 (2018). https://doi.org/10.24963/ijcai.2018/629

Download references

Acknowledgments

This work was partially supported by the National Natural Science Foundation of China (No. 61977002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenge Rong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Y., Rong, W., Zhou, S., Ouyang, Y., Xiong, Z. (2021). Dynamic Multi-level Attention Models for Dialogue Response Generation. In: Rodríguez González, S., et al. Distributed Computing and Artificial Intelligence, Special Sessions, 17th International Conference. DCAI 2020. Advances in Intelligent Systems and Computing, vol 1242. Springer, Cham. https://doi.org/10.1007/978-3-030-53829-3_6

Download citation

Publish with us

Policies and ethics