Advertisement

Memory-Based Model with Multiple Attentions for Multi-turn Response Selection

  • Xingwu Lu
  • Man LanEmail author
  • Yuanbin WuEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11302)

Abstract

In this paper, we study the task of multi-turn response selection in retrieval-based dialogue systems. Previous approaches focus on matching response with utterances in the context to distill important matching information, and modeling sequential relationship among utterances. This kind of approaches do not take into account the position relationship and inner semantic relevance between utterances and query (i.e., the last utterance). We propose a memory-based network (MBN) to build the effective memory integrating position relationship and inner semantic relevance between utterances and query. Then we adopt multiple attentions on the memory to learn representations of context with multiple levels, which is similar to the behavior of human that repetitively think before response. Experimental results on a public data set for multi-turn response selection show the effectiveness of our MBN model.

Keywords

Multi-turn conversation Response selection Neural networks Memory network 

Notes

Acknowledgements

This work is supported by the Science and Technology Commission of Shanghai Municipality Grant (No. 15ZR1410700) and the open project of Shanghai Key Laboratory of Trustworthy Computing (No. 07dz22304201604).

References

  1. 1.
    Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: EMNLP, pp. 452–461 (2017)Google Scholar
  2. 2.
    Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
  3. 3.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  4. 4.
    Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: NIPS, pp. 2042–2050 (2014)Google Scholar
  5. 5.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv:1412.6980 (2014)
  6. 6.
    LeCun, Y., et al.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)CrossRefGoogle Scholar
  7. 7.
    Li, F.L., et al.: AliMe assist: an intelligent assistant for creating an innovative e-commerce experience. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 2495–2498. ACM (2017)Google Scholar
  8. 8.
    Logeswaran, L., Lee, H., Radev, D.: Sentence ordering and coherence modeling using recurrent neural networks. arXiv:1611.02654 (2018)
  9. 9.
    Lowe, R., Pow, N., Serban, I., Pineau, J.: The Ubuntu dialogue corpus: a large dataset for research in unstructured multi-turn dialogue systems. arXiv:1506.08909 (2015)
  10. 10.
    Lu, X., Lan, M., Wu, Y.: Memory-based matching models for multi-turn response selection in retrieval-based chatbots. In: NLPCC, pp. 294–303 (2018)CrossRefGoogle Scholar
  11. 11.
    Lu, Z., Li, H.: A deep architecture for matching short texts. In: NIPS, pp. 1367–1375 (2013)Google Scholar
  12. 12.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS, pp. 3111–3119 (2013)Google Scholar
  13. 13.
    Serban, I.V., Sordoni, A., Bengio, Y., Courville, A.C., Pineau, J.: Building end-to-end dialogue systems using generative hierarchical neural network models. In: AAAI, vol. 16, pp. 3776–3784 (2016)Google Scholar
  14. 14.
    Serban, I.V., et al.: A hierarchical latent variable encoder-decoder model for generating dialogues. In: AAAI, pp. 3295–3301 (2017)Google Scholar
  15. 15.
    Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. In: ACL, vol. 1, pp. 1577–1586 (2015)Google Scholar
  16. 16.
    Shum, H.Y., He, X., Li, D.: From Eliza to Xiaoice: challenges and opportunities with social chatbots. arXiv preprint arXiv:1801.01957 (2018)
  17. 17.
    Sordoni, A., et al.: A neural network approach to context-sensitive generation of conversational responses. In: NAACL, pp. 196–205 (2015)Google Scholar
  18. 18.
    Wang, M., Lu, Z., Li, H., Liu, Q.: Syntax-based deep matching of short texts. arXiv:1503.02427 (2015)
  19. 19.
    Wang, Y., Yan, Z., Li, Z., Chao, W.: Response selection of multi-turn conversation with deep neural networks. In: NLPCC, pp. 110–119 (2018)CrossRefGoogle Scholar
  20. 20.
    Wu, Y., Wu, W., Xing, C., Zhou, M., Li, Z.: Sequential matching network: a new architecture for multi-turn response selection in retrieval-based chatbots. In: ACL, vol. 1, pp. 496–505 (2017)Google Scholar
  21. 21.
    Zhang, Z., Liu, S., Li, M., Zhou, M., Chen, E.: Stack-based multi-layer attention for transition-based dependency parsing. In: EMNLP, pp. 1677–1682 (2017)Google Scholar
  22. 22.
    Zhou, X., et al.: Multi-view response selection for human-computer conversation. In: EMNLP, pp. 372–381 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.School of Computer Science and Software EngineeringEast China Normal UniversityShanghaiPeople’s Republic of China
  2. 2.Shanghai Key Laboratory of Multidimensional Information ProcessingShanghaiChina

Personalised recommendations