Reformulating natural language queries using sequence-to-sequence models

  • Xiaoyu Liu
  • Shunda Pan
  • Qi ZhangEmail author
  • Yu-Gang Jiang
  • Xuanjing Huang



This work was partially supported by National Key Research and Development Plan (Grant No. 2017YFB1002104), National Natural Science Foundation of China (Grant Nos. 61532011, 61751201, 61473092, 61472088), and STCSM (Grant No. 16JC1420401, 17JC1420200). The authors would like to thank the anonymous reviewers for their helpful comments.


  1. 1.
    Riezler S, Liu Y. Query rewriting using monolingual statistical machine translation. Comput Linguist, 2010, 36: 569–582CrossRefGoogle Scholar
  2. 2.
    Jones R, Rey B, Madani O, et al. Generating query substitutions. In: Proceedings of the 15th International Conference on World Wide Web, Edinburgh, 2006. 387–396Google Scholar
  3. 3.
    Gao J F, He X D, Xie S S, et al. Learning lexicon models from search logs for query expansion. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Jeju Island, 2013. 666–676Google Scholar
  4. 4.
    Song H J, Kim A, Park S B. Translation of natural language query into keyword query using a RNN encoderdecoder. In: Proceedings of International ACM SIGIR Conference, 2017. 965–968Google Scholar
  5. 5.
    Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. 2015. ArXiv: 1409.0473v6Google Scholar
  6. 6.
    Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation. 2015. ArXiv: 1508.04025Google Scholar
  7. 7.
    Gu J, Lu Z, Li H, et al. Incorporating copying mechanism in sequence-to-sequence learning. 2016. ArXiv: 1603.06393Google Scholar
  8. 8.
    Riezler S, Liu Y, Vasserman A. Translating queries into snippets for improved query expansion. In: Proceedings of International Conference on Computational Linguistics, Manchester, 2008. 737–744Google Scholar
  9. 9.
    Rush A M, Chopra S, Weston J. A neural attention model for abstractive sentence summarization. 2015. ArXiv: 1509.00685Google Scholar

Copyright information

© Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  • Xiaoyu Liu
    • 1
  • Shunda Pan
    • 1
  • Qi Zhang
    • 1
    Email author
  • Yu-Gang Jiang
    • 1
  • Xuanjing Huang
    • 1
  1. 1.School of Computer ScienceFudan UniversityShanghaiChina

Personalised recommendations