Abstract
This study investigates the use of query expansion (QE) methods in sentence retrieval for non-factoid queries to address the query-document term mismatch problem. Two alternative QE approaches: i) pseudo relevance feedback (PRF), using Robertson term selection, and ii) word embeddings (WE) of query words, are explored. Experiments are carried out on the WebAP data set developed using the TREC GOV2 collection. Experimental results using P@10, NDCG@10 and MRR show that QE using PRF achieves a statistically significant improvement over baseline retrieval models, but that while WE also improves over the baseline, this is not statistically significant. A method combining PRF and WE expansion performs consistently better than using only the PRF method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
We learnt different embeddings by varying the training method, dimension size, window size, no. of iterations in internal development experiments, but the results obtained showed little variation in performance.
References
Allan, J., Wade, C., Bolivar, A.: Retrieval and novelty detection at the sentence level. In: Proceedings of SIGIR 2003, pp. 314–321 (2003)
Diaz, F., Mitra, B., Craswell, N.: Query expansion with locally-trained word embeddings (2016). arXiv preprint arXiv:1605.07891
Keikha, M., Park, J.H., Croft, W.B., Sanderson, M.: Retrieving passages and finding answers. In: Proceedings of the 2014 Australasian Document Computing Symposium, p. 81 (2014)
Kuzi, S., Shtok, A., Kurland, O.: Query expansion using word embeddings. In: Proceedings of CIKM 2016, pp. 1929–1932 (2016)
Metzler, D., Kanungo, T.: Machine learned sentence selection strategies for query-biased summarization. In: SIGIR Learning to Rank Workshop, pp. 40–47 (2008)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. CoRR, abs/1301.3781 (2013)
Robertson, S.E.: On term selection for query expansion. J. Documentation 46(4), 359–364 (1990)
Yang, L., et al.: Beyond factoid QA: effective methods for non-factoid answer sentence retrieval. In: Ferro, N., et al. (eds.) ECIR 2016. LNCS, vol. 9626, pp. 115–128. Springer, Cham (2016). doi:10.1007/978-3-319-30671-1_9
Roy, D., Ganguly, D., Mitra, M., Jones, G.J.F.: Word vector compositionality based relevance feedback using kernel density estimation. In: Proceedings of CIKM 2016, pp. 1281–1290 (2016)
Ponte, J.M., Croft, W.B.: A language modeling approach to information retrieval. In: Proceedings of SIGIR 1998, pp. 275–281 (1998)
Robertson, S., Zaragoza, H., et al.: The probabilistic relevance framework: Bm25 and beyond. Found. Trends® Inf. Retrieval 3(4), 333–389 (2009)
Acknowledgments
We thank the reviewers for their feedback and comments. This research is supported by Science Foundation Ireland (SFI) as a part of the ADAPT Centre at Dublin City University (Grant No: 12/CE/I2267).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Arora, P., Foster, J., Jones, G.J.F. (2017). Query Expansion for Sentence Retrieval Using Pseudo Relevance Feedback and Word Embedding. In: Jones, G., et al. Experimental IR Meets Multilinguality, Multimodality, and Interaction. CLEF 2017. Lecture Notes in Computer Science(), vol 10456. Springer, Cham. https://doi.org/10.1007/978-3-319-65813-1_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-65813-1_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-65812-4
Online ISBN: 978-3-319-65813-1
eBook Packages: Computer ScienceComputer Science (R0)