Abstract
In document retrieval using pseudo relevance feedback, after initial ranking, a fixed number of top-ranked documents are selected as feedback to build a new expansion query model. However, very little attention has been paid to an intuitive but critical fact that the retrieval performance for different queries is sensitive to the selection of different numbers of feedback documents. In this paper, we explore two approaches to incorporate the factor of query-specific feedback document selection in an automatic way. The first is to determine the “optimal” number of feedback documents with respect to a query by adopting the clarity score and cumulative gain. The other approach is that, instead of capturing the optimal number, we hope to weaken the effect of the numbers of feedback document, i.e., to improve the robustness of the pseudo relevance feedback process, by a mixture model. Our experimental results show that both approaches improve the overall retrieval performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Collins-Thompson, K., Callan, J.: Estimation and use of uncertainty in pseudo-relevance feedback. In: Proceedings of the SIGIR 2007, pp. 303–310 (2007)
Croft, W.B.: Combining approaches in information retrieval. In: Croft, W.B. (ed.) Advances in Information Retrieval: Recent Research from the CIIR, pp. 1–36. Academic Publishers, Boston (2000)
Cronen-Townsend, S., Zhou, Y., Croft, W.B.: A language modeling framework for selective query expansion. Technical report, CIIR (2004)
Jarvelin, K., Kekalainen, J.: Ir evaluation methods for retrieving highly relevant documents. In: Proceedings of SIGIR 2000, pp. 41–48 (2000)
Jarvelin, K., Kekalainen, J.: Cumulated gain-based evaluation of ir techniques. ACM Transaction on Information Systems 20, 422–446 (2002)
Lavrenko, V.: Optimal Mixture Models in IR. In: Crestani, F., Girolami, M., van Rijsbergen, C.J.K. (eds.) ECIR 2002. LNCS, vol. 2291, pp. 193–212. Springer, Heidelberg (2002)
Matveeva, I., Burges, C., Burkard, T., Laucius, A., Wong, L.: High accuracy retrieval with multiple nested ranker. In: Proceedings of SIGIR 2006 (2006)
Okabe, M., Umemura, K., Yamada, S.: Query Expansion with the Minimum Relevance Judgments. In: Lee, G.G., Yamada, A., Meng, H., Myaeng, S.-H. (eds.) AIRS 2005. LNCS, vol. 3689, pp. 31–42. Springer, Heidelberg (2005)
Onoda, T., Murata, H., Yamada, S.: One class classification methods based non-relevance feedback document retrieval. In: The International Workshop on Intelligent Web Interaction, Hong Kong, pp. 389–392 (2006)
Voohees, E.: Evaluating by highly relevant documents. In: Proceedings of SIGIR 2001, pp. 74–82 (2001)
Winaver, M., Kurland, O., Domshlak, C.: Towards robust query expansion: Model selection in the language modeling framework (poster). In: Proceedings of SIGIR 2007, Amsterdam (2007)
Yamron, J., Carp, I., Gillick, L., Lowe, S., van Mulbregt, P.: Topic tracking in a news stream. In: Proceedings of DARPA Broadcast News Workshop, pp. 133–136 (1999)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Huang, Q., Song, D., Rüger, S. (2008). Robust Query-Specific Pseudo Feedback Document Selection for Query Expansion. In: Macdonald, C., Ounis, I., Plachouras, V., Ruthven, I., White, R.W. (eds) Advances in Information Retrieval. ECIR 2008. Lecture Notes in Computer Science, vol 4956. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-78646-7_54
Download citation
DOI: https://doi.org/10.1007/978-3-540-78646-7_54
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-78645-0
Online ISBN: 978-3-540-78646-7
eBook Packages: Computer ScienceComputer Science (R0)