Using Relevance Feedback in Expert Search
In Enterprise settings, expert search is considered an important task. In this search task, the user has a need for expertise - for instance, they require assistance from someone about a topic of interest. An expert search system assists users with their “expertise need” by suggesting people with relevant expertise to the topic of interest. In this work, we apply an expert search approach that does not explicitly rank candidates in response to a query, but instead implicitly ranks candidates by taking into account a ranking of document with respect to the query topic. Pseudo-relevance feedback, aka query expansion, has been shown to improve retrieval performance in adhoc search tasks. In this work, we investigate to which extent query expansion can be applied in an expert search task to improve the accuracy of the generated ranking of candidates. We define two approaches for query expansion, one based on the initial of ranking of documents for the query topic. The second approach is based on the final ranking of candidates. The aims of this paper are two-fold. Firstly, to determine if query expansion can be successfully applied in the expert search task, and secondly, to ascertain if either of the two forms of query expansion can provide robust, improved retrieval performance. We perform a thorough evaluation contrasting the two query expansion approaches in the context of the TREC 2005 and 2006 Enterprise tracks.
Unable to display preview. Download preview PDF.
- 1.Amati, G.: Probabilistic Models for Information Retrieval based on Divergence from Randomness. PhD thesis, Department of Computing Science, University of Glasgow (2003)Google Scholar
- 3.Balog, K., Azzopardi, L., de Rijke, M.: Formal models for expert finding in enterprise corpora. In: Proceedings of the 29th ACM SIGIR 2006, Seattle, WA, August 2006, pp. 43–50. ACM Press, New York (2006)Google Scholar
- 4.Craswell, N., de Vries, A.P., Soboroff, I.: Overview of the TREC-2005 Enterprise Track. In: Proceedings of the 14th Text REtrieval Conference (TREC-2005) (2005)Google Scholar
- 6.Hiemstra, D.: Using language models for information retrieval. PhD thesis, Centre for Telematics and Information Technology, University of Twente (2001)Google Scholar
- 7.Macdonald, C., et al.: University of Glasgow at TREC 2005: Experiments in Terabyte and Enterprise tracks with Terrier. In: Proceedings of 14th Text REtrieval Conference (TREC 2005) (2005)Google Scholar
- 8.Macdonald, C., Ounis, I.: Voting for candidates: Adapting data fusion techniques for an expert search task. In: Proceedings of the 15th ACM CIKM 2006, Arlington, VA, November 2006, ACM Press, New York (2006)Google Scholar
- 9.Ounis, I., et al.: Terrier: A high performance and scalable information retrieval platform. In: Proceedings of the OSIR Workshop 2006, August 2006, pp. 18–25 (2006)Google Scholar
- 10.Rocchio, J.: Relevance feedback in information retrieval. Prentice-Hall, Englewood CliffsGoogle Scholar