Abstract
In Enterprise settings, expert search is considered an important task. In this search task, the user has a need for expertise - for instance, they require assistance from someone about a topic of interest. An expert search system assists users with their “expertise need” by suggesting people with relevant expertise to the topic of interest. In this work, we apply an expert search approach that does not explicitly rank candidates in response to a query, but instead implicitly ranks candidates by taking into account a ranking of document with respect to the query topic. Pseudo-relevance feedback, aka query expansion, has been shown to improve retrieval performance in adhoc search tasks. In this work, we investigate to which extent query expansion can be applied in an expert search task to improve the accuracy of the generated ranking of candidates. We define two approaches for query expansion, one based on the initial of ranking of documents for the query topic. The second approach is based on the final ranking of candidates. The aims of this paper are two-fold. Firstly, to determine if query expansion can be successfully applied in the expert search task, and secondly, to ascertain if either of the two forms of query expansion can provide robust, improved retrieval performance. We perform a thorough evaluation contrasting the two query expansion approaches in the context of the TREC 2005 and 2006 Enterprise tracks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Amati, G.: Probabilistic Models for Information Retrieval based on Divergence from Randomness. PhD thesis, Department of Computing Science, University of Glasgow (2003)
Amati, G.: Frequentist and bayesian approach to information retrieval. In: Lalmas, M., et al. (eds.) ECIR 2006. LNCS, vol. 3936, pp. 13–24. Springer, Heidelberg (2006)
Balog, K., Azzopardi, L., de Rijke, M.: Formal models for expert finding in enterprise corpora. In: Proceedings of the 29th ACM SIGIR 2006, Seattle, WA, August 2006, pp. 43–50. ACM Press, New York (2006)
Craswell, N., de Vries, A.P., Soboroff, I.: Overview of the TREC-2005 Enterprise Track. In: Proceedings of the 14th Text REtrieval Conference (TREC-2005) (2005)
Hertzum, M., Pejtersen, A.M.: The information-seeking practices of engineers: searching for documents as well as for people. Inf. Process. Manage. 36(5), 761–778 (2000)
Hiemstra, D.: Using language models for information retrieval. PhD thesis, Centre for Telematics and Information Technology, University of Twente (2001)
Macdonald, C., et al.: University of Glasgow at TREC 2005: Experiments in Terabyte and Enterprise tracks with Terrier. In: Proceedings of 14th Text REtrieval Conference (TREC 2005) (2005)
Macdonald, C., Ounis, I.: Voting for candidates: Adapting data fusion techniques for an expert search task. In: Proceedings of the 15th ACM CIKM 2006, Arlington, VA, November 2006, ACM Press, New York (2006)
Ounis, I., et al.: Terrier: A high performance and scalable information retrieval platform. In: Proceedings of the OSIR Workshop 2006, August 2006, pp. 18–25 (2006)
Rocchio, J.: Relevance feedback in information retrieval. Prentice-Hall, Englewood Cliffs
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Macdonald, C., Ounis, I. (2007). Using Relevance Feedback in Expert Search. In: Amati, G., Carpineto, C., Romano, G. (eds) Advances in Information Retrieval. ECIR 2007. Lecture Notes in Computer Science, vol 4425. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-71496-5_39
Download citation
DOI: https://doi.org/10.1007/978-3-540-71496-5_39
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-71494-1
Online ISBN: 978-3-540-71496-5
eBook Packages: Computer ScienceComputer Science (R0)