Overview of the INEX 2010 Focused Relevance Feedback Track
The INEX 2010 Focused Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. As in traditional approaches we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions.
The evaluation was extended in several ways: the use of exhaustive relevance feedback over entire runs; the evaluation of focused retrieval where both the retrieval results and the feedback are focused; the evaluation was performed over a closed set of documents and complete focused assessments; the evaluation was performed over executable implementations of relevance feedback algorithms; and finally, the entire evaluation platform is reusable.
We present the evaluation methodology, its implementation, and experimental results obtained for nine submissions from three participating organisations.
KeywordsAverage Precision Relevance Feedback Indian Statistical Institute Relevant Passage Focus Result
Unable to display preview. Download preview PDF.
- 1.Buckley, C.: The trec eval IR evaluation package (2004) (Retrieved January 1, 2005)Google Scholar
- 2.Goetz, B.: The Lucene search engine: Powerful, flexible, and free. Javaworld (2002), http://www.javaworld.com/javaworld/jw-09-2000/jw-0915-lucene.html
- 3.Rocchio, J.J.: Relevance feedback in information retrieval. In: Salton, G. (ed.) The SMART Retrieval System: Experiments in Automatic Document Processing. Prentice-Hall Series in Automatic Computation, vol. 14, pp. 313–323. Prentice-Hall, Englewood Cliffs (1971)Google Scholar