Abstract
Informativeness measures have been used in interactive information retrieval and automatic summarization evaluation. Indeed, as opposed to adhoc retrieval, these two tasks cannot rely on the Cranfield evaluation paradigm in which retrieved documents are compared to static query relevance document lists. In this paper, we explore the use of informativeness measures to evaluate adhoc task. The advantage of the proposed evaluation framework is that it does not rely on an exhaustive reference and can be used in a changing environment in which new documents occur, and for which relevance has not been assessed. We show that the correlation between the official system ranking and the informativeness measure is specifically high for most of the TREC adhoc tracks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cleverdon, C.: The cranfield tests on index language devices. In: Aslib Proceedings, vol. 19. No. 6. MCB UP Ltd (1967)
Hauff, C.: Predicting the effectiveness of queries and retrieval systems. Ph.D thesis, Enschede, SIKS Dissertation Series No. 2010-05, January 2010
Nuray, R., Can, F.: Automatic ranking of information retrieval systems using data fusion. Inf. Process. Manage. 42(3), 595–614 (2006)
Pavlu, V., Rajput, S., Golbus, P.B., Aslam, J.A.: IR system evaluation using nugget-based test collections. In: Proceedings of WSDM (2012)
SanJuan, E., Moriceau, V., Tannier, X., Bellot, P., Mothe, J.: Overview of the INEX 2011 question answering track (QA@INEX). In: Geva, S., Kamps, J., Schenkel, R. (eds.) INEX 2011. LNCS, vol. 7424, pp. 188–206. Springer, Heidelberg (2012)
SanJuan, E., Moriceau, E., Tannier, X., Bellot, P., Mothe, J.: Overview of the INEX 2012 tweet contextualization track. In: CLEF (2012)
Spink, A., Greisdorf, H., Bateman, J.: From highly relevant to not relevant: examining different regions of relevance. Inf. Process. Manage. 34(5), 599–621 (1998)
Tague-Sutcliffe, J.: Measuring the informativeness of a retrieval process. In: Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 23–36 (1992)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Deveaud, R., Moriceau, V., Mothe, J., SanJuan, E. (2016). Informativeness for Adhoc IR Evaluation: A Measure that Prevents Assessing Individual Documents. In: Ferro, N., et al. Advances in Information Retrieval. ECIR 2016. Lecture Notes in Computer Science(), vol 9626. Springer, Cham. https://doi.org/10.1007/978-3-319-30671-1_73
Download citation
DOI: https://doi.org/10.1007/978-3-319-30671-1_73
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30670-4
Online ISBN: 978-3-319-30671-1
eBook Packages: Computer ScienceComputer Science (R0)