Skip to main content

Informativeness for Adhoc IR Evaluation: A Measure that Prevents Assessing Individual Documents

  • Conference paper
Advances in Information Retrieval (ECIR 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9626))

Included in the following conference series:

  • 4307 Accesses

Abstract

Informativeness measures have been used in interactive information retrieval and automatic summarization evaluation. Indeed, as opposed to adhoc retrieval, these two tasks cannot rely on the Cranfield evaluation paradigm in which retrieved documents are compared to static query relevance document lists. In this paper, we explore the use of informativeness measures to evaluate adhoc task. The advantage of the proposed evaluation framework is that it does not rely on an exhaustive reference and can be used in a changing environment in which new documents occur, and for which relevance has not been assessed. We show that the correlation between the official system ranking and the informativeness measure is specifically high for most of the TREC adhoc tracks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://snowball.tartarus.org/algorithms/english/stemmer.html.

References

  1. Cleverdon, C.: The cranfield tests on index language devices. In: Aslib Proceedings, vol. 19. No. 6. MCB UP Ltd (1967)

    Google Scholar 

  2. Hauff, C.: Predicting the effectiveness of queries and retrieval systems. Ph.D thesis, Enschede, SIKS Dissertation Series No. 2010-05, January 2010

    Google Scholar 

  3. Nuray, R., Can, F.: Automatic ranking of information retrieval systems using data fusion. Inf. Process. Manage. 42(3), 595–614 (2006)

    Article  MATH  Google Scholar 

  4. Pavlu, V., Rajput, S., Golbus, P.B., Aslam, J.A.: IR system evaluation using nugget-based test collections. In: Proceedings of WSDM (2012)

    Google Scholar 

  5. SanJuan, E., Moriceau, V., Tannier, X., Bellot, P., Mothe, J.: Overview of the INEX 2011 question answering track (QA@INEX). In: Geva, S., Kamps, J., Schenkel, R. (eds.) INEX 2011. LNCS, vol. 7424, pp. 188–206. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  6. SanJuan, E., Moriceau, E., Tannier, X., Bellot, P., Mothe, J.: Overview of the INEX 2012 tweet contextualization track. In: CLEF (2012)

    Google Scholar 

  7. Spink, A., Greisdorf, H., Bateman, J.: From highly relevant to not relevant: examining different regions of relevance. Inf. Process. Manage. 34(5), 599–621 (1998)

    Article  Google Scholar 

  8. Tague-Sutcliffe, J.: Measuring the informativeness of a retrieval process. In: Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 23–36 (1992)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eric SanJuan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Deveaud, R., Moriceau, V., Mothe, J., SanJuan, E. (2016). Informativeness for Adhoc IR Evaluation: A Measure that Prevents Assessing Individual Documents. In: Ferro, N., et al. Advances in Information Retrieval. ECIR 2016. Lecture Notes in Computer Science(), vol 9626. Springer, Cham. https://doi.org/10.1007/978-3-319-30671-1_73

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30671-1_73

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30670-4

  • Online ISBN: 978-3-319-30671-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics