Skip to main content

On the Early History of Evaluation in IR

  • Chapter
  • 775 Accesses

Part of the book series: The Kluwer International Series on Information Retrieval ((INRE,volume 16))

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Cleverdon, C. W. (1962). Report on the testing and analysis of an investigation into the comparative efficiency of indexing systems. Cranfield, U.K.: College of Aeronautics. (Aslib Cranfield Research Project)

    Google Scholar 

  • Cleverdon, C. W., Mills, J., & Keen, E. M. (1966). Factors determining the performance of indexing systems. Cranfield, U.K.: College of Aeronautics. ((2 vols.) Aslib Cranfield Research Project)

    Google Scholar 

  • Gilbert, H., & Sparck Jones, K. (1979). Statistical basis of relevance assessment for the ‘ideal’ information retrieval test collection. Cambridge, U.K.: Computing Laboratory, University of Cambridge.

    Google Scholar 

  • Lancaster, F. W. (1969). MEDLARS: Report on the evaluation of its operating efficiency. American Documentation, 20, 119–148.

    Google Scholar 

  • Sparck Jones, K. (1971). Automatic keyword classification for information retrieval. London, U.K.: Butterworths.

    Google Scholar 

  • Sparck Jones, K. (1972). A statistical interpretation of term specificity and its application in retrieval. Journal of Documentation, 28, 11–21.

    Google Scholar 

  • Sparck Jones, K. (Ed.). (1981). Information retrieval experiment. London, U.K.: Butterworths.

    Google Scholar 

  • Sparck Jones, K., & Bates, R. G. (1977). Report on a design study for the ‘ideal’ information retrieval test collection. Cambridge, U.K.: Computing Laboratory, University of Cambridge.

    Google Scholar 

  • Sparck Jones, K., & van Rijsbergen, C. J. (1975). Report on the need for and provision of an ‘ideal’ information retrieval test collection. Cambridge, U.K.: Computing Laboratory, University of Cambridge.

    Google Scholar 

  • Sparck Jones, K., & van Rijsbergen, C. J. (1976). Information retrieval test collections. Journal of Documentation, 32, 59–75.

    Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer

About this chapter

Cite this chapter

Robertson, S. (2005). On the Early History of Evaluation in IR. In: Tait, J.I. (eds) Charting a New Course: Natural Language Processing and Information Retrieval. The Kluwer International Series on Information Retrieval, vol 16. Springer, Dordrecht. https://doi.org/10.1007/1-4020-3467-9_2

Download citation

  • DOI: https://doi.org/10.1007/1-4020-3467-9_2

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-1-4020-3343-8

  • Online ISBN: 978-1-4020-3467-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics