Skip to main content

CLEF 2002 Methodology and Metrics

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2785))

Abstract

We give a detailed presentation of the organization of the CLEF 2002 evaluation campaign, focusing mainly on the core tracks. This includes a discussion of the evaluation approach adopted, explanations of the tracks and tasks and the underlying motivations, a description of the test collections, and an outline of the guidelines for the participants. The paper concludes with indications of the techniques used for results calculation and analysis.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cleverdon, C.: The Cranfield Tests on Index Language Devices. In K. Sparck-Jones and P. Willett (Eds.): Readings in Information Retrieval, pages 47-59. Morgan Kaufmann, 1997.

    Google Scholar 

  2. Harman, D.: The TREC Conferences. In R. Kuhlen and M. Rittberger (Eds.): Hypertext — Information Retrieval — Multimedia: Synergieeffekte Elektronischer Informationssysteme, Proceedings of HIM’ 95, pages 9-28. Universitätsverlag Konstanz

    Google Scholar 

  3. Voorhees, E.: The Philosophy of Information Retrieval Evaluation. In Peters, C., Braschler, M., Gonzalo, J., and Kluck, M. (Eds.): Evaluation of Cross-Language Information Retrieval Systems. Second Workshop of the Cross-Language Evaluation Forum, CLEF 2001, Revised Papers, 2002, Pages 355-370.

    Google Scholar 

  4. Text REtrieval Conference (TREC) Series: http://trec.nist.gov/

  5. NTCIR (NII-NACSIS Test Collection for IR Systems): http://research.nii.ac.jp/ntcir/

  6. Braschler, M. CLEF 2002 — Overview of Results: This volume.

    Google Scholar 

  7. Peters, C. Introduction: This volume.

    Google Scholar 

  8. Gey, F.C. & Kluck, M.: The Domain-Specific Task of CLEF — Specific Evaluation Strategies in Cross-Language Information Retrieval. In C. Peters (Ed.). Cross-Language Information Retrieval and Evaluation. Lecture Notes in Computer Science 2069, Springer Verlag, pp 48-56.4.

    Google Scholar 

  9. Gonzalo, J and Oard, D.W.: The CLEF 2002 Interactive Track. This volume.

    Google Scholar 

  10. DELOS Network of Excellence on Digital Libraries: http://delos-noe.iei.pi.cnr.it/

  11. Jones, G.J.F., Federico, M.: CLEF2002 Cross-Language Spoken Document Retrieval Pilot Track Report. This volume.

    Google Scholar 

  12. Womser-Hacker, C.: Multilingual Topic Generation within the CLEF 2001 Experiments.

    Google Scholar 

  13. Mandl, T., Womser-Hacker, C.: Linguistic and Statistical Analysis of the CLEF Topics. This volume.

    Google Scholar 

  14. ftp://ftp.cs.cornell.edu/pub/smart/

  15. Schäuble, P.: Content-Based Information Retrieval from Large Text and Audio Databases. Section 1.6 Evaluation Issues, Pages 22-29, Kluwer Academic Publishers, 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Braschler, M., Peters, C. (2003). CLEF 2002 Methodology and Metrics. In: Peters, C., Braschler, M., Gonzalo, J., Kluck, M. (eds) Advances in Cross-Language Information Retrieval. CLEF 2002. Lecture Notes in Computer Science, vol 2785. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45237-9_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-45237-9_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40830-7

  • Online ISBN: 978-3-540-45237-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics