CLEF 2006: Ad Hoc Track Overview

  • Giorgio M. Di Nunzio
  • Nicola Ferro
  • Thomas Mandl
  • Carol Peters
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4730)

Abstract

We describe the objectives and organization of the CLEF 2006 ad hoc track and discuss the main characteristics of the tasks offered to test monolingual, bilingual, and multilingual textual document retrieval systems. The track was divided into two streams. The main stream offered mono- and bilingual tasks using the same collections as CLEF 2005: Bulgarian, English, French, Hungarian and Portuguese. The second stream, designed for more experienced participants, offered the so-called ”robust task” which used test collections from previous years in six languages (Dutch, English, French, German, Italian and Spanish) with the objective of privileging experiments which achieve good stable performance over all queries rather than high average performance. The performance achieved for each task is presented and the results are commented. The document collections used were taken from the CLEF multilingual comparable corpus of news documents.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Braschler, M.: CLEF 2003 - Overview of results. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 44–63. Springer, Heidelberg (2004)Google Scholar
  2. 2.
    Di Nunzio, G.M., Ferro, N.: Appendix A. Results of the Core Tracks. In: Nardi, A., Peters, C., Vicedo, J.L. (eds.) Working Notes for the CLEF 2006 Workshop (2006), Published Online at www.clef-campaign.org
  3. 3.
    Braschler, M., Peters, C.: CLEF 2003 Methodology and Metrics. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 7–20. Springer, Heidelberg (2004)Google Scholar
  4. 4.
    Halácsy, P., Trón, V.: Benefits of Deep NLP-based Lemmatization for Information Retrieval. LNCS, vol. 4730. Springer, Heidelberg (2007)Google Scholar
  5. 5.
    Moreira Orengo, V., Buriol, L.S., Ramos Coelho, A.: A Study on the use of Stemming for Monolingual Ad-Hoc Portuguese. Information Retrieval (2006)Google Scholar
  6. 6.
    Azevedo Arcoverde, J.M., das Gracas Volpe Nunes, M.: NLP-Driven Constructive Learning for Filtering an IR Document Stream. LNCS, vol. 4730, pp. 74–82, Springer, Heidelberg (2007) Google Scholar
  7. 7.
    Gonzalez, M., de Lima, V.L.S.: The PUCRS-PLN Group participation at CLEF 2006. LNCS, vol. 4730, Springer, Heidelberg (2007) Google Scholar
  8. 8.
    Pingali, P., Tune, K.K., Varma, V.: Hindi, Telugu, Oromo, English CLIR Evaluation. LNCS, vol. 4730, pp. 35–42, Springer, Heidelberg (2007) Google Scholar
  9. 9.
    Hayurani, H., Sari, S., Adriani, M.: Query and Document Translation for English-Indonesian Cross Language IR. LNCS, vol. 4730, Springer, Heidelberg (2007) Google Scholar
  10. 10.
    Voorhees, E.M.: The TREC Robust Retrieval Track. SIGIR Forum 39, 11–20 (2005)CrossRefGoogle Scholar
  11. 11.
    Savoy, J., Abdou, S.: Experiments with Monolingual, Bilingual, and Robust Retrieval. LNCS, vol. 4730, pp. 137–144, Springer, Heidelberg (2007) Google Scholar
  12. 12.
    Voorhees, E.M.: Overview of the TREC 2005 Robust Retrieval Track. In: Voorhees, E.M., Buckland, L.P. (eds.): The Fourteenth Text REtrieval Conference Proceedings (TREC 2005) [last visited 2006, August 4] (2005), http://trec.nist.gov/pubs/trec14/t14_proceedings.html
  13. 13.
    Martinez-Santiago, F., Montejo-Ráez, A., Garcia-Cumbreras, M., Ureña-Lopez, A.: SINAI at CLEF 2006 Ad-hoc Robust Multilingual Track: Query Expansion using the Google Search Engine. LNCS, vol. 4730, Springer, Heidelberg (2007)Google Scholar
  14. 14.
    Zazo, A., Berrocal, J., Figuerola, C.: Local Query Expansion Using Term Windows for Robust Retrieval. LNCS, vol. 4730, Springer, Heidelberg (2007) Google Scholar
  15. 15.
    Tomlinson, S.: Comparing the Robustness of Expansion Techniques and Retrieval Measures. LNCS, vol. 4730, Springer, Heidelberg (2007) Google Scholar
  16. 16.
    Goni-Menoyo, J., Gonzalez-Cristobal, J., Vilena-Román, J.: Report of the MIRACLE teach for the Ad-hoc track in CLEF 2006. In: Nardi, A., Peters, C., Vicedo, J.L. (eds.) Working Notes for the CLEF 2006 Workshop, Published Online (2006)Google Scholar
  17. 17.
    Hull, D.: Using Statistical Testing in the Evaluation of Retrieval Experiments. In: Korfhage, R., Rasmussen, E., Willett, P. (eds.) Proc. 16th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 1993), pp. 329–338. ACM Press, New York (1993)CrossRefGoogle Scholar
  18. 18.
    Conover, W.J.: Practical Nonparametric Statistics, 1st edn. John Wiley and Sons, New York (1971)Google Scholar
  19. 19.
    Judge, G.G., Hill, R.C., Griffiths, W.E., Lütkepohl, H., Lee, T.C.: Introduction to the Theory and Practice of Econometrics, 2nd edn. John Wiley and Sons, New York (1988)MATHGoogle Scholar
  20. 20.
    Tague-Sutcliffe, J.: The Pragmatics of Information Retrieval Experimentation, Revisited. In: Sparck Jones, K., Willett, P. (eds.) Readings in Information Retrieval, pp. 205–216. Morgan Kaufmann Publisher, Inc, San Francisco, California (1997)Google Scholar
  21. 21.
    Di Nunzio, G.M., Ferro, N., Mandl, T., Peters, C.: CLEF 2006: Ad Hoc Track Overview. In: Nardi, A., Peters, C., Vicedo, J.L., eds.: Working Notes for the CLEF 2006 Workshop (2006) (last visited, March 23, 2007), http://www.clef-campaign.org/2006/working_notes/workingnotes2006/dinunzioOCLEF2006.pdf

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Giorgio M. Di Nunzio
    • 1
  • Nicola Ferro
    • 1
  • Thomas Mandl
    • 2
  • Carol Peters
    • 3
  1. 1.Department of Information Engineering, University of PaduaItaly
  2. 2.Information Science, University of HildesheimGermany
  3. 3.ISTI-CNR, Area di Ricerca – 56124 PisaItaly

Personalised recommendations