CLEF 2008 Ad-Hoc Track: Comparing and Combining Different IR Approaches

  • Jens Kürsten
  • Thomas Wilhelm
  • Maximilian Eibl
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5706)

Abstract

This article describes post workshop experiments that were conducted after our first participation at the TEL@CLEF task. We used the Xtrieval framework [5], [4] for the preparation and execution of the experiments. We ran 69 experiments in the setting of the CLEF 2008 task, whereof 39 were monolingual and 30 were cross-lingual. We investigated the capabilities of the current version of Xtrieval, which could use the two retrieval cores Lucene and Lemur from now on. Our main goal was to compare and combine the results from those retrieval engines. The translation of the topics for the cross-lingual experiments was realized with a plug-in to access the Google AJAX language API. The performance of our monolingual experiments was better than the best experiments we submitted during the evaluation campaign. Our cross-lingual experiments performed very well for all target collections and achieved between 87% and 100% of the monolingual retrieval effectiveness. The combination of the results from the Lucene and the Lemur retrieval core showed very consistent performance.

Keywords

Evaluation Experimentation Data Fusion Cross-Language Information Retrieval 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Agirre, E., Di Nunzio, G.M., Ferro, N., Mandl, T., Peters, C.: CLEF 2008: Ad Hoc Track Overview. In: Working Notes for the CLEF 2008 Workshop, September 17-19, Aarhus, Denmark (October 2008)Google Scholar
  2. 2.
    Krovetz, R.: Viewing Morphology as an Inference Process. In: SIGIR 1993: Proceedings of the 16th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 191–202. ACM, New York (1993)Google Scholar
  3. 3.
    Kürsten, J., Wilhelm, T., Eibl, M.: CLEF 2008 Ad-Hoc Track: On-line Processing Experiments with Xtrieval. In: Working Notes for the CLEF 2008 Workshop, September 17-19, Aarhus, Denmark (October 2008)Google Scholar
  4. 4.
    Kürsten, J., Wilhelm, T., Eibl, M.: Extensible Retrieval and Evaluation Framework: Xtrieval. In: LWA 2008: Lernen - Wissen - Adaption, Workshop Proceedings - FGIR, Würzburg, October 2008, pp. 107–110 (2008)Google Scholar
  5. 5.
    Kürsten, J., Wilhelm, T., Eibl, M.: The Xtrieval Framework at CLEF 2007: Domain-Specific Track. In: Peters, C., Jijkoun, V., Mandl, T., Müller, H., Oard, D.W., Peñas, A., Petras, V., Santos, D. (eds.) CLEF 2007. LNCS, vol. 5152, pp. 174–181. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  6. 6.
    Savoy, J.: A stemming procedure and stopword list for general French corpora. Journal of the American Society for Information Science, 944–982 (1999)Google Scholar
  7. 7.
    Savoy, J.: Data Fusion for Effective European Monolingual Information Retrieval. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 233–244. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Jens Kürsten
    • 1
  • Thomas Wilhelm
    • 1
  • Maximilian Eibl
    • 1
  1. 1.Faculty of Computer Science, Chair Computer Science and MediaChemnitz University of TechnologyChemnitzGermany

Personalised recommendations