Dublin City University at CLEF 2004: Experiments in Monolingual, Bilingual and Multilingual Retrieval

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3491)


The Dublin City University group participated in the monolingual, bilingual and multilingual retrieval tasks. The main focus of our investigation for CLEF 2004 was extending our information retrieval system to document languages other than English, and completing the multilingual task comprising four languages: English, French, Russian and Finnish. Our retrieval system is based on the City University Okapi BM25 system with document preprocessing using the Snowball stemming software and stopword lists. Our French monolingual experiments compare retrieval using French documents and topics, and documents and topics translated into English. Our results indicate that working directly in French is more effective for retrieval than adopting document and topic translation. A breakdown of our multilingual retrieval results by the individual languages shows that similar overall average precision can be achieved when there is significant underlying variation in performance for individual languages.


Relevant Document Data Fusion Average Precision Pseudo Relevance Feedback English Topic 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lam-Adesina, A.M., Jones, G.J.F.: Exeter at CLEF 2003: Experiments with Machine Translation for Monolingual, Bilingual and Multilingual Retrieval. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 271–285. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  2. 2.
  3. 3.
    Porter, M.F.: An algorithm for suffix stripping. Program 14, 10–137 (1980)Google Scholar
  4. 4.
    Lam-Adesina, A.M., Jones, G.J.F.: Applying Summarization Techniques for Term Selection in Relevance Feedback. In: Proceedings of the 24th Annual International ACM SIGIR Conference, New Orleans, pp. 1–9. ACM, New York (2001)Google Scholar
  5. 5.
    Peters, C., Braschler, M., Di Nunzio, G., Ferro, N.: CLEF 2004: Ah Hoc Track Overview and Results Analysis. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 10–26. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  6. 6.
    Braschler, M., Göhring, A., Schäuble, P.: Eurospider at CLEF 2002. In: Peters, C., Braschler, M., Gonzalo, J. (eds.) CLEF 2002. LNCS, vol. 2785, pp. 164–174. Springer, Heidelberg (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  1. 1.School of ComputingDublin City UniversityDublin 9Ireland

Personalised recommendations