Dublin City University at CLEF 2005: Cross-Language Speech Retrieval (CL-SR) Experiments

  • Adenike M. Lam-Adesina
  • Gareth J. F. Jones
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4022)


The Dublin City University participation in the CLEF 2005 CL-SR task concentrated on exploring the application of our existing information retrieval methods based on the Okapi model to the conversational speech data set. This required an approach to determining approximate sentence boundaries within the free-flowing automatic transcription provided to enable us to use our summary-based pseudo relevance feedback (PRF). We also performed exploratory experiments on the use of the metadata provided with the document transcriptions for indexing and relevance feedback. Topics were translated into English using Systran V3.0 machine translation. In most cases Title field only topic statements performed better than combined Title and Description topics. PRF using our adapted method is shown to be affective, and absolute performance is improved by combining the automatic document transcriptions with additional metadata fields.


Machine Translation Automatic Speech Recognition Query Expansion Mean Average Precision Expansion Term 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    White, R.W., Oard, D.W., Jones, G.J.F., Soergel, D., Huang, X.: Overview of the CLEF-2005 Cross-Language Speech Retrieval Track. In: Peters, C., Gey, F.C., Gonzalo, J., Müller, H., Jones, G.J.F., Kluck, M., Magnini, B., de Rijke, M., Giampiccolo, D. (eds.) CLEF 2005. LNCS, vol. 4022, pp. 744–759. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Lam-Adesina, A.M., Jones, G.J.F.: Applying Summarization Techniques for Term Selection in Relevance Feedback. In: Proceedings of the Twenty-Fourth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, New Orleans, pp. 1–9. ACM, New York (2001)Google Scholar
  3. 3.
    Robertson, S.E., Walker, S., Jones, S., Hancock-Beaulieu, M.M., Gatford, M.: Okapi at TREC-3. In: Proceedings of the Third Text REtrieval Conference (TREC-3), pp. 109–126. NIST (1995)Google Scholar
  4. 4.
    Porter, M.F.: An Algorithm for Suffix Stripping. Program 14, 10–137 (1980)Google Scholar
  5. 5.
    Luhn, H.P.: The Automatic Creation of Literature Abstracts. IBM Journal of Research and Development 2(2), 159–165 (1958)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Jones, G.J.F., Burke, M., Judge, J., Khasin, A., Lam-Adesina, A.M., Wagner, J.: Dublin City University at CLEF 2004: Experiments in Monolingual, Bilingual and Multilingual Retrieval. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 207–220. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  7. 7.
    Tombros, A., Sanderson, M.: The Advantages of Query-Biased Summaries in Information Retrieval. In: Proceedings of the Twenty-First Annual International ACM SIGIR Conference Research and Development in Information Retrieval, Melbourne, pp. 2–10. ACM, New York (1998)CrossRefGoogle Scholar
  8. 8.
    Inkpen, D., Alzghool, M., Islam, A.: University of Ottawa’s Contribution to CLEF 2005, In: The CL-SR Track Proceedings of the CLEF 2005: Workshop on Cross-Language Information Retrieval and Evaluation, Vienna, Austria (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Adenike M. Lam-Adesina
    • 1
  • Gareth J. F. Jones
    • 1
  1. 1.School of ComputingDublin City UniversityIreland

Personalised recommendations