Advertisement

Overview of the CLEF 2005 Multilingual Question Answering Track

  • Alessandro Vallin
  • Bernardo Magnini
  • Danilo Giampiccolo
  • Lili Aunimo
  • Christelle Ayache
  • Petya Osenova
  • Anselmo Peñas
  • Maarten de Rijke
  • Bogdan Sacaleanu
  • Diana Santos
  • Richard Sutcliffe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4022)

Abstract

The general aim of the third CLEF Multilingual Question Answering Track was to set up a common and replicable evaluation framework to test both monolingual and cross-language Question Answering (QA) systems that process queries and documents in several European languages. Nine target languages and ten source languages were exploited to enact 8 monolingual and 73 cross-language tasks. Twenty-four groups participated in the exercise. Overall results showed a general increase in performance in comparison to last year. The best performing monolingual system irrespective of target language answered 64.5% of the questions correctly (in the monolingual Portuguese task), while the average of the best performances for each target language was 42.6%. The cross-language step instead entailed a considerable drop in performance. In addition to accuracy, the organisers also measured the relation between the correctness of an answer and a system’s stated confidence in it, showing that the best systems did not always provide the most reliable confidence score. We provide an overview of the 2005 QA track, detail the procedure followed to build the test sets and present a general analysis of the results.

Keywords

Target Language Question Answering Source Language Head Noun Answer Type 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    QA@CLEF 2005 Organizing Committee. Guidelines 2005, http://clefqa.itc.it/2005/guidelines.html
  2. 2.
    Herrera, J., Peñas, A., Verdejo, F.: Question Answering Pilot Task at CLEF 2004. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 581–590. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  3. 3.
    Magnini, B., Vallin, A., Ayache, C., Erbach, G., Peñas, A., de Rijke, M., Rocha, P., Simov, K., Sutcliffe, R.: Overview of the CLEF 2004 Multilingual Question Answering Track. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 371–391. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. 4.
    Santos, D., Rocha, P.: The Key to the First CLEF with Portuguese: Topics, Questions and Answers in CHAVE. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 821–832. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  5. 5.
    Spark Jones, K.: Is question answering a rational task? In: Bernardi, R., Moortgat, M. (eds.) Questions and Answers: Theoretical and Applied Perspectives. Second CoLogNETElsNET Symposium, Amsterdam, pp. 24–35 (2003)Google Scholar
  6. 6.
    Voorhees, E.M.: Overview of the TREC 2002 Question Answering Track. In: Voorhees, E.M., Buckland, L.P. (eds.) Proceedings of the Eleventh Text Retrieval Conference (TREC 2002 NIST Special Publication 500-251, Washington DC, pp. 115–123 (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Alessandro Vallin
    • 1
  • Bernardo Magnini
    • 2
  • Danilo Giampiccolo
    • 1
  • Lili Aunimo
    • 3
  • Christelle Ayache
    • 4
  • Petya Osenova
    • 5
  • Anselmo Peñas
    • 6
  • Maarten de Rijke
    • 7
  • Bogdan Sacaleanu
    • 8
  • Diana Santos
    • 9
  • Richard Sutcliffe
    • 10
  1. 1.CELCTItaly
  2. 2.ITC-IrstItaly
  3. 3.University of HelsinkiFinland
  4. 4.ELDA/ELRAFrance
  5. 5.BTBBulgaria
  6. 6.UNEDSpain
  7. 7.University of AmsterdamThe Netherlands
  8. 8.DFKIGermany
  9. 9.SintefNorway
  10. 10.University of LimerickIreland

Personalised recommendations