Advertisement

Overview of the Answer Validation Exercise 2006

  • Anselmo Peñas
  • Álvaro Rodrigo
  • Valentín Sama
  • Felisa Verdejo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4730)

Abstract

The first Answer Validation Exercise (AVE) has been launched at the Cross Language Evaluation Forum 2006. This task is aimed at developing systems able to decide whether the answer of a Question Answering system is correct or not. The exercise is described here together with the evaluation methodology and the systems results. The starting point for the AVE 2006 was the reformulation of Answer Validation as a Recognizing Textual Entailment problem, under the assumption that the hypothesis can be automatically generated instantiating hypothesis patterns with the QA system’s answers. 11 groups have participated with 38 runs in 7 different languages. Systems that reported the use of Logic have obtained the best results in their respective subtasks.

Keywords

Question Answering Test Collection Answer Validation Question Answering System Candidate Answer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bar-Haim, R., Dagan, I., Dolan, B., Ferro, L., Giampiccolo, D., Magnini, B., Szpektor, I.: The Second PASCAL Recognising Textual Entailment Challenge. In: Proceedings of the Second PASCAL Challenges Workshop on Recognising Textual Entailment, Venezia, Italy (April 2006)Google Scholar
  2. 2.
    Dagan, I., Glickman, O., Magnini, B.: The PASCAL Recognising Textual Entailment Challenge. In: Proceedings of the PASCAL Challenges Workshop on Recognising Textual Entailment, Southampton, UK, pp. 1–8 (April 2005)Google Scholar
  3. 3.
    Herrera, J., Peñas, A., Verdejo, F.: Question Answering Pilot Task at CLEF 2004. In: Peters, C., Clough, P.D., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 581–590. Springer, Heidelberg (2005)Google Scholar
  4. 4.
    Magnini, B., Romagnoli, S., Vallin, A., Herrera, J., Peñas, A., Peinado, V., Verdejo, F., de Rijke, M.: The Multiple Language Question Answering Track at CLEF 2003. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 471–486. Springer, Heidelberg (2004)Google Scholar
  5. 5.
    Magnini, B., Vallin, A., Ayache, C., Erbach, G., Peñas, A., de Rijke, M., Rocha, P., Simov, K., Sutcliffe, R.: Overview of the CLEF 2004 Multilingual Question Answering Track. In: Peters, C., Clough, P.D., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 371–391. Springer, Heidelberg (2005)Google Scholar
  6. 6.
    Peñas, A., Rodrigo, Á., Verdejo, F.: Sparte, a test suite for recognising textual entailment in spanish. In: Gelbukh, A.F. (ed.) CICLing 2006. LNCS, vol. 3878, pp. 275–286. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  7. 7.
    Vallin, A., Magnini, B., Giampiccolo, D., Aunimo, L., Ayache, C., Osenova, P., Penas, A., de Rijke, M., Sacaleanu, B., Santos, D., Sutcliffe, R.: Overview of the CLEF 2005 Multilingual Question Answering Track. In: Peters, C., Gey, F.C., Gonzalo, J., Müller, H., Jones, G.J.F., Kluck, M., Magnini, B., de Rijke, M., Giampiccolo, D. (eds.) CLEF 2005. LNCS, vol. 4022, pp. 307–331. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Anselmo Peñas
    • 1
  • Álvaro Rodrigo
    • 1
  • Valentín Sama
    • 1
  • Felisa Verdejo
    • 1
  1. 1.Dpto. Lenguajes y Sistemas Informáticos, UNED 

Personalised recommendations