A PROMISE for Experimental Evaluation

  • Martin Braschler
  • Khalid Choukri
  • Nicola Ferro
  • Allan Hanbury
  • Jussi Karlgren
  • Henning Müller
  • Vivien Petras
  • Emanuele Pianta
  • Maarten de Rijke
  • Giuseppe Santucci
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6360)

Abstract

Participative Research labOratory for Multimedia and Multilingual Information Systems Evaluation (PROMISE) is a Network of Excellence, starting in conjunction with this first independent CLEF 2010 conference, and designed to support and develop the evaluation of multilingual and multimedia information access systems, largely through the activities taking place in Cross-Language Evaluation Forum (CLEF) today, and taking it forward in important new ways.

PROMISE is coordinated by the University of Padua, and comprises 10 partners: the Swedish Institute for Computer Science, the University of Amsterdam, Sapienza University of Rome, University of Applied Sciences of Western Switzerland, the Information Retrieval Facility, the Zurich University of Applied Sciences, the Humboldt University of Berlin, the Evaluation and Language Resources Distribution Agency, and the Centre for the Evaluation of Language Communication Technologies.

The single most important step forward for multilingual and multimedia information access which PROMISE will work towards is to provide an open evaluation infrastructure in order to support automation and collaboration in the evaluation process.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Agosti, M., Ferro, N.: Towards an Evaluation Infrastructure for DL Performance Evaluation. In: Tsakonas, G., Papatheodorou, C. (eds.) Evaluation of Digital Libraries: An insight into useful applications and methods, pp. 93–120. Chandos Publishing, Oxford (2009)CrossRefGoogle Scholar
  2. 2.
    Commission of the European Communities. Multilingualism: an asset for Europe and a shared commitment. COMM (2008) 566 Final (September 2008)Google Scholar
  3. 3.
    Ferro, N.: CLEF, CLEF 2010, and PROMISEs: Perspectives for the Cross-Language Evaluation Forum. In: Kando, N., Kishida, K. (eds.) Proc. 8th NTCIR Workshop Meeting on Evaluation of Information Access Technologies: Information Retrieval, Question Answering and Cross-Lingual Information Access, pp. 2–12. National Institute of Informatics, Tokyo (2010)Google Scholar
  4. 4.
    Keim, D.A., Mansmann, F., Schneidewind, J., Ziegler, H.: Challenges in Visual Data Analysis. In: Banissi, E. (ed.) Proc. of the 10th International Conference on Information Visualization (IV 2006), pp. 9–16. IEEE Computer Society, Los Alamitos (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Martin Braschler
    • 1
  • Khalid Choukri
    • 2
  • Nicola Ferro
    • 3
  • Allan Hanbury
    • 4
  • Jussi Karlgren
    • 5
  • Henning Müller
    • 6
  • Vivien Petras
    • 7
  • Emanuele Pianta
    • 8
  • Maarten de Rijke
    • 9
  • Giuseppe Santucci
    • 10
  1. 1.Zurich University of Applied SciencesSwitzerland
  2. 2.Evaluations and Language resources Distribution AgencyFrance
  3. 3.University of PaduaItaly
  4. 4.Information Retrieval FacilityAustria
  5. 5.Swedish Institute of Computer ScienceSweden
  6. 6.University of Applied Sciences Western SwitzerlandSwitzerland
  7. 7.Humboldt-Universität zu BerlinGermany
  8. 8.Centre for the Evaluation of Language Communication TechnologiesItaly
  9. 9.University of AmsterdamThe Netherlands
  10. 10.Sapienza University of RomeItaly

Personalised recommendations