Visual Analytics for Information Retrieval Evaluation (VAIRË 2015)

  • Marco Angelini
  • Nicola Ferro
  • Giuseppe Santucci
  • Gianmaria Silvello
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9022)


Measuring is a key to scientific progress. This is particularly true for research concerning complex systems, whether natural or human-built. The tutorial introduced basic and intermediate concepts about lab-based evaluation of information retrieval systems, its pitfalls, and shortcomings and it complemented them with a recent and innovative angle to evaluation: the application of methodologies and tools coming from the Visual Analytics (VA) domain for better interacting, understanding, and exploring the experimental results and Information Retrieval (IR) system behaviour.


Information Retrieval Failure Analysis Information Retrieval System Information Retrieval Evaluation Intermediate Concept 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Harman, D.K.: Information Retrieval Evaluation. Morgan & Claypool Publishers, USA (2011)Google Scholar
  2. 2.
    Sanderson, M.: Test Collection Based Evaluation of Information Retrieval Systems. Foundations and Trends in Information Retrieval (FnTIR) 4, 247–375 (2010)CrossRefzbMATHGoogle Scholar
  3. 3.
    Fenton, N.E., Bieman, J.: Software Metrics: A Rigorous & Practical Approach, 3rd edn. Chapman and Hall/CRC, USA (2014)CrossRefGoogle Scholar
  4. 4.
    Büttcher, S., Clarke, C.L.A., Cormack, G.V.: Information Retrieval: Implementing and Evaluating Search Engines. The MIT Press, Cambridge (2010)Google Scholar
  5. 5.
    Harman, D., Buckley, C.: Overview of the Reliable Information Access Workshop. Information Retrieval 12, 615–641 (2009)CrossRefGoogle Scholar
  6. 6.
    Thomas, J.J., Cook, K.A. (eds.): Illuminating the Path: The Research and Development Agenda for Visual Analytics, National Visualization and Analytics Center, USA (2005)Google Scholar
  7. 7.
    Keim, D.A., Kohlhammer, J., Ellis, G., Mansmann, F. (eds.): Mastering the Information Age – Solving Problems with Visual Analytics. Eurographics Association, Goslar (2010)Google Scholar
  8. 8.
    Kang, Y.: a., Görg, C., Stasko, J.: Evaluating Visual Analytics Systems for Investigative Analysis: Deriving Design Principles from a Case Study. In: May, R., Kohlhammer, J., Stasko, J., van Wijk, J. (eds.) Proc. IEEE Symposium on Visual Analytics Science and Technology (VAST 2009), pp. 139–146. IEEE Computer Society, Los Alamitos (2009)CrossRefGoogle Scholar
  9. 9.
    Angelini, M., Ferro, N., Santucci, G., Garcia Seco de Herrera, A.: Deliverable D5.4 – Revised Collaborative User Interface Prototype with Annotation Functionalities. PROMISE Network of Excellence, EU 7FP, Contract N. 258191. (2013)
  10. 10.
    Ferro, N., Berendsen, R., Hanbury, A., Lupu, M., Petras, V., de Rijke, M., Silvello, G.: PROMISE Retreat Report – Prospects and Opportunities for Information Access Evaluation. SIGIR Forum 46, 60–84 (2012)CrossRefGoogle Scholar
  11. 11.
    Angelini, M., Ferro, N., Santucci, G., Silvello, G.: VIRTUE: A visual tool for information retrieval performance evaluation and failure analysis. Journal of Visual Languages & Computing (JVLC) 25, 394–413 (2014)CrossRefGoogle Scholar
  12. 12.
    Angelini, M., Ferro, N., Santucci, G., Silvello, G.: A Visual Interactive Environment for Making Sense of Experimental Data. In: de Rijke, M., Kenter, T., de Vries, A.P., Zhai, C., de Jong, F., Radinsky, K., Hofmann, K. (eds.) ECIR 2014. LNCS, vol. 8416, pp. 767–770. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  13. 13.
    Angelini, M., Ferro, N., Santucci, G., Silvello, G.: Visual Interactive Failure Analysis: Supporting Users in Information Retrieval Evaluation. In: Kamps, J., Kraaij, W., Fuhr, N. (eds.) Proc. 4th Symposium on Information Interaction in Context (IIiX 2012), pp. 195–203. ACM Press, New York (2012)Google Scholar
  14. 14.
    Angelini, M., Ferro, N., Granato, G.L., Santucci, G., Silvello, G.: Information Retrieval Failure Analysis: Visual analytics as a Support for Interactive “What-If” Investigation. In: Santucci, G., Ward, M. (eds.) Proc. IEEE Conference on Visual Analytics Science and Technology (VAST 2012), pp. 204–206. IEEE Computer Society, Los Alamitos (2012)CrossRefGoogle Scholar
  15. 15.
    Ferro, N.: CLEF 15th Birthday: Past, Present, and Future. SIGIR Forum 48, 31–55 (2014)CrossRefGoogle Scholar
  16. 16.
    Ferro, N., Silvello, G.: CLEF 15th Birthday: What Can We Learn From Ad Hoc Retrieval? In Kanoulas, E. In: Kanoulas, E., Lupu, M., Clough, P., Sanderson, M., Hall, M., Hanbury, A., Toms, E. (eds.) CLEF 2014. LNCS, vol. 8685, pp. 31–43. Springer, Heidelberg (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Marco Angelini
    • 1
  • Nicola Ferro
    • 2
  • Giuseppe Santucci
    • 1
  • Gianmaria Silvello
    • 2
  1. 1.“La Sapienza” University of RomeItaly
  2. 2.University of PaduaItaly

Personalised recommendations