Abstract
The development of multilingual and multimedia information access systems calls for proper evaluation methodologies to ensure that they meet the expected user requirements and provide the desired effectiveness. IR research offers a strong evaluation methodology and a range of evaluation metrics, such as MAP and (n)DCG. In this paper, we propose a new metric for ranking evaluation, the CRP. We start with the observation that a document of a given degree of relevance may be ranked too early or too late regarding the ideal ranking of documents for a query. Its relative position may be negative, indicating too early ranking, zero indicating correct ranking, or positive, indicating too late ranking. By cumulating these relative rankings we indicate, at each ranked position, the net effect of document displacements, the CRP. We first define the metric formally and then discuss its properties, its relationship to prior metrics, and its visualization. Finally we propose different visualizations of CRP by exploiting a test collection to demonstrate its behavior.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Sanderson, M.: Test Collection Based Evaluation of Information Retrieval Systems. Foundations and Trends in Information Retrieval (FnTIR) 4, 247–375 (2010)
Harman, D.K.: Information Retrieval Evaluation. Morgan & Claypool Publishers, USA (2011)
Kekäläinen, J., Järvelin, K.: Using Graded Relevance Assessments in IR Evaluation. Journal of the American Society for Information Science and Technology (JASIST) 53, 1120–1129 (2002)
Robertson, S.E., Kanoulas, E., Yilmaz, E.: Extending Average Precision to Graded Relevance Judgments. In: Proc. 33rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2010), pp. 603–610. ACM Press, New York (2010)
Järvelin, K., Kekäläinen, J.: Cumulated Gain-Based Evaluation of IR Techniques. ACM Transactions on Information Systems (TOIS) 20, 422–446 (2002)
Keskustalo, H., Järvelin, K., Pirkola, A., Kekäläinen, J.: Intuition-Supporting Visualization of User’s Performance Based on Explicit Negative Higher-Order Relevance. In: Proc. 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2008), pp. 675–681. ACM Press, New York (2008)
Korfhage, R.R.: Information Storage and Retrieval. Wiley Computer Publishing, John Wiley & Sons, Inc., USA (1997)
Salton, G., McGill, M.J.: Introduction to Modern Information Retrieval. McGraw-Hill, New York (1983)
Voorhees, E.M.: TREC: Continuing Information Retrieval’s Tradition of Experimentation. Communications of the ACM (CACM) 50, 51–54 (2007)
Cosijn, E., Ingwersen, P.: Dimensions of Relevance. Information Processing & Management 36, 533–550 (2000)
Saracevic, T.: Relevance reconsidered. In: Ingwersen, P., Pors, N.O. (eds.) Proc. 2nd International Conference on Conceptions of Library and Information Science – Integration in Perspective (CoLIS 2), pp. 201–218. Royal School of Librarianship, Copenhagen (1996)
Sormunen, E.: Liberal Relevance Criteria of TREC: Counting on Negligible Documents? In: Proc. of the 25th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 324–330. ACM Press (2002)
Ferro, N., Sabetta, A., Santucci, G., Tino, G.: Visual Comparison of Ranked Result Cumulated Gains. In: Proc. 2nd International Workshop on Visual Analytics (EuroVA 2011), pp. 21–24. Eurographics Association, Goslar (2011)
Voorhees, E., Harman, D.: Overview of the Seventh Text REtrieval Conference (TREC-7). In: NIST Special Publication 500-242: The Seventh Text REtrieval Conference (TREC 7). Springer, Heidelberg (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Angelini, M. et al. (2012). Cumulated Relative Position: A Metric for Ranking Evaluation. In: Catarci, T., Forner, P., Hiemstra, D., Peñas, A., Santucci, G. (eds) Information Access Evaluation. Multilinguality, Multimodality, and Visual Analytics. CLEF 2012. Lecture Notes in Computer Science, vol 7488. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33247-0_13
Download citation
DOI: https://doi.org/10.1007/978-3-642-33247-0_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33246-3
Online ISBN: 978-3-642-33247-0
eBook Packages: Computer ScienceComputer Science (R0)