Information Retrieval

, Volume 16, Issue 4, pp 504–529

Diversified search evaluation: lessons from the NTCIR-9 INTENT task

Search Intents and Diversification

DOI: 10.1007/s10791-012-9208-x

Cite this article as:
Sakai, T. & Song, R. Inf Retrieval (2013) 16: 504. doi:10.1007/s10791-012-9208-x


The evaluation of diversified web search results is a relatively new research topic and is not as well-understood as the time-honoured evaluation methodology of traditional IR based on precision and recall. In diversity evaluation, one topic may have more than one intent, and systems are expected to balance relevance and diversity. The recent NTCIR-9 evaluation workshop launched a new task called INTENT which included a diversified web search subtask that differs from the TREC web diversity task in several aspects: the choice of evaluation metrics, the use of intent popularity and per-intent graded relevance, and the use of topic sets that are twice as large as those of TREC. The objective of this study is to examine whether these differences are useful, using the actual data recently obtained from the NTCIR-9 INTENT task. Our main experimental findings are: (1) The \(\hbox{D}\,\sharp\) evaluation framework used at NTCIR provides more “intuitive” and statistically reliable results than Intent-Aware Expected Reciprocal Rank; (2) Utilising both intent popularity and per-intent graded relevance as is done at NTCIR tends to improve discriminative power, particularly for \(\hbox{D}\,\sharp\)-nDCG; and (3) Reducing the topic set size, even by just 10 topics, can affect not only significance testing but also the entire system ranking; when 50 topics are used (as in TREC) instead of 100 (as in NTCIR), the system ranking can be substantially different from the original ranking and the discriminative power can be halved. These results suggest that the directions being explored at NTCIR are valuable.


DiversityEvaluationIntentsNTCIRSearch result diversificationTest collectionsTRECWeb search

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Microsoft Research AsiaBeijingPeople’s Republic of China