Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence
- First Online:
Many different measures are used to assess academic research excellence and these are subject to ongoing discussion and debate within the scientometric, university-management and policy-making communities internationally. One topic of continued importance is the extent to which citation-based indicators compare with peer-review-based evaluation. Here we analyse the correlations between values of a particular citation-based impact indicator and peer-review scores in several academic disciplines, from natural to social sciences and humanities. We perform the comparison for research groups rather than for individuals. We make comparisons on two levels. At an absolute level, we compare total impact and overall strength of the group as a whole. At a specific level, we compare academic impact and quality, normalised by the size of the group. We find very high correlations at the former level for some disciplines and poor correlations at the latter level for all disciplines. This means that, although the citation-based scores could help to describe research-group strength, in particular for the so-called hard sciences, they should not be used as a proxy for ranking or comparison of research groups. Moreover, the correlation between peer-evaluated and citation-based scores is weaker for soft sciences.
KeywordsPeer review Citations Research assessment exercise (RAE) Research excellence framework (REF)
- Evidence. (2010). Evidence (a Thomson Reuters business) report. The future of the UK university research base, July 2010.Google Scholar
- Evidence. (2011). Funding research excellence: Research group size, critical mass and performance. A University Alliance report, July 2011.Google Scholar
- Evidence. (2012). Bibliometric evaluation and international benchmarking of the UK’s physics research, Summary report prepared for the Institute of Physics by Evidence, Thomson Reuters.Google Scholar
- Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of Ideas. Science, 122(3159), 108–111.Google Scholar
- Garfield, E. (1973). Citation frequency as a measure of research activity and performance in essays of an information scientist, Current Contents, 1, 406–408.Google Scholar
- Ioannidis, J. P. A. et al. (2007). International ranking systems for universities and institutions: A critical appraisal BMC Medicine 5, 30.Google Scholar
- Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.Google Scholar
- Nature. (2010). Editorial, Metrics Special. 465, p. 845. Retrieved April 2012, from http://www.nature.com/metrics.
- Oppenheim C., & Summers M. A. C. (2008). Citation counts and the Research Assessment Exercise, part VI. Unit of assessment 67 (music), Information Research, 13(2).Google Scholar
- RAE. (2008). The panel criteria and working methods. Panel E. (2006). Retrieved October 19, 2012, from http://www.rae.ac.uk/pubs/2006/01/docs/eall.pdf.
- The official web-page of the RAE. (2008). Retrieved October 18, 2012, from http://www.rae.ac.uk/.
- The official web-page of the Higher Education Funding Council for England. Funding for universities and colleges in 2009–10 (2009). Electronic Publication 01/2009 in the ADMIN-HEFCE Archives. Retrieved October 19, 2012.Google Scholar
- The official web-page of Academic Ranking of World Universities (ARWU). (2012). Retrieved October 19, 2012, from http://www.shanghairanking.com.
- The official web-page of Evidence Thomson Reuters. (2012). Retrieved October 18, 2012, from http://www.evidence.co.uk.
- The official web-page of the REF. (2012). Retrieved October 19, 2012, from http://www.ref.ac.uk/.
- Williams R., de Rassenfosse G., Jensen P., & Marginson S. (2012). U21 Ranking of National Higher Education Systems, Report of the project sponsored by Universitas 21, University of Melbourne.Google Scholar