Skip to main content
Log in

Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Many different measures are used to assess academic research excellence and these are subject to ongoing discussion and debate within the scientometric, university-management and policy-making communities internationally. One topic of continued importance is the extent to which citation-based indicators compare with peer-review-based evaluation. Here we analyse the correlations between values of a particular citation-based impact indicator and peer-review scores in several academic disciplines, from natural to social sciences and humanities. We perform the comparison for research groups rather than for individuals. We make comparisons on two levels. At an absolute level, we compare total impact and overall strength of the group as a whole. At a specific level, we compare academic impact and quality, normalised by the size of the group. We find very high correlations at the former level for some disciplines and poor correlations at the latter level for all disciplines. This means that, although the citation-based scores could help to describe research-group strength, in particular for the so-called hard sciences, they should not be used as a proxy for ranking or comparison of research groups. Moreover, the correlation between peer-evaluated and citation-based scores is weaker for soft sciences.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Billaut, J. -C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shanghai ranking? Scientometrics 84, 237–263.

    Article  Google Scholar 

  • Bornmann, L. (2012). The Hawthorne effect in journal peer review, Scientometrics 91, 857–862.

    Article  Google Scholar 

  • Bornmann, L., Wallon, G., & Ledin, A. (2008). Is the h index related to (standard) bibliometric measures and to the assessments by peers? An investigation of the h index by using molecular life sciences data, Research Evaluation 17, 149-156.

    Article  Google Scholar 

  • Butler, D. (2010). University rankings smarten up, Nature, 464,16–17.

    Article  Google Scholar 

  • Derrick, G. E., Haynes, A., Chapman, S., & Hall, W. D. (2001). The association between four citation metrics and peer rankings of research influence of Australian researchers in six fields of public health, PLoS One, 6, e18521.

    Article  Google Scholar 

  • Donovan, C. (2007). Future pathways for science policy and research assessment: Metrics vs peer review, quality vs impact. Science and Public Policy 34, 538-542.

    Article  Google Scholar 

  • Egghe, L. (2006). Theory and practise of the g-index, Scientometrics,69(1), 131–152.

    Article  MathSciNet  Google Scholar 

  • Evidence. (2010). Evidence (a Thomson Reuters business) report. The future of the UK university research base, July 2010.

  • Evidence. (2011). Funding research excellence: Research group size, critical mass and performance. A University Alliance report, July 2011.

  • Evidence. (2012). Bibliometric evaluation and international benchmarking of the UK’s physics research, Summary report prepared for the Institute of Physics by Evidence, Thomson Reuters.

  • Florian, R. V. (2007). Irreproducibility of the results of the Shanghai academic ranking of world universities Scientometrics 72, 25–32.

    Article  Google Scholar 

  • Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of Ideas. Science, 122(3159), 108–111.

    Google Scholar 

  • Garfield, E. (1973). Citation frequency as a measure of research activity and performance in essays of an information scientist, Current Contents, 1, 406–408.

    Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. PNAS, 102(46),16569–16572.

    Article  Google Scholar 

  • Ioannidis, J. P. A. et al. (2007). International ranking systems for universities and institutions: A critical appraisal BMC Medicine 5, 30.

    Google Scholar 

  • Kenna, R., & Berche, B. (2010). Critical mass and the dependency of research quality on group size. Scientometrics, 86(2), 527–540.

    Article  Google Scholar 

  • Kenna, R., & Berche, B. (2011). Critical masses for academic research groups and consequences for higher education research policy and management. Higher Education Management and Policy, 23(3), 1–21

    Article  Google Scholar 

  • Macilwain, C. (2010). Wild goose chase. Nature, 463, 291.

    Article  Google Scholar 

  • Mryglod O., Kenna R., Holovatch Y., Berche B. (2012). Absolute and specific measures of research group excellence. Scientometrics doi:10.1007/s11192-012-0874-7.

    Google Scholar 

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

    Google Scholar 

  • Nature. (2010). Editorial, Metrics Special. 465, p. 845. Retrieved April 2012, from http://www.nature.com/metrics.

  • Norris, M., & Oppenheim, C. (2003). Citation counts and the Research Assessment Exercise. V archaeology and the 2001 RAE, Journal of Documentation, 59(6), 709–730.

    Article  Google Scholar 

  • Oppenheim C., & Summers M. A. C. (2008). Citation counts and the Research Assessment Exercise, part VI. Unit of assessment 67 (music), Information Research, 13(2).

  • RAE. (2008). The panel criteria and working methods. Panel E. (2006). Retrieved October 19, 2012, from http://www.rae.ac.uk/pubs/2006/01/docs/eall.pdf.

  • Schubert, A., & Braun, T. (1996). Cross-field normalization of scientometric indicators. Scientometrics 36(3), 311–324.

    Article  Google Scholar 

  • Stauffer D. (2012). A biased review of sociophysics. Journal of Statistical Physics doi:10.1007/s10955-012-0604-9.

    Google Scholar 

  • The official web-page of the RAE. (2008). Retrieved October 18, 2012, from http://www.rae.ac.uk/.

  • The official web-page of the Higher Education Funding Council for England. Funding for universities and colleges in 2009–10 (2009). Electronic Publication 01/2009 in the ADMIN-HEFCE Archives. Retrieved October 19, 2012.

  • The official web-page of Academic Ranking of World Universities (ARWU). (2012). Retrieved October 19, 2012, from http://www.shanghairanking.com.

  • The official web-page of Evidence Thomson Reuters. (2012). Retrieved October 18, 2012, from http://www.evidence.co.uk.

  • The official web-page of the REF. (2012). Retrieved October 19, 2012, from http://www.ref.ac.uk/.

  • van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods, Scientometrics, 62(1), 133–143.

    Article  Google Scholar 

  • Vinkler, P. (2001). An attempt for defining some basic categories of scientometrics and classifying the indicators of evaluative scientometrics. Scientometrics, 50(3), 539–544

    Article  Google Scholar 

  • Vinkler, P. (2003). Relations of relative scientometric indicators. Scientometrics,58(3), 687–694.

    Article  Google Scholar 

  • Warner, J. (2003). Citation Analysis and Research Assessment in the United Kingdom, American Society for Information Science and Technology, 30(1), 26–27.

    Article  Google Scholar 

  • Williams R., de Rassenfosse G., Jensen P., & Marginson S. (2012). U21 Ranking of National Higher Education Systems, Report of the project sponsored by Universitas 21, University of Melbourne.

Download references

Acknowledgments

This work was supported in part by the 7th FP, IRSES project No. 269139 “Dynamics and cooperative phenomena in complex physical and biological environments” and IRSES project No. 295302 “Statistical physics in diverse realizations”. The authors thank Jonathan Adams from Thomson Reuters Research Analytics for the data and Ihor Mryglod for fruitful discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to O. Mryglod.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mryglod, O., Kenna, R., Holovatch, Y. et al. Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence. Scientometrics 97, 767–777 (2013). https://doi.org/10.1007/s11192-013-1058-9

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-013-1058-9

Keywords

Navigation