Absolute and specific measures of research group excellence


A desirable goal of scientific management is to introduce, if it exists, a simple and reliable way to measure the scientific excellence of publicly funded research institutions and universities to serve as a basis for their ranking and financing. While citation-based indicators and metrics are easily accessible, they are far from being universally accepted as way to automate or inform evaluation processes or to replace evaluations based on peer review. Here we consider absolute measurements of research excellence at an amalgamated, institutional level and specific measures of research excellence as performance per head. Using biology research institutions in the UK as a test case, we examine the correlations between peer review-based and citation-based measures of research excellence on these two scales. We find that citation-based indicators are very highly correlated with peer-evaluated measures of group strength, but are poorly correlated with group quality. Thus, and almost paradoxically, our analysis indicates that citation counts could possibly form a basis for deciding on, how to fund research institutions, but they should not be used as a basis for ranking them in terms of quality.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5


  1. 1.

    Here and further we use terms “quality” and “strength” following the notation in Kenna and Berche (2010, 2011a)

  2. 2.

    The Ringelmann effect describes the tendency for average productivity to reduce as the size of the group increases, while in (Kenna and Berche 2010, 2011a) a reduction in the “rate of change” of quality with quantity is observed.


  1. De Bellis, N. (2009). Bibliometrics and citation analysis: from the science citation index to cybermetrics. USA: The Scarecrow Press, Inc., Lanham, Maryland, Toronto, Plymouth (UK).

  2. Nature Editorial (2010). Nature, and metrics special. http://www.nature.com/metrics. Accessed April 2012. (465, p. 845).

  3. Evidence (a business) report. (2010). The future of the UK university research base.

  4. Vinkler, P. (2001). An attempt for defining some basic categories of scientometrics and classifying the indicators of evaluative scientometrics. Scientometrics, 50(3), 539–544.

    Article  Google Scholar 

  5. Vinkler, P. (2003). Relations of relative scientometric indicators. Scientometrics, 58(3),687–694.

    Article  Google Scholar 

  6. Kenna, R., Berche, B. (2010). Critical mass and the dependency of research quality on group size. Scientometrics, 86(2),527–540.

    Article  Google Scholar 

  7. Kenna, R., Berche, B. (2011a). Critical masses for academic research groups and consequences for higher education research policy and management. Higher Education Management and Policy, 23(3),1–21.

    Google Scholar 

  8. The official web-page of the RAE. (2008a). http://www.rae.ac.uk/. Accessed 18 October 2012.

  9. The panel criteria and working methods, available on the official web-page of RAE. (2008b). http://www.rae.ac.uk/pubs/2006/01/docs/dall.pdf. Accessed 18 October 2012.

  10. The official web-page of the RAE. (2009) Biological Sciences. http://www.rae.ac.uk/pubs/2009/pro/uoas/uoa%2014%20-%20all%20submissions.pdf. Accessed 18 October 2012.

  11. The official web-page of the Higher Education Funding Council for England. Funding for universities and colleges in 2009--10 (2009). Electronic Publication 01/2009 in the ADMINHEFCE Archives. Accessed 18 October 2012

  12. Oppenheim, Ch. (1996). Do citations count? Citation indexing and the research assessment exercise (RAE). Serials The Journal for the Serials Community, 9(2),155–161.

    Article  Google Scholar 

  13. Norrism M., Oppenheim, Ch. (2003). Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation, 59(6),709–730.

    Article  Google Scholar 

  14. Holmes, A., Oppenheim, C. (2001). Use of citation analysis to predict the outcome of the 2001 research assessment exercise for unit of assessment (UoA) 61: library and information management. Information Research, 6(2). http://informationr.net/ir/6-2/paper103.html.

  15. MacRoberts, M.H., MacRoberts, B.R. (1989). Problems of citation analysis: a critical review. Journal of the American Society for Information Science , 40(5):342–349.

    Article  Google Scholar 

  16. The official web-page of Evidence Thomson Reuters. (2012). http://www.evidence.co.uk. Accessed 18 October 2012.

  17. Evidence report. (2011). Funding research excellence: research group size, critical mass & performance. University Alliance report (2011). http://www.unialliance.ac.uk/wpcontent/uploads/2011/07/University-Alliance-Funding-Research-Excellence-July-2011.pdf Accessed 18 October 2012.

  18. Schubert, A., Braun, T. (1996). Cross-field normalization of scientometric indicators. Scientometrics, 36(3),311–324.

    Article  Google Scholar 

  19. Kenna, R., Berche, B. (2011b). Normalization of research evaluation results across academic disciplines. Research evaluation, 20, 107–116.

    Article  Google Scholar 

  20. The overall quality profile, and sub-profiles for research outputs, research environment and esteem indicators for each submission for RAE. (2008c). http://www.rae.ac.uk/pubs/2009/pro/Quality_profiles.xls.

Download references


This work was supported in part by the 7th FP, IRSES project No. 269139 “Dynamics and cooperative phenomena in complex physical and biological environments” and IRSES project No. 295302 “Statistical physics in diverse realizations”. The authors thank Jonathan Adams from Evidence for the data and Ihor Mryglod for fruitful discussions.

Author information



Corresponding author

Correspondence to O. Mryglod.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Mryglod, O., Kenna, R., Holovatch, Y. et al. Absolute and specific measures of research group excellence. Scientometrics 95, 115–127 (2013). https://doi.org/10.1007/s11192-012-0874-7

Download citation


  • Scientometrics
  • Scientific evaluation
  • Higher education