Skip to main content
Log in

Evaluating University Research Performance Using Metrics

  • Profession
  • Published:
European Political Science Aims and scope Submit manuscript

Abstract

Evaluations of research quality in universities are now widely used in the advanced economies. The UK's Research Assessment Exercise (RAE), which began in 1986, is the most highly developed of these research evaluations. Based on peer review and involving some sixty-nine panels evaluating the research work of more than 50,000 academic staff, the exercise is expensive and time consuming. In this article, we examine the possibility that a quantitative, metrics-based approach can provide a low-cost alternative to expensive, qualitative peer review. To do this, we build on our previous work on political science by extending a metrics-based model to chemistry, using the results of the 2001 RAE. Our results show that no single model will apply across science and non-science disciplines. Any metrics approach to performance evaluation has to use a discipline-specific suite of indicators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1

Similar content being viewed by others

Notes

  1. Our earlier paper was part of a symposium on research evaluation in political science, and our contribution was the subject of comment and criticism from, among others, Albert Weale, Andrew Russell and Claire Donovan. We do not seek to address or report on all of the points made in these debates as they can be found in the January 2009 issue of Political Studies Review.

  2. Parts of the discussion and analysis relating to political science draw on Butler and McAllister (2009).

  3. In 2001 the subject areas were reduced by one, to sixty-eight.

  4. Research outputs are usually, but not always, publications. Other types of output – including films, exhibitions or contributions to television programmes – are increasingly common (UK Parliament, 2002).

  5. The units were: King's College London (War Studies and Defence Studies); Nottingham Trent University (International Relations, and Politics and International Studies); and the University of Sussex (SPRU – Science and Technology Policy Research, and Politics and International Studies).

  6. For one unit of assessment, from the North East Wales Institute for Higher Education, it proved impossible to identify data within the Web of Science for use in some of our analyses, and this unit was omitted.

  7. For example, in 1992 no funds were allocated to departments ranked lowest in five grades; in 1996 this was extended to the lowest two of seven grades; and in 2001, to the lowest three of seven grades.

  8. Other ways of measuring citations include: total citations, relative citation impact, and the Hirsch index (or h-index as it is now commonly known, see Hirsch, 2006). The measure used here – mean citations per submitted work, across the sixty-nine departments – is the most straightforward. Early analyses controlled for the year in which publication of the submitted work took place, since the period of time that has elapsed might influence the opportunities for the work to be cited. Only works published between 1996 and 2000 could be submitted to the 2001 RAE. This control made no substantive difference to the results and has been omitted in the interests of presenting as parsimonious a model as possible.

  9. Namely: Buzan et al (1998), Mouffe (2000).

  10. Our earlier paper omitted Thompson from the database because of the problem of having to search by first author for publications indexed in the ISI. That has been corrected, and makes no substantive difference to the results of our analysis. The only changes are slight variations in the size of the coefficients reported for political science in Table 3, and a slight variation in the ordering of the political science departments reported in Table 4.

  11. A valuable measure would of course be the total number of academic staff in a department, but that was unfortunately not available.

  12. The estimates were also made using logistic regression (to take into account the skewed distribution of the dependent variable). Since the results were substantially the same as those obtained using the OLS model, we have relied on the latter here, as the results are more easily interpretable.

  13. A further possible explanation, put forward by one of the reviewers for this journal, was the concentration of chemistry within the older universities by 2001, so that tacit knowledge of the RAE rules and procedures was evenly spread across chemistry departments. By contrast, political science had a larger number of departments spread across the old and new universities, whereas the panel membership was mainly drawn from the old universities. Tacit knowledge was therefore not as evenly spread across the political science discipline, making panel membership a more important resource for understanding the RAE process.

References

  • Arts and Humanities Research Council and the Higher Education Funding Council of England (AHRC and HEFCE). (2006) ‘Use of research metrics in the arts and humanities’, available at http://www.hefce.ac.uk/research/ref/group/, accessed 3 December 2008.

  • Bence, V. and Oppenheim, C. (2005) ‘The evolution of the UK's research assessment exercise: publications, performance and perceptions’, Journal of Educational Administration and History 37: 137–155.

    Article  Google Scholar 

  • Butler, L. and McAllister, I. (2009) ‘Metrics or peer review? Evaluating the 2001 UK research assessment exercise in political science’, Political Studies Review 7: 3–17.

    Article  Google Scholar 

  • Buzan, B., Wæver, O. and de Wilde, J. (1998) Security: A New Framework for Analysis, Boulder, Co/London: Lynne Reiner.

    Google Scholar 

  • Council for the Humanities, Arts and Social Sciences (CHASS). (2005) ‘Measures of quality and impact of publicly-funded research in the humanities, arts and social sciences’, available at http://www.chass.org.au/papers/PAP20051101JP.php, accessed 3 December 2008.

  • Dixon, H. and Suckling, J. (1996) ‘Measuring Outcomes in Higher Education’, in P. Smith (ed.) Measuring Outcome in the Public Sector, London: Taylor and Francis, pp. 47–56.

    Google Scholar 

  • Department for Education and Skills (DfES). (2006) ‘DfES consultation on the reform of higher education research assessment and funding: summary of responses’, available at: http://www.dfes.gov.uk/consultations/conResults.cfm?consultationId=1404, accessed 3 December 2008.

  • Heinrich, C.J. (2002) ‘Outcomes-based performance management in the public sector: implications for government accountability and effectiveness’, Public Administration Review 62: 712–725.

    Article  Google Scholar 

  • Hirsch, J.E. (2006) ‘An index to quantify an individual's scientific research output’, Proceedings of the National Academy of Science 102 (46): 16569–16572.

    Article  Google Scholar 

  • HM Treasury, Department of Trade and Industry, Department for Education and Skills, and Department of Health. (2006) ‘Science and innovation investment framework 2004–2014: next steps’, available at http://www.berr.gov.uk/dius/science/science-funding/framework/next_steps/page28988.html, accessed 3 December 2008.

  • Jackman, R.W. and Siverson, R.M. (1996) ‘Rating the rating: an analysis of the national research council's appraisal of political science Ph.D. programs’, PS: Political Science and Politics 29: 155–160.

    Google Scholar 

  • Moed, H.F. (2005) Citation Analysis in Research Evaluation, Heidelberg: Springer.

    Google Scholar 

  • Mouffe, C. (2000) The Democratic Paradox, London: Verso.

    Google Scholar 

  • Propper, C. and Wilson, D. (2003) ‘The use and usefulness of performance measures in the public sector’, Oxford Review of Economic Policy 19: 250–267.

    Article  Google Scholar 

  • Research Assessment Exercise (RAE). (2001a) ‘Panels’ criteria and working methods. 3.12 chemistry UoA 18’, available at http://www.rae.ac.uk/2001/Pubs/5_99/ByUoA/crit18.htm, accessed 3 December 2008.

  • Research Assessment Exercise (RAE). (2001b) ‘Panels’ criteria and working methods. 3.31 politics and international studies, UoA 39’, available at http://www.rae.ac.uk/2001/Pubs/5_99/ByUoA/crit39.htm, accessed 3 December 2008.

  • Research Assessment Exercise (RAE). (2008) ‘Research assessment exercise 2008: The outcome’, available at http://www.rae.ac.uk/pubs/2008/01/, accessed 25 December 2008.

  • UK Parliament. (2002) ‘Memorandum submitted by the higher education funding council for England (HEFCE)’, available at http://www.parliament.the-stationery-office.co.uk/pa/cm200102/cmselect/cmsctech/507/2012302.htm#n7, accessed 16 January, 2009.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Linda Butler.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Butler, L., Mcallister, I. Evaluating University Research Performance Using Metrics. Eur Polit Sci 10, 44–58 (2011). https://doi.org/10.1057/eps.2010.13

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/eps.2010.13

Keywords

Navigation