Skip to main content
Log in

Evaluating research: from informed peer review to bibliometrics

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

National research assessment exercises are becoming regular events in ever more countries. The present work contrasts the peer-review and bibliometrics approaches in the conduct of these exercises. The comparison is conducted in terms of the essential parameters of any measurement system: accuracy, robustness, validity, functionality, time and costs. Empirical evidence shows that for the natural and formal sciences, the bibliometric methodology is by far preferable to peer-review. Setting up national databases of publications by individual authors, derived from Web of Science or Scopus databases, would allow much better, cheaper and more frequent national research assessments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. In some countries, such as the USA and The Netherlands, the results of these exercises are not used to inform selective funding allocations.

  2. Mathematics and computer sciences, Physics, Chemistry, Earth sciences, Biological sciences, Medical sciences, Agriculture and veterinary sciences, Industrial and information engineering.

  3. VQR (Quinquennial Research Evaluation). http://civr.miur.it/en/index.html. Accessed 21 Jan 2011.

  4. ERA (Excellence in Research for Australia). http://www.arc.gov.au/era/default.htm. Accessed 21 Jan 2011.

  5. Further information can be retrieved from “Second consultation on the assessment and funding of research” of September 2009, downloadable at http://www.hefce.ac.uk/pubs/hefce/2009/09_38/#exec. The launch of the REF is planned for 2012. Not all details are definitive and some may be subject to adjustment.

  6. The peer-review approach is used for the social sciences, arts and humanities. The list of the disciplines evaluated by bibliometrics only can be found at: http://www.arc.gov.au/era/key_docs10.htm.

  7. If the research output submitted is published in a journal not indexed by Scopus, but is on the ERA journal list, it will be included in the ‘ranked outlet’ analysis but not used in ‘citation analysis’.

  8. We refer to individual level research performance assessments through citation indicators. An example of such methodology is presented in Abramo and D’Angelo (2011).

  9. VTR (Italian Triennial Research Evaluation Framework). http://vtr2006.cineca.it/index_EN.html. Accessed 21 Jan 2011.

  10. From REF 2009 “Second consultation on the assessment and funding of research” downloadable at http://www.hefce.ac.uk/pubs/hefce/2009/09_38/#exec.

  11. Research Excellence Framework, page 34, downloadable at http://www.hefce.ac.uk/pubs/hefce/2009/09_38/.

  12. Audits of more recent exercises reported much lower levels of error, with the latest rate being under 10%, probably due to Australian universities learning how to better collect data on publications.

  13. More details on the ORP can be found in Abramo et al. (2008).

  14. A pertinent example is that of J. Hirsch, father of the bibliometric indicator by the same name, who is a physicist that publishes both in physics and in scientometrics categories.

  15. A number of Italian universities (e.g., the universities of Rome Tor Vergata, Milan, Pavia, Cagliari and Udine) have already used the ORP system for comparative evaluation of research.

References

  • Abramo, G., & D’Angelo, C. A. (2011). National-scale research performance assessment at the individual level. Scientometrics, 86(2), 347–364.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Pugini, F. (2008). The measurement of Italian Universities’ research productivity by a non parametric-bibliometric methodology. Scientometrics, 76(2), 225–244.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Caprasecca, A. (2009). Allocative efficiency in public research funding: can bibliometrics help? Research Policy, 38(1), 206–215.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Viel, F. (2010). Peer review research assessment: a sensitivity analysis of performance rankings to the share of research product evaluated. Scientometrics, 85(3), 705–720.

    Article  Google Scholar 

  • Adams, J., & Griliches, Z. (1998). Research productivity in a system of universities. Annales d’economie et de statistique, 4950, 127–162.

    Google Scholar 

  • Aksnes, D. W. (2008). When different persons have an identical author name. How frequent are homonyms? Journal of the American Society for Information Science and Technology, 59(5), 838–841.

    Article  Google Scholar 

  • Aksnes, D. W., & Taxt, R. E. (2004). Peers reviews and bibliometric indicators: a comparative study at Norvegian University. Research Evaluation, 13(1), 33–41.

    Article  Google Scholar 

  • Butler, L., & McAllister, I. (2007). Metrics or peer review? Evaluating the 2001 UK Research assessment exercise in political science. Political Studies Review, 7, 3–17.

    Article  Google Scholar 

  • D’Angelo, C. A., Giuffrida, C., & Abramo, G. (2011). A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology, 62(2), 257–269.

    Article  Google Scholar 

  • Garfield, E. (1980). Premature discovery or delayed recognition—Why? Current Contents, 21, 5–10.

    Google Scholar 

  • Glanzel, W., 2008. Seven myths in bibliometrics. About facts and fiction in quantitative science studies. In Kretschmer & F. Havemann (Eds.), Proceedings of WIS Fourth international conference on webometrics, informetrics and scientometrics and ninth COLLNET meeting, Berlin.

  • Harman, G. (2000). Allocating research infrastructure grants in post-binary higher education systems: British and Australian approaches. Journal of Higher Education Policy and Management, 22(2), 111–126.

    Article  MathSciNet  Google Scholar 

  • Horrobin, D. F. (1990). The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association, 263(10), 1438–1441.

    Article  Google Scholar 

  • Lach, S., & Schankerman, M., 2003. Incentives and invention in universities, National Bureau of Economic Research working paper 9727. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1158310.

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

    Google Scholar 

  • Moxham, H., Anderson, J., 1992. Peer review. A view from the inside. Science and Technology Policy, 7–15.

  • Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, 53(5), 477–487.

    Article  MathSciNet  Google Scholar 

  • Oppenheim, C., & Norris, M. (2003). Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation, 56(6), 709–730.

    Google Scholar 

  • Pendlebury, D. A. (2009). The use and misuse of journal metrics and other citation indicators. Scientometrics, 57(1), 1–11.

    Google Scholar 

  • REF (Research Excellence Framework). 2009. http://www.hefce.ac.uk/pubs/hefce/2009/09_38/#exec. Accessed 21 Jan 2011.

  • Rinia, E. J., van Leeuwen, Th. N., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria, evaluation of condensed matter physics in The Netherlands. Research Policy, 27, 95–107.

    Article  Google Scholar 

  • van Raan, A. F. J. (2008). Scaling rules in the science system: influence of field-specific citation characteristics on the impact of research groups. Journal of the American Society for Information Science and Technology, 59(4), 565–576.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Abramo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Abramo, G., D’Angelo, C.A. Evaluating research: from informed peer review to bibliometrics. Scientometrics 87, 499–514 (2011). https://doi.org/10.1007/s11192-011-0352-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-011-0352-7

Keywords

Navigation