Skip to main content
Log in

Peer review research assessment: a sensitivity analysis of performance rankings to the share of research product evaluated

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In national research assessment exercises that take the peer review approach, research organizations are evaluated on the basis of a subset of their scientific production. The dimension of the subset varies from nation to nation but is typically set as a proportional function of the number of researchers employed at each research organization. However, scientific fertility varies from discipline to discipline, meaning that the representativeness of such a subset also varies according to discipline. The rankings resulting from the assessments could be quite sensitive to the size of the share of articles selected for evaluation. The current work examines this issue, developing empirical evidence of variations in ranking due changes in the dimension of the subset of products evaluated. The field of observation is represented by the scientific production from the hard sciences of the entire Italian university system, from 2001 to 2003.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. For a detailed analysis of the distribution of average productivity among scientific disciplines, see Abramo et al. (2008).

  2. www.orp.researchvalue.it

  3. Note that these disciplines include 60% of Italy’s total university research personnel. Civil engineering and architecture are not considered because the WoS listings are not sufficiently representative of research output in this area.

  4. The publications considered are those referred to in the WoS as “article” or “review”, and exclude all other types of publications.

  5. Publications in multidisciplinary journals were distributed to the relevant UDAs.

  6. The analysis here and in the next section of the study excludes, for each UDA, those universities with less than an average of 5 researchers on staff over the triennium considered.

References

  • Abramo, G., & D’Angelo, C. A. (2009). A decision support system for public research organizations participating in national research assessment exercises. Journal of the American Society for Information Science and Technology, 60(10), 2095–2106.

    Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Caprasecca, A. (2009). Allocative efficiency in public research funding: Can bibliometrics help? Research Policy, 38(1), 206–215.

    Article  Google Scholar 

  • DEST. (2007). Department of Education, Science and Training. Research quality framework: Assessing the quality and impact of research in Australia—RQF submission specifications. Canberra: Commonwealth of Australia.

  • ERA. (2009). The Excellence in Research for Australia. Evaluation Guidelines for the 2009 ERA Trial. http://www.arc.gov.au/pdf/ERA_Eval_Guide.pdf (last accessed 5 November 2010).

  • Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304.

    Article  Google Scholar 

  • Hicks, D. (2009). Evolving regimes of multi-university research evaluation. Higher Education, 57, 393–404.

    Article  Google Scholar 

  • Horrobin, D. F. (1990). The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association, 263, 1438–1441.

    Article  Google Scholar 

  • Moxam, H., & Anderson, J. (1992). Peer review. A view from the inside. Science and Technology Policy, 5(1), 7–15.

    Google Scholar 

  • NRC. (2009). US National Research Council. http://sites.nationalacademies.org/NRC/index.htm (last accessed 5 November 2010).

  • Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, 53, 477–487.

    Article  Google Scholar 

  • Oppenheim, C., & Norris, M. (2003). Citation counts and the research assessment exercise V: Archaeology and the 2001 RAE. Journal of Documentation, 56(6), 709–730.

    Google Scholar 

  • PBRF. (2008). Performance-Based Research Fund – ANNUALREPORT 2008. http://www.tec.govt.nz/Documents/Publications/PBRF-AR-2008-for-web.pdf (last accessed 5 November 2010).

  • RAE. (2008). Research Assessment Exercise – 2008. Panel criteria and working methods. http://www.rae.ac.uk/pubs/2006/01/ (last accessed 5 November 2010).

  • REF. (2009). Report on the pilot exercise to develop bibliometric indicators for the Research Excellence Framework. http://www.hefce.ac.uk/pubs/hefce/2009/09_39/ (last accessed 5 November 2010).

  • Rinia, E. J., van Leeuwen, Th. N., van Vuren, H. G., & Van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria, evaluation of condensed matter physics in the Netherlands. Research Policy, 27, 95–107.

    Article  Google Scholar 

  • VTR. (2006). Valutazione Triennale (2001–2003) della Ricerca italiana (italian Triennal Assessment Excercise). Risultati delle valutazioni dei Panel di Area. http://vtr2006.cineca.it/php5/vtr_ris_val_panel.php (last accessed 5 November 2010).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Abramo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Abramo, G., D’Angelo, C.A. & Viel, F. Peer review research assessment: a sensitivity analysis of performance rankings to the share of research product evaluated. Scientometrics 85, 705–720 (2010). https://doi.org/10.1007/s11192-010-0238-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-010-0238-0

Keywords

Navigation