Abstract
National Research Assessment Exercises (NRAEs) aim to improve returns from public funding of research. Critics argue that they undervalue publications influencing practice, not citations, implying that journals valued least by NRAEs are disproportionately useful to practitioners. Conservation biology can evaluate this criticism because it uses species recovery plans, which are practitioner-authored blueprints for recovering threatened species. The literature cited in them indicates what is important to practitioners’ work. We profiled journals cited in 50 randomly selected recovery plans from each of the USA, Australia and New Zealand, using ranking criteria from the Australian Research Council and the SCImago Institute. Citations showed no consistent pattern. Sometimes higher ranked publications were represented more frequently, sometimes lower ranked publications. Recovery plans in all countries also contained 37 % or more citations to ‘grey literature’, discounted in NRAEs. If NRAEs discourage peer-reviewed publication at any level they could exacerbate the trend not to publish information useful for applied conservation, possibly harming conservation efforts. While indicating the potential for an impact does not establish that it occurs, it does suggest preventive steps. NRAEs considering the proportion of papers in top journals may discourage publication in lower-ranked journals, because one way to increase the proportion of outputs in top journals is by not publishing in lower ones. Instead, perhaps only a user-nominated subset of publications could be evaluated, a department’s or an individual’s share of the top publications in a field could be noted, or innovative new multivariate assessments of research productivity applied, including social impact.
Similar content being viewed by others
References
Adler, R., Ewing, J. & Taylor, P. (2008). Citation statistics. A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Instititute of Mathematical Statistics (IMS). http://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf. Accessed 28 August 2012.
Adler, N. J., & Harzing, A.-W. (2009). When knowledge wins: transcending the sense and nonsense of academic rankings. Academy of Management Learning and Education, 8, 72–95.
ARC (2012). ERA 2012 frequently asked questions. Available from http://www.arc.gov.au/era/faq.htm. Accessed 12 August 2011.
Bollen, J., van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principal component analysis of 39 scientific impact measures. PLoS ONE, 4(6), e6022. doi:10.1371/journal.pone.0006022.
Box, S. (2010). Performance-based funding for public research in tertiary education institutions: Country experiences. In OECD Performance-based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings. Paris: OECD Publishing. doi:10.1787/9789264094611-en.
Broadbent, J. (2010). The UK research assessment exercise: Performance measurement and resource allocation. Australian Accounting Review, 20, 14–23.
Bryant, K. & Calver, M. (2012). Adaptive radiation in Australian journals in the Arbustocene ERA: an empty niche for JANCO? In P.B.Banks, D. Lunney & C.R. Dickman (Eds.), Science under siege (in press). Sydney: Royal Zoological Society of New South Wales.
Butler, L. (2007). Assessing university research: a plea for a balanced approach. Science and Public Policy, 34, 565–574.
Butler, L. (2010). Impacts of performance-based research funding systems: a review of the concerns and the evidence. In OECD Performance-based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings. Paris: OECD Publishing. doi:10.1787/9789264094611-en.
Butler, L., & McAllister, I. (2009). Authors’ response to reviews. Political Studies Review, 7, 84–87.
Calver, M. C., & King, D. R. (2000). Why publication matters in conservation biology. Pacific Conservation Biology, 6, 2–8.
Calver, M. C., Grayson, J., Lilith, M., & Dickman, C. R. (2011). Applying the precautionary principle to the issue of impacts by pet cats on urban wildlife. Biological Conservation, 144, 1895–1901.
Calver, M., Wardell-Johnson, G., Bradley, S., & Taplin, R. (2010). What makes a journal international? A case study using conservation biology journals. Scientometrics, 85, 387–400.
Carr, K. (2011). Improvements to Excellence in Research for Australia. Canberra: Australian Government. Available from http://archive.innovation.gov.au/ministersarchive2011/carr/MediaReleases/Pages/IMPROVEMENTSTOEXCELLENCEINRESEARCHFORAUSTRALIA.html. Accessed 9 April 2011.
Clark, J. A., Hoekstra, J. M., Boersma, P. D., & Kareiva, P. (2002). Improving US Endangered Species Act recovery plans: key findings and recommendations of the SCB recovery plan project. Conservation Biology, 16, 1510–1519.
Colledge, L., De Moya-Anegón, F., Guerrero-Bote, V., López-Illescas, C., El Aisati, M., & Moed, H. F. (2010). SJR and SNIP: Two new journal metrics in Elsevier’s Scopus. Serials, 23, 215–221.
Cooper, S., & Poletti, A. (2011). The new ERA of journal ranking: the consequences of Australia’s fraught encounter with ‘quality’. Australian Universities’ Review, 53, 57–65.
Corsi, M., D’Ippoliti, C., & Lucidi, F. (2010). Pluralism at risk? Heterodox economic approaches and the evaluation of economic research in Italy. American Journal of Economics and Sociology, 69, 1495–1529.
Debachere, M.-C. (1995). Problems in obtaining grey literature. IFLA Journal, 21, 94–98.
Deville, A., & Harding, R. (1997). Applying the precautionary principle. Sydney: The Federation Press.
Edgar, F., & Geare, A. (2010). Characteristics of high- and low-performing university departments as assessed by the New Zealand performance based research funding (PBRF) exercise. Australian Accounting Review, 20, 55–63.
Elton, L. (2000). The UK research assessment exercise: unintended consequences. Higher Education Quarterly, 54, 274–283.
Fairfull, S. J., & Williams, R. J. (2003). Community involvement in natural resource management: Lessons for future water management in catchments of New South Wales. In P. Hutchings & D. Lunney (Eds.), Conserving marine environments: Out of sight, out of mind (pp. 55–61). Sydney: Royal Zoological Society of New South Wales.
Falagas, M. E., & Alexiou, V. G. (2008). The top-ten in journal impact factor manipulation. Archivum Immunologiae Et Therapiae Experimentalis, 56, 223–226.
Gihus, N. E., & Sivertsen, G. (2009). Publishing affects funding in neurology. European Journal of Neurology, 17, 147–151.
Gowrishankar, J., & Divakar, P. (1999). Sprucing up one’s impact factor (multiple letters). Nature, 401(6751), 321–322.
Hicks, D. (2009). Evolving regimes of multi-university research evaluation. Higher Education, 57, 541–552.
Hicks, D. (2010). Overviews of performance-based research funding systems. In OECD Performance-based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings. Paris: OECD Publishing. doi:10.1787/9789264094611-en.
Hodder, A. P. W., & Hodder, C. (2010). Research culture and New Zealand’s performance-based research fund: Some insights from bibliographic compilations of research outputs. Scientometrics, 84, 1–15.
Horwitz, P. & Wardell-Johnson, G. (2009). Cultural conservation biology. In M. Calver, A. Lymbery, J. McComb & M. Bamford, M. (Eds.) Environmental biology (pp. 559–578). Melbourne: Cambridge University Press.
Jacsó, P. (2010). Comparison of journal impact rankings in the SCImago journal and country rank and the journal citation reports databases. Online Information Review, 34, 642–657.
Lane, J. (2010). Let’s make science metrics more scientific. Nature, 464, 488–489.
Lane, J., & Bertuzzi, S. (2011). Measuring the results of science investments. Science, 331, 678–680.
Lawrence, P. A. (2007). The mismeasurement of science. Current Biology, 17, R583–R585.
Luwel, M. (2010). Highlights and reflections: rapporteur’s report. In OECD Performance-based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings. Paris: OECD Publishing. doi:10.1787/9789264094611-en.
Marsh, H., Smith, B., King, M., & Evans, T. (2012). A new era for research education in Australia? Australian Universities’ Review, 54, 83–93.
Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’. Are we creating a Frankenstein monster? Research Evaluation, 20, 247–254.
McNay, I. (1998). The research assessment exercise (RAE) and after: ‘you never know how it will turn out’. Perspectives: Policy and Practice in Higher Education, 2, 19–22.
Meffe, G. (2006). The success—and challenges—of conservation biology. Conservation Biology, 20, 931–933.
Molas-Gallart, J., & Tang, P. (2011). Tracing ‘productive interactions’ to identify social impacts: an example from the social sciences. Research Evaluation, 20, 219–226.
Northcott, D., & Linacre, S. (2010). Producing spaces for academic discourse: The impact of research assessment exercises and journal quality rankings. Australian Accounting Review, 52, 38–54.
OECD (2010). Performance-based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings. Paris: OECD Publishing. doi:10.1787/9789264094611-en.
Oppenheim, C. (2008). Out with the old and in with the new: the RAE, bibliometrics and the new REF. Journal of Librarianship and Information Science, 40, 147–149.
Ortega-Argueta, A., Baxter, G., & Hockings, M. (2011). Compliance of Australian threatened species recovery plans with legislative requirements. Journal of Environmental Management, 92, 2054–2060.
Oswald, A. J. (2010). A suggested method for the measurement of world-leading research (illustrated with data on economics). Scientometrics, 84, 99–113.
Primack, R. (2009). Why did we reject your paper? Biological Conservation, 142, 1559.
Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business and management. Research Policy, 41, 1262–1282.
Roa, T., Beggs, J. R., Williams, J., & Mohler, H. (2009). New Zealand’s performance based research funding (PBRF) model undermines Maori research. Journal of the Royal Society of New Zealand, 39, 233–238.
Sampson, K. A., & Comer, K. (2010). When the governmental tail wags the disciplinary dog: some consequences of national funding policy on doctoral research in New Zealand. Higher Education Research and Development., 29, 275–289.
Schuch, S., Bock, J., Leuschner, C., Schaefer, M., & Wesche, K. (2011). Minor changes in orthopteran assemblages of Central European protected dry grasslands during the last 40 years. Journal of Insect Conservation, 15, 811–822.
Schuch, S., Bock, J., Krause, B., Wesche, K., & Schaefer, M. (2012a). Long-term population trends in three grassland insect groups: A comparative analysis of 1951 and 2009. Journal of Applied Entomology, 136, 321–331.
Schuch, S., Wesche, K., & Schaefer, M. (2012b). Long-term decline in the abundance of leafhoppers and planthoppers (Auchenorrhyncha) in Central European protected dry grasslands. Biological Conservation, 149, 75–83.
SCImago (2007). SJR—SCImago Journal & Country Rank. Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) & Alcalá de Henares, Spain. Available from http://www.scimagojr.com. Accessed July–August 2010.
SCImago (2012). SCImago institutions rankings. SIR World Rankings 2011: Global ranking. Available from http://www.scimagoir.com/ Accessed 9 April 2012.
Shewan, L. G., & Coats, A. J. S. (2006). The research quality framework and its implications for health and medical research: time to take stock? Medical Journal of Australia, 184, 463–466.
Smith, S., Ward, V., & House, A. (2011). ‘Impact’ in the proposals for the UK’s Research Excellence Framework: shifting the boundaries of academic autonomy. Research Policy, 40, 1369–1379.
Spaapen, J., & van Drooge, L. (2011). Productive interactions as a tool for social impact assessment of research. Research Evaluation, 20, 211–218.
Steele, C., Butler, L., & Kingsley, D. (2006). The publishing imperative: the pervasive influence of publication metrics. Learned Publishing, 19, 277–290.
Stergiou, K. I. S., & Tsikliras, A. C. (2006). Underrepresentation of regional ecological research output by bibliometric indices. Ethics in Science and Environmental Politics, 6, 15–17.
Stinchcombe, J., & Moyle, L.C. (2002). The influence of the academic conservation biology literature on endangered species recovery planning. Conservation Ecology 6(2), 15. http://www.consecol.org/vol16/iss12/art15/.
UNESCO (2005). The precautionary principle. World Commission on the Ethics of Scientific Knowledge and Technology (COMEST). United Nations Educational, Scientific and Cultural Organization, Paris.
Visser, G. (2009). Tourism geographies and the South African National Research Foundation’s Researcher Rating System: international connections and local disjunctures. Tourism Geographies, 11, 43–72.
Witten, K., & Hammond, K. (2010). What becomes of social science knowledge: New Zealand researchers’ experiences of knowledge transfer modes and audiences. Kotuitui, 5, 3–12.
Acknowledgments
We thank, without implication, H. Recher, B. Dell, D. Saunders and an anonymous reviewer for detailed and constructive feedback on earlier versions.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Calver, M.C., Lilith, M. & Dickman, C.R. A ‘perverse incentive’ from bibliometrics: could National Research Assessment Exercises (NRAEs) restrict literature availability for nature conservation?. Scientometrics 95, 243–255 (2013). https://doi.org/10.1007/s11192-012-0908-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-012-0908-1