Skip to main content

Journal Ratings as Predictors of Article Quality in Arts, Humanities, and Social Sciences: An Analysis Based on the Italian Research Evaluation Exercise

  • Chapter
  • First Online:
The Evaluation of Research in Social Sciences and Humanities

Abstract

The aim of this paper is to understand whether the probability of receiving positive peer reviews is influenced by having been published in an independently assessed, high-ranking journal: we interpret a positive relationship among peer evaluation and journal ranking as evidence that journal ratings are good predictors of article quality. The analysis is based on a large dataset of more than 11,500 research articles published in Italy between 2004 and 2010 in the areas of architecture, arts and humanities, history and philosophy, law, sociology and political sciences. These articles received a score from a significant number of externally appointed referees during the Italian research assessment exercise (VQR). Similarly, journal scores were assigned in a panel-based independent assessment, which involved academic journals in which Italian scholars were published, carried out using a different procedure. An article’s score was compared with that of the journal in which it was published. We first estimate an ordered probit model, assessing the paper’s probability receiving a higher score due to the higher journal’s score. In a second step, we concentrate on the top papers, evaluating the probability of a paper receiving an excellent score after it was published in a top-rated journal. In doing so, we control some characteristics of the paper and its author, including the publication’s language, the scientific field and its size, the author’s age and academic status. We add to journal classification literature by providing a large-scale test of the robustness of expert-based classification for the first time.

This chapter is a revised and expanded version of Bonaccorsi A., Cicero T., Ferrara A. and Malgarini M., Journal ratings as predictors of articles quality in Arts, Humanities, and Social Sciences: an analysis based on the Italian Research Evaluation Exercise [version 1; referees: 3 approved]. F1000Research 2015, 4:196 (doi: 10.12688/f1000research.6478.1).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    All the statistical analyses have been performed using the software STATA ver. 13 (http://www.stata.com/stata13/)

References

  • Aagaard, K., Bloch, C., & Schneider, J. W. (2015). Impacts of performance-based research funding systems: The case of the Norwegian publication indicator. Research Evaluation, 24(2), 106–117.

    Article  Google Scholar 

  • Ahlgren, P., Colliander, C., & Persson, O. (2012). Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds. Scientometrics, 92(3), 767–780.

    Article  Google Scholar 

  • Ancaiani, A., Anfossi, A., Barbara, A., Benedetto, S., Blasi, B., Carletti, V., Cicero, T., Ciolfi, A., Costa, F., Colizza, G., Costantini, M., di Cristina, F., Ferrara, A., Lacatena, R. M., Malgarini, M., Mazzotta, I., Nappi, C. A., Romagnosi, S., & Sileoni, S. (2015). Evaluating scientific research in Italy: The 2004-2010 research evaluation exercise. Research Evaluation, 24(3), 242–255.

    Article  Google Scholar 

  • Archambault, E., Vignola-Gagnè, E., Cotè, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.

    Google Scholar 

  • Baccini A. (2016). Comments on Bonaccorsi A, Cicero T, Ferrara A. and Malgarini M., Journal ratings as predictors of articles quality in Arts, Humanities and Social Sciences: an analysis based on the Italian Research Evaluation Exercise. https://f1000research.com/articles/4-196/v1.

  • Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. A., & Peracchi, F. (2015). Bibliometric evaluation vs. informed peer review: Evidence from Italy. Research Policy, 44, 451–466.

    Article  Google Scholar 

  • Bolton, K., & Kuteeva, M. (2012). English as an academic language at a Swedish university: Parallel language use and the ‘threat of English'. Journal of Multilingual and Multicultural Development, 33(5), 429–447.

    Google Scholar 

  • Butler, L. (2003a). Explaining Australia’s increased share of ISI publications – The effects of a funding formula based on publication counts. Research Policy, 32, 143–155.

    Article  Google Scholar 

  • Butler, L. (2003b). Modifying publication practices in response to funding formulas. Research Evaluation, 12, 39–46.

    Article  Google Scholar 

  • Campanario, J.-M. (1998a). Peer review for journals as it stands today – Part 1. Science Communication, 19(3), 181–211.

    Article  Google Scholar 

  • Campanario, J.-M. (1998b). Peer review for journals as it stands today – Part 2. Science Communication, 19(4), 277–306.

    Google Scholar 

  • Campbell, K., Goodacre, A., & Little, G. (2006). Ranking of United Kingdom law journals: An analysis of the research assessment exercise 2001 submissions and results. Journal of Law and Society, 33, 335–363.

    Article  Google Scholar 

  • Carayol, N., & Dalle, J. M. (2007). Sequential problem choice and the reward system in Open Science. Structural Change and Economic Dynamics, 17, 167–191.

    Article  Google Scholar 

  • Christenson, J. A., & Sigelman, L. (1985). Accrediting knowledge: Journal stature and citation impact in social science. Social Science Quarterly, 66(4), 964–975.

    Google Scholar 

  • Chubin, D. E., & Hackett, E. J. (1990). Peerless science: Peer review and U.S. science policy. Albany: State University of New York Press.

    Google Scholar 

  • Cicchetti, D. V. (1991). The reliability of peer review for manuscript and grant submissions: A cross-disciplinary investigation. Behavioural and Brain Sciences, 14, 119–186.

    Article  Google Scholar 

  • Cicero, T., Malgarini, M., & Benedetto, S. (2016). Determinants of research quality in Italian universities: Evidence from the 2004 to 2010 evaluation exercise. Research Evaluation, 25(3), 257–263.

    Article  Google Scholar 

  • Cole, S. (1992). Making science. Between nature and society. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Cole, S., Cole, J. R., & Simon, G. (1981). Chance and consensus in peer review. Science, 214, 881–886.

    Article  Google Scholar 

  • Daniel, H.D. (1993). Guardians of science. Fairness and reliability of peer review. Weinheim: VCH.

    Google Scholar 

  • Ferrara, A., & Bonaccorsi, A. (2016). How robust is journal rating in humanities and social sciences? Evidence from a large-scale multi-method exercise. Research Evaluation, 25(3), 279–291.

    Article  Google Scholar 

  • Finkenstaedt, T. (1990). Measuring research performance in the humanities. Scientometrics, 19(5/6), 409–417.

    Article  Google Scholar 

  • Frost, C. O. (1979). The use of citations in literary research: A preliminary classification of citation functions. Library Quarterly, 49(4), 399–414.

    Article  Google Scholar 

  • Gimenez-Toledo, E., & Roman-Roman, A. (2009). Assessment of humanities and social sciences monographs through their publishers: A review and a study towards a model of evaluation. Research Evaluation, 18, 201–213.

    Article  Google Scholar 

  • Glanzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and the social sciences. Information Processing and Management, 35, 31–44.

    Article  Google Scholar 

  • Gläser, J., & Laudel, G. (2005). Advantages and dangers of ‘remote’ peer evaluation. Research Evaluation, 11(2), 141–154.

    Article  Google Scholar 

  • Gomez-Mejia, L., & Balkin, D. (1992). Determinants of faculty pay: An agency theory perspective. Academy of Management Journal, 35(5), 921–955.

    Article  Google Scholar 

  • Guetzkow, J., Lamont, M., & Mallard, G. (2004). What is originality in the humanities and the social sciences? American Sociological Review, 69(2), 190–212.

    Article  Google Scholar 

  • Hammarfelt, B. (2011). Interdisciplinarity and the intellectual base of literature studies: A citation analysis of highly cited monographs. Scientometrics, 86(3), 705–725.

    Article  Google Scholar 

  • Hammarfelt, B. (2012). Harvesting footnotes in a rural field: Citation patterns in Swedish literary studies. Journal of Documentation, 68(4), 536–558.

    Article  Google Scholar 

  • Hammarfelt, B. (2016). Beyond coverage: Toward a bibliometrics for the humanities. In M. Ochsner et al. (Eds.), Research assessment in the humanities (pp. 115–132). London: Springer Open 2016.

    Chapter  Google Scholar 

  • Hammarfelt, B., & De Rijcke, S. (2015). Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the Faculty of Arts at Uppsala University. Research Evaluation, 24(1), 63–77.

    Article  Google Scholar 

  • Hasselback, J. R., Reinstein, A., & Schwan, E. S. (2000). Benchmarks for evaluating the performance of accounting faculty. Journal of Accounting Education, 18, 79–97.

    Article  Google Scholar 

  • Hellqvist, B. (2010). Referencing in the humanities and its implications for citation analysis. Journal of the American Society for Information Science and Technology, 61(2), 310–318.

    Google Scholar 

  • Hemlin, S. (1996). Social studies of the humanities: A case study of research conditions and performance in ancient history and classical archaeology and English. Research Evaluation, 6(1), 53–61.

    Article  Google Scholar 

  • Hemlin, S., & Gustafsson, M. (1996). Research production in the arts and humanities. A questionnaire study of factors influencing research performance. Scientometrics, 37(3), 417–432.

    Article  Google Scholar 

  • Hogler, R., & Gross, M. A. (2009). Journal rankings and academic research: Two discourses about the quality of faculty work. Management Communication Quarterly, 23(1), 107–126.

    Article  Google Scholar 

  • Hug, S. E., Ochsner, M., & Daniel, H.-D. (2013). Criteria for assessing research quality in the humanities: A Delphi study among scholars of English literature, German literature and art history. Research Evaluation, 22(5), 369–383.

    Google Scholar 

  • Hug, S. E., Ochsner, M., & Daniel, H.-D. (2014). A framework to explore and develop criteria for assessing research quality in the humanities. International Journal of Education Law and Policy, 10(1), 55–64.

    Google Scholar 

  • Ingwersen, P., & Larsen, B. (2014). Influence of a performance indicator on Danish research production and citation impact 2000–12. Scientometrics, 101(2), 1325–1344.

    Article  Google Scholar 

  • Jarwal, S. D., Brion, A. M., & King, M. L. (2009). Measuring research quality using the journal impact factor, citations and ‘ranked journals’: Blunt instruments or inspired metrics? Journal of Higher Education Policy and Management, 31(4), 289–300.

    Article  Google Scholar 

  • Lamp, J. W. (2009). At the sharp end: Journal ranking and the dreams of academics. Online Information Review, 33(4), 827–830.

    Article  Google Scholar 

  • Lee, F. S. (2006). The ranking game, class, and scholarship in American mainstream economics. Journal of Economics, 3, 1–41.

    Google Scholar 

  • Macdonald, S., & Kam, J. (2008). Quality journals and gamesmanship in management studies. Management Research News 31(8),595-606.

    Google Scholar 

  • Mahoney, M. J. (1977). Publication prejudices: An experimental study of confirmatory bias in the peer review system. Cognitive Therapy and Research, 1(1977), 161–175.

    Article  Google Scholar 

  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.

    Article  Google Scholar 

  • Nederhof, A. J., & Noyons, E. C. M. (1992). International comparison of departments’ research performance in the humanities. Journal of the American Society for Information Science, 43(3), 249–256.

    Article  Google Scholar 

  • Nederhof, A. J., & Zwaan, R. A. (1991). Quality judgements of journals as indicators of research performance in the humanities and the social and behavioural sciences. Journal of the American Society for Information Science, 42(5), 332–340.

    Google Scholar 

  • Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. (1989). Assessing the usefulness of bibliometric indicators for the humanities and the social sciences – A comparative study. Scientometrics, 15(5/6), 423–435.

    Article  Google Scholar 

  • Ochsner, M., Hug, S. E., & Daniel, H.-D. (2012). Indicators for research quality in the humanities: Opportunities and limitations. Bibliometrie – Praxis und Forschung, 1, 4.

    Google Scholar 

  • Ochsner, M., Hug, S. E., & Daniel, H.-D. (2013). Four types of research in the humanities: Setting the stage for research quality criteria in the humanities. Research Evaluation, 22(2), 79–92.

    Google Scholar 

  • Park, S., & Gordon, M. (1996). Publication records and tenure decisions in the field of strategic management. Strategic Management Journal, 17(2), 109–128.

    Article  Google Scholar 

  • Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinarity. The case of innovation studies in business and management. Research Policy, 41(7), 1262–1282.

    Article  Google Scholar 

  • Rodriguez-Navarro, A. (2009). Sound research, unimportant discoveries: Research, universities, and formal evaluation of research in Spain. Journal of the American Society for Information Science, 60(9), 1845–1858. 

    Google Scholar 

  • Schneider, J. W. (2009). An outline of the bibliometric indicator used for performance-based funding of research institutions in Norway. European Political Science, 8(3), 364–378.

    Article  Google Scholar 

  • Sivertsen, G. (2016). Publication-based funding: The Norwegian model. In M. Ochsner et al. (eds.), Research Assessment in the Humanities, Springer Open 2016, London, 79–90.

    Google Scholar 

  • Starbuck, W. H. (2005). How much better are the most prestigious journals? The statistics of academic publication. Organization Science, 16, 180–200.

    Article  Google Scholar 

  • Thompson, J. W. (2002). The death of the scholarly monograph in the humanities? Citation patterns in literary scholarship. Libri, 52, 121–136.

    Article  Google Scholar 

  • Travis, G. D. L., & Collins, H. M. (1991). New light on old boys: Cognitive and institutional particularism in the peer review system. Science, Technology & Human Values., 16(3), 322–341.

    Article  Google Scholar 

  • Van Fleet, D., McWilliams, A., & Siegel, D. (2000). A theoretical and empirical analysis of journal rankings: The case of formal lists. Journal of Management, 26(5), 839–861.

    Article  Google Scholar 

  • Ward, K. (2009). The future of research monographs: An international set of perspectives. Progress in Human Geography., 33(1), 101–126.

    Article  Google Scholar 

  • Williams, P., et al. (2009). The role and future of the monograph in arts and humanities research. ASLIB Proceedings, 61, 67–82.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrea Bonaccorsi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Bonaccorsi, A., Ferrara, A., Malgarini, M. (2018). Journal Ratings as Predictors of Article Quality in Arts, Humanities, and Social Sciences: An Analysis Based on the Italian Research Evaluation Exercise. In: Bonaccorsi, A. (eds) The Evaluation of Research in Social Sciences and Humanities. Springer, Cham. https://doi.org/10.1007/978-3-319-68554-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68554-0_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-68553-3

  • Online ISBN: 978-3-319-68554-0

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics