Abstract
De Marchi and Lorenzetti (Scientometrics 106(1):253–261, 2016) have recently argued that in fields where the journal impact factor (IF) is not calculated, such as in the humanities, it is key to find other indicators that would allow the relevant community to assess the quality of scholarly journals and the research outputs that are published in them. The authors’ suggestion is that information concerning the journal’s rejection rate and the number of subscriptions sold is important and should be used for such assessment. The question addressed by the authors is very important, yet their proposed solutions are problematic. Here I point to some of these problems and illustrate them by considering as a case in point the field of philosophy. Specifically, here I argue for four main claims. First, even assuming that IF provides a reliable indicator of the quality of journals for the assessment of research outputs, De Marchi and Lorenzetti have failed to validate their suggested indicators and proxies. Second, it has not been clarified why, in absence of IF, other journal-based metrics that are currently available should not be used. Third, the relationship between IF and rejection rate is more complex than the authors suggest. Fourth, accepting the number of sold subscriptions as a proxy would result in discrimination against open access journals. The upshot of my analysis is that the question of how to assess journals and research outputs in the humanities is still far from resolved.
Notes
Others stress that “in arts and humanities, the typical level of journal-to-journal citations is so low that citation-based journal metrics are generally not used in these fields” (Morris et al. 2013, 151).
For some arguments on the primacy of IF qua metric please see, for instance, http://scholarlyoa.com/2015/12/10/dead-metrics/.
E.g., for some discussion see also http://blog.frontiersin.org/2015/12/21/4782/ (Rocha da Silva 2016).
For a rather long list of open access journals in philosophy, please see: https://feministphilosophers.wordpress.com/2016/01/03/oa-journals-with-peer-review/.
References
Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: Unintended consequences of journal rank. Frontiers in Human Neuroscience, 7, 291. doi:10.3389/fnhum.2013.00291.
Craig, I. D., Plume, A. M., McVeigh, M. E., Pringle, J., & Amin, M. (2007). Do open access articles have greater citation impact? A critical review of the literature. Journal of Informetrics, 1(3), 239–248.
De Marchi, M., & Lorenzetti, E. (2016). Measuring the impact of scholarly journals in the humanities field. Scientometrics, 106(1), 253–261.
Haensly, P. J., Hodges, P. E., & Davenport, S. A. (2008). Acceptance rates and journal quality: An analysis of journals in economics and finance. Journal of Business and Finance Librarianship, 14(1), 2–31.
Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics, 101(2), 1419–1430.
Horvat, M., Mlinaric, A., Omazic, J., & Supak-Smolcic, V. (2015). An analysis of medical laboratory technology journals’ instructions for authors. Science and Engineering Ethics, 1–12. doi:10.1007/s11948-015-9689-2.
Lee, K. P., Schotland, M., Bacchetti, P., & Bero, L. A. (2002). Association of journal quality indicators with methodological quality of clinical research articles. Journal of the American Medical Association, 287(21), 2805–2808.
Morris, S., Barnas, E., & LaFrenier, D. (2013). The handbook of journal publishing. Cambridge: Cambridge University Press.
Moustafa, K. (2014). The disaster of the impact factor. Science and Engineering Ethics, 21(1), 139–142.
Polonioli, A. (2016). Debunking unwarranted defenses of the status quo in the humanities and social sciences. Scientometrics, 1–4. doi:10.1007/s11192-016-1906-5.
Rocha da Silva, P. (2016). Selecting for impact: New data debunks old beliefs. http://blog.frontiersin.org/2015/12/21/4782/.
Sotudeh, H., Ghasempour, Z., & Yaghtin, M. (2015). The citation advantage of author-pays model: The case of Springer and Elsevier OA journals. Scientometrics, 104(2), 581–608.
Swan, A. (2010). The open access citation advantage: Studies and results to date. In Technical report, school of electronics and computer science, university of Southampton. http://openaccess.eprints.org/index.
Wang, X., Liu, C., Mao, W., & Fang, Z. (2015). The open access advantage considering citation, article usage and social media attention. Scientometrics, 103(2), 555–564.
Wray, K. B. (2016a). No new evidence for a citation benefit for Author-Pay Open Access Publications in the social sciences and humanities. Scientometrics, 106(3), 1031–1035.
Wray, K. B. (2016b). Still no new evidence: Author-Pay Open Access in the social sciences and humanities. Scientometrics, 1–3. doi:10.1007/s11192-016-1907-4.
Acknowledgments
I am grateful to the audience of Frontiers weekly round up meetings for the constructive and helpful comments on an earlier version of this paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Polonioli, A. Metrics, flawed indicators, and the case of philosophy journals. Scientometrics 108, 987–994 (2016). https://doi.org/10.1007/s11192-016-1941-2
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-016-1941-2