Advertisement

Scientometrics

, Volume 108, Issue 2, pp 987–994 | Cite as

Metrics, flawed indicators, and the case of philosophy journals

  • Andrea Polonioli
Article

Abstract

De Marchi and Lorenzetti (Scientometrics 106(1):253–261, 2016) have recently argued that in fields where the journal impact factor (IF) is not calculated, such as in the humanities, it is key to find other indicators that would allow the relevant community to assess the quality of scholarly journals and the research outputs that are published in them. The authors’ suggestion is that information concerning the journal’s rejection rate and the number of subscriptions sold is important and should be used for such assessment. The question addressed by the authors is very important, yet their proposed solutions are problematic. Here I point to some of these problems and illustrate them by considering as a case in point the field of philosophy. Specifically, here I argue for four main claims. First, even assuming that IF provides a reliable indicator of the quality of journals for the assessment of research outputs, De Marchi and Lorenzetti have failed to validate their suggested indicators and proxies. Second, it has not been clarified why, in absence of IF, other journal-based metrics that are currently available should not be used. Third, the relationship between IF and rejection rate is more complex than the authors suggest. Fourth, accepting the number of sold subscriptions as a proxy would result in discrimination against open access journals. The upshot of my analysis is that the question of how to assess journals and research outputs in the humanities is still far from resolved.

Keywords

Impact factor H-index SCImago Altmetrics Humanities Philosophy Subscriptions Rejection rates Open access 

Notes

Acknowledgments

I am grateful to the audience of Frontiers weekly round up meetings for the constructive and helpful comments on an earlier version of this paper.

References

  1. Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: Unintended consequences of journal rank. Frontiers in Human Neuroscience, 7, 291. doi: 10.3389/fnhum.2013.00291.CrossRefGoogle Scholar
  2. Craig, I. D., Plume, A. M., McVeigh, M. E., Pringle, J., & Amin, M. (2007). Do open access articles have greater citation impact? A critical review of the literature. Journal of Informetrics, 1(3), 239–248.CrossRefGoogle Scholar
  3. De Marchi, M., & Lorenzetti, E. (2016). Measuring the impact of scholarly journals in the humanities field. Scientometrics, 106(1), 253–261.CrossRefGoogle Scholar
  4. Haensly, P. J., Hodges, P. E., & Davenport, S. A. (2008). Acceptance rates and journal quality: An analysis of journals in economics and finance. Journal of Business and Finance Librarianship, 14(1), 2–31.CrossRefGoogle Scholar
  5. Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics, 101(2), 1419–1430.CrossRefGoogle Scholar
  6. Horvat, M., Mlinaric, A., Omazic, J., & Supak-Smolcic, V. (2015). An analysis of medical laboratory technology journals’ instructions for authors. Science and Engineering Ethics, 1–12. doi: 10.1007/s11948-015-9689-2.
  7. Lee, K. P., Schotland, M., Bacchetti, P., & Bero, L. A. (2002). Association of journal quality indicators with methodological quality of clinical research articles. Journal of the American Medical Association, 287(21), 2805–2808.CrossRefGoogle Scholar
  8. Morris, S., Barnas, E., & LaFrenier, D. (2013). The handbook of journal publishing. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  9. Moustafa, K. (2014). The disaster of the impact factor. Science and Engineering Ethics, 21(1), 139–142.CrossRefGoogle Scholar
  10. Polonioli, A. (2016). Debunking unwarranted defenses of the status quo in the humanities and social sciences. Scientometrics, 1–4. doi: 10.1007/s11192-016-1906-5.
  11. Rocha da Silva, P. (2016). Selecting for impact: New data debunks old beliefs. http://blog.frontiersin.org/2015/12/21/4782/.
  12. Sotudeh, H., Ghasempour, Z., & Yaghtin, M. (2015). The citation advantage of author-pays model: The case of Springer and Elsevier OA journals. Scientometrics, 104(2), 581–608.CrossRefGoogle Scholar
  13. Swan, A. (2010). The open access citation advantage: Studies and results to date. In Technical report, school of electronics and computer science, university of Southampton. http://openaccess.eprints.org/index.
  14. Wang, X., Liu, C., Mao, W., & Fang, Z. (2015). The open access advantage considering citation, article usage and social media attention. Scientometrics, 103(2), 555–564.Google Scholar
  15. Wray, K. B. (2016a). No new evidence for a citation benefit for Author-Pay Open Access Publications in the social sciences and humanities. Scientometrics, 106(3), 1031–1035.Google Scholar
  16. Wray, K. B. (2016b). Still no new evidence: Author-Pay Open Access in the social sciences and humanities. Scientometrics, 1–3. doi: 10.1007/s11192-016-1907-4.

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2016

Authors and Affiliations

  1. 1.Frontiers MediaÈcole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland

Personalised recommendations