Does monetary support increase citation impact of scholarly papers?

Abstract

One of the main indicators of scientific development of a given country is the number of papers published in high impact scholarly journals. Many countries introduced performance-based research funding systems to create a more competitive environment where prolific researchers get rewarded with subsidies to increase both the quantity and quality of papers. Yet, subsidies do not always function as a leverage to improve the citation impact of scholarly papers. This paper investigates the effect of the publication support system of Turkey (TR) on the citation impact of papers authored by Turkish researchers. Based on a stratified probabilistic sample of 4521 TR-addressed papers, it compares the number of citations to determine whether supported papers were cited more often than those of not supported ones and published in journals with relatively higher citation impact in terms of journal impact factors (JIF), article influence scores (AIS) and quartiles. Both supported and not supported papers received comparable number of citations per paper and were published in journals with similar citation impact values. The results of the hurdle model test showed that monetary support is related with reducing the number of uncited papers, and with slightly increasing the citation impact of papers with positive (i.e., non-zero) citations. Journal-level metrics of JIF, AIS and quartiles are not associated with papers’ getting their first citations nor with receiving higher citation counts. Findings suggest that subsidies do not seem to be an effective incentive to improve the citation impact of scholarly TR-addressed papers. Such support programs should therefore be reconsidered.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Notes

  1. 1.

    A professor may earn the equivalent of 20 years’ salary for a Nature or Science paper, and the maximum amount can be as high as 165,000 USD for a single paper (Quan et al. 2017: 491, 494). However, it is worth noting that China has apparently experienced the dire consequences of this policy and recently decided to ban cash rewards for publishing papers in journals listed in citation indexes (Mallapaty 2020). With the new policy, China plans to say farewell to WoS-based indicators (so called “SCI worship”) by moving to “a balanced combination of qualitative and quantitative research evaluation” with stronger local relevance (Zhang and Sivertsen 2020a: 7, 2020b).

  2. 2.

    In this study, we use “papers” or “TR-addressed papers” in general instead of “articles” or “TR-addressed articles”, unless otherwise indicated.

  3. 3.

    We actually selected three different samples (every 50th and 99th record; every 12th and 77th record; and every 12th and 75th record) with the same sample size and compared the descriptive statistics such as means and medians to make sure the stratified probabilistic sampling technique worked properly. As sample statistics were quite similar in all three cases, we report here the findings based on the last one.

  4. 4.

    Note that 49 Arts and Humanities papers that received a total of 289 citations were excluded from further analysis as bibliometric characteristics of Arts and Humanities journals are not listed in JCR.

  5. 5.

    Not all journals in which TR-addressed papers were published had both JIF and/or AIS values listed in JCR. The correlation coefficient is based on 3961 papers with both values. Papers that were published in journals with no JIS and/or AIS were also excluded.

  6. 6.

    Note that 4% of all TR-addressed papers were published in journals with no assigned JCR quartiles in 2015 (i.e., journals with no JIFs).

  7. 7.

    Note that 3% of all TR-addressed Science papers indexed in SCI were published in journals with no assigned JCR quartiles in 2015 (i.e., Science journals with no JIFs).

  8. 8.

    In fact, the mean JIF and AIS values of not supported papers with zero citations were even slightly higher than those of supported ones with zero citations.

References

  1. Abramo, G., D’Angelo, C. A., & Di Costa, F. (2019). When research assessment exercises leave room for opportunistic behavior by the subjects under evaluation. Journal of Informetrics, 13(3), 830–840. https://doi.org/10.1016/j.joi.2019.07.006.

    Article  Google Scholar 

  2. Akça, S., & Akbulut, M. (2018). Türkiye’deki yağmacı dergiler: Beall listesi üzerine bir araştırma. Bilgi Dünyası, 19(2), 255–274. https://doi.org/10.15612/BD.2018.695.

    Article  Google Scholar 

  3. Arendt, J. (2010). Are article influence scores comparable across scientific fields? Issues in Science and Technology Librarianship, 60. Retrieved September 16, 2019, from http://www.istl.org/10-winter/refereed2.html.

  4. Article Influence Score. (2019). Retrieved December 1, 2019, from http://help.incites.clarivate.com/incitesLiveJCR/glossaryAZgroup/g4/7790-TRS.html.

  5. Auranen, O., & Nieminen, M. (2010). University research funding and publication performance—An international comparison. Research Policy, 39(6), 822–834. https://doi.org/10.1016/j.respol.2010.03.003.

    Article  Google Scholar 

  6. Baccini, A., De Nicolao, G., & Petrovich, E. (2019). Citation gaming induced by bibliometric evaluation: A country-level comparative analysis. PLoS ONE, 14(9), e0221212. https://doi.org/10.1371/journal.pone.0221212.

    Article  Google Scholar 

  7. Butler, L. (2003). Explaining Australia’s increased share of ISI publications—The effects of a funding formula based on publication counts. Research Policy, 32(1), 143–155. https://doi.org/10.1016/S0048-7333(02)00007-0.

    Article  Google Scholar 

  8. Butler, L. (2004). What happens when funding is linked to publication counts? In H. F. Moed, et al. (Eds.), Handbook of quantitative science and technology research: The use of publication and patent statistics in studies of S&T systems (pp. 389–405). Dordrecht: Kluwer.

    Google Scholar 

  9. Casadevall, A., & Fang, F. C. (2012). Causes for the persistence of impact factor mania. mBio, 5(2). Retrieved September 16, 2019, from http://mbio.asm.org/content/5/2/e00064-14.full.pdf.

  10. Çetinsaya, G. (2014). Büyüme, kalite, uluslararasılaşma: Türkiye yükseköğretimi için bir yol haritası (2nd ed.). Ankara: Yükseköğretim Kurulu. Retrieved December 1, 2019, from https://www.yok.gov.tr/Documents/Yayinlar/Yayinlarimiz/buyume-kalite-uluslararasilasma-turkiye-yuksekogretim-icin-bir-yol-haritasi.pdf.

  11. Checchi, D., Malgarini, M., & Sarlo, S. (2019). Do performance-based research funding systems affect research production and impact? Higher Education Quarterly, 73, 45–69. https://doi.org/10.1111/hequ.12185.

    Article  Google Scholar 

  12. Chen, C. (2012). Predictive effects of structural variation on citation counts. Journal of the American Society for Information Science and Technology, 63(3), 431–449. https://doi.org/10.1002/asi.21649.

    Article  Google Scholar 

  13. De Boer, H., et al. (2015). Performance-based funding and performance agreements in fourteen higher education systems. Report for the Ministry of Culture and Science (Reference: C15HdB014). Enschede: Center for Higher Education Policy Studies University of Twente. Retrieved September 16, 2019, from http://bit.ly/2DZNVWP.

  14. De Rijcke, S., Wouters, P., Rushforth, A. D., Franssen, T., & Hammarfelt, B. M. S. (2016). Evaluation practices and effects of indicator use—A literature review. Research Evaluation, 25(2), 161–169. https://doi.org/10.1093/reseval/rvv038.

    Article  Google Scholar 

  15. Demir, S. B. (2018a). Predatory journals: Who publishes in them and why? Journal of Informetrics, 12(4), 1296–1311. https://doi.org/10.1016/j.joi.2018.10.008.

    MathSciNet  Article  Google Scholar 

  16. Demir, S. B. (2018b). Pros and cons of the new financial support policy for Turkish researchers. Scientometrics, 116(3), 2053–2068. https://doi.org/10.1007/s11192-018-2833-4.

    Article  Google Scholar 

  17. Didegah, F., & Thelwall, M. (2013a). Which factors help authors produce the highest impact research? Collaboration, journal and document properties. Journal of Informetrics, 7, 861–873. https://doi.org/10.1016/j.joi.2013.08.006.

    Article  Google Scholar 

  18. Didegah, F., & Thelwall, M. (2013b). Determinants of research citation impact in nanoscience and nanotechnology. Journal of the American Society for Information Science and Technology, 64(5), 1055–1064. https://doi.org/10.1002/asi.22806.

    Article  Google Scholar 

  19. European Commission. (2010). Assessing Europe’s university-based research: Expert Group on Assessment of University-Based Research (EUR24187EN). Retrieved September 16, 2019, from https://ec.europa.eu/research/science-society/document_library/pdf_06/assessing-europe-university-based-research_en.pdf.

  20. Fire, M., & Guestrin, C. (2019). Over-optimization of academic publishing metrics: Observing Goodhart’s Law in action. GigaScience, 8(6), 1–20. https://doi.org/10.1093/gigascience/giz053.

    Article  Google Scholar 

  21. Fischer, I., & Steiger, H.-J. (2018). Dynamics of Journal Impact Factors and limits to their inflation. Journal of Scholarly Publishing, 50(1), 26–36. https://doi.org/10.3138/jsp.50.1.06.

    Article  Google Scholar 

  22. Geuna, A., & Martin, B. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304. https://doi.org/10.1023/B:MINE.0000005155.70870.bd.

    Article  Google Scholar 

  23. Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171–193. https://doi.org/10.1023/A:1014848323806.

    Article  Google Scholar 

  24. Gök, A., Rigby, J., & Shapira, P. (2016). The impact of research funding on scientific outputs: Evidence from six smaller European countries. Journal of the Association for Information Science & Technology, 67(3), 715–730. https://doi.org/10.1002/asi.23406.

    Article  Google Scholar 

  25. Good, B., Vermeulen, N., Tiefenthaler, B., & Arnold, E. (2015). Counting quality? The Czech performance-based research funding system. Research Evaluation, 24(2), 91–105. https://doi.org/10.1093/reseval/rvu035.

    Article  Google Scholar 

  26. Hammarfelt, B., & Haddow, G. (2018). Conflicting measures and values: How humanities scholars in Australia and Sweden use and react to bibliometric indicators. Journal of the Association for Information Science & Technology, 69(7), 924–935. https://doi.org/10.1002/asi.24043.

    Article  Google Scholar 

  27. Harley, Y. X., Huysamen, E., Hlungwani, C., & Douglas, T. (2016). Does the DHET research output subsidy model penalise high-citation publication? A case study. South African Journal of Science, 112(5–6), 1–3. https://doi.org/10.17159/sajs.2016/20150352.

    Article  Google Scholar 

  28. Hedding, D. W. (2019). Payouts push professors towards predatory journals. Nature, 565, 267. https://doi.org/10.1038/d41586-019-00120-1.

    Article  Google Scholar 

  29. Herbst, M. (2007). Financing public universities: The case of performance funding. Dordrecht: Springer.

    Google Scholar 

  30. Heywood, J. S., Wei, X., & Ye, G. (2011). Piece rates for professors. Economics Letters, 113(3), 285–287. https://doi.org/10.1016/j.econlet.2011.08.005.

    Article  Google Scholar 

  31. Hicks, D. (2004). The four literatures of Social Science. In H. F. Moed, et al. (Eds.), Handbook of quantitative science and technology research: The use of publication and patent statistics in studies of S&T systems (pp. 473–496). Dordrecht: Kluwer.

    Google Scholar 

  32. Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261. https://doi.org/10.1016/j.respol.2011.09.007.

    Article  Google Scholar 

  33. Hongyang, L. (2017). Lancet restaurant gives medical professionals food for thought. China Daily, Retrieved September 16, 2019, from http://www.chinadaily.com.cn/china/2017-11/02/content_34013235.htm.

  34. Jackman, S., et al. (2020). Package ‘pscl’. Retrieved June 7, 2020, from https://cran.r-project.org/web/packages/pscl/pscl.pdf.

  35. Jonkers, K., & Zacharewicz, T. (2016). Research performance based funding systems: A comparative assessment. Luxembourg: Publications Office of the European Union. Retrieved September 16, 2019, from http://publications.jrc.ec.europa.eu/repository/bitstream/JRC101043/kj1a27837enn.pdf.

  36. Kamalski, J., Huggett, S., Kalinaki, E., Lan, G., Lau, G., Pan, L., & Scheerooren, S. (2017). World of research 2015: Revealing patterns and archetypes in scientific research. Elsevier Analytic Services. Retrieved September 16, 2019, from http://www.doc88.com/p-2032803429898.html.

  37. Kleiber, C., & Zeileis, A. (2016). Visualizing count data regressions using rootograms. The American Statistician, 70(3), 296–303. https://doi.org/10.1080/00031305.2016.1173590.

    MathSciNet  Article  Google Scholar 

  38. Koçak, Z. (2019). Predatory publishing and Turkey (editorial). Balkan Medical Journal, 36(4), 199–201. https://doi.org/10.4274/balkanmedj.galenos.2019.2019.4.001.

    Article  Google Scholar 

  39. Lee, A. T. K., & Simon, C. A. (2018). Publication incentives based on journal rankings disadvantage local publications. South African Journal of Science, 114(9/10), 1–3. https://doi.org/10.17159/sajs.2018/a0289.

    Article  Google Scholar 

  40. Liu, F., Guo, W., & Zuo, C. (2018). High impact factor journals have more publications than expected. Current Science, 114(5), 955–956. https://doi.org/10.18520/cs/v114/i05/955-956.

    Article  Google Scholar 

  41. Liu, W., Hu, G., & Gu, M. (2016). The probability of publishing in first-quartile journals. Scientometrics, 106(3), 1273–1276. https://doi.org/10.1007/s11192-015-1821-1.

    Article  Google Scholar 

  42. Lumley, T., Diehr, P., Emerson, S., & Chen, L. (2002). The importance of the normality assumption in large public health data sets. Annual Review of Public Health, 23, 151–169. https://doi.org/10.1146/annurev.publhealth.23.100901.140546.

    Article  Google Scholar 

  43. Mallapaty, S. (2020). China bans cash rewards for publishing papers. Nature, 579, 18. https://doi.org/10.1038/d41586-020-00574-8.

    Article  Google Scholar 

  44. Marx, W., & Bornmann, L. (2013). Journal Impact Factor: “The poor man’s citation analysis” and alternative approaches. European Science Editing, 39(3), 62–63. Retrieved September 16, 2019, from http://www.ease.org.uk/sites/default/files/aug13pageslowres.pdf.

  45. Miranda, R., & Garcia-Carpintero, E. (2019). Comparison of the share of documents and citations from different quartile journals in 25 research areas. Scientometrics, 121(1), 479–501. https://doi.org/10.1007/s11192-019-03210-z.

    Article  Google Scholar 

  46. Moed, H. F., & van Leeuwen, T. N. (1996). Impact factors can mislead. Nature, 381(6579), 186. https://doi.org/10.1038/381186a0.

    Article  Google Scholar 

  47. Mouton, J., & Valentine, A. (2017). The extent of South African authored articles in predatory journals. South African Journal of Science, 113(7/8), 1–9. https://doi.org/10.17159/SAJS.2017/20170010.

    Article  Google Scholar 

  48. Muller, S. M. (2017). Academics as rent seekers: Distorted incentives in higher education, with reference to the South African case. International Journal of Educational Development, 52, 58–67. https://doi.org/10.1016/j.ijedudev.2016.11.004.

    Article  Google Scholar 

  49. Muller, J. Z. (2018). The tyranny of metrics. Princeton, N.J.: Princeton University Press.

    Google Scholar 

  50. Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the Social Sciences and Humanities: A review. Scientometrics, 66, 81–100. https://doi.org/10.1007/s11192-006-0007-2.

    Article  Google Scholar 

  51. Nicolaisen, J., & Frandsen, T. V. (2019). Zero-impact: A large scale study of uncitedness. Scientometrics, 119, 1227–1254. https://doi.org/10.1007/s11192-019-03064-5.

    Article  Google Scholar 

  52. Ochsner, M., Hug, S. E., & Daniel, H.-D. (2014). Setting the stage for the assessment of research quality in humanities. Consolidating the results of four empirical studies. Zeitschrift für Erziehungswissenschaft, 117, 111–132. https://doi.org/10.1007/s11618-014-0576-4.

    Article  Google Scholar 

  53. Osuna, C., Cruz-Castro, L., & Sanz-Menéndez, L. (2011). Overturning some assumptions about the effects of evaluation systems on publication performance. Scientometrics, 86(3), 575–592. https://doi.org/10.1007/s11192-010-0312-7.

    Article  Google Scholar 

  54. Pajić, D. (2014). Globalization of the social sciences in Eastern Europe: Genuine breakthrough or a slippery slope of the research evaluation practice? Scientometrics, 102(3), 2131–2150. https://doi.org/10.1007/s11192-014-1510-5.

    Article  Google Scholar 

  55. Pillay, T. S. (2013). Subject and discipline-specific publication trends in South African medical research, 1996–2011. South African Journal of Science. https://doi.org/10.1590/sajs.2013/20120054.

    Article  Google Scholar 

  56. Quan, W., Chen, B., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999–2016). Aslib Journal of Information Management, 69(5), 486–502. https://doi.org/10.1108/AJIM-01-2017-0014.

    Article  Google Scholar 

  57. Sætnan, A. R., Tøndel, G., & Rasmussen, B. (2019). Does counting change what is counted? Potential for paradigm change through performance metrics. Research Evaluation, 28(1), 73–83. https://doi.org/10.1093/reseval/rvy032.

    Article  Google Scholar 

  58. Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314(7079), 498–502. https://doi.org/10.1136/bmj.314.7079.497.

    Article  Google Scholar 

  59. Şengör, A. M. C. (2014). How scientometry is killing science. GSA Today, 24(12), 44–45. https://doi.org/10.1130/GSATG226GW.1.

    Article  Google Scholar 

  60. Shao, J., & Shen, H. (2012). Research assessment: The overemphasized impact factor in China. Research Evaluation, 21(3), 199–203. https://doi.org/10.1093/reseval/rvs011.

    Article  Google Scholar 

  61. Sīle, L., & Vanderstraeten, R. (2019). Measuring changes in publication patterns in a context of performance-based research funding systems: The case of educational research in the University of Gothenburg (2005–2014). Scientometrics, 118, 71–91. https://doi.org/10.1007/s11192-018-2963-8.

    Article  Google Scholar 

  62. Sivertsen, G. (2016). Patterns of internationalization and criteria for research assessment in social sciences and humanities. Scientometrics, 107, 357–368. https://doi.org/10.1007/s11192-016-1845-1.

    Article  Google Scholar 

  63. Sivertsen, G. (2019). Understanding and evaluating research and scholarly publishing in the Social Sciences and Humanities (SSH). Data and Information Management, 2(3), 1–11.

    Google Scholar 

  64. Sombatsompop, N., & Markpin, T. (2005). Making an equality of ISI impact factors for different subject fields. Journal of the American Society for Information Science and Technology, 56(7), 676–683. https://doi.org/10.1002/asi.20150.

    Article  Google Scholar 

  65. Teodorescu, D., & Andrei, T. (2014). An examination of “citation circles” for social sciences journals in Eastern European countries. Scientometrics, 99(2), 209–231. https://doi.org/10.1007/s11192-013-1210-6.

    Article  Google Scholar 

  66. Tomaselli, K. G. (2018). Perverse incentives and the political economy of South African academic journal publishing. South African Journal of Science, 114(11/12), 1–6. https://doi.org/10.17159/sajs.2018/4341.

    Article  Google Scholar 

  67. Tonta, Y. (2017a). Does monetary support increase the number of scientific papers? An interrupted time series analysis. Journal of Data and Information Science, 3(1), 19–39. https://doi.org/10.2478/jdis-2018-0002.

    Article  Google Scholar 

  68. Tonta, Y. (2017b). TÜBİTAK Türkiye Adresli Uluslararası Bilimsel Yayınları Teşvik (UBYT) Programının değerlendirilmesi. Ankara: TÜBİTAK ULAKBİM. Retrieved March 28, 2020, from http://ulakbim.tubitak.gov.tr/sites/images/Ulakbim/tonta_ubyt.pdf.

  69. Tonta, Y., & Akbulut, M. (2019). Does monetary support increase citation impact of scholarly papers? In G. Catalano, et al. (eds.), 17th international conference on Scientometrics & Informetrics ISSI2019 with a special STI indicators conference track, 25 September 2019, Sapienza University of Rome, Italy. Proceedings (pp. 1952–1963). Rome: International Society for Scientometrics and Informetrics. Retrieved September 20, 2019, from http://yunus.hacettepe.edu.tr/~tonta/Yayinlar/tonta_ISSI2019.pdf.

  70. TÜBİTAK Türkiye Adresli Uluslararası Bilimsel Yayınları Teşvik Programı Uygulama Esasları. (2015). (237 BK-EK 1). Retrieved September 16, 2019, from http://www.tubitak.gov.tr/sites/default/files/237bk-ek1_0.pdf.

  71. TÜBİTAK Türkiye Adresli Uluslararası Bilimsel Yayınları Teşvik Programı Uygulama Esasları. (2020). Retrieved July 5, 2020, from https://cabim.ulakbim.gov.tr/wp-content/uploads/sites/4/2020/06/2020_UBYT_Program%c4%b1_Uygulama_Usul_ve_Esaslar%c4%b1.pdf.

  72. Van Leeuwen, T. (2013). Bibliometric research evaluations, Web of Science and the Social Sciences and Humanities: A problematic relationship? Bibliometrie—Praxis und Forschung. https://doi.org/10.5283/bpf.173.

    Article  Google Scholar 

  73. Van Leeuwen, T. N., Moed, H. F., Tijssen, R. J. W., Visser, M. S., & Van Raan, A. F. J. (2001). Language biases in the coverage of the Science Citation Index and its consequences for international comparisons of national research performance. Scientometrics, 51(1), 335–346. https://doi.org/10.1023/A:1010549719484.

    Article  Google Scholar 

  74. Wilsdon, J., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. London: Sage. https://doi.org/10.13140/RG.2.1.4929.1363.

    Google Scholar 

  75. Wouters, P., et al. (2015). The metric tide: Literature review (supplementary report I to the independent review of the role of metrics in research assessment and management). HEFCE. https://doi.org/10.13140/RG.2.1.5066.3520.

    Article  Google Scholar 

  76. Yuret, T. (2017). Do researchers pay attention to publication subsidies? Journal of Informetrics, 11(2), 423–434.

    Article  Google Scholar 

  77. Zeileis, A., Kleiber, C., & Jackman, S. (2008). Regression models for count data in R. Journal of Statistical Software, 27(8), 1–25. https://doi.org/10.18637/jss.v027.i08.

    Article  Google Scholar 

  78. Zhang, L., Rousseau, R., & Sivertsen, G. (2017). Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen’s work on journal impact and research evaluation. PLoS ONE, 12(3), e0174205. https://doi.org/10.1371/journal.pone.0174205.

    Article  Google Scholar 

  79. Zhang, L., & Sivertsen, G. (2020a). The new research assessment reform in China and its implementation. Scholarly Assessment Reports, 2(1), 3. https://doi.org/10.29024/sar.15.

    Article  Google Scholar 

  80. Zhang, L., & Sivertsen, G. (2020b). For China’s ambitious research reforms to be successful, they will need to be supported by new research assessment infrastructures (blog post). LSE Impact Blog. Retrieved June 20, 2020 from https://blogs.lse.ac.uk/impactofsocialsciences/2020/06/11/for-chinas-ambitious-research-reforms-to-be-successful-they-will-need-to-be-supported-by-new-research-assessment-infrastructures/.

Download references

Acknowledgements

We thank Mr. Mirat Satoğlu of TÜBİTAK ULAKBİM for providing data for supported papers, and Dr. Umut Al of Hacettepe University for reviewing an earlier version of this paper. We also thank the anonymous reviewers for their constructive comments and suggestions.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Yaşar Tonta.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Tonta, Y., Akbulut, M. Does monetary support increase citation impact of scholarly papers?. Scientometrics (2020). https://doi.org/10.1007/s11192-020-03688-y

Download citation

Keywords

  • Citations
  • Impact factor
  • Article influence score
  • Journal quartiles
  • Hurdle model