Skip to main content
Log in

Superior identification index: Quantifying the capability of academic journals to recognize good research

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In this paper we present “superior identification index” (SII), a metric to quantify the capability of academic journals to recognize top papers restricted by specific time window and study field. Intuitively, SII is the percentage of papers from a journal in the top p% papers in the field. SII provides flexible framework to make trade-offs on journal quality and quantity, as p rises it puts more weight on quantity and less weight on quality. Concerns on the p selection are discussed, and extended metrics of SII, including superior identification efficiency (SIE) and paper rank percentile (PRP), were proposed to sketch other dimensions of journal performance. Based on bibliometric data from ecological field, we find that as p increases, the correlation between SIE and JIF first rises then drops, indicating that JIF might most likely reflect “how well a journal identifies the top 26–34% papers in the field”. Hopefully, the new proposed SII metric and its extensions could promote the quality awareness and provide flexible tools for research evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Aksnes, D. W., Langfeldt, L., & Wouters, P. (2019). Citations, citation indicators, and research quality: An overview of basic concepts and theories. SAGE Open, 9(1), 1478753095.

    Article  Google Scholar 

  • Bergstrom, C. (2007). Eigenfactor: Measuring the value and prestige of scholarly journals. College & Research Libraries News, 68(5), 314–316.

    Article  Google Scholar 

  • Bornmann, L. (2014). How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature. Research Evaluation, 23(2), 166–173.

    Article  Google Scholar 

  • Bornmann, L., & Marx, W. (2013). How good is research really? EMBO Reports, 14(3), 226–230.

    Article  Google Scholar 

  • Bornmann, L., Schier, H., Marx, W., & Daniel, H. (2012). What factors determine citation counts of publications in chemistry besides their quality? Journal of Informetrics, 6(1), 11–18.

    Article  Google Scholar 

  • Callaway, E. (2016). Beat it, impact factor! Publishing elite turns against controversial metric. Nature, 535(7611), 210–211.

    Article  Google Scholar 

  • Campanario, J. M. (2011). Empirical study of journal impact factors obtained using the classical two-year citation window versus a five-year citation window. Scientometrics, 87(1), 189–204.

    Article  Google Scholar 

  • Cole, J. R., & Cole, S. (1973). Social stratification in science (p. 35). University of Chicago Press.

    Google Scholar 

  • Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.

    Article  Google Scholar 

  • González-Pereira, B., Guerrero-Bote, V. P., & Moya-Anegón, F. (2010). A new approach to the metric of journals’ scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), 379–391.

    Article  Google Scholar 

  • Huang, D. (2016). Positive correlation between quality and quantity in academic journals. Journal of Informetrics, 10(2), 329–335.

    Article  Google Scholar 

  • Lei, L., & Sun, Y. (2020). Should highly cited items be excluded in impact factor calculation? The effect of review articles on journal impact factor. Scientometrics, 122(3), 1697–1706.

    Article  Google Scholar 

  • Leimu, R., & Koricheva, J. (2005). What determines the citation frequency of ecological papers? Trends in Ecology & Evolution, 20(1), 28–32.

    Article  Google Scholar 

  • Leydesdorff, L., & Bornmann, L. (2011). Integrated impact indicators compared with impact factors: An alternative research design with policy implications. Journal of the American Society for Information Science and Technology, 62(11), 2133–2146.

    Article  Google Scholar 

  • Lozano, G. A., Larivière, V., & Gingras, Y. (2012). The weakening relationship between the impact factor and papers’ citations in the digital age. Journal of the American Society for Information Science and Technology, 63(11), 2140–2145.

    Article  Google Scholar 

  • McVeigh, M. E., & Mann, S. J. (2009). The journal impact factor denominator: Defining citable (counted) items. JAMA, 302(10), 1107–1109.

    Article  Google Scholar 

  • Milojević, S., Radicchi, F., & Bar-Ilan, J. (2017). Citation success index: An intuitive pair-wise journal comparison metric. Journal of Informetrics, 11(1), 223–231.

    Article  Google Scholar 

  • Mingers, J., & Yang, L. (2017). Evaluating journal quality: A review of journal citation indicators and ranking in business and management. European Journal of Operational Research, 257(1), 323–337.

    Article  MathSciNet  MATH  Google Scholar 

  • Moed, H. F., & Van Leeuwen, T. N. (1995). Improving the accuracy of institute for scientific information’s journal impact factors. Journal of the American Society for Information Science, 46(6), 461–467.

    Article  Google Scholar 

  • Mutz, R., & Daniel, H. (2012). Skewed citation distributions and bias factors: Solutions to two core problems with the journal impact factor. Journal of Informetrics, 6(2), 169–176.

    Article  Google Scholar 

  • Plomp, R. (1990). The significance of the number of highly cited papers as an indicator of scientific prolificacy. Scientometrics, 19(3–4), 185–197.

    Article  Google Scholar 

  • Pudovkin, A. I., & Garfield, E. (2004). Rank-normalized impact factor: A way to compare journal performance across subject categories. Proceedings of the American Society for Information Science and Technology, 41(1), 507–515.

    Article  Google Scholar 

  • Quan, W., Chen, B., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999–2016). Aslib Journal of Information Management, 69(5), 486–502.

    Article  Google Scholar 

  • Rousseau, R. (2016). Positive correlation between journal production and journal impact factors. Journal of Informetrics, 10(2), 567–568.

    Article  Google Scholar 

  • Szomszor, M., Adams, J., Fry, R., Gebert, C., Pendlebury, D. A., Potter, R. W. K., et al. (2021). Interpreting bibliometric data. Frontiers in Research Metrics and Analytics, 5, 30.

    Article  Google Scholar 

  • Tijssen, R. J. W., Visser, M. S., & Van Leeuwen, T. N. (2002). Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? Scientometrics, 54(3), 381–397.

    Article  Google Scholar 

  • Uddin, S., & Khan, A. (2016). The impact of author-selected keywords on citation counts. Journal of Informetrics, 10(4), 1166–1177.

    Article  Google Scholar 

  • Van Leeuwen, T. N., & Moed, H. F. (2002). Development and application of journal impact measures in the Dutch science system. Scientometrics, 53(2), 249–266.

    Article  Google Scholar 

  • Van Nierop, E. (2009). Why do statistics journals have low impact factors? Statistica Neerlandica, 63(1), 52–62.

    Article  MathSciNet  Google Scholar 

  • Van Noorden, R. (2016). Controversial impact factor gets a heavyweight rival. Nature News, 540(7633), 325.

    Article  Google Scholar 

  • Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365–391.

    Article  Google Scholar 

  • Waltman, L., & Traag, V. A. (2020). Use of the journal impact factor for assessing individual articles need not be statistically wrong. F1000Research, 9, 366.

    Article  Google Scholar 

  • Waltman, L., van Eck, N. J., van Leeuwen, T. N., & Visser, M. S. (2013). Some modifications to the SNIP journal impact indicator. Journal of Informetrics, 7(2), 272–285.

    Article  Google Scholar 

  • Zhang, L., & Sivertsen, G. (2020). The new research assessment reform in China and its implementation. Scholarly Assessment Reports, 2(1), 3.

    Article  Google Scholar 

Download references

Acknowledgements

This study is funded by The National Social Science Fund of China “Research on Semantic Evaluation System of Scientific Literature Driven by Big Data” (21&ZD329). The authors have no conflicts of interest to declare that are relevant to the content of this article. The codes of this study are available on GitHub (https://github.com/hope-data-science/SII). Our access to the WoS comes through a contract with Thomson Reuters that forbids redistribution of their database; researchers who desire the raw data on which to run our analytics can obtain it via a paid subscription to Thomson Reuters.

Funding

The National Social Science Fund of China (21&ZD329).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liying Yang.

Appendix

Appendix

See Table

Table 1 The JIF and PRP of investigated 157 ecological journals (including ranking information)

1.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, TY., Yang, L. Superior identification index: Quantifying the capability of academic journals to recognize good research. Scientometrics 127, 4023–4043 (2022). https://doi.org/10.1007/s11192-022-04372-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-022-04372-z

Keywords

Navigation