Abstract
Wu et al. (Nature 566:378–382, 2019) introduced a new indicator measuring disruption (\({DI}_{1}\)). Bornmann et al. (Do disruption index indicators measure what they propose to measure? The comparison of several indicator variants with assessments by peers, 2019. https://arxiv.org/abs/1911.08775) compared variants of the disruption index and pointed to \({DI}_{5}\) as an interesting variant. The calculation of a field-specific version of \({DI}_{5}\) (focusing on disruptiveness within the same field) for Scientometrics papers in the current study reveals that the variant is possibly able to identify landmark papers in scientometrics. This result is in contrast to the Scientometrics analysis previously published by Bornmann and Tekles (Scientometrics 120(1):331–336, 2019) based on the original disruption index (\({DI}_{1}\)).
Avoid common mistakes on your manuscript.
Recently, a new indicator measuring disruption has been introduced (Funk and Owen-Smith 2017; Wu et al. 2019). This indicator is connected to the ‘scientific revolutions’ concept of Kuhn (1962) and tries to identify exceptional research which is characterized by an overthrow of established thinking (Casadevall and Fang 2016). The indicator is based on combining cited references of citing papers with cited references data of a focal paper (FP). The disruptiveness of a FP is determined by the extent to which the cited references of the papers citing the FP overlap with the cited references of the FP. Disruptiveness is indicated if many citing papers do not refer to the FP’s cited references. Then, it seems that the FP presents something new which is independent of the context of the FP.
Disruptiveness is described by Wu et al. (2019) as a weighted index \({DI}_{1}\) (see Fig. 1) calculated for a FP: the difference between the number of papers citing the FP without citing any of its cited references (\({N}_{i}\)) and the number of papers citing both the FP and at least one of its cited references (\({N}_{j}^{1}\)) is divided by the sum of \({N}_{i}\), \({N}_{j}^{1}\), and \({N}_{k}\). \({N}_{k}\) is the number of papers citing at least one of the FP’s cited references without citing FP itself. Wu et al. (2019) claim that disruptive research is reflected in high positive values; developmental research continuing previous research directions is indicated by small negative values.
Bornmann et al. (2019) discussed and empirically investigated \({DI}_{1}\) and some variants. They followed the idea of considering how strong the cited references of the FP’s citing papers are coupled with the cited references of the FP. This is converted in the formula in Fig. 1: the subscript of \({DI}_{l}\) corresponds to the threshold for the number of bibliographic coupling links. With \(l=1\), there is no restriction on the number of bibliographic coupling links which corresponds to the originally proposed index by Wu et al. (2019); \(l=5\) means that a threshold of five bibliographic coupling links is used. Bornmann et al. (2019) compared (among other indicators measuring disruption) the variants \({DI}_{1}\) and \({DI}_{5}\). The empirical results show that \({DI}_{5}\) might be better able than \({DI}_{1}\) (and also better than the other indicators) to identify disruptive research. Thus, we took this indicator as a starting point for a modified variant which allows to focus on disruptive effects in a certain field.
We calculated this variant for all papers published in Scientometrics and compared the results with \({DI}_{1}\). Bornmann and Tekles (2019) calculated \({DI}_{1}\) for Scientometrics papers and showed a “concentration of disruption values around zero” (p. 332). Most of the papers identified by Bornmann and Tekles (2019) with the highest \({DI}_{1}\) values could not be characterized as landmark papers in the scientometrics field using manual inspection. To focus with \({DI}_{5}\) in this study on the scientometrics field, we defined \({N}_{j}^{l}\) and \({N}_{k}\) based on all papers cited by any paper published in Scientometrics in the same year as the FP instead of the FP’s cited references. The corresponding definitions of \({N}_{j}^{l}\) and \({N}_{k}\) are as follows:
-
\({N}_{j}^{l}\): Number of papers citing the FP, and at least \(l\) of the cited references of all Scientometrics papers published in the same year as the FP
-
\({N}_{k}\): Number of papers citing at least one of the cited references of all Scientometrics papers published in the same year as the FP, but not the FP itself
Table 1 shows the twenty Scientometrics papers with the highest \({DI}_{5}\) values (to base the values on reliable data, we considered only papers with at least 10 cited references counts). We selected all publications in Scientometrics available in Scopus for the years 1996–2015 (2145 papers with at least 10 cited references). As the values reveal, they are still close to zero. This is similar to the results based on \({DI}_{1}\). However, the comparison of both results (\({DI}_{1}\) and \({DI}_{5}\)) points out different foci: as Bornmann and Tekles (2019) reveal, \({DI}_{1}\) mostly brings out scientometrics papers with local focus (e.g., issues rooted in countries such as Iberian-American countries, Korea, or Iceland). \({DI}_{5}\) is more focused on basic scientometric papers which focus on bibliometric performance measures or state-of-the-art scientometrics. For example, Narin and Hamilton (1996) discuss “three different types of bibliometrics—literature bibliometrics, patent bibliometrics, and linkage bibliometric” (p. 293).
van Raan (1996) “gives an overview of the potentials and limitations of bibliometric methods for the assessment of strengths and weaknesses in research performance, and for monitoring scientific developments” (p. 397). Martin (1996) is a landmark paper in scientometrics discussing how government-funded basic research can be evaluated using different indicators. The paper by MacRoberts and MacRoberts (1996) is one of the core papers in the critical discussion of evaluative citation analyses. Ho (2004)—the paper with the largest \({DI}_{5}\)—demonstrated the problem of citation analysis by focusing on the reception of an equation published in one paper. The results show that “all of the examined references from the literature review in this study were incorrect for referencing Lagergren’s equation” (p. 175).
Taken as a whole, the results in Table 1 reveal that the field-specific variant of \({DI}_{5}\) may be better able than the original variant of \({DI}_{1}\) to identify disruptive papers which concentrate the attention of researchers within a discipline. This suggests that focusing on the citation links in a specific field (here: scientometrics as represented by the journal Scientometrics) is a promising approach for improving existing disruption indicators. Further exploration on a variety of datasets, journals, and disciplines is certainly needed to elucidate this possibility.
References
Babu, A. R., & Singh, Y. P. (1998). Determinants of research productivity. Scientometrics, 43(3), 309–329.
Bornmann, L., Devarakonda, S., Tekles, A., & Chacko, G. (2019). Do disruption index indicators measure what they propose to measure? The comparison of several indicator variants with assessments by peers. Retrieved December 6, 2019, from https://arxiv.org/abs/1911.08775.
Bornmann, L., & Tekles, A. (2019). Disruptive papers published in Scientometrics. Scientometrics, 120(1), 331–336. https://doi.org/10.1007/s11192-019-03113-z.
Casadevall, A., & Fang, F. C. (2016). Revolutionary science. mBio, 7(2), e00158. https://doi.org/10.1128/mBio.00158-16.
Chiu, W. T., & Ho, Y. S. (2005). Bibliometric analysis of homeopathy research during the period of 1991 to 2003. Scientometrics, 63(1), 3–23. https://doi.org/10.1007/s11192-005-0201-7.
de Moya-Anegon, F., & Herrero-Solana, V. (1999). Science in America Latina: A comparison of bibliometric and scientific-technical indicators. Scientometrics, 46(2), 299–320. https://doi.org/10.1007/Bf02464780.
Dietz, J. S., Chompalov, I., Bozeman, B., Lane, E. O., & Park, J. (2000). Using the curriculum vita to study the career paths of scientists and engineers: An exploratory assessment. Scientometrics, 49(3), 419–442.
Funk, R. J., & Owen-Smith, J. (2017). A dynamic network measure of technological change. Management Science, 63(3), 791–817. https://doi.org/10.1287/mnsc.2015.2366.
Glänzel, W., Schubert, A., & Czerwon, H. J. (1999). A bibliometric analysis of international scientific cooperation of the European Union (1985–1995). Scientometrics, 45(2), 185–202. https://doi.org/10.1007/Bf02458432.
Gu, Y. N. (2004). Global knowledge management research: A bibliometric analysis. Scientometrics, 61(2), 171–190. https://doi.org/10.1023/B:SCIE.0000041647.01086.f4.
Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193–215. https://doi.org/10.1007/Bf02457380.
Ho, Y. S. (2004). Citation review of Lagergren kinetic rate equation on adsorption reactions. Scientometrics, 59(1), 171–177.
Hood, W. W., & Wilson, C. S. (2001). The literature of bibliometrics, scientometrics, and informetrics. Scientometrics, 52(2), 291. https://doi.org/10.1023/a:1017919924342.
Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.
MacRoberts, M. H., & MacRoberts, B. R. (1996). Problems of citation analysis. Scientometrics, 36(3), 435–444.
Martin, B. (1996). The use of multiple indicators in the assessment of basic research. Scientometrics, 36(3), 343–362.
Meyer, M. (2000). What is special about patent citations? Differences between scientific and patent citations. Scientometrics, 49(1), 93–123. https://doi.org/10.1023/A:1005613325648.
Narin, F., & Hamilton, K. S. (1996). Bibliometric performance measures. Scientometrics, 36(3), 293–310. https://doi.org/10.1007/Bf02129596.
Porter, A. L., Kongthon, A., & Lui, J. C. (2002). Research profiling: Improving the literature review. Scientometrics, 53(3), 351–370. https://doi.org/10.1023/A:1014873029258.
Ren, S. L., & Rousseau, R. (2002). International visibility of Chinese scientific journals. Scientometrics, 53(3), 389–405. https://doi.org/10.1023/A:1014877130166.
Schubert, A., & Braun, T. (1996). Cross-field normalization of scientometric indicators. Scientometrics, 36(3), 311–324.
van Eck, N. J., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523–538. https://doi.org/10.1007/s11192-009-0146-3.
van Raan, A. F. J. (1996). Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics, 36(3), 397–420. https://doi.org/10.1007/Bf02129602.
van Raan, A. F. J. (1997). Scientometrics: State-of-the-art. Scientometrics, 38(1), 205–218. https://doi.org/10.1007/Bf02461131.
van Raan, A. F. J. (2005). Fatal attraction: conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.
Wu, L., Wang, D., & Evans, J. A. (2019). Large teams develop and small teams disrupt science and technology. Nature, 566, 378–382. https://doi.org/10.1038/s41586-019-0941-9.
Wu, S., & Wu, Q. (2019). A confusing definition of disruption. Retrieved May 15, 2019, from https://osf.io/preprints/socarxiv/d3wpk/.
Acknowledgements
Open Access funding provided by Projekt DEAL. Work in this publication was partially supported by Federal funds, HHSN271201800040C (N44DA-18-1216), from the National Institute on Drug Abuse, National Institutes of Health, US Department of Health and Human Services. Citation analysis in this paper relied partly on Scopus data as implemented in the ERNIE project, which is collaborative between NET ESolutions Corporation and Elsevier. The content of this publication is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or Elsevier. The authors do not declare any competing interests.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Bornmann, L., Devarakonda, S., Tekles, A. et al. Disruptive papers published in Scientometrics: meaningful results by using an improved variant of the disruption index originally proposed by Wu, Wang, and Evans (2019). Scientometrics 123, 1149–1155 (2020). https://doi.org/10.1007/s11192-020-03406-8
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-020-03406-8