Scientometrics

, Volume 94, Issue 3, pp 851–872 | Cite as

Citation time window choice for research impact evaluation

Article

Abstract

This paper aims to inform choice of citation time window for research evaluation, by answering three questions: (1) How accurate is it to use citation counts in short time windows to approximate total citations? (2) How does citation ageing vary by research fields, document types, publication months, and total citations? (3) Can field normalization improve the accuracy of using short citation time windows? We investigate the 31-year life time non-self-citation processes of all Thomson Reuters Web of Science journal papers published in 1980. The correlation between non-self-citation counts in each time window and total non-self-citations in all 31 years is calculated, and it is lower for more highly cited papers than less highly cited ones. There are significant differences in citation ageing between different research fields, document types, total citation counts, and publication months. However, the within group differences are more striking; many papers in the slowest ageing field may still age faster than many papers in the fastest ageing field. Furthermore, field normalization cannot improve the accuracy of using short citation time windows. Implications and recommendations for choosing adequate citation time windows are discussed.

Keywords

Citation time window Citation ageing Research evaluation Field normalization 

References

  1. Abbott, A. (2009). Italy introduces performance-related funding. Nature, 460(7255), 559. doi:10.1038/460559a.CrossRefGoogle Scholar
  2. Abramo, G., Cicero, T., & D’Angelo, C. A. (2011). Assessing the varying level of impact measurement accuracy as a function of the citation window length. Journal of Informetrics, 5(4), 659–667. doi:10.1016/j.joi.2011.06.004.CrossRefGoogle Scholar
  3. Abramo, G., Cicero, T., & D’Angelo, C. A. (2012a). A sensitivity analysis of research institutions’ productivity rankings to the time of citation observation. Journal of Informetrics, 6(2), 298–306. doi:10.1016/j.joi.2011.11.005.CrossRefGoogle Scholar
  4. Abramo, G., Cicero, T., & D’Angelo, C. A. (2012b). A sensitivity analysis of researchers’ productivity rankings to the time of citation observation. Journal of Informetrics, 6(2), 192–201. doi:10.1016/j.joi.2011.12.003.CrossRefGoogle Scholar
  5. Adams, J. (2005). Early citation counts correlate with accumulated impact. Scientometrics, 63(3), 567–581.CrossRefGoogle Scholar
  6. Aksnes, D. W. (2003a). Characteristics of highly cited papers. Research Evaluation, 12(3), 159–170.CrossRefGoogle Scholar
  7. Aksnes, D. W. (2003b). A macro study of self-citation. Scientometrics, 56(2), 235–246. doi:10.1023/a:1021919228368.CrossRefGoogle Scholar
  8. Aversa, E. S. (1985). Citation patterns of highly cited papers and their relationship to literature aging: A study of the working literature. Scientometrics, 7(3), 383–389.MathSciNetCrossRefGoogle Scholar
  9. Costas, R., Van Leeuwen, T. N., & van Raan, A. F. J. (2010). Is scientific literature subject to a ‘Sell By Date’? A general methodology to analyze the ‘durability’of scientific documents. Journal of the American Society for Information Science and Technology, 61(2), 329–339.Google Scholar
  10. Costas, R., van Leeuwen, T. N., & van Raan, A. F. J. (2011). The “Mendel syndrome” in science: durability of scientific literature and its effects on bibliometric analysis of individual scientists. Scientometrics, 89(1), 177–205.CrossRefGoogle Scholar
  11. De Bellis, N. (2009). Bibliometrics and citation analysis: from the Science citation index to cybermetrics. Lanham, MD: Scarecrow Press.Google Scholar
  12. Garfield, E. (1980). Premature discovery or delayed recognition—why. Current Contents, 21, 5–10.Google Scholar
  13. Garfield, E. (1985a). The articles most cited in the SCI from 1961 to 1982. 7. Another 100 citation-classics—the Watson-Crick double helix has its turn. Current Contents, 20, 3–12.Google Scholar
  14. Garfield, E. (1985b). The articles most cited in the SCI from 1961 to 1982. 8. Ninety-eight more classic papers from unimolecular reaction velocities to natural opiates-the changing frontiers of science. Current Contents, 33, 3–11.Google Scholar
  15. Garfield, E. (1986). Letter to editor. Information Processing and Management, 22(5), 445.CrossRefGoogle Scholar
  16. Glänzel, W. (2008). Seven myths in bibliometrics. About facts and fiction in quantitative science studies. In H. Kretschmer & F. Havemann (Eds.), Proceedings of WIS 2008 (pp. 1–10). Germany: Berlin.Google Scholar
  17. Glänzel, W., Debackere, K., Thijs, B., & Schubert, A. (2006). A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics, 67(2), 263–277. doi:10.1007/s11192-006-0098-9.CrossRefGoogle Scholar
  18. Glänzel, W., Schlemmer, B., & Thijs, B. (2003). Better late than never? On the chance to become highly cited only beyond the standard bibliometric time horizon. Scientometrics, 58(3), 571–586.CrossRefGoogle Scholar
  19. Glänzel, W., & Schoepflin, U. (1995). A bibliometric study on ageing and reception processes of scientific literature. Journal of information Science, 21(1), 37–53.CrossRefGoogle Scholar
  20. King, D. A. (2004). The scientific impact of nations. Nature, 430(6997), 311–316. doi:10.1038/430311a.CrossRefGoogle Scholar
  21. Levitt, J. M., & Thelwall, M. (2008). Patterns of annual citation of highly cited articles and the prediction of their citation ranking: A comparison across subjects. Scientometrics, 77(1), 41–60.CrossRefGoogle Scholar
  22. Leydesdorff, L. (2008). Caveats for the use of citation indicators in research and journal evaluations. Journal of the American Society for Information Science and Technology, 59(2), 278–287. doi:10.1002/Asi.20743.CrossRefGoogle Scholar
  23. Leydesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the American Society for Information Science and Technology, 60(7), 1327–1336. doi:10.1002/asi.21024.CrossRefGoogle Scholar
  24. Leydesdorff, L., & Opthof, T. (2010). Normalization at the field level: Fractional counting of citations. Journal of Informetrics, 4(4), 644–646. doi:10.1016/j.joi.2010.05.003.CrossRefGoogle Scholar
  25. Line, M. B. (1993). Changes in the use of literature with time: Obsolescence revisited. Library Trends, 41(4), 665–683.Google Scholar
  26. Moed, H. F., Burger, W., Frankfort, J., & van Raan, A. (1985). The application of bibliometric indicators: Important field- and time-dependent factors to be considered. Scientometrics, 8(3), 177–203. doi:10.1007/bf02016935.CrossRefGoogle Scholar
  27. Moed, H. F., van Leeuwen, T. N., & Reedijk, J. (1998). A new classification system to describe the ageing of scientific journals and their impact factors. Journal of Documentation, 54(4), 387–419.CrossRefGoogle Scholar
  28. Porter, A. L. (1977). Citation analysis: Queries and caveats. Social Studies of Science, 7(2), 257–267. doi:10.1177/030631277700700207.CrossRefGoogle Scholar
  29. Radicchi, F., & Castellano, C. (2011). Rescaling citations of publications in physics. Physical Review E, 83(4), 046116.CrossRefGoogle Scholar
  30. Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National academy of Sciences of the United States of America, 105(45), 17268–17272. doi:10.1073/pnas.0806977105.CrossRefGoogle Scholar
  31. Rogers, J. D. (2010). Citation analysis of nanotechnology at the field level: Implications of RD evaluation. Research Evaluation, 19(4), 281–290.CrossRefGoogle Scholar
  32. Schubert, A., & Braun, T. (1996). Cross-field normalization of scientometric indicators. Scientometrics, 36(3), 311–324. doi:10.1007/bf02129597.CrossRefGoogle Scholar
  33. Stent, G. (1972). Prematurity and uniqueness in scientific discovery. Scientific American, 227(6), 84–93.CrossRefGoogle Scholar
  34. Van Raan, A. F. J. (2004). Sleeping beauties in science. Scientometrics, 59(3), 467–472.CrossRefGoogle Scholar
  35. Walters, G. D. (2011). The citation life cycle of articles published in 13 American Psychological Association journals: A 25-year longitudinal analysis. Journal of the American Society for Information Science and Technology, 62(8), 1629–1636. doi:10.1002/asi.21560.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2012

Authors and Affiliations

  1. 1.Institute for Research Information and Quality Assurance (iFQ)BerlinGermany

Personalised recommendations