Assessment of the relevance of journals in research libraries using bibliometrics (a review)

  • V. N. GureevEmail author
  • N. A. Mazov


Originally, most bibliometric research is aimed at the improvement of library collection-management methods and at the development of new methods for the selection of documents for a collection. This is also true of the impact factor, which was designed to evaluate journals before their inclusion in the collections of research libraries. Thus, the core of bibliometrics is librarianship. However, in recent years bibliometric studies are mainly conducted for the evaluation of scientific developments and individual scientists. In this case, a significant part of such research is carried out by workers at research libraries. Paradoxically, these approaches are not used in librarianship itself. With the expansion of the range of available tools, the development and use of bibliometric methods for the analysis of scientific information in libraries, especially in the acquisition process, once again has become relevant, which is consistent with the changes in science and publishing themselves. This has been stated by both foreign and domestic experts.


citation analysis bibliometric analysis acquisition content analysis research libraries scientific journals 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Serials Price Projections for 2013, Ipswich: EBSCO Information Services, 2012.Google Scholar
  2. 2.
    Meadows, A., A Librarians and Societies and Publishers—Oh My!, Cited October 29, 2014.
  3. 3.
    Slashcheva, N.A., Mokhnacheva, Yu.V., and Kharybina, T.N., Izuchenie informatsionnykh potrebnostei pol’zovatelei Pushchinskogo nauchnogo tsentra RAN v Tsentral’noi biblioteke Tsentra (department of the RAS LNS), in Biblioteki natsional’nykh akademii nauk: problemy funktsionirovaniya, tendentsii razvitiya: Nauchno-prakticheskii i teoreticheskii sbornik, (Library of the National Academies of Sciences, Functioning Problems, Development Trends. Research and Practice and Theoretical Collection), Kiev: NLU, 2008, pp. 247–264.Google Scholar
  4. 4.
    Jurski, D., 2013 Study of Subscription Prices for Scholarly; Society Journals Society Journal Pricing Trends and Industry Overview, Lawrence: Allen Press, 2013.Google Scholar
  5. 5.
    Priem, J., Taraborelli, D., Groth, P., and Neylon, C., Altmetrics: a Manifesto, (Cited October 29, 2014).
  6. 6.
    Artamonova, G.V., Galeev, A.R., and Bazhenova, T.S., An alternative method of assessing the relevance of medical and biological research of Russian NRU, Public Health Manager, 2012, no. 12, pp. 35–41.Google Scholar
  7. 7.
    Marshakova, I.V., Sistema tsitirovaniya nauchnoi literatury kak sredstvo slezheniya za razvitiem nauki (System of Citation of Scientific Literature as a Means of Tracking the Progress of Science), Moscow: Nauka, 1988.Google Scholar
  8. 8.
    Moore, W.H., Citation Versus Reputation: Assessing Political Science Journals, Tallahassee: Florida State Univ., 2000.Google Scholar
  9. 9.
    Vikhreva, G.M., Tsennostnye aspekty otbora dokumentov v fond nauchnoi biblioteki (Value-Based Aspects of Document Selection for a Research Library Collection), Novosibirsk: SPSTL, 2004.Google Scholar
  10. 10.
    Vikhreva, G.M., On axiological nature of library selection, Bibliotechnye Fondy: Problemy Resheniya, 2004, no. 6, pp. 1–4.Google Scholar
  11. 11.
    Shilov, V.V., The core of the library fund, Bibliotechnaya Zhizn’ Kuzbassa, 1998, no. 4, pp. 134–140.Google Scholar
  12. 12.
    Tsyganov, A.V. Brief description of scientometric indicators based on citation, Naukometriya Ekspertiza Upravlenii Naukoi, Moscow: IPU RAS, 2013.Google Scholar
  13. 13.
    Davis, P.M., Where to spend our e-journal money? Defining a university library’s core collection through citation analysis, Portal (Baltimore), 2002, vol. 2(1), pp. 155–166.CrossRefGoogle Scholar
  14. 14.
    Garfield, E., Citation analysis as a tool in journal evaluation, Science, 1972, vol. 178(4060), pp. 471–479.CrossRefGoogle Scholar
  15. 15.
    Fedorets, O.V., The use of the training sample to determine the priority of criteria in the rating system of evaluation of scientific journals, Control Probl., 2009, no. 1, pp. 59–65.Google Scholar
  16. 16.
    Hirst, G., Discipline impact factors: a method for determining core journal lists, J. Am. Soc. Inf. Sci., 1978, vol. 29(4), pp. 171–172.CrossRefGoogle Scholar
  17. 17.
    Cawkell, A.E., Evaluating scientific journals with journal citation reports — a case study in acoustics, J. Am. Soc. Inf. Sci., 1978, vol. 29(1), pp. 41–46.CrossRefGoogle Scholar
  18. 18.
    Azarkina, M.A., Organization of journal fund of the research library. Acquisition problems, Bibliotechnoe Delo, 2007, no. 5, pp. 17–20.Google Scholar
  19. 19.
    Azarkina, M.A., Organization of journal fund of the research library. Acquisition problems, Bibliotechnoe Delo, 2007, no. 6, pp. 41–42.Google Scholar
  20. 20.
    Pislyakov, V.V., Analiz kontenta vedushchikh elektronnykh resursov aktual’noi zarubezhnoi periodiki. Kolichestvennyi analiz v ekonomike (Content Analysis of Leading Electronic Resources of Relevant Foreign Periodicals. Quantitative Analysis in Economics), Moscow: GU-VShE, 2002.Google Scholar
  21. 21.
    Kirillova, O.V., Andronova, M.B., Divil’kovskaya, T.Yu., and Khachko, O.A., New approaches and results of the information center evaluating the Russian flow of scientific journals: criteria and presentation of the ranked data, Educational Technology & Society, 2006, vol. 9(3), pp. 321–334.Google Scholar
  22. 22.
    Amin, M. and Mabe, M.A., Impact factors: use and abuse, Medicina (Buenos Aires), 2003, vol. 63(4), pp. 347–354.Google Scholar
  23. 23.
    The impact factor game: It is time to find a better way to assess the scientific literature, PLOS Medicine, 2006, vol. 3(6), pp. 0707–0708.Google Scholar
  24. 24.
    Rossner, M., Van Epps, H., and Hill, E., Show me the data, J. Cell Biol., 2007, vol. 179(6), pp. 1091–1092.CrossRefGoogle Scholar
  25. 25.
    Adler, R., Ewing, J., and Taylor, P., A report from the international mathematical union (IMU) in cooperation with the international council of industrial and applied mathematics (ICIAM) and the institute of mathematical statistics (IMS), Stat. Sci., 2009, vol. 24(1), pp. 1–14.CrossRefMathSciNetGoogle Scholar
  26. 26.
    Arnold, D.N. and Fowler, K.K., Nefarious numbers, Notices Amer. Math. Soc., 2011, vol. 58(3), pp. 434–437.zbMATHMathSciNetGoogle Scholar
  27. 27.
    Campbell, P., Escape from the impact factor, Ethics Sci. Environ. Polit., 2008, vol. 8(1), pp. 5–7.CrossRefGoogle Scholar
  28. 28.
    Falagas, M.E. and Alexiou, V.G., The top-ten in journal impact factor manipulation, Arch. Immunol. Ther. Ex., 2008, vol. 56(4), pp. 223–226.CrossRefGoogle Scholar
  29. 29.
    González-Alcaide, G., Valderrama-Zurián, J.C., and Aleixandre-Benavent, R., The impact factor in non-English-speaking countries, Scientometrics, 2012, vol. 92(2), pp. 297–311.CrossRefGoogle Scholar
  30. 30.
    Seglen, P.O., Why the impact factor of journals should not be used for evaluating research, Br. Med. J., 1997, vol. 314(7079), pp. 498–502.CrossRefGoogle Scholar
  31. 31.
    San Francisco Declaration on Research Assessment Putting Science into the Assessment of Research, San Francisco: United States Society for Cell Biology, 2013.Google Scholar
  32. 32.
    Duy, J. and Vaughan, L., Can electronic journal usage data replace citation data as a measure of journal use? An empirical examination, J. Acad. Librariansh., 2006, vol. 32(5), pp. 512–517.CrossRefGoogle Scholar
  33. 33.
    Chrzastowski, T.E., Journal collection cost-effectiveness in an academic chemistry library, Collection Management, 1991, vol. 14(1–2), pp. 85–98.CrossRefGoogle Scholar
  34. 34.
    Pan, E., Journal citation as a predictor of journal usage in libraries, Collection Management, 1978, vol. 2(1), pp. 29–38.CrossRefGoogle Scholar
  35. 35.
    Thelwall, M., Journal impact evaluation: a webometric perspective, Scientometrics, 2012, vol. 92(2), pp. 429–441.CrossRefGoogle Scholar
  36. 36.
    Leydesdorff, L., Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the mosthighly cited papers, Scientometrics, 2012, vol. 92(2), pp. 355–365.CrossRefGoogle Scholar
  37. 37.
    Colledge, L., De Moya-Anegón, F., Guerrero-Bote, V., Lopez-Illescas, C., El Aisati, M., and Moed, H.F., SJR and SNIP: two new journal metrics in Elsevier’s Scopus, Serials, 2010, vol. 23(3), pp. 215–221.CrossRefGoogle Scholar
  38. 38.
    Moed, H.F., Measuring contextual citation impact of scientific journals, J. Informetr., 2010, vol. 4(3), pp. 265–277.CrossRefGoogle Scholar
  39. 39.
    González-Pereira, B., Guerrero-Bote, V.P., and Moya-Anegon, F., A new approach to the metric of journals’ scientific prestige: the SJR indicator, J. Informetr., 2010, vol. 4(3), pp. 379–391.CrossRefGoogle Scholar
  40. 40.
    Haddawy, P. and Hassan, S.-U., A comparison of three prominent journal metrics with expert judgement of journal quality, Context Counts: Pathways to Master Big and Little Data, Leiden, 2014, pp. 238–240.Google Scholar
  41. 41.
    Tiers for the Australian Ranking of Journals. Cited Octorber 29, 2014.
  42. 42.
    Kim, M.T., Ranking of journals in library and information-science—a comparison of perceptual and citation-based measures, Coll. Res. Libr., 1991, vol. 52(1), pp. 24–37.CrossRefGoogle Scholar
  43. 43.
    Lozano, G.A., Larivière, V., and Gingras, Y., The weakening relationship between the impact factor and papers’ citations in the digital age, J. Assoc. Inf. Sci. Technol., 2012, vol. 63(11), pp. 2140–2145.CrossRefGoogle Scholar
  44. 44.
    Ellison, G., Is peer review in decline, Econ. Enq., 2011, vol. 49(3), pp. 635–657.CrossRefGoogle Scholar
  45. 45.
    Bensman, S.J., The impact factor: its place in Garfield’s thought, in science evaluation, and in library collection management, Scientometrics, 2012, vol. 92(2), pp. 263–275.CrossRefGoogle Scholar
  46. 46.
    Moed, H.F., Colledge, L., Reedijk, J., Moya-Anegon, F., Guerrero-Bote, V., Plume, A., and Amin, M., Citation-based metrics are appropriate tools in journal assessment provided that they are accurate and used in an informed way, Scientometrics, 2012, vol. 92(2), pp. 367–376.CrossRefGoogle Scholar
  47. 47.
    Gross, P.L.K. and Gross, E.M., College libraries and chemical education, Science, 1927, vol. 66(1713), pp. 385–389.CrossRefGoogle Scholar
  48. 48.
    Allen, E.S., Periodicals for mathematicians, Science, 1929, vol. 70(1825), pp. 592–594.CrossRefGoogle Scholar
  49. 49.
    McNeely, J.K. and Crosno, C.D., Periodicals for electrical engineers, Science, 1930, vol. 72(1856), pp. 81–84.CrossRefGoogle Scholar
  50. 50.
    Gross, P.L.K. and Woodford, A.O., Serial literature used by American geologists, Science, 1931, vol. 73(1903), pp. 660–664.CrossRefGoogle Scholar
  51. 51.
    Hooker, R.H., A study of scientific periodicals, Rev. Sci. Instrum., 1935, vol. 6(11), pp. 333–338.CrossRefGoogle Scholar
  52. 52.
    Jenkins, R.L., Periodicals for medical libraries, J. Am. Med. Assoc., 1931, vol. 97(9), pp. 608–610.CrossRefGoogle Scholar
  53. 53.
    Gregory, J., An evaluation of periodical literature from the standpoint of endocrinology, Endocrinol., 1935, vol. 19(2), pp. 213–215.CrossRefMathSciNetGoogle Scholar
  54. 54.
    Hackh, I., The periodicals useful in the dental library, Bull. Med. Libr. Assoc., 1936, vol. 25(1–2), pp. 109–112.Google Scholar
  55. 55.
    Via, B.J. and Schmidle, D.J., Investing wisely: citation rankings as a measure of quality in library and information science journals, Portal Libr. Acad., 2007, vol. 7(3), pp. 333–373.CrossRefGoogle Scholar
  56. 56.
    Sengupta, I.N., Impact of scientific serials on the advancement of medical knowledge: an objective method of analysis, Int. Libr. Rev., 1972, vol. 4(2), pp. 169–195.CrossRefGoogle Scholar
  57. 57.
    Sengupta, I.N., The growth of biophysical literature, Scientometrics, 1985, vol. 8(5–6), pp. 365–375.CrossRefGoogle Scholar
  58. 58.
    Servi, P.N. and Griffith, B.C., A method for partitioning the journal literature, J. Am. Soc. Inf. Sci., 1980, vol. 31(1), pp. 36–40.CrossRefGoogle Scholar
  59. 59.
    McCain, K.W., Core journal networks and co-citation maps: new bibliometric tools for serials research and management, Libr. Q., 1991, vol. 61(3), pp. 311–336.CrossRefGoogle Scholar
  60. 60.
    Brown, C.H. and Krumm, R.V., Scientific serials: characteristics and lists of most cited publications in mathematics, physics, chemistry, geology, physiology, botany, zoology, and entomology, Chicago: Association of College and Reference Libraries, 1956.zbMATHGoogle Scholar
  61. 61.
    Hockings, E.F., Selection of scientific periodicals in an industrial research library, J. Am. Soc. Inf. Sci., 1974, vol. 25(2), pp. 131–132.CrossRefGoogle Scholar
  62. 62.
    Brookes, B.C., Optimum p% library of scientific periodicals, Nature, 1971, vol. 232(5311), pp. 458–461.CrossRefGoogle Scholar
  63. 63.
    Ash, J., Library use of public health materials: description and analysis, Bull. Med. Libr. Assoc., 1974, vol. 62(2), pp. 95–104.Google Scholar
  64. 64.
    Chambers, G.R. and Healey, J.S., Journal citations in master’s theses: one measurement of a journal collection, J. Am. Soc. Inf. Sci., 1973, vol. 24(5), pp. 397–401.CrossRefGoogle Scholar
  65. 65.
    McCain, K.W. and Bobick, J.E., Patterns of journal use in a departmental library—a citation analysis, J. Am. Soc. Inf. Sci., 1981, vol. 32(4), pp. 257–267.CrossRefGoogle Scholar
  66. 66.
    LaBorie, T. and Halperin, M., Citation patterns in library science dissertations, J. Educ. Librarianship, 1976, vol. 16(4), pp. 271–283.CrossRefGoogle Scholar
  67. 67.
    Drott, C. and Griffith, B.C., Empirical examination of Bradford’s law and the scattering of scientific literature, J. Am. Soc. Inf. Sci., 1978, vol. 29(5), pp. 238–246.CrossRefGoogle Scholar
  68. 68.
    Dubrov, A.P. and Krasikova, O.L., The use of the citation analysis for studying the formation of the fund of foreign journals in scientific academic libraries, Nauchn. Tekh. Biblioteki, 1998, no. 6, pp. 26–34.Google Scholar
  69. 69.
    Pislyakov, V.V. and Lyubushko, E.E., The analysis of scientific and information activities (reading, publications, and citations) of scientists of the Institute of Catalysis of the SB RAS, Kataliz Prom-st., 2007, no. 3, pp. 55–63.Google Scholar
  70. 70.
    Mazov, N.A., The evaluation of the flow of scientific publications of the academic institution based on the bibliometric analysis of citation, Inform. Technol. Gumanitarnykh Issledovaniyakh, 2011, no. 16, pp. 25–30.Google Scholar
  71. 71.
    Slashcheva, N.A. and Mokhnacheva, Yu.V., Electronic information in scientometric studies, Nauchn.-Tekhn. Inform., Ser. 1. Organ. Metod. Inf. Rab., 2003, no. 5, pp. 21–27.Google Scholar
  72. 72.
    Mokhnacheva, Yu.V., Data support of research by academic libraries using bibliometric methods, Cand. Sci. (Pedagogic) Dissertation, Moscow, 2008.Google Scholar
  73. 73.
    Gureev, V.N. and Mazov, N.A., Themes of the publications of an organization as a basis for forming an objective and optimal repertoire of scientific periodicals, Sci. Tech. Inf. Process., 2013, vol. 40, no. 4, pp. 195–204.CrossRefGoogle Scholar
  74. 74.
    Gureyev, V.N. and Mazov, N.A., Detection of information requirements of researchers using bibliometric analyses to identify target journals, Inf. Tech. Libr., 2013, vol. 32(4), pp. 66–77.Google Scholar
  75. 75.
    Tenopir, C. and King, D.W., Electronic journals and changes in scholarly article seeking and reading patterns, D-Lib Magazine, 2008, vol. 14(11–12), pp. 1–13.Google Scholar
  76. 76.
    Podkorytova, N.I., Bosina, L.V., and Lakizo, I.G., System of the centralized acquisition of the CLS SB RAS. Results and prospects, Bibliosfera, 2012, no. 5, pp. 54–57.Google Scholar
  77. 77.
    Derevyanko, A.P. and Kholyushkin, Yu.P., The problem of the qualitative analysis of archaeological publications, in Metodologiya i metodika arkheologicheskikh rekonstruktsii: Sb. nauch. trudov (Methodology and methods of archaeological reconstructions. Collection of scientific papers), Novosibirsk: SB RAS, 1994.Google Scholar
  78. 78.
    Shotton, D., Cito, The citation typing ontology, J. Biomed. Semant., 2010, vol. 1, suppl. 1, p. S6.CrossRefGoogle Scholar
  79. 79.
    Simkin, M.V. and Roychowdhury, V.P., Read before you cite, Complex Syst., 2003, no. 14, pp. 262–274.Google Scholar
  80. 80.
    Simkin, M.V. and Roychowdhury, V.P., Do you sincerely want to be cited? Or: read before you cite, Significance, 2006, vol. 3(4), pp. 179–181.CrossRefMathSciNetGoogle Scholar
  81. 81.
    Sandison, A., Library optimum, Nature, 1971, vol. 234(5328), pp. 368–369.CrossRefGoogle Scholar
  82. 82.
    Tenopir, C., King, D.W., Boyce, P., Grayson, M., and Paulson, K.L., Relying on electronic journals: reading patterns of astronomers, J. Assoc. Inf. Sci. Technol., 2005, vol. 56(8), pp. 786–802.CrossRefGoogle Scholar
  83. 83.
    Halevi, G. and Moed, H.F., Usage patterns of scientific journals and their relationship with citations, Context Counts: Pathways to Master Big and Little Data, Leiden, 2014, pp. 241–251.Google Scholar
  84. 84.
    Khaitun, S.D., Naukometriya. Sostoyanie i perspektivy (Scientometrics. Status and Prospects), Moscow: Nauka, 1983.Google Scholar
  85. 85.
    Garfield, E., Keywords plus—ISI’s breakthrough retrieval method. 1. Expanding your searching power on current-contents on diskette, Current Contents, 1990, vol. 32, pp. 5–9.Google Scholar
  86. 86.
    Garfield, E. and Sher, I.H., Keywords-plus(tm)—algorithmic derivative indexing, J. Am. Soc. Inf. Sci., 1993, vol. 44(5), pp. 298–299.CrossRefGoogle Scholar
  87. 87.
    Dhawan, S.M., Phull, S.K., and Jain, P., Documentation notes, J. Doc., 1980, vol. 36(1), pp. 24–32.CrossRefGoogle Scholar
  88. 88.
    Colledge, L. and Verlinde, R., SciVal metrics guidebook, Netherlands: Elsevier, 2014.Google Scholar

Copyright information

© Allerton Press, Inc. 2015

Authors and Affiliations

  1. 1.Information and Library Center, Institute of Petroleum Geology and Geophysics, Siberian BranchRussia Academy of SciencesNovosibirskRussia

Personalised recommendations