Skip to main content
Log in

The application of bibliometric analysis: disciplinary and user aspects

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Bibliometric analysis has been used increasingly as a tool within the scientific community. Interplay is vital between those involved in refining bibliometric methods and the recipients of this type of analysis. Production as well as citations patterns reflect working methodologies in different disciplines within the specialized Library and Information Science (LIS) field, as well as in the non-specialist (non-LIS) professional field. We extract the literature on bibliometric analyses from Web of Science in all fields of science and analyze clustering of co-occurring keywords at an aggregate level. It reveals areas of interconnected literature with different impact on the LIS and the non-LIS community.We classify and categorize bibliometric articles that obtain the most citations in accordance with a modified version of Derrick’s, Jonker’s and Lewison’s method (Derrick et al. in Proceedings, 17th international conference on science and technology indicators. STI, Montreal, 2012). The data demonstrates that cross-referencing between the LIS and the non-LIS field is modest in publications outside their main categories of interest, i.e. discussions of various bibliometric issues or strict analyses of various topics. We identify some fields as less well-covered bibliometrically.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Do metrics matter? Many researchers believe that quantitative metrics determine who gets hired and who gets promoted at their institutions. With an exclusive poll and interviews, nature probes to what extent metrics are really used that way. Nature, 465(7300), 860–863.

    Article  Google Scholar 

  • Bar-Ilan, J. (2008). Which h-index? Comparison of WoS. Scopus and Google Scholar. Scientometrics, 74(2), 257–271.

    Google Scholar 

  • Bollen, J., & Van de Sompel, H. (2008). Usage impact factor: The effects of sample characteristics on usage-based impact metrics. Journal of the American Society for Information Science and Technology, 59(1), 136–149.

    Article  Google Scholar 

  • Bonnell, A. G. (2016). Tide or tsunami? The impact of metrics on scholarly research. Australian Universities’ Review, The, 58(1), 54.

    Google Scholar 

  • Bornmann, L., & Daniel, H.-D. (2005). Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391–392.

    Article  Google Scholar 

  • Bornmann, L., Mutz, R., & Daniel, H. D. (2008). Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. Journal of the American Society for Information Science and Technology, 59(5), 830–837.

    Article  Google Scholar 

  • Bornmann, L., Stefaner, M., de Moya Anegón, F., & Mutz, R. (2014). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers: A visualisation of results from multi-level models. Online Information Review, 38(1), 43–58.

    Article  Google Scholar 

  • Braun, T., Bergstrom, C. T., Frey, B. S., Osterloh, M., West, J. D., Pendlebury, D., et al. (2010). How to improve the use of metrics. Nature, 465(17), 870–872.

    Google Scholar 

  • Cox, A., Gadd, E., Petersohn, S., & Sbaffi, L. (2017). Competencies for bibliometrics. Journal of Librarianship and Information Science. https://doi.org/10.1177/0961000617728111.

    Google Scholar 

  • Derrick, G., Jonkers, K., & Lewison, G. (2012) Characteristics of bibliometrics articles in library and information sciences (LIS) and other journals. In Proceedings, 17th international conference on science and technology indicators, (pp. 449–551). STI: Montreal.

  • Ellegaard, O., & Wallin, J. A. (2015). The bibliometric analysis of scholarly production: How great is the impact? Scientometrics, 105(3), 1809–1831.

    Article  Google Scholar 

  • Garfield, E. (1977). Restating fundamental assumptions of citation analysis. Current Contents, 39, 5–6.

    Google Scholar 

  • Glänzel, W. (1996). The need for standards in bibliometric research and technology. Scientometrics, 35(2), 167–176.

    Article  Google Scholar 

  • Grandjean, P., Eriksen, M. L., Ellegaard, O., & Wallin, J. A. (2011). The Matthew effect in environmental science publication: A bibliometric analysis of chemical substances in journal articles. Environmental Health, 10(1), 96.

    Article  Google Scholar 

  • Hall, C. M. (2011). Publish and perish? Bibliometric analysis, journal ranking and the assessment of research quality in tourism. Tourism Management, 32(1), 16–27.

    Article  Google Scholar 

  • Harvey, L. (2008). Rankings of higher education institutions: A critical review. Routledge: Taylor & Francis.

    Google Scholar 

  • Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261.

    Article  Google Scholar 

  • Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). The Leiden manifesto for research metrics. Nature, 520(7548), 429.

    Article  Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.

    Article  MATH  Google Scholar 

  • Jonkers, K., & Derrick, G. (2012). The bibliometric bandwagon: Characteristics of bibliometric articles outside the field literature. Journal of the Association for Information Science and Technology, 63(4), 829–836.

    Google Scholar 

  • Kaur, J., Radicchi, F., & Menczer, F. (2013). Universality of scholarly impact metrics. Journal of Informetrics, 7(4), 924–932.

    Article  Google Scholar 

  • Larivière, V. (2012). The decade of metrics? Examining the evolution of metrics within and outside LIS. Bulletin of the American Society for Information Science and Technology, 38(6), 12–17.

    Article  Google Scholar 

  • Larivière, V., Archambault, E., Gingras, Y., & Vignola-Gagné, É. (2006). The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities. Journal of the Association for Information Science and Technology, 57(8), 987–1004.

    Google Scholar 

  • Leydesdorff, L., & Bornmann, L. (2016). The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”. Journal of the Association for Information Science and Technology, 67(3), 707–714.

    Article  Google Scholar 

  • Leydesdorff, L., Wouters, P., & Bornmann, L. (2016). Professional and citizen bibliometrics: Complementarities and ambivalences in the development and use of indicators-a state-of-the-art report. Scientometrics, 109, 2129–2150.

    Article  Google Scholar 

  • Liu, X., Zhang, L., & Hong, S. (2011). Global biodiversity research during 1900–2009: A bibliometric analysis. Biodiversity and Conservation, 20(4), 807–826.

    Article  Google Scholar 

  • Martinez-Pulgarin, D. F., Acevedo-Mendoza, W. F., Cardona-Ospina, J. A., Rodriiuez-Morales, A. J., & Paniz-Mondolfi, A. E. (2016). A bibliometric analysis of global Zika research. Travel Medicine and Infectious Disease, 14(1), 55–57.

    Article  Google Scholar 

  • McKechnie, L., & Pettigrew, K. E. (2002). Surveying the use of theory in library and information science research: A disciplinary perspective. Library trends, 50(3), 406.

    Google Scholar 

  • Petersohn, S. (2016). Professional competencies and jurisdictional claims in evaluative bibliometrics: The educational mandate of academic librarians. Education for Information, 32(2), 165–193.

    Article  Google Scholar 

  • Prebor, G. (2010). Analysis of the interdisciplinary nature of library and information science. Journal of Librarianship and Information Science, 42(4), 256–267.

    Article  Google Scholar 

  • Tranfield, D., Denyer, D., & Smart, P. (2003). Towards a methodology for developing evidence-informed management knowledge by means of systematic review. British Journal of Management, 14(3), 207–222.

    Article  Google Scholar 

  • Van Eck, N. J., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523–538.

    Article  Google Scholar 

  • Van Eck, N. J., & Waltman, L. (2017). Citation-based clustering of publications using CitNetExplorer and VOSviewer. Scientometrics, 111(2), 1053–1070.

    Article  Google Scholar 

  • Van Noorden, R. (2010). A profusion of measures: Scientific performance indicators are proliferating—Leading researchers to ask afresh what they are measuring and why. Richard Van Noorden surveys the rapidly evolving ecosystem. Nature, 465(7300), 864–867.

    Article  Google Scholar 

  • Van Raan, A. F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.

    Article  Google Scholar 

  • Wallin, J. A. (2005). Bibliometric methods: Pitfalls and possibilities. Basic & Clinical Pharmacology & Toxicology, 97(5), 261–275.

    Article  Google Scholar 

  • Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.

    Article  Google Scholar 

  • Weller, K. (2015). Social media and altmetrics: An overview of current alternative approaches to measuring scholarly impact. In I. M. Welpe, J. Wollersheim, S. Ringelhan, & M. Osterloh (Eds.), Incentives and performance (pp. 261–276). Berlin: Springer.

    Google Scholar 

  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. & Johnson, B. (2015). Report of the independent review of the role of metrics in research assessment and management. https://doi.org/10.13140/rg.2.1.4929.1363.

  • Wouters, P. et al. (2015). The Metric Tide: Literature review (supplementary report I to the independent review of the role of metrics in research assessment and management). HEFCE. https://doi.org/10.13140/rg.2.1.5066.3520.

Download references

Acknowledgements

The author wishes to thank Ph.D. Mette Bruus and anonymous referees for valuable comments and suggestions for improvement of the article.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ole Ellegaard.

Electronic supplementary material

Appendices

Appendix 1

See Table 5.

Table 5 Countries with most publications on bibliometric analysis 1964–2016

Appendix 2

See Table 6.

Table 6 Keywords represented in cluster 5: ‘Technology and innovation’ and their occurrence in the literature on bibliometric analysis

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ellegaard, O. The application of bibliometric analysis: disciplinary and user aspects. Scientometrics 116, 181–202 (2018). https://doi.org/10.1007/s11192-018-2765-z

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-018-2765-z

Keywords

Navigation