Advertisement

Scientometrics

, Volume 49, Issue 1, pp 7–22 | Cite as

Validation of Bibliometric Indicators in the Field of Microbiology: A Norwegian Case Study

  • Dag W. Aksnes
  • Terje Bruen Olsen
  • Per O. Seglen
Article

Abstract

This paper addresses two related issues regarding the validity of bibliometric indicators for the assessment of national performance within a particular scientific field. Firstly, the representativeness of a journal-based subject classification; and secondly, the completeness of the database coverage. Norwegian publishing in microbiology was chosen as a case, using the standard ISI-product National Science Indicators on Diskette (NSIOD) as a source database. By applying an "author-gated" retrieval procedure, we found that only 41 percent of all publications in NSIOD-indexed journals, expert-classified as microbiology, were included under the NSIOD-category Microbiology. Thus, the set of defining core journals only is clearly not sufficient to delineate this complex biomedical field. Furthermore, a subclassification of the articles into different subdisciplines of microbiology revealed systematic differences with respect to representation in NSIOD's Microbiology field; fish microbiology and medical microbiology are particularly underrepresented.

In a second step, the individual publication lists from a sample of Norwegian microbiologists were collected and compared with the publications by the same authors, retrieved bibliometrically. The results showed that a large majority (94%) of the international scientific production in Norwegian microbiology was covered by the database NSIOD. Thus, insufficient subfield delineation, and not lack of coverage, appeared to be the main methodological problem in the bibliometric analysis of microbiology.

Keywords

Subject Classification Related Issue Scientific Field Methodological Problem Scientific Production 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    D. W. Aksnes, T. B. Olsen, P. O. Seglen, Inadequacy of a journal-based research field delineation. Incomplete recovery of Norwegian microbiology articles in ISI's Microbiology field. In: C. A. MacÌas-Chapula (Ed.) Proceedings of the Seventh Conference of the International Society for Scientometrics and Informetrics, Colima, Mexico, July 5–8, 1999.Google Scholar
  2. 2.
    J. F. Miquel, T. Ojasoo, Y. Okubo, A. Paul, J. C. DorÉ, World science in 18 disciplinary areas: Comparative evaluation of the publication patterns of 48 countries over the period 1981–1992, Scientometrics, 33 (1995) 149-167.Google Scholar
  3. 3.
    T. Braun, W. GlÄnzel, H. Grupp, The scientometric weight of 50 nations in 27 science areas, 1989–1993. Part 1. All fields combined, mathematics, engineering, chemistry and physics, Scientometrics, 33 (1995) 263-293.Google Scholar
  4. 4.
    H. F. Moed, Differences in the construction of SCI based bibliometric indicators among various producers: A first overview, Scientometrics, 35 (1996) 177-191.Google Scholar
  5. 5.
    I. GÓmez, M. Bordons, M. T. FernÁndez, A. MÉndez, Coping with the problem of subject classification diversity, Scientometrics, 35 (1996) 223-235.Google Scholar
  6. 6.
    G. Sivertsen, Norsk forskning på den internasjonale arena: En sammenligning av 18 OECD-lands artikler og siteringer i Science Citation Index 1973–86. NAVFs utredningsinstitutt, Oslo, Rapport 1/91.Google Scholar
  7. 7.
    W. GlÄnzel, The need for standards in bibliometric research and technology, Scientometrics, 35 (1996) 167-176.Google Scholar
  8. 8.
    G. Lewison, The definition of biomedical research subfields with title keywords and application to the analysis of research output, Research Evaluation, 6 (1996) 25-36.Google Scholar
  9. 9.
    E. J. Rinia, C. de Lange, H. F. Moed, Measuring national output in physics: Delimitation problems, Scientometrics, 28 (1993) 89-109.Google Scholar
  10. 10.
    E. Garfield, The ISI database: The journal selection process. Internet: URL: http://www.isinet.com/whatshot/essays, 1997.Google Scholar
  11. 11.
    P. Bourke, L. Butler, Publication types, citation rates and evaluation, Scientometrics, 37 (1996) 437-494.Google Scholar
  12. 12.
    H. F. Moed, W. J. M. Burger, J. G. Frankfurt, A. F. J. van Raan, The use of bibliometric data for the measurement of university research performance, Research Policy, 14 (1985) 131-149.Google Scholar
  13. 13.
    A. J. Nederhof, R. A. Zwaan, R. E. De Bruin, P. J. Dekker, Assessing the usefulness of bibliometric indicators for the humanities and the social and behavioural sciences: A comparative study, Scientometrics, 15 (1989) 423-435.Google Scholar
  14. 14.
    U. Schoepflin, Problems of Representativity in the Social Sciences Citation Index. In: P. Weingart, R. Sehringer, M. Winterhager (Eds) Representations of Science and Technology. Proceedings of the International Conference on Science and Technology Indicators, Bielefeld, Germany, 10–12 June 1990. DSWO Press, Leiden University, 1992.Google Scholar
  15. 15.
    D. Hicks, The difficulty of achieving full coverage of international social science literature and the bibliometric consequences, Scientometrics, 44 (1999) 193-215.Google Scholar
  16. 16.
    Norges Forskningsråd, Det Norske Forskningssystemet — Statistikk og Indikatorer. Oslo, 1997.Google Scholar
  17. 17.
    F. Narin, G. Pinski, H. H. Gee, Structure of the biomedical literature, Journal of the American Society for Information Science, 27 (1976) 25-45.Google Scholar
  18. 18.
    G. Folly, B. Hajtman, J. I. Nagy, I. Ruff, Some methodological problems in ranking scientists by citation analysis, Scientometrics, 3 (1981) 135-147.Google Scholar
  19. 19.
    P. O. Seglen, Citations and journal impact factors: questionable indicators of research quality, Allergy, 52 (1997) 1050-1056.Google Scholar
  20. 20.
    A. J. Nederhof, Delimitation of a medical research topic: interaction with experts in selecting a database and constructing a search strategy, Research Evaluation, 1 (1991) 149-154.Google Scholar
  21. 21.
    S. Kyvik, Productivity in Academia. Scientific Publishing at Norwegian Universities, Norwegian University Press, Oslo, 1991, p. 81.Google Scholar
  22. 22.
    A. J. Nederhof, R. F. Meuer, Development of bibliometric indicators for utility of research to users in society: measurement of external knowledge transfer via publications in trade journals, Scientometrics, 32 (1995) 37-48.Google Scholar
  23. 23.
    M. Bordons, S. BarrigÓn, Bibliometric analysis of publications of Spanish pharmacologists in the SCI (1984–89). II. Contribution to subfields other than “Pharmacology & Pharmacy” (ISI), Scientometrics, 25 (1992) 425-446.Google Scholar
  24. 24.
    T. B. Olsen, Norsk forskning i internasjonale tidsskrifter: Sammenligning med andre land belyst ved bibliometriske makrodata. NIFU, Oslo, Rapport 1/98.Google Scholar
  25. 25.
    R. E. D. De Bruin, H. F. Moed, Delimitation of scientific subfields using cognitive words from corporate addresses in scientific publications, Scientometrics, 26 (1993) 65-80.Google Scholar
  26. 26.
    A. Schubert, T. Braun, Cross-field normalization of scientometric indicators, Scientometrics, 36 (1996) 311-324.Google Scholar

Copyright information

© Académia Kiadó, Budapest 2000

Authors and Affiliations

  • Dag W. Aksnes
    • 1
  • Terje Bruen Olsen
    • 1
  • Per O. Seglen
    • 1
  1. 1.Norwegian Institute for Studies in Research and Higher Education (NIFU)OsloNorway

Personalised recommendations