Scientometrics

, Volume 28, Issue 1, pp 61–77 | Cite as

A new approach to defining a multidisciplinary field of science: The case of cardiovascular biology

  • L. A. Rogers
  • J. Anderson
Article

Abstract

This paper describes a new and objective method for tackling the problem of defining a multidisciplinary research area for bibliometric analysis. The test field was cardiovascular biology. A three stage process was adopted in setting a boundary around this research field:
  1. 1.

    Appropriate sections of a hierarchical subject classification scheme, Medical Subject Headings (MeSH), were developed into a “MeSH filter” through which papers indexed in MEDLINE were screened.

     
  2. 2.

    A panel of cardiovascular experts reviewed the core set of classification terms, identifying irrelevant and missing areas, facilitating the development of a more sophisticated “filter”.

     
  3. 3.

    The definition was validated using publication lists from research departments with a known interest in cardiovascular research.

     

This iterative process resulted in a definition of the field which captured basic and clinical research papers from the international biomedical research community and which was recognisable to experts in the field of cardiovascular research. Importantly, the field boundary also excluded publications which were not relevant to cardiovascular research. The process of involving experts in shaping the field definition also yielded two intangible, but key benefits: (a) it lent credibility to subsequent analyses, the results of which were to be presented to policy-makers in cardiovascular biology, and (b) it served to shape consensus among the cardiovascular experts on the full range of scientific disciplines that are relevant to their field.

Analysis of international publishing in cardiovascular research revealed that whilst the UK and US dominate in total numbers of papers, the relative emphasis on cardiovascular research in these countries (as a proportion ofall biomedical publishing) is actually quite low, and declining. Japan and Germany in contrast appear to give greater emphasis to cardiovascular research in their national portfolios of biomedical science, and between 1988–1991 Japan established a marked increase in activity.

Keywords

MeSH Bibliometric Analysis Medical Subject Heading Field Boundary Cardiovascular Research 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    P. Jones, Report on the 1989 research assessment exercise, Universities Funding Council, December 1989.Google Scholar
  2. 2.
    J. A. McGinnety, A. G. Thomas, The efficiency of research council grants, Mimeo, Natural Environment Research Council, Swindon, 1989.Google Scholar
  3. 3.
    P. O. Williams, Report to the Director of Research and Development, Review of the role of DH-funded research units. Strategies for long-term funding of Research and Development, Department of Health, March 1992.Google Scholar
  4. 4.
    B. R. Martin, The bibliometric assessment of the UK scientific performance: a reply to Braun, Glänzel and Schubert,Scientometrics, 20 (1991) 333–357.Google Scholar
  5. 5.
    WHO annual health statistics International Classification of Diseases (410–414) (1990).Google Scholar
  6. 6.
    OPCS, Mortality Statistics 1988, DH2 no. 15, HMSO (1990).Google Scholar
  7. 7.
    Health of the Nation. A consultative document for Health in England, Department of Health, HMSO, 1991.Google Scholar
  8. 8.
    P. Isard, J. Forbes, The cost of stroke to the NHS in Scotland,Cerebrovascular Diseases, (1992) (in press).Google Scholar
  9. 9.
    The Health of the Nation. A strategy for health in England. Department of Health, HMSO, 1992.Google Scholar
  10. 10.
    M. Peckham, Research and development for the National Health Service,The Lancet, 338 (1991) 367–371.Google Scholar
  11. 11.
    MRC Cardiovascular Working Group Report, 1990, Professor A. Henderson personal communication.Google Scholar
  12. 12.
    B. Balmer, B. R. Martin, Who's doing what in human genome research,Scientometrics, 22 (1991) 369–377.Google Scholar
  13. 13.
    H. Small, E. Sweeney, Clustering the Science Citation Index using co-citations, Part I, A comparison of methods,Scientometrics, 7 (1985) 393–409.Google Scholar
  14. 14.
    H. Small, E. Sweeney, E. Greenlee, Clustering the Science Citation Index, II, Mapping science,Scientometrics, 8 (1985) 321–340.Google Scholar
  15. 15.
    S. E. Cozzens, Literature-based data in research evaluation: a managers guide to bibliometrics, SPSG Concept Paper II, 1990.Google Scholar
  16. 16.
    K. Stevens, F. Narin, National Citation Indicators based on citing year: the citation time anomaly, CHI Research Inc., memo to science literature indicator users, May, 1989.Google Scholar
  17. 17.
    H. G. Small, B. C. Griffith, The structure of scientific literatures I: Identifying and graphing specialties,Science Studies, 4 (1974) 17–40.Google Scholar
  18. 18.
    D. Sullivan, D. Hywel-White, E. J. Barboni, Co-citation analyses of science: an evaluation,Social Studies of Science, 7 (1977) 223–240.Google Scholar
  19. 19.
    M. Callon, J. Law, A. Rip (Eds),Mapping the Dynamics of Science and Technology: Sociology of Science in the Real World, London, Macmillan, 1986.Google Scholar
  20. 20.
    P. Healey, H. Rothman, P. K. Hoch, An experiment in science mapping for research planning,Research Policy, 15 (1986) 233–251.Google Scholar
  21. 21.
    J. Law, J. Whittaker, Mapping acidification research: a test of the co-word method,Scientometrics, 23 (1992) 417–461.Google Scholar
  22. 22.
    National Library of Medicine News, 46 (1991), July–August.Google Scholar
  23. 23.
    National Library of Medicine, Index Medicus Review (1991).Google Scholar
  24. 24.
    National Library of Medicine, US Department of Health and Human Services, Medical Subject Headings, 1–2 (1991).Google Scholar
  25. 25.
    J. Anderson, P. M. D. Collins, J. Irvine, P. A. Isard, B. R. Martin, F. Narin, K. Stevens, On-Line approaches to measuring national scientific output: A cautionary tale,Science and Public Policy, 15 (1988) 153–161.Google Scholar
  26. 26.
    F. Narin, The Japan Technology 50, Venture Economics Inc., 1988.Google Scholar
  27. 27.
    J. Irvine, B. R. Martin, International comparisons of scientific performance revisited,Scientometrics, 15 (1989) 369–392.Google Scholar
  28. 28.
    S. Le Minor, P. Dostatni, A bibliometric study of publications of the National Institute for Health and Medical Research (INSERM),Scientometrics, 22 (1991) 41–62.Google Scholar
  29. 29.
    B. R. Martin, J. Irvine, Assessing basic research: some partial indicators of scientific progress in radio astronomy,Research Policy, 12 (1983) 61–90.Google Scholar
  30. 30.
    S. M. Lawani, On the relationship between quantity and quality of a country's research productivity,Journal of Information Science, 5 (1982) 143–145.Google Scholar
  31. 31.
    F. Narin, Bibliometric techniques in the evaluation of research programs,Science and Public Policy, 14 (1987) 99–106.Google Scholar
  32. 32.
    A. Schubert, W. Glänzel, T. Braun, Against absolute methods: relative scientometric indicators and relational charts as evaluation tools,Handbook of Quantitative Studies of Science and Technology,A. J. F. Van Raan (Ed.), 1988, p. 137–176.Google Scholar
  33. 33.
    O. Persson, Measuring scientific output by online techniques,Handbook of Quantitative Studies of Science and Technology,A. J. F. Van Raan (Ed.), 1988, p. 229–254.Google Scholar
  34. 34.
    T. Braun, W. Glänzel, A. Schubert, Assessing assessments of British Science: some facts and figures to accept or decline,Scientometrics, 15 (1989) 165–170.Google Scholar
  35. 35.
    B. R. Martin, J. Irvine, F. Narin, C. Sterritt, K. A. Stevens, Recent trends in the output and impact of British science,Science and Public Policy, 17 (1990) 14–26.Google Scholar
  36. 36.
    Sciencewatch, March 2 (1991) 2–8.Google Scholar

Copyright information

© Akadémiai Kiadó 1993

Authors and Affiliations

  • L. A. Rogers
    • 1
  • J. Anderson
    • 1
  1. 1.The Unit for Policy Research in Science and Medicine (PRISM)The Wellcome TrustLondon(England)

Personalised recommendations