Advertisement

Scientometrics

, Volume 36, Issue 3, pp 293–310 | Cite as

Bibliometric performance measures

  • F. Narin
  • Kimberly S. Hamilton
Article

Abstract

Three different types of bibliometrics — literature bibliometrics, patent bibliometrics, and linkage bibliometric can all be used to address various government performance and results questions. Applications of these three bibliometric types will be described within the framework of Weinberg's internal and external criteria, whether the work being done is good science, efficiently and effectively done, and whether it is important science from a technological viewpoint. Within all bibliometrics the fundamental assumption is that the frequency with which a set of papers or patents is cited is a measure of the impact or influence of the set of papers. The literature bibliometric indicators are counts of publications and citations received in the scientific literature and various derived indicators including such phenomena as cross-sectoral citation, coauthorship and concentration within influential journals. One basic observation of literature bibliometrics, which carries over to patent bibliometrics, is that of highly skewed distributions — with a relatively small number of high-impact patents and papers, and large numbers of patents and papers of minimal impact. The key measure is whether an agency is producing or supporting highly cited papers and patents. The final set of data are in the area of linkage bibliometrics, looking at citations from patents to scientific papers. These are particularly relevant to the external criteria, in that it is quite obvious that institutions and supporting agencies whose papers are highly cited in patents are making measurable contributions to a nation's technological progress.

Keywords

Skewed Distribution Technological Progress Scientific Paper Minimal Impact Government Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    F. Narin, “Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity”, Report prepared for the National Science Foundation, Contract NSF C-627, NTIS Accession #PB252339/AS, (1976), 456pp.Google Scholar
  2. 2.
    A.M. Weinberg, Criteria for scientific choice II: The two cultures,Minerva, 3 (1964) 3–14.CrossRefGoogle Scholar
  3. 3.
    A.M. Weinberg, Criteria for scientific choice”,Minerva, 1 (1963) 159–171.CrossRefGoogle Scholar
  4. 4.
    IIT Research Institute, “Technology in Retrospect and Critical Events in Science (TRACES)”. Report prepared for the National Science Foundation, Contract NSF-C535, (1968), 72pp.Google Scholar
  5. 5.
    National Science Foundation, Science Indicators 1972, U.S. Government Printing Office, Reports of the National Science Board.Google Scholar
  6. 6.
    P.O. Seglen, The skewness of science,Journal of the American Society for Information Science, 43 (9), (1992) 628–638.CrossRefGoogle Scholar
  7. 7.
    G. Pinski, F. Narin, Citation influence for journal aggregates of scientific publications: theory, with application to the literature of physics,Information Processing and Management, 12 (5), (1976) 297–312.CrossRefGoogle Scholar
  8. 8.
    F. Narin, A. Breitzman, Inventive Productivity,Research Policy, 24 (1995) 507–519.CrossRefGoogle Scholar
  9. 9.
    A.J. Lotka, The frequency distribution of scientific productivity,Journal of the Washington Academy of Science, 16 (1926) 317–323.Google Scholar
  10. 10.
    F. Narin, H.H. Gee, “An Analysis of Research Publications Supported by NIH, 1973–1980”, NIH Program Evaluation Report, U.S. Department of Health and Human Services, Public Health Service, National Institutes of Health, (1986).Google Scholar
  11. 11.
    J.D. Frame, F. Narin, NIH funding and biomedical publication output,Federation Proceedings, 35 (14) (1976) 2529–2532.Google Scholar
  12. 12.
    F. Narin, K. Stevens, E. Whitlow, Scientific cooperation in Europe and the citation of multinationally authored papers,Scientometrics, 21 (1991) 313–323.CrossRefGoogle Scholar
  13. 13.
    L.B. Ellwein, P. Kroll, F. Narin, Linkage between research sponsorship and patented eye-care technology, To be published (1996).Google Scholar
  14. 14.
    J. Anderson, N. Williams, D. Seemungal, F. Narin, D. Olivastro, Human genetic technology: Exploring the links between science and innovation. To be published inTechnology Analysis and Strategic Management, (1996).Google Scholar
  15. 15.
    F. Narin, “Linking Biomedical Research to Outcomes — The Role of Bibliometrics and Patent Analysis”, Presented at The Economics Round Table, National Institutes of Health, (1995), 27pp.Google Scholar
  16. 16.
    F. Narin, D. Olivastro, Status report — linkage between technology and science,Research Policy, 21 (3) (1992) 237–249.CrossRefGoogle Scholar
  17. 17.
    F. Narin, E. Noma, Is technology becoming science?,Scientometrics, 7 (1985) 369–381.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó 1996

Authors and Affiliations

  • F. Narin
    • 1
  • Kimberly S. Hamilton
    • 1
  1. 1.CHI Computer Horizons Inc.Haddon Heights(USA)

Personalised recommendations