Skip to main content
Log in

Bibliometric analysis for science policy: An evaluation of the United Kingdom's research performance in ocean currents and protein crystallography

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper presents the results of a study of Britain's scientific performance in the fields of ocean currents and protein crystallography carried out for the Advisory Board for the Research Councils (ABRC). Using a range of publication and citation indicators, the study aimed to explore the potential value to science policy-making of low-cost scientometric approaches to research evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes and references

  1. Full details of the study can be found in: B. R. MARTIN, J. IRVINE, D. CROUCH,Science Indicators for Research Policy: A Bibliometric Analysis of Ocean Currents and Protein Crystallography, Science Policy Research Unit Occasional Paper No. 23, University of Sussex, Brighton, 1985.

    Google Scholar 

  2. J. IRVINE, B. R. MARTIN, Evaluating big science: CERN's past performance and future prospects,Scientometrics, 7 (1985) 281.

    Google Scholar 

  3. The reasons why peer-review is coming under strain include decreased or negative real growth rates in national science budgets, the concentration of research in certain areas within a few central laboratories and a consequent politicization of the peer-review process, and an intrinsic tendency towards the reproduction of past priorities-see IRVINE and MARTIN, op. cit., note 2..

    Google Scholar 

  4. However, we have also carried out assessments of more applied areas, including mechanical engineering and electronics in Norway [see M. SCHWARZ, J. IRVINE, B. R. MARTIN, K. PAVITT, R. ROTHWELL, Government support for industrial research: Lessons from a study of Norway,R&D Management, 12 (1982) 155] and steel research supported by the European Community.

    Google Scholar 

  5. See H. R. COWARD, J. J. FRANKLIN, L. SIMON,ABRC Science Policy Study: Co-Citation Bibliometric Models, Center for Research Planning, 1984, and W. A. TURNER,A Mapindex of Ruminant Protein Digestion, Service d'Etudes et de Réalisation des Produits d'Information Avancés (SERPIA), Paris, 1984.

    Google Scholar 

  6. The results can be found in: B. R. MARTIN, J. IRVINE, R. TURNER, The writing on the wall for British science,New Scientist, 104 (8 October 1984) 25.

    Google Scholar 

  7. Prominent among the exceptions to this is the work by COLE and COLE on peer review, for example, S. COLE, J. R. COLE, G. A. SIMON, Chance and consensus in peer review,Science, 214, 4523 (1981) 881. and by NARIN and his colleagues, for example, J. D. FRAME, F. NARIN, M. P. CARPENTER, The distribution of world science,Social Studies of Science, 7 (1977) 501, and M. P. CARPENTER, F. NARIN, The adequacy of theScience Citation Index (SCI) as an indicator of international scientific activity,Journal of the American Society for Information Science, 32 (1981) 430. See also the comments on ‘US science in an international perspective’ inScience Indicators 1976 by D. de SOLLA PRICE,Scientometrics, 2 (1980) 423; J. BEN-DAVID,ibid., Scientometrics, 2 (1980) 355. More recently, the Policy Research Group at Leiden University has analyzed the research performance of university departments: see, for example, H. F. MOED, W. J. M. BURGER, J. G. FRANKFORT, A. F. J. VAN RAAN,On the Measurement of Research Performance—The Use of Bibliometric Indicators, University of Leiden, Netherlands, 1983.

    Google Scholar 

  8. For a discussion of one possible approach to this problem of bias, see J. IRVINE, B. R. MARTIN, Basic research in the East and West: A comparison of the scientific performance of high-energy physics accelerators,Social Studies of Science, 15 (1985) 293.

    Google Scholar 

  9. Many previous scientometric studies have tended to take as the unit of analysis the specialty or problem area. One early exception to this is A. J. MATHESON, Centres of chemical excellence,Chemistry in Britain, 8 (1972) 207.

    Google Scholar 

  10. For a summary of previous SPRU work on the development of techniques for evaluating research output and impact, see, for example, IRVINE and MARTIN, op. cit., note 2..

    Google Scholar 

  11. The data-bank also contains a few papers published before this data, although the extent of coverage is less certain — for further details, see F. C. BERNSTEIN, T. F. KOETZLE, G. J. B. WILLIAMS, E. F. MEYER, M. D. BRICE, J. R. ROGERS, D. KENNARD, T. SHIMANOUCHI, M. TASUMI, The protein data bank: A computer-based archival file for macromolecular structures,European Journal of Biochemistry, 80 (1977) 319.

    Google Scholar 

  12. The exact procedure used is described elsewhere, for example in: B. R. MARTIN, J. IRVINE, CERN: Past performance and future prospects: I-CERN's position in world high-energy physics,Research Policy, 13 (1984) 183.

    Google Scholar 

  13. It is instructive to compare these results with the corresponding figures produced from analysis of theNSF Science Literature Indicators Data-Base. The nearest equivalent field to ocean currents contained in the NSF data is ‘oceanography and limnology’, of which ocean currents research constitutes just one element. Britain's world-share of oceanography and limnology publications has averaged only around 5 to 6% over the latter part of the 1970s, suggesting that the United Kingdom may be proportionately stronger in ocean currents than in oceanography and limnology as a whole. On the other hand, the figure of about 8–9% for the UK share of ocean currents papers is not very different from that for the wider field of earth and space sciences (of which oceanography and limnology form a substantial part in the NSF data-base), which has increased from around 8% in 1973 to 9.5% in 1980. For further details, see MARTIN, IRVINE and CROUCH, op. cit., note 1,. A parallel set of comparisons for protein crystallography is also made; these are not discussed further in this paper.

    Google Scholar 

  14. One limitation to the use of numbers of highly cited papers as an indicator is that it fails to differentiate between papers that are highly cited over a period of many years and those having only a short-lived impact. It is therefore important to examine in addition the number of times highly cited papers receive more than a particular number of citations in a year. The figures in Table 13 of MARTIN, IRVINE and CROUCH, op. cit., note 1, show that the 3 British papers cited more than 8 times in a year achieved this on 12 occasions (corresponding to 6% of the world total). Overall, however, the figures for national world-shares of highly cited papers were little different from those in Table 3 above, with the United States continuing to account for over 80%.

    Google Scholar 

  15. See E. GARFIELD,Citation Indexing: Its Theory and Application in Science, Technology, and Humanities, Wiley Interscience, New York, 1979, p. 149.

    Google Scholar 

  16. Details of the journal sets comprising the first three categories obtained using this approach can be found in Tables 14, 15 and 16 in: MARTIN, IRVINE and CROUCH, op. cit., note 1..

    Google Scholar 

  17. The results of the analysis of papers from first and second class journals combined, and first, second and third class combined, are shown in Appendix 2 of MARTIN, IRVINE and CROUCH, op. cit., note 1..

    Google Scholar 

  18. Data on highly cited papers are contained in Tables 20 and 21 of MARTIN, IRVINE and CROUCH, op. cit., note 1, but we shall not go into details here because the overall picture that emerges is very similar to that for allASFA publications shown in Table 3 above.

    Google Scholar 

  19. With the approach adopted here, it is comparatively simple to take the analysis one step further, and disaggregate the UK statistics to the level of research institutes. Details of the results can be found in: MARTIN, IRVINE and CROUCH op. cit., note 1, pp. 56–58.

    Google Scholar 

  20. A similar result emerges even if one takes into account the fact that some highly cited papers have a longer-lasting impact than others. Table 42 in MARTIN, IRVINE and CROUCH op. cit., note 1, lists the number of times highly cited papers earned more than a certain number of citations, and again shows Britain accounting for approximately one quarter of the world total.

    Google Scholar 

  21. For further details, see Table 43 in MARTIN, IRVINE and CROUCH op. cit., note 1..

    Google Scholar 

  22. Other groups do, however, appear to have accounted for a greater share of major advances if a higher threshold for highly cited papers is selected-see Table 48 in MARTIN, IRVINE and CROUCH, op. cit., note 1..

    Google Scholar 

  23. See IRVINE and MARTIN, op. cit., note 2..

    Google Scholar 

  24. It is, however, arguable whether inter-field comparisons of ‘fundamentality’ could be made in this way across disciplinary boundaries. For example, the cognitive nature of biological specialties such as protein crystallography is such that researchers can legitimately ‘borrow’ ideas and methods from neighbouring fields in a manner that is less common in more ‘restricted’ sciences such as high-energy physics [see R. WHITLEY, Types of science, organizational strategies and patterns of work in research laboratories in different fields,Social Science Information, 17 (1978) 427].

    Google Scholar 

  25. See note 6, above.. However, the method does depend on the availability of a reasonably comprehensive publication list for the field concerned, either from an abstracting service or some other source. A different approach may be necessary for fields where there are no relevant categories in the abstracts or where the categories have altered appreciably over time (for example, because of changes in the intellectual focus of the field).

    Google Scholar 

  26. See MARTIN, IRVINE and CROUCH, op. cit., note 1, for full details.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

No order of seniority implied (rotating first authorship). The authors work at the Science Policy Research Unit (SPRU), University of Sussex, where they carry out research on a range of issues concerning policies for basic and applied science.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Crouch, D., Irvine, J. & Martin, B.R. Bibliometric analysis for science policy: An evaluation of the United Kingdom's research performance in ocean currents and protein crystallography. Scientometrics 9, 239–267 (1986). https://doi.org/10.1007/BF02017247

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02017247

Keywords

Navigation