Skip to main content
Log in

Bibliometric profiles for British academic institutions: An experiment to develop research output indicators

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In this paper, we report the results of an exploratory study commissioned by the Advisory Board for the Research Councils to produce bibliometric research profiles for academic and related institutions within the UK. The approach adopted is based on the methodology developed by CHI Research whereby publications from a given institution are weighted according to the influence of the journal in which they appear. Although certain technical limitations were encountered with the approach, the study nonetheless yielded potentially useful information on the comparative research output of British universities and polytechnics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes and references

  1. See, for example, J. IRVINE, B. R. MARTIN,Foresight in Science: Picking the Winners, Frances Pinter (Publishers), London and New York, 1984.

    Google Scholar 

  2. See A. ANDERSON, Research gradings stir emotions,Nature, 332 (24 July 1986) 229.

    Google Scholar 

  3. See, for example, the list of results reported in N. CREQUER, The strengths and weaknesses,Times Higher Education Supplement, (30 May 1985) 4.

    Google Scholar 

  4. See, for example, ANDERSON,op. cit., note 6,.

    Google Scholar 

  5. See R. GILLETT, Serious anomalies in the UGC comparative evaluation of the research performance of psychology departments,Bulletin of the British Psychological Society, 40 (1987) 42.

    Google Scholar 

  6. For example, C. H. LLOYD, The research productivity of UK dental schools in the years 1980–1985,Medical Scientific Research, 15 (1987) 349.

    Google Scholar 

  7. GILLETT,op. cit., note 9,, p. 42. See also R. GILLETT, M. AITKENHEAD, Rank injustice in academic research,Nature, 327 (June 4, 1987) 381.

    Google Scholar 

  8. See the papers in this volume ofScientometrics by J. KING, and J. McGINNETY.

  9. Additional indicators (e.g. patents) sometimes need to be employed for evaluating certain engineering and applied science specialties — see, for example, J. IRVINE, B. R. MARTIN,Assessing the Impact of SERC Support for Engineering Research, (mimeo), Science and Engineering Research Council, Swindon, 1985; and J. IRVINE,Evaluating Applied Research: Lessons from Japan, Frances Pinter, London, 1988. Bibliometric indicators should only be used with caution in these areas of research.

  10. See, for example, B. R. MARTIN, J. IRVINE, Assessing basic research: some partial indicators of scientific progress in radio astronomy,Research Policy, 12 (1983) 61; H. F. MOED, W. J. M. BURGER, J. G. FRANKFORT, A. F. J. VAN RAAN, The use of bibliometric data for the measurement of university research performance,Research Policy, 14 (1985) 131; and F. NARIN,Evaluative Bibliometrics, National Science Foundation, Monograph 456, Washington DC, 1976.

    Google Scholar 

  11. G. PINSKI, F. NARIN, Citation influence for journal aggregates of scientific publications; theory, with application to the literature of physics,Information Processing and Management, 12 (1976) 297; and F. NARIN, Measuring the research performance of higher education institutions using bibliometric techniques, paper presented at Workshop on Science and Technology Measures in the Higher Education Sector, Paris (10–13 June 1985).

    Google Scholar 

  12. These are estimated to contain at least 75 per cent of all science of international relevance. However, there is an Anglo-Saxon, and, more particularly, a US bias in theSCI. The classification of journals was undertaken within CHI according to the main subfield of papers published in each journal. For very general areas of research, subfield categories such as ‘general physics’ are employed. In the case of the 100 or so journals regarded by CHI as multi-disciplinary, each journal has been classified according to the estimated subfield breakdown of the papers it contains. Details of the procedure employed are outlined in F. NARIN, M. P. CARPENTER,Bibliometric Profiles of UK Universities and Research Institutions, report by CHI Research to the Advisory Board for the Research Councils, London, 1987 (restricted).

  13. For details, see NARIN and CARPENTER,op. cit, note 17Bibliometric Profiles of UK Universities and Research Institutions, report by CHI Research to the Advisory Board for the Research Councils, London, 1987 (restricted).

  14. Full mathematical details can be found inibid. NARIN and CARPENTER,op. cit., note 17,Bibliometric Profiles of UK Universities and Research Institutions, report by CHI Research to the Advisory Board for the Research Councils, London, 1987 (restricted), pp. 74–80. An alternative approach is to solve the relevant simultaneous equations directly. However, such an approach involves more operations and is likely to be less accurate.

  15. P. HEALEY, H. ROTHMAN, P. K. HOCH, An experiment in science mapping for research planning,Research Policy, 15 (1986) 233; D. CROUCH, J. IRVINE, B. R. MARTIN, Bibliometric analysis for science policy: an evaluation of the United Kingdom's research performance in ocean currents and protein crystallography,Scientometrics, 9 (1986) 239; and the paper by PHILLIPS and TURNEY in this volume ofScientometrics.

  16. See ROYAL SOCIETY,Evaluation of National Performance in Basic Research, Advisory Board for the Research Councils, Science Policy Studies No. 1, London, 1986; D. C. SMITH, P. M. D. COLLINS, D. M. HICKS, S. WYATT, National performance in basic research,Nature, 323 (23 October 1986) 681; and P. M. D. COLLINS, M. HART, D. M. HICKS, UK performance in basic solid state physics,Physics in Technology, 18 (1987) 72.

  17. See D. HICKS, Beyond serendipity: factors affecting performance in condensed matter physics, paper presented at the 4S Annual Meeting, Worcester, MA (November 19–22, 1987).

  18. For further details, see F. GIBB, J. A. PARRISH,ABRC Bibliometric Profiles Project, (mimeo), Department of Information Science, University of Strathclyde, Glasgow, 1986.

    Google Scholar 

  19. There were in fact several stages of interation in compiling this list involving both Strathclyde and CHI — seeibid. F. GIBB, J. A. PARRISH,ABRC Bibliometric Profiles Profect, (mimeo), Department of Information Science, University of Strathclyde, Glasgow, 1986.

  20. P. HACKNEY,Profile Information Presentation System (PIPS), Science Policy Research Unit, University of Sussex, Brighton, 1987.

    Google Scholar 

  21. Even with very large samples of papers, there are at least two reasons why one would not expect perfect correlation. In the influence weight procedure, papers are fractionated, and citations are weighted according to the influence of the citing journal. Neither of these technical adjustments was undertaken in the manual citation analysis (except in the case of a number of experimental high-energy physics papers from one university where the number of collaborating institutions varied between 7 and 19, and where anomalous citation results would have been obtained if those papers had not been fractionated). For details, see B. R. MARTIN, J. IRVINE,Bibliometric Profiles of UK Universities and Research Institutions, a report by the Science Policy Research Unit to the Advisory Board for the Research Councils, London, 1987.

  22. See for example, NARIN and CARPENTER,op. cit., note 17Bibliometric Profiles of UK Universities and Research Institutions, report by CHI Research to the Advisory Board for the Research Councils, London, 1987 (restricted). pp. 12–14.

  23. All these correlations are appreciably lower than those recorded for equivalent data in US universities — see R. ANDERSON, F. NARIN, P. MCALLISTER, Publication ratings vs peer rankings of universities,Journal of the American Society for Information Science 29 (1978) 91. One possible interpretation is that the UGC rankings are less reliable than the Roose-Anderson ratings of US universities.

  24. For example, SIR GEOFFREY WILKINSON (see N. HALL, Nobel chemist calls for drastic closures,Times Higher Education Supplement (31 October 1986), p. 1); CHRISTOPHER BALL (See O. SURRIDGE, Ball calls for research elite,Times Higher Education Supplement (6 March 1987) p. 3); and the UGC Earth Sciences Review Committee (see UNIVERSITY GRANTS COMMITTEE,Strengthening University Earth Sciences: Report of the Earth Sciences Review, UGC, London, 1987).

Download references

Author information

Authors and Affiliations

Authors

Additional information

The findings and conclusions presented are those of the authors alone and do not necessarily represent the views of their institutions or the Advisory Board for the Research Councils. Correspondence concerning the paper should be addressed toMartin at SPRU.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Carpenter, M.P., Gibb, F., Harris, M. et al. Bibliometric profiles for British academic institutions: An experiment to develop research output indicators. Scientometrics 14, 213–233 (1988). https://doi.org/10.1007/BF02020076

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02020076

Keywords

Navigation