, Volume 41, Issue 1–2, pp 5–16 | Cite as

New bibliometric techniques for the evaluation of medical schools

  • G. Lewison


Bibliometrics have been used in novel ways to assist with the evaluation of two medical schools, one in England and one in Sweden. The first evaluation was intended to allow the relative strengths in 26 subfields of five component campuses to be estimated. Selective filters for each subfield were defined, many of them with the help of the school's reserrch staff, so that relevant papers could be retrieved from a database on the basis of their title keywords and specialist journals. The campus outputs were then analysed by the research level of the journals (clinical/basic) and their influence. In the second evaluation, nine different indicators of research output were produced so that the school could be compared with four others in Scandinavia. The indicators included measures of output, co-authorship, journal esteem and citations by papers and by patents.


Research Output Review Journal Research Level Expect Citation Rate Title Keyword 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Anderson, J. (1989), The evaluation of research training. In:D.C. Evered, S. Harnett (Eds),The Evaluation of Scientific Research, 93, Wiley, Chichester.Google Scholar
  2. Anderson, J., Williams, N., Seemungal, D., Narin, F., Olivastro, D. (1996), Human genetic technology: exploring the links between science and innovation,Technology Analysis & Strategic Management, 8: 135.Google Scholar
  3. Bienenstock, J., Af Malmborg, C., Huttunen, J., Rodriguez-Farré, E. (1996),Evaluation of the Göteborg University Faculty of Medicine, University of Göteborg, Sweden.Google Scholar
  4. Collins P.M.D. (1991),Quantitative Assessment of Departmental Research, SEPSU Policy Study no 5, The Royal Society, London.Google Scholar
  5. Daniel, H.-D., Fisch, R. (1990), Research performance evaluation in the German university sector,Scientometrics, 19: 349.CrossRefGoogle Scholar
  6. Jeschin, D., Lewison, G., Anderson, J. (1995), A bibliometric database for tracking acknowledgements of research funding. In:Proceedings of the 5th International Conference of the International Society for Scientometrics & Informetrics, 235. Learned Information Inc., Medford NJ.Google Scholar
  7. Lewison, G. (1995), Evaluation of national biomedical research outputs through journal-based esteem measures,Research Evaluation, 5: 225–235.Google Scholar
  8. Lewison, G. (1996), The definition of biomedical research subfields with title keywords and application to the analysis of research outputs,Research Evaluation, 6: 25–36.Google Scholar
  9. Martin, B.R. (1996), The use of multiple indicators in the assessment of basic research,Scientometrics, 36: 343.CrossRefGoogle Scholar
  10. Narin, F., Hamilton, K. S. (1996), Bibliometric performance measures,Scientometrics, 36: 293.CrossRefGoogle Scholar
  11. Narin, F., Pinski, G., Gee, H. H. (1976), Structure of the biomedical literature,Journal of the American Society of Information Science, 27: 25.CrossRefGoogle Scholar
  12. Van Raan, A. F. J. (1993), Advanced bibliometric methods to assess research performance and scientific development: basic principles and recent practical applications,Research Evaluation, 3: 151.Google Scholar

Copyright information

© Akadémiai Kiadó 1998

Authors and Affiliations

  • G. Lewison
    • 1
  1. 1.Unit for Policy Research In Science and Medicine (PRISM)The Wellcome TrustLondon(England)

Personalised recommendations