Skip to main content
Log in

The evaluation of plant biomass research: A case study of the problems inherent in bibliometric indicators

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The aim of this project was to evaluate research groups working in the broad field of plant biomass in the areas outside the USA and the EC. The assessment had two key elements: the measurement of scientific productivity and the investigation of factors affecting research performance. Research groups were identified from a range of information sources. Data on funding, information access, staffing, publication policy and degree of awareness of other research groups in the field were collected during the course of interviews. Two approaches —bibliometric analysis and peer review — were examined as a means of constructing indicators for assessing research output.

Following a critical review of the use of bibliometric indicators in peripheral countries, the results from a study of eight countries are presented. Neither of two indicators employed proved to be a particularly successful method of evaluating research, and this finding is discussed in relation to publication patterns, the nature of the research community and the research field under study. Finally, the use of a “peripatetic expert” was found to have some value as a means of assessment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes and references

  1. D.M. Hicks,Beyond Serendipity: Factors Affecting Research Performance in Condensed Matter Physics, D. Phil Thesis, University of Sussex, UK, 1989.

    Google Scholar 

  2. A.J. Nederhof, The validity and reliability of evaluation of scholarly performance. In:Handbook of Quantitative Studies of Science and Technology,A.F.J. Van Raan (Ed.), Elsevier Science Publishers B.V. (North Holland), 1988, p. 193.

    Google Scholar 

  3. J. Irvine, B.R. Martin, Assessing basic research: the case of the Isaac Newton telescope,Social Studies of Science, 13 (1983) 49–86.

    Google Scholar 

  4. H. Morita-Lou (Ed.),Science and Technology Indicators for Development, United Nations Science and Technology for Development Series, Westview Press, Boulder and London, 1985, p. 13.

    Google Scholar 

  5. For a review of 24 studies involving bibliometric and non-bibliometric indicators, seeF. Narin,Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity, Report in fulfillment of contract number NSF C-627, Computer Horizons Inc., Cherry Hill, New Jersey, US, 1976.

    Google Scholar 

  6. J. Irvine, B.R. Martin, Assessing basic research: some partial indicators of scientific progress in radio astronomy,Research Policy, 12 (1983) 61–90.

    Google Scholar 

  7. J. Irvine, B.R. Martin, CERN: Past performance and future prospects,Research Policy, 13 (1984) 247–284.

    Google Scholar 

  8. Op. cit. note 3, 49–86.

    Google Scholar 

  9. The Institute for Scientific Information (ISI) in Philadelphia scans 3500 of the world's leading scientific journals (ISI-recognized-journals) and publishes theSCI which contains citation data for papers referred to in ISI-recognized-journals.

  10. M.J. Moravcsik,The Scientist, 11 (1987).

  11. M.J. Moravcsik, Strengthening the Coverage of Third World Science. The final report of the Philadelphia workshop, July, 1988, p. 10.

  12. S. Arunachalam, Citation Counts as Indicators of the Science and Technology Capacity of Third World Nations (mimeo). Paper presented at the Annual Meeting of the American Association for the Advancement of Science, 26–31, May, 1985, Los Angeles, US, 1985.

  13. D. Hicks, B.R. Martin, J. Irvine, Bibliometric techniques for monitoring performance in technologically oriented research: the case of integrated optics,R & D Management, 16 (1986) 211–223.

    Google Scholar 

  14. V. Cano, F.E. Burke, Publication Patterns in Mexican Science (mimeo), 1986.

  15. J.D. Frame, Measuring scientific activity in lesser developed countries,Scientometrics, 2 (1980) 133–135.

    Google Scholar 

  16. L. Velho, J. Krige, Publication and citation practices of Brazilian agricultural scientists,Social Studies of Science, 14 (1984) 45–62.

    Google Scholar 

  17. L. Velho, The meaning of citation in the context of a scientifically peripheral country,Scientometrics, 9 (1986) 71–89.

    Google Scholar 

  18. Op. cit. note 17 71–89.

    Google Scholar 

  19. C.M. Castro,Ciência e Universidade, Jorge Zahar Publ., Rio de Janeiro, 1985, p. 72.

    Google Scholar 

  20. F. Spagnolo, Brazilian scientists' publications and mainstream science: some policy implications,Scientometrics, 18 (1990) 205–218.

    Google Scholar 

  21. L. Velho, The author and the beholder: how paradigm commitments can influence the interpretation of research results,Scientometrics, 11 (1987) 59–70.

    Google Scholar 

  22. M.J. Moravcsik, In the beholder's eye: a possible reinterpretation of Velho's results on Brazilian agricultural research,Scientometrics, 11 (1987) 53–57.

    Google Scholar 

  23. Y. Yuthavong, Bibliometric indicators of scientific activity in Thailand,Scientometrics, 9 (1986) 139–143.

    Google Scholar 

  24. K.C. Garg, M.K.D. Rao Bibliometric analysis of scientific productivity: a case study of an Indian physics laboratory,Scientometrics, 13 (1988) 261–269.

    Google Scholar 

  25. F.W. Lancaster, M.A. Porta, K. Plagenz, K. Szymborski, M. Krebs, Factors influencing sources cited by scientists: a case study for Cuba,Scientometrics, 10 (1986) 243–257.

    Google Scholar 

  26. Y.M. Rabkin, H. Inhaber, Science on the periphery: a citation study of three less developed countries,Scientometrics, 1 (1979) 261–274.

    Google Scholar 

  27. Y.M. Rabkin, T.O. Eisemon, J.J. Lafitte-Hoossat, E. Mclean Rathgeber Citation visibility of Africa's science,Social Studies of Science, 9 (1979) 499–506.

    Google Scholar 

  28. S. Arunachalam, K.C. Garg, A small country in a world of big science,Scientometrics, 8 (1986) 301–313.

    Google Scholar 

  29. IMF, International Financial Statistics, XLII (4), Washington, US, 1988.

  30. V. Cano, F.E. Burke, Publication Patterns in Mexican Science (mimeo), 1986.

  31. Op. cit. note 15.

    Google Scholar 

  32. Op. cit. note 17 71–89.

    Google Scholar 

  33. C. H. Henderson, Personal communication.

  34. C. De Moura Castro, F. Spagnolo, cited in:L. Velho, See note 17.

    Google Scholar 

  35. J.A. Large,The Foreign Language Barrier: Problems in Scientific Communication, Deutsch, London, UK, 1983, p. 35.

    Google Scholar 

  36. TERI is the Tata Energy Research Institute New Delhi, India.

  37. Op. cit. note 17 71–89.

    Google Scholar 

  38. A. Schubert, W. Glänzel, T. Braun World flash on basic research: scientometric datafiles,Scientometrics, 16 (1989) 3–478.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Thomas, S.M. The evaluation of plant biomass research: A case study of the problems inherent in bibliometric indicators. Scientometrics 23, 149–167 (1992). https://doi.org/10.1007/BF02020920

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02020920

Keywords

Navigation