Skip to main content
Log in

Research impact quantification

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The development of methods for the quantification of research impact has taken a variety of forms: the impact of research outputs on other research, through various forms of citation analysis; the impact of research and technology, through patent-derived data; the economic impact of research projects and programs, through a variety of cost-benefit analyses; the impact of research on company performance, where there is no relationship with profit, but a strong positive correlation with sales growth has been established; and calculations of the rates of social return on the investment in research.

However, each of these approaches, which have had varying degrees of success, are being challenged by substantial revision in the understanding of the ways in which research interacts, and contributes to, other human activities. First, advances in the sociology of scientific knowledge have revealed the complex negotiation processes involved in the establishment of research outcomes and their meanings. In this process, citation is little more than a peripheral formalisation. Second, the demonstration of the limitations of neo-classical economics in explaining the role of knowledge in the generation of wealth, and the importance of learning porcesses, and interaction, in innovation within organisations, has finally overturned the linear model on which so many research impact assessments have been based. A wider examination of the political economy of research evaluation itself reveals the growth of a strong movement towards managerialism, with the application of a variety of mechanisms — foresight, priority setting, research evaluation, research planning — to improve the efficiency of this component of economic activity. However, there are grounds for questioning whether the resulting improved efficiencies have, indeed, improved overall performances. A variety of mechanisms are currently being experimented with in a number of countries which provide both the desired accountability and direction for research, but which rely less on the precision of measures and more on promoting a research environment that is conducive to interaction, invention, and connection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. For an overview, OTA,Research Funding as an Investment; Can we Measure the Returns, US Congress, 1989.

  2. For example,A. F. J. van Raan et al.,Science and Technology Indicators, DSWO Press, 1989;H. F. Moed,The Use of Bibliometric Indicators for the Assessment of Research Performance, DSWO Press, 1989.

  3. P. Bourke, L. Butler, ‘Science in our universities: What's done where?’The Australian, 8 March 1995.

  4. For example,H. Grupp (Ed.),Problems of Measuring Technological Change, Verlag, 1987, and subsequent reports.

  5. For example,R. T. Prinsley,A Review of Research and Development Evaluation, AGPS, Canberra, 1993.

    Google Scholar 

  6. Industry Commission,Research and Development, Canberra, 1994.

  7. G. K. Morbey, R. & D expenditures and profit growth,Research Technology Management, 28 (1989) 20.

    Google Scholar 

  8. E. Mansfield, Social returns from R & D,Research Technology Management, Vol. 30, 1991; a recent critical review of the literature is provided in Industry Commission,op. cit., ref. 6,Research and Development, Canberra, 1994, Vol. 3.

  9. S. Cozzens et al.,Methods for Evaluating Fundamental Science, RAND, Office of Science and Technology Policy, Washington, 1994.

    Google Scholar 

  10. D. E. Stokes, ‘The Impaired Dialog between Science and Government and What Might Be Done About It’, delivered to Nineteenth Annual AAAS Colloquium on Science and Technology Policy, Washington, April, 1994, p. 2.

  11. For example, the Industry Commission offers the following definitions: “non-rivalry — it can be made available to a number of users simultaneously, at no extra cost to the supplier; and non-excludability — users cannot be denied access to it”, p. A98.

  12. D. J. Teece, Technological change and the nature of the firm, In:G. Dosi et al. (Eds.),Technical Change and Economic Theory, Macmillan, 1988.

  13. M. Callon, Is science a public good?Science, Technology & Human Values, 19 (1994) 407.

    Google Scholar 

  14. Key references areB. W. Arthur, Competing technologies, increasing returns and lock-in by historical events,Economic Journal, 99 (1989) 116–131 andP. A. David, Clio and the economics of QWERTY,American Economic Review, 75 (1984) 332–337.

    Google Scholar 

  15. M. Callon,op. cit. ref. 4, p. 408.H. Grupp (Ed.),Problems of Measuring Technological Change, Verlag, 1987, and subsequent reports.

  16. For example,C. Freeman, L. Soete,New Explorations in the Economics of Technological Change, Pinter, 1990,R. Nelson,Understanding Technological Change as an Evolutionary Process, North Holland, 1987.

  17. OECD, ‘Interactions in Knowledge Systems; Foundation, Policy Implications and Empirical Methods’, DSTI/STP/TIP(94)15, Paris, 1994, p. 4.

  18. J. Fagerberg, International competitiveness,Economic Journal, 98 (1988) 355–374.

    Google Scholar 

  19. J. Ziman,Prometheus Bound; Science in a Dynamic Steady State, Cambridge University Press, 1994.

  20. R. Johnston, Strategic policy for science, In:S. Cozzens et al.,The Research System in Transition, Kluwer Press, 1990.

  21. J. Ziman,op. cit., ref. 17, OECD, ‘Interactions in Knowledge Systems; Foundation, Policy Implications and Empirical Methods’, DSTI/STP/TIP(94)15, Paris, 1994, p. 251.

  22. Ibid. J. Ziman,op. cit., ref. 17, OECD, ‘Interactions in Knowledge Systems; Foundation, Policy Implications and Empirical Methods’, DSTI/STP/TIP(94)15, Paris, 1994, p. 263.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Johnston, R. Research impact quantification. Scientometrics 34, 415–426 (1995). https://doi.org/10.1007/BF02018009

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02018009

Keywords

Navigation