Skip to main content
Log in

Federal research impact assessment: Axioms, approaches, applications

  • Review
  • Published:
Scientometrics Aims and scope Submit manuscript

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

IV. Bibliography

  1. Brown, G. E., “Report of the Task Force on the Health of Research,” Chairman's Report to the Committee on Science, Space, and Technology, U.S. House of Representatives, No. 56–819, U.S. Government Printing Office, Washington, 1992.

    Google Scholar 

  2. NAS, “The Government Role in Civilian Technology: Building a New Alliance”, Committee on Science, Engineering, and Public Policy, National Academy of Sciences, National Academy Press, 1992.

  3. Carnegie, “Enabling the Future: Linking Science and Technology to Societal Goals,” Carnegie Commission on Science, Technology, and Government, Carnegie Commission, New York, NY, 1992.

    Google Scholar 

  4. OTA, “Federally Funded Research: Decisions for a Decade”, U.S. Congress, Office of Technology Assessment, OTA-SET-490 (Wash., DC: U. S. GPO, May 1991).

    Google Scholar 

  5. OTA, “The Defense Technology Base: Introduction and Overview”, U.S. Congress, Office of Technology Assessment, (OTA-ISC-374, March 1988) and “Holding the Edge: Maintaining the Defense Technology Base” (OTA-ISC-420) Wash., DC: U. S. GPO, April 1989.

    Google Scholar 

  6. Narin, F., “The Impact of Different Modes of Research Funding”, in:Evered, D., Harnett, S. (Eds.),The Evaluation of Scientific Research, John Wiley and Sons, Chichester, UK, 1989.

    Google Scholar 

  7. Robb, W. L., “Evaluating Industrial R&D”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18∶1, February 1994.

  8. Nelson, K. S., Tomsyck, J. P., Sorensen, D. P., “Industrial R&D Program Evaluation Techniques”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18:1, February 1994.

  9. Salasin, J. et al, “The Evaluation of Federal Research Programs”, MITRE Technical Report MTR-80W123, June 1980.

  10. Logsdon, J. M., Rubin, C. B., “An Overview of Federal Research Evaluation Activities”, Report, The George Washington University, Wash., D. C., April 1985. See alsoJ. M. Logsdon, C. B. Rubin,Federal Research Evaluation Activities, Cambridge, MA, Abt Associates, 1985.

    Google Scholar 

  11. Chubin, D. E., Hackett, E. J.,Peerless Science: Peer Review and U. S. Science Policy, State University of New York Press, Albany, NY, 1990.

    Google Scholar 

  12. Chubin, D. E., “Grants Peer Review in Theory and Practice”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18:1, February 1994.

  13. Kostoff, R. N., “Evaluating Federal R&D in the U. S.,” in:Assessing R&D Impacts: Method and Practice,Bozeman,B.,Melkers,J. (Eds), Kluwer Academic Publishers, Norwell, MA, 1993.

    Google Scholar 

  14. Kostoff, R. N., “Quantitative/Qualitative Federal Research Impact Evaluation Practices”,Technological Forecasting and Social Change, 45:2, February 1994.

    Google Scholar 

  15. Kostoff, R. N., “Research Impact Assessment: Federal Peer Review Practices”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18:1, February 1994.

  16. Barker, K., “The ‘British Model’ — Evaluation by Professionals”, in:Laredo, P., Mustar, P. (Eds),EC Handbook on Evaluation, 1992.

  17. Cicchetti, D. V., “The Reliability of Peer Review for Manuscript and Grant Submissions: A Cross-Disciplinary Investigation,”Behavioral and Brain Sciences, 14:1, 1991.

    Google Scholar 

  18. Cole, S., Rubin, L., Cole, J., “Peer Review in the National Science Foundation: phase one of a study,”, National Research Council, 1978, NTIS Acc. No. PB83-192161.

  19. Cole, J., Cole, S., “Peer Review in the National Science Foundation: phase two of a study,” National Research Council, 1981, NTIS Acc. No. PB82-182130.

  20. Cole, S., Cole, J., Simon, G., “Chance and Consensus in Peer Review,”Science, Vol. 214, November 1981.

  21. Cozzens, S. E., “Expert Review in Evaluating Programs”,Science and Public Policy, 14:2, April 1987.

    Google Scholar 

  22. DOD, “The Department of Defense Report on the Merit Review Process for Competitive Selection of University Research Projects and an Analysis of the Potential for Expanding the Geographic Distribution of Research,” April 1987, DTIC Acc. No. 88419044.

  23. DOE, “An Assessment of the Basic Energy Sciences Program”, Office of Energy Research, Office of Program Analysis, Report No. DOE/ER-0123, March 1982.

  24. DOE, “Procedures for Peer Review Assessments”, Office of Energy Research, Office of Program Analysis, Report No. DOE/ST-0007P, Revised January 1993.

  25. Frazier, S. P., “University Funding: Information on the Role of Peer Review at NSF and NIH”, U.S. General Accounting Office Report No. GAO/RCED-87-87FS, March 1987.

  26. Kostoff, R. N., “Evaluation of Proposed and Existing Accelerated Research Programs by the Office of Naval Research”,IEEE Trans. of Engineering Management, 35:4, Nov. 1988.

    Google Scholar 

  27. Ormala, E., “Nordic Experiences of the Evaluation of Technical Research and Development”,Research Policy, 18, 1989.

  28. OTA, “Research Funding as an Investment: Can We Measure the Returns”, U. S. Congress, Office of Technology Assessment, OTA-TM-SET-36 (Wash., DC: U. S. GPO, April 1986).

    Google Scholar 

  29. Nicholson, R. S., “Improving Research Through Peer Review,” National Research Council, 1987, NTIS Acc. No. PB88-163571.

  30. DOE, “An Evaluation of Alternate Magnetic Fusion Concepts 1977”, DOE/ET-0047, May 1978.

  31. NIST, “Annual Report, 1990, ”Visiting Committee on Advanced Technology, January 1991.

  32. Ormala, E., “Impact Assessment: European Experience of Qualitative Methods and practices”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18∶1, February 1994.

  33. Roy, R., “Funding Science: The Real Defects of Peer Review and an Alternative to It”,Science, Technology, and Human Values, 10:3, 1985.

    Google Scholar 

  34. King, J., “A Review of Bibliometric and Other Science Indicators and Their Role in Research Evaluation”,Journal of Information Science, 13, 1987.

  35. Kruytbosch, C., “The Role and Effectiveness of Peer Review”, in:Evered, D., Harnett, S. (Eds.),The Evaluation of Scientific Research, John Wiley and Sons, Chichester, UK, 1989.

    Google Scholar 

  36. Bornstein, R. F., “The Predictive Validity of Peer Review: A Neglected Issue”,Behavioral and Brain Sciences, 14:1, 1991.

    Google Scholar 

  37. Bornstein, R. F., “Manuscript Review in Psychology: Psychometrics, Demand Characteristics, and an Alternative Model,”Journal of Mind and Behaviour, 12, 1991.

  38. Narin, F., Olivastro, D., Stevens, K. A., “Bibliometrics-Theory, Practice, and Problems”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18:1, February 1994.

  39. Mansfield, E., “Academic Research and Industrial Innovation,”Research Policy, Vol. 20, 1991.

  40. Kostoff, R. N., “Semi-Quantitative Methods for Research Impact Assessment”,Technological Forecasting and Social Change, 44:3, November, 1993.

    Google Scholar 

  41. Kingsley, G., “The Use of Case Studies in R&D Impact Evaluation”, in:Assessing R&D Impacts: Method and Practice,Bozeman,B.,Melkers,J. (Eds.), Kluwer Academic Publishers, Norwell, MA, 1993.

    Google Scholar 

  42. DOD,Project Hindsight, Office of the Director of Defense Research and Engineering, Wash., D. C., DTIC No. AD495905, Oct. 1969.

    Google Scholar 

  43. IITRI, “Technology in Retrospect and Critical Events in Science”, Illinois Institute of Technology Research Institute Report, December, 1968.

  44. Battelle, “Interactions of Science and Technology in the Innovative Process: Some Case Studies”, Final Report, Prepared for the National Science Foundation, Contract NSF-C 667, Battelle Columbus Laboratories, March 19, 1973.

  45. IDA, “DARPA Technical Accomplishments”, Volume I, IDA Paper P-2192, February 1990; Volume II, IDA Paper P-2429, April 1991; Volume III, IDA Paper P-2538, July 1991, Institute for Defense Analysis.

  46. DOE, “Health and Environmental Research: Summary of Accomplishments”, Office of Energy Research, Office of Program Analysis, Report No. DOE/ER-0194, May 1983.

  47. DOE, “Health and Environmental Research: Summary of Accomplishments”, Office of Energy Research, Office of Program Analysis, Report No. DOE/ER-0275, August 1986.

  48. Kostoff, R. N., “Research Impact Quantification,”R&D Management, 24:3, July 1994.

    Google Scholar 

  49. Australia, “Research Performance Indicators Survey”, National Board of Employment, Education and Training, Commissioned Report No. 21, Australian Government Publishing Service, Canberra, Australia, January 1993.

  50. Braun, T., Glänzel, W., Schubert, A., “An Alternative Quantitative Approach to the Assessment of National Performance in Basic Research”, in:Evered, D., Harnett, S. (Eds.),The Evaluation of Scientific Research, John Wiley and Sons, Chichester, UK, 1989.

    Google Scholar 

  51. Braun, T., et al, “Publication Productivity: from Frequency Distribution to Scientometric indicators”,Journal of Information Science, 16, 1990.

  52. Braun, T., et al., “Scientometric Indicators Datafiles,”Scientometrics, 28:2, 1993.

    Google Scholar 

  53. Schubert, A., Braun, T., “Relative Indicators and Relational Charts for Comparative Assessment of Publication Output and Citation Impact,”Scientometrics, 9:5–6, 1986.

    Google Scholar 

  54. Braun, T., Schubert, A., “Scientometric versus Socio-Economic Indicators: Scatter Plots for 51 Countries: 1978–1980,”Scientometrics, 13:1–2, 1987.

    Google Scholar 

  55. Braun, T., Schubert, A., “The Landscape of National Performances in the Sciences, 1981–1985,”Scientometrics, 20:1, 1991.

    Google Scholar 

  56. Schubert, A., Braun, T., “Three Scientometric Etudes on Developing Countries as a Tribute to Michael Moravcsik,”Scientometrics, 23:1, 1992.

    Google Scholar 

  57. Oberski, J. E. J., “Some Statistical Aspects of Co-citation Cluster Analysis and a Judgement by Physicists,” in:Van Raan, A.F.J. (Ed.),Handbook of Quantitative Studies of Science and Technology, North Holland, 1988.

  58. White, H. D., McCain, K. W., “Bibliometrics,” in:Williams, M. E. (Ed.),Annual Review of Information Science and Technology, 24, 1989.

  59. Narin, F., “Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity” (monograph), NSF C-637, National Science Foundation, Contract NSF C-627, NTIS Accession No. PB252339/AS, March 31, 1976.

  60. Hicks, D., Martin, B., Irvine, J., “Bibliometric Techniques for Monitoring Performance in Technologically Oriented Research: The Case of Integrated Optics”,R&D Management, Vol. 16, No. 3, 1986.

  61. NSF, “Science and Engineering Indicators − 1989”, National Science Board Report NSB 89-1, GPO, Wash., D.C., 1989.

    Google Scholar 

  62. Martin, B. R. et al, “Recent Trends in the Output and Impact of British Science”,Science and Public Policy, 17:1, Feb., 1990.

    Google Scholar 

  63. Frame, J. D., “Quantitative Indicators for Evaluation of Basic Research Programs/Projects”,IEEE Transactions on Engineering Management, Vol. EM-30, No. 3, August 1983.

  64. McAllister, P. R., Narin, F., Corrigan, J. G., “Programmatic Evaluation and Comparison Based on Standardized Citation Scores”,IEEE Transactions on Engineering Management, Vol. EM-30, No. 4, November 1983.

  65. Mullins, N., “Evaluating Research Programs: Measurement and Data Sources”,Science and Public Policy, Vol. 14, No. 2, April 1987.

  66. Mullins, N., Snizek, W., Oehler, K., “The Structural Analysis of a Scientific Paper”, in:Van Raan, A. F. J. (Ed.),Handbook of Quantitative Studies of Science and Technology, North Holland, 1988

  67. Moed, H. F., Van Raan, A. F. J., “Indicators of Research Performance: Applications in University Research Policy,” in:Van Raan, A.F.J. (Ed.),Handbook of Quantitative Studies of Science and Technology, North Holland, 1988.

  68. Irvine, J., “Evaluation of Scientific Institutions: Lessons from a Bibliometric Study of UK Technical Universities,” in:Evered, D., Harnett, S. (Eds.),The Evaluation of Scientific Research, John Wiley and Sons, Chichester, UK, 1989.

    Google Scholar 

  69. Van Raan, A. F. J., “Evaluation of Research Groups”, in:Evered, D., Harnett, S. (Ed.),The Evaluation of Scientific Research, John Wiley and Sons, UK, 1989.

    Google Scholar 

  70. Luukkonen, T., “Bibliometrics and Evaluation of Research Performance”,Annals of Medicine, Vol. 22, No. 3, 1990.

  71. Luukkonen, T., Stahle, B., “Quality Evaluations in the Management of Basic and Applied Research”,Research Policy, 19, 1990.

  72. Luukkonen, T., Persson, O., Sivertsen, G., “Understanding Patterns of International Scientific Collaboration,”Science, Technology, and Human Values, Vol. 17, No. 1, January 1992.

  73. Narin, F., “Bibliometric Techniques in the Evaluation of Research Programs”,Science and Public Policy, 14:2, April 1987.

    Google Scholar 

  74. Carpenter, M. P., Narin, F., “Validation Study: Patent Citations as Indicators of Science and Foreign Dependence”,World Patent Information, Vol. 5, No. 3, 1983.

  75. Narin, F., Carpenter, M. P., Woolf, P., “Technological Performance Assessments Based on Patents and Patent Citations”,IEEE Transactions on Engineering Management, EM-31, 4, Nov. 1984.

  76. Wallmark, J. T., Sedig, K. G., “Quality of Research Measured by Citation Method and by Peer Review — A Comparison”,IEEE Transactions on Engineering Management, Vol. EM-33, No. 4, November 1986.

  77. Collins, P., Wyatt, S., “Citations in Patents to the Basic Research Literature”,Research Policy, 17, 1988.

  78. Narin, F., Olivastro, D., “Technology Indicators Based on Patents and Patent Citations”, in:Van Raan, A.F.J. (Ed.),Handbook of Quantitative Studies in Science and Technology, Elsevier Science Publishers, Amsterdam, 1988.

    Google Scholar 

  79. Van Vianen, B. G., Moed, H. F., Van Raan, A. F. J., “An Exploration of the Science Base of Recent Technology,”Science Policy, Vol. 19, 1990.

  80. Narin, F., Olivastro, D., “Status Report-Linkage between Technology and Science,”Research Policy, 21:3, June, 1992.

    Google Scholar 

  81. Carpenter, M. P., Cooper, M., Narin, F., “Linkage Between Basic Research Literature and Patents”,Research Management, 13:2, March 1980.

    Google Scholar 

  82. Narin, F., Noma, E., Perry, R., “Patents as Indicators of Corporate Technological Strength”,Research Policy, Vol. 16, 1987.

  83. Narin, F., “Technological Evaluation of Industrial Firms by Means of Patent Investigation”, Presented at VPP Professional Meeting, Nürnberg, Germany, November 13, 1992.

  84. Miller, R., “The Influence of Primary Task on R&D Laboratory Evaluation: A Comparative Bibliometric Analysis”,R&D Management, 22:1, 1992.

    Google Scholar 

  85. Schubert, A., Braun, T., “Reference Standards for Citation Based Assessments”,Scientometrics, 26:1, 1993.

    Google Scholar 

  86. Kostoff, R. N., “Research Impact Assessment,”Proceedings: Third International Conference on Management of Technology, Miami, FL, February 17–21, 1992. Larger text available from author.

  87. Kostoff, R. N., “Co-Word Analysis”, in:Assessing R&D Impacts: Method and Practice,Bozeman,B.,Melkers,J. (Eds.), Kluwer Academic Publishers, Norwell, MA, 1993.

    Google Scholar 

  88. Kostoff, R. N., “Database Tomography: Origins and Applications,”Competitive Intelligence Review, Special Issue on Technology, 5:1, Spring 1994.

    Google Scholar 

  89. Tijssen, R., Van Raan, A., “Mapping Changes in Science and Technology”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18∶1, February 1994.

  90. Georghiou, L., Giusti, W. L., Cameron, H. M., Gibbons, M., “The Use of Co-nomination Analysis in the Evaluation of Collaborative Research”, in:Van Raan, A.F.J. (Ed.),Handbook of Quantitative Studies of Science and Technology, North Holland, 1988.

  91. Engelsman, E. C., Van Raan, A. F. J., “Mapping of Technology: A First Exploration of Knowledge Diffusion amongst Fields of Technology,” Research Report to the Ministry of Economic Affairs, CWTS-91-02, Centre for Science and Technology Studies, Leiden, March 1991.

    Google Scholar 

  92. Averch, H., “Economic Approaches to the Evaluation of Research”, in:Kostoff, R. N. (Ed.),Evaluation Review, Special Issue on Research Impact Assessment, 18:1, February 1994.

  93. Link, A., “Methods for Evaluating the Return on R&D Investments”, in:Assessing R&D Impacts: Method and Practice,Bozeman,B.,Melkers,J. (Eds.), Kluwer Academic Publishers, Norwell, MA, 1993.

    Google Scholar 

  94. Roessner, J. D., “Use of Quantitative Methods to Support Research Decisions in Business and Government”, in:Assessing R&D Impacts: Method and Practice,Bozeman,B.,Melkers,J. (Eds.), Kluwer Academic Publishers, Norwell, MA, 1993.

    Google Scholar 

  95. Kostoff, R. N., “A Cost/Benefit Analysis of Commercial Fusion-Fission Hybrid Reactor Development”,Journal of Fusion Energy, 3:2, 1983.

    Google Scholar 

  96. Mansfield, E., “Basic Research and Productivity Increase in Manufacturing,”The American Economic Review, Vol. 70, No. 5, December 1980.

  97. Terleckyj, N.,State of Science and Research: Some New Indicators, Westview Press, Boulder, CO, 1977.

    Google Scholar 

  98. Terleckyj, N., “Measuring Economic Effects of Federal R&D Expenditures: Recent History with Special Emphasis on Federal R&D Performed in Industry”, Presented at NAS Workshop on ‘The Federal Role in Research and Development’, November 1985.

  99. Griliches, Z., “Issues in Assessing the Contribution of Research and Development to Productivity Growth”,The Bell Journal of Economics, Vol. 10, Spring 1979.

  100. Griliches, Z., “Productivity, R&D, and the Data Constraint”,The American Economic Review, 84:1, March 1994.

    Google Scholar 

  101. Averch, H., “Measuring the Cost-Efficiency of Basic Research Investment: Input-Output Approaches”,Journal of Policy Analysis and Management, Vol. 6, No. 3, 1987.

  102. Averch, H., “Exploring the Cost-Efficiency of Basic Research Funding in Chemistry”,Research Policy, Vol. 19, 1989.

  103. Odeyale, C. O., 1993.Knowledge-Based Systems: Knowledge Representation and Inference Strategies of Effective and Unbiased Military Biomedical and R&D Management. Ph.D. Thesis, Walden Univ.

  104. Odeyale, C. O., Kostoff, R. N., “R&D Management Expert Networks: I. Knowledge Representation and Inference Strategies”,HEURISTICS, the Jour of Knowledge Engineering and Technology, 7:1, 1994.

    Google Scholar 

  105. Odeyale, C. O., Kostoff, R. N., “R&D Management Expert Networks: II. Prototype Construction and Validation”,HEURISTICS, the Journal of Knowledge Engineering and Technology, 7:1, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

The views in this paper are solely those of the author and do not represent the views of the Department of the Navy.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kostoff, R.N. Federal research impact assessment: Axioms, approaches, applications. Scientometrics 34, 163–206 (1995). https://doi.org/10.1007/BF02020420

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02020420

Keywords

Navigation