Skip to main content
Log in

Scientific publications at U.S. federal research laboratories

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In this paper, we focus on scientific publications as an innovative output from the research efforts at U.S. federal laboratories. The data used relate to Federally Funded Research and Development Centers (FFRDCs). The relationship between R&D expenditures at these federal laboratories and their peer-reviewed scientific publications allows us to make inferences about the return to public-sector R&D. We examine two complementary statistical models. From the first model, we find that a 10% increase in constant dollar public-sector R&D is associated with between a 15.5 and 21.5% increase in scientific publications. From the second model, we find that the annual rate of return generated by an additional $1 million of R&D-based knowledge stock varies across the FFRDCs, averaging about 93 additional scientific publications, with the statistically significant values ranging from about 1 to as many as about 400 additional scientific publications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Notes

  1. The original version of The President’s Management Agenda is undated; however, online it is referred to as the 2018 version. There have been two updates since the 2018 version.

  2. See Link, Morris, and Van Hasselt (2019), Link and Van Hasselt (2019), and Link and Oliver (2020). Overall, these publications report that a 10% increase in federal agency per capita R&D is associated with more than a 10% increase in new patent applications by the funding agencies.

  3. Link and Scott (2019, 2020) have studied scientific publications at one U.S. federal laboratory, the National Institute of Standards and Technology (NIST). They found evidence that the return on investments of public-sector R&D has declined in the sense that, as time has progressed, more public-sector R&D is required to produce scientific publications. Relatedly, Audretsch et al. (2019) studied scientific publications resulting from public-sector R&D allocated to private-sector firms through the Small Business Innovation Research (SBIR) program. These authors found that a 10% increase in an SBIR project’s budget corresponds to an 11% increase in the number of scientific papers submitted for publication.

  4. Data on patents, licenses, and CRADAs are readily available at the agency level through the Technology Partnerships Office (TPO) at the National Institute of Standards and Technology (NIST), but federal laboratories have historically been reluctant to make public their data on technology transfer activities. This remains the case even in light of President Obama’s (2011) Memorandum as documented by Link and Oliver (2020). The Board on Science and Technology Policy (STEP) at the National Academies (the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine) recently commissioned a study on Advancing Commercialization from the Federal Laboratories. Their draft report to the U.S. Congress calls for greater transparency by federal laboratories about their technology transfer activities. See https://www.nationalacademies.org/our-work/advancing-commercialization-from-the-federal-laboratories.

  5. President Franklin D. Roosevelt commissioned the report from Bush on November 17, 1944, but he passed away on April 12, 1945.

  6. The Bayh-Dole Act of 1980 brought about the establishment of technology transfer offices at universities.

  7. Federal laboratories, which are, by definition, government owned (GO) can be distinguished by the organization operating the laboratory. Federal laboratories can be government operated (GO) or contractor operated (CO). Thus, federal laboratories are referred to either as GOGO laboratories or GOCO laboratories.

  8. See footnote 4 above.

  9. Prefacing each fiscal year report from the TPO is the statement: “This report fulfills the requirement of Title 15 of the United States Code, Sect. 3710(g)(2), for an annual report summarizing the use of technology transfer authorities by federal agencies.”

  10. Metrics related to scientific publications are conspicuously absent from the TPO’s Federal Laboratory Technology Transfer Database v.2015; see, https://www.nist.gov/tpo/reports-and-publications.

  11. The European Commission (EC) takes a broader interpretation of technology transfer metrics than the United States. The EC prefers the term knowledge transfer metrics. “[Knowledge transfer] KT is about getting research and expertise put to use which, by its nature, is wide-ranging and complex” (European Commission, p. 7). Regarding the scope of KT indicators, the EC also notes that the primary KT channel is publications and presentations (European Commission, p. 16).

  12. Data on science and engineering (S&E) publications in a selected number of journals by scientists from FFRDCs are available at the aggregate level, not at the agency level or at the laboratory level. See Science and Engineering Indicators 2018: Appendix Table; Table 5–41. A distinction is made between peer-reviewed scientific publications, which are considered in this paper, and, for example, contractor reports or conference presentations.

  13. U.S. Federal Acquisition Regulation (FAR 2.101) defines an FFRDC in the following manner: “Federally Funded Research and Development Centers (FFRDCs) means activities that are sponsored under a broad charter by a Government agency (or agencies) for the purpose of performing, analyzing, integrating, supporting, and/or managing basic or applied research and/or development, and that receive 70 percent or more of their financial support from the Government.” With reference to footnote 7 above, all FFRDCs are GOCO laboratories.

  14. See, for example, Hall and Ziedonis (2001) and Czarnitzki, Kraft, and Thorwarth (2009).

  15. We thank Michael Gibbons, Research and Development Statistics Program Survey Statistician at the National Science Foundation (NSF), for his electronically shared insight about R&D expenditures allocated to FFRDCs.  We also appreciate information learned about data on federally funded R&D at FFRDCs through electronic correspondence with Matt Hourihan, Director of the R&D Budget and Policy Program at the American Association for the Advancement of Science.

  16. Since the measure of labor inputs is used as an alternative to the R&D expenditures, there is no double counting issue that requires interpretation in their main estimating equations. However, Adams and Griliches (1996), in addition to presenting the model with real R&D and then alternatively with S&Es, report (p. 12668) “for good measure … a third specification that includes both S&Es and real R&D per S&E,” as in specification (3) above in our discussion of the literature studying patent applications. Adams and Griliches (1996, p. 12668) in their study of university publications observe, “When we add R&D per S&E as a separate variable the main effect of S&Es is about the same but there is an additional effect, generally somewhat smaller yet still significant, of per capita R&D. These findings suggest that not all research is financed by grants, but that departments with more generous support per researcher are more productive. More of the research in the smaller programs is being supported by teaching funds, because the S&E input measure is larger in these programs relative to real R&D.” Observe from Eq. (3) that we would expect the per capita R&D effect to be smaller than the estimated effect for the scientific labor term because the effect estimated for the R&D per capita is in fact the output elasticity for R&D expenditures, while the estimated coefficient for the S&E input is the sum of the output elasticity of R&D expenditures and the output elasticity for S&E input.

  17. An extension of an analysis of scientific publication counts, which we discuss in “Concluding remarks” section below, is to relax the assumption that scientific publications are homogeneous in terms of their knowledge content. Adjustments for the heterogeneity of scientific publications, possibly through citation analyses, should be considered in future research. Of course, that would require that federal laboratories not only make available their publication counts, but also their listing of the scientific publications. From our experience, U.S. federal laboratories, as we discussed in footnote 4 above, are even reluctant to provide details of scientific publication counts.

  18. RD is converted first to constant dollars, then to natural logarithms, and then the lagged value enters the equation.

  19. Consider the year 2005. Scientific publications are observed by calendar year and would be in the period January 1 to December 31 of 2005. R&D expenditures are for fiscal years, so the R&D for 2005 would be for the period October 1 of 2004 through September 30 of 2005. R&D is already lagged a bit, and then the additional one-year lag is applied in the estimation of the model.

  20. See https://fred.stlouisfed.org/series/A191RD3A086NBEA (2012 = 100).

  21. See Jankowski (1993) for the use of the implicit price deflator for converting current dollar R&D to constant dollar R&D.

  22. Adding trend to the specification, we have to drop an additional time dummy, because otherwise a linear combination of the column of ones for the intercept and the time dummies equals trend; and thus, in the fully specified models with the year effects, the two specifications (1) and (2) with all of the year effects are identical—the 14 year dummies in specification (1) or trend and 13 year dummies in specification (2) span the same space, and the tests of joint significance are identical.

  23. See National Science Foundation, National Center for Science and Engineering Statistics, FFRDC Research and Development Survey.

  24. See Link and Scott (2012).

  25. See https://www.energy.gov/eere/analysis/downloads/aggregate-economic-return-investment-us-doe-office-energy-efficiency-and.

  26. The variable trend is not included, because with the fully specified model, a specification with trend included is identical to the estimated model. If trend is added, then in addition to dropping the time dummy for 2018 because we lose those observations from the construction of the dependent variable, and for 2003 because of the variable for lagged R&D over quantity, and for 2004 because it is left in the intercept, another of the year dummy variables must be dropped because otherwise a linear combination of the time dummies and the intercept would equal the variable trend. The two models, one without trend and the year dummy variables for 2005 through 2017, and the other with trend and the year dummy variables for 2006 through 2017, are equivalent because the sets of time effects span the same space.

  27. Using the robust standard errors, the F-statistic for the average equal to zero is F(1, 81) = 9.44, with the probability of a greater F = 0.0029. Using the clustered standard errors, the F-statistic is F(1, 7) = 17.61, with the p value = 0.0041.

  28. The finding about the rate of return here is two orders of magnitude larger than the rate of return (7.3 scientific publications for an additional $10 million in knowledge stock) reported in Link and Scott (2019) for the federal laboratory NIST during the NIST era (i.e., since the National Bureau of Standards was reorganized and renamed as the National Institute of Standards and Technology). Perhaps the difference is related to the many channels through which NIST transfers knowledge, or possibly to the different operating structures of the labs. See footnote 7 above. The FFRDCs are GOCO laboratories, while NIST is a GOGO laboratory.

  29. The primary data on R&D are in thousands of dollars. See the note to Table 3.

  30. The finding about the rate of technological change can be compared with the finding in Link and Scott (2020) for the federal laboratory NIST for which a strong negative rate of change in the shift factor was found. While a difference might have been anticipated because of the different operating structures, in fact the results for most laboratories are similar, with one of the two exceptions being statistically insignificant. See footnote 7 above; the FFRDCs in our sample are all GOCO laboratories, while NIST is a GOGO laboratory.

  31. The average is statistically significant. Using the robust standard errors, F(1, 81) = 15.44, with p value = 0.0002. Using the clustered standard errors, F(1, 7) = 34.38, with p value = 0.0006.

  32. The elasticity of the research output of scientific publications with respect to the stock of scientific knowledge (specified in Eq. (7)) approaches the elasticity of output with respect to R&D expenditure as the annual depreciation rate of knowledge stock approaches 100%.

  33. Again, see footnote 4 above.

  34. Much of the literature on CRADAs is reviewed in Chen et al. (2018).

  35. See Choudhry and Ponzio (2020).

References

  • Adams, J., & Griliches, Z. (1996). Measuring science: An exploration. Proceedings of the National Academy of Sciences of the United States of America, 93(23), 12664–12670.

    Article  Google Scholar 

  • Audretsch, D. B., Link, A. N., & van Hasselt, M. (2019). Knowledge begets knowledge: University knowledge spillovers and the output of scientific papers from U.S. Small Business Innovation Research (SBIR) projects. Scientometrics, 121, 1367–1383.

    Article  Google Scholar 

  • Bush, V. (1945). Science—The endless frontier. Washington, DC: National Science Foundation.

    Book  Google Scholar 

  • Chen, C., Link, A. N., & Oliver, Z. T. (2018). U.S. federal laboratories and their research partners: A quantitative case study. Scientometrics, 115, 501–517.

    Article  Google Scholar 

  • Choudhry, V., & Ponzio, T. A. (2020). Modernizing federal technology transfer metrics. Journal of Technology Transfer, 45, 544–559.

    Article  Google Scholar 

  • Czarnitzki, D., Kraft, K., & Thorwarth, S. (2009). The knowledge production of ‘R’ and ‘D’. Economics Letters, 105, 141–143.

    Article  Google Scholar 

  • European Commission. (2020). Knowledge transfer metrics: Towards a European-wide set of harmonized indicators. Luxembourg: European Union.

    Google Scholar 

  • Griliches, Z. (1979). Issues in assessing the contribution of research and development to productivity growth. Bell Journal of Economics, 10(1), 92–116.

    Article  Google Scholar 

  • Hall, B. H., & Ziedonis, R. H. (2001). The patent paradox revisited: An empirical study of patenting in the U.S. semiconductor industry, 1979 − 1995. RAND Journal of Economics, 32, 101–128.

    Article  Google Scholar 

  • Jankowski, J. E., Jr. (1993). Do we need a price index for industrial R&D? Research Policy, 22(3), 195–205.

    Article  Google Scholar 

  • Link, A. N., Morris, C. A., & van Hasselt, M. (2019). The impact of public R&D investments on patenting activity: Technology transfer at the U.S. Environmental Protection Agency. Economics of Innovation and New Technology, 28(5), 536–546.

    Article  Google Scholar 

  • Link, A. N., & Oliver, Z. T. (2020). Technology transfer and U.S. Public Sector Innovation. Northampton, MA: Edward Elgar Publishing.

    Book  Google Scholar 

  • Link, A. N., & Scott, J. T. (2012). The theory and practice of public-sector R&D economic impact analysis, NIST Planning Report 11-1. Gaithersburg, MD: National Institute of Standards and Technology.

    Google Scholar 

  • Link, A. N., & Scott, J. T. (2019). Technological change in the production of new scientific knowledge: A second look. Economics of Innovation and New Technology. https://doi.org/10.1080/10438599.2019.1705004.

    Article  Google Scholar 

  • Link, A. N., & Scott, J. T. (2020). Creativity-enhancing technological change in the production of scientific knowledge. Economics of Innovation and New Technology, 29(5), 489–500.

    Article  Google Scholar 

  • Link, A. N., & van Hasselt, M. (2019). A public sector knowledge production function. Economics Letters, 174, 64–66.

    Article  Google Scholar 

  • Mansfield, E. (1998). Academic research and industrial innovation: An update of empirical findings. Research Policy, 26, 773–776.

    Article  Google Scholar 

  • National Science Board. (2018). Science and engineering indicators 2018, NSB-2018-1, Alexandria, VA: National Science Foundation. Available at https://www.nsf.gov/statistics/indicators/.

  • Obama, B. (2011). Presidential memorandum—Accelerating technology transfer and commercialization of federal research in support of high-growth businesses. Washington, DC: The White House.

    Google Scholar 

  • Shelton, R. D. (2008). Relations between national research investment and publication output: Application to an American paradox. Scientometrics, 74, 191–205.

    Article  Google Scholar 

  • Stam, E., & van de Ven, A. (2019). Entrepreneurial ecosystem elements. Small Business Economics. https://doi.org/10.1007/s11187-019-00270-6.

    Article  Google Scholar 

  • Terleckyj, N. E. (1974). The effects of R&D on the productivity growth of industries: An exploratory study. Washington, DC: National Planning Association.

    Google Scholar 

  • The President’s Management Agenda (undated). https://www.whitehouse.gov/omb/management/pma/.

  • United States Government Accountability Office (GAO) (2014). Federally Funded Research Centers: Agency reviews of employee compensation and center performance, GAO-14-593. Washington, DC: United States Government Accountability Office.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Albert N. Link.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Link, A.N., Scott, J.T. Scientific publications at U.S. federal research laboratories. Scientometrics 126, 2227–2248 (2021). https://doi.org/10.1007/s11192-020-03854-2

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-020-03854-2

Keywords

JEL Classification

Navigation