Skip to main content
Log in

Assessing the societal impact of publicly funded research

  • Published:
The Journal of Technology Transfer Aims and scope Submit manuscript

Abstract

The paper offers a critical overview of recent conceptual and methodological endeavors to elevate the policy saliency of societal impacts as a criterion for formulating and assessing societal impacts. Beginning with an overview of the historical context for the contemporary rise to prominence of societal impacts as a criterion for allocating and assessing public research funds, it unpacks embedded, compound propositions that connect the governance of science to the design and implementation of assessment methodologies that satisfy the joint criteria of policy relevance and technical rigor. In doing so, it highlights analytical and methodological differences between ex ante rationales for increased attention to societal impacts and ex post assessments of the character and magnitude of these impacts. It next appraises the utility of different modes of evaluation, singling out those it deems best suited to the tasks at hand, while questioning the soundness of other contemporary approaches. It closing section calls attention to the problematic, indeed at points chimerical, character of endeavors to endeavor to link the political and normative elements embedded in calls for increased attention to societal impacts with structured program evaluations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Of the 101 publications, reports and other documents cited in Bornmann’s (2013) comprehensive literature review, none are to JTT.

  2. As noted by the Canadian Academy of Health Sciences in its review of evaluation studies of health research, “While there are a number of excellent frameworks and a few international reviews of ‘ROI’ in other countries, there is no accepted international standard framework or indicators, and there is no agreement on a standard approach to determining the value of health research” (Canadian Academy of Health Sciences 2009, p. 49).

  3. Heightened recent calls for attention to societal impacts by themselves are not necessarily reliable indicators that less attention to societal impacts is being paid than in earlier periods or that the underlying societal conditions towards which public research funding is directed have worsened. Policy agendas are shaped by several influences other than unsatisfactory societal conditions. Among them are rising expectations, perceptions of relative deprivation, and advocacy by persuasive sets of policy entrepreneurs. (Baumgartner and Jones 1993; Kingdon 1995).

  4. In February 2015, the U.S. Dietary Guidelines Advisory Committee recommended for the first time that food system sustainability be an integral part of dietary guidance: “…consistent evidence suggests that such a dietary pattern is not only more healthful but also is associated with less environmental impact than the average American diet” (Science, 9 October 2015, p. 165–166).

  5. OMB is careful to note though that these indicators, “do not measure the impacts of Government policies” (op. cit, p. 51.).

  6. “Economics offers an excellent set of tools for estimating costs and benefits, but tells us little about who benefits and who suffers” (Sovacool and Dworkin 2014, p. 363).

  7. Disability-adjusted life years, the single strongest predictor of NIH funding, has been found to account for between 33 and 39% of the variance in funding across diseases, with little evidence that the alignment between burden of disease and research funding has improved over time (Gillum et al. 2011. Also, Sampat 2012, for a fuller analysis of these findings).

  8. Phrased in terms of the public values model of the public failure of mechanisms for value articulation, “Political processes and social cohesion insufficient to ensure effective communication and processing of public values” (Bozeman and Sarewitz, op. cit, p. 124).

  9. As expressed by John Holdren, Special Assistant to the President for Science and Technology, “The fact is that nobody can predict where new understandings in fundamental research will ultimately lead-and what benefits is to society will ultimately result. Even in applied research, it is rarely possible to predict with confidence whether the work will achieve its intended goal or not, never mind what ultimate benefit might follow from achieving that goal” (May 2, 2013).

  10. “The internet, Facebook, and Twitter didn’t cause the revolutions, but like television in Eastern Europe in 1989, technology accelerated the pace of events” (Engel 2016, p. 152).

  11. “…economic benefits were quantified by comparing actual technological progress to counterfactual scenarios under which DOE technical expertise, technology infrastructure, and financial support were not available and PV(photoelectric) module companies pursued their technology R&D strategies without DOE support”(Gallagher, op. cit., p. 49).

  12. Thus, Godin and Dore write, “We still have, forty years after the first demands for impact indicators, to rely on case studies to quantify, very imperfectly, dimensions other than the economic one” (Godin and Dore, op. cit. p. 1). Similarly, Cozzens and Snoek write, “In terms of methods, the research literature is dominated by case studies, with a scattering of survey research; neither is particularly helpful for evaluation or performance monitoring purposes” (2010, p. 2).

  13. The propulsive force of concerns about income inequality that undergirds much of the attention to societal impacts also can now be visibly seen in evaluation research, with ethics presented as the “last frontier of evaluation”. (See Evaluation for an Equitable Society, edited by Donaldson and Picciotto 2016; especially Scriven, loc. cit., 2016, pp. 11–48).

  14. I am indebted to a reviewer for leading me to so explicitly state the implications of what I have written.

References

  • Agribusiness Accountability Project. (1978). Hard tomatoes, hard times. Cambridge, MA: Schenkman Publishing Company.

    Google Scholar 

  • Baumgartner, F., & Jones, B. (1993). Agendas and Instability in American Politics. Chicago, IL: University of Chicago Press.

    Google Scholar 

  • Boekestein, A., Diederen, P., Jongen, W., Rabbinge, R., & Rutten, H. (Eds.). (1999). Towards an agenda for agricultural research in Europe. Wageningen: Wageningen Pers.

    Google Scholar 

  • Bornmann, L. (2013). What is societal impact of research and how can it be assessed: A literature survey. Journal of the American Society for Information Science and Technology, 64, 217–233.

    Article  Google Scholar 

  • Bozeman, B., & Boardman, C. (2009). Broad impacts and narrow perspectives: Passing the buck on science and social impacts. Social Epistemology, 23, 183–198.

    Article  Google Scholar 

  • Bozeman, B., & Kingsley, G. (2013). Research value mapping and evaluation: Theory and practice. In A. Link & N. Vonortas (Eds.), Handbook on the theory and practice of program evaluation (pp. 166–189). Cheltenham: Edward Elgar.

    Google Scholar 

  • Bozeman, B., & Sarewitz, D. (2005). Public values and public failure in U.S science policy. Science and Public Policy, 32, 119–136.

    Article  Google Scholar 

  • Braun, D. (2006). The mix of policy rationales in science and technology policy. Melbourne Journal of Politics, 31, 8–35.

    Google Scholar 

  • Brooks, H. (1965). Future needs for basic research. In U.S. House of Representatives, op. cit, pp. 77–110.

  • Brown, G. (1994). Moving from science in the service of a vigilant society to science in the service of a humane society: The promises and the pitfalls. In A. Teich, S. Nelson, & C. McEnaney (Eds.), AAAS science and technology policy yearbook-1994 (pp. 211–219). Washington, DC: American Association for the Advancement of Science.

    Google Scholar 

  • Canadian Academy of Health Sciences (2009) Making an impact, report on the panel on the return on investments in health research

  • Carson, R. (1962). Silent spring. Boston, MA: Houghton Mifflin.

    Google Scholar 

  • Cozzens, S. (2007). Distributive justice in science and technology policy. Science and Public Policy, 34, 85–94.

    Article  Google Scholar 

  • Cozzens, S., & Snoek, M. (2010). Knowledge to policy. Paper prepared for the “Workshop on the Science of Science Measurement”, Washington, DC, December 2–3

  • David, P. (1990). The dynamo and the computer: An historical perspective on the modern productivity paradox. American Economic Review, Papers and Proceedings, 80, 355–361.

    Google Scholar 

  • Donaldson, S., & Picciotto, B. (2016). Evaluation for an equitable society. Greenwich, CT: Information Age Publishing Inc.

    Google Scholar 

  • Donovan, C. (2011). State of the art in assessing research impact: Introduction to a special issue. Research Evaluation, 20, 175–179.

    Article  Google Scholar 

  • Elzinga, A., & Jamison, A. (1995). Changing policy agendas in science and technology. In S. Jasanoff, G. Markle, J. Petersen, & T. Pinch (Eds.), Handbook of science and technology studies (pp. 572–597). SAGE: Thousand Oaks, CA.

    Google Scholar 

  • Engel, R. (2016). And then all hell broke loose. New York: Simon and Schuster.

    Google Scholar 

  • Feller, I. (2011). The promises and limitations of performance measures. In S. Olson & S. Merrill (Eds.), Measuring the impacts of federal investments in research (pp. 119–152). Washington, DC: National Academies Press.

    Google Scholar 

  • Gallagher, M., Link, A., & O’Connor, A. (2012). Public investment in energy research. Cheltenham: Edward Elgar.

    Book  Google Scholar 

  • Gaunand, A., Hocde, A., Lemarie, S., Matt, M., & de Turckheim, E. (2015). Howe does public agricultural research impact society? A characterization of various patterns. Research Policy, 44, 849–861.

    Article  Google Scholar 

  • Gillum, L., Gouvela, C., Dorsey, E., Pletcher, M., Mathers, C., McCulloch, C., et al. (2011). NIH funding levels and burden of disease. PLoS ONE, 6, e16837 ff.

    Article  Google Scholar 

  • Godin, B., & Dore, C. (2005). Measuring the impacts of science: Beyond the economic dimension. Paper presented at the Urbanisation INRS. Culture et Societe. Helsinki Institute for Science and Technology Studies: Helsinki, Finland

  • Goldenberg, E. (1983). The three faces of evaluation. Journal of Policy Analysis and Management, 2, 515–525.

    Article  Google Scholar 

  • Gordon, R. (2016). The rise and fall of American growth. Princeton, NJ: Princeton University Press.

    Book  Google Scholar 

  • Guston, D., & Keniston, K. (1994). Introduction: The social contract for science. In D. Guston & K. Keniston (Eds.), The fragile contract (pp. 1–41). Cambridge, MA: MIT Press.

    Google Scholar 

  • Hedge, D., & Mowery, D. (2008). Politics and funding in the U.S. public biomedical R&D system. Science, 322, 1797–1798.

    Article  Google Scholar 

  • Holbrook, J. B., & Froderman, R. (2011). Peer review and the ex ante assessment of societal impacts. Research Evaluation, 20, 239–246.

    Article  Google Scholar 

  • Holdren, J. (2013). Keynote presentation to the AAAS forum on science and technology policy, Washington, DC.

  • Joly, P. B., Gaunand, A., Colinet, L., Laredo, P., Lemarie, S., & Matt, M. (2015). ASIRPA: A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation, 24, 440–453.

    Article  Google Scholar 

  • Jordan, G. (2013). Logic modeling: A tool for designing program evaluations. In Link and Vonortas, op. cit, pp. 143–165.

  • Kessler, D. (2016). The opiod epidemic we failed to foresee. New York Times, May 7, 2016.

  • Kingdon, J. (1995). Agendas, alternatives, and public policies. New York: HarperCollins.

    Google Scholar 

  • Kline, S., & Rosenberg, N. (1986). An overview of innovation. In R. Landau & N. Rosenberg (Eds.), The positive sum strategy (pp. 275–305). Washington, DC: National Academy Press.

    Google Scholar 

  • Kostoff, R. (1994). Assessing research impact: Semiquantitative methods. Evaluation, 18, 11–19.

    Article  Google Scholar 

  • Kreilkamp, K. (1971). Hindsight and the real world of science policy. Science Studies, 1, 43–66.

    Article  Google Scholar 

  • Lane, J. (2009). Assessing the impact of science funding. Science, 324, 1273–1275.

    Article  Google Scholar 

  • Laredo, P., & Mustar, P. (2001). General conclusion: Three major trends in research and innovation policies. In P. Laredo & P. Mustar (Eds.), Research and innovation policies in the new global economy (pp. 497–509). Cheltenham: Edward Elgar.

    Google Scholar 

  • Link, A., O’Connor, A., & Scott, T. (2015). Battery technology for electric vehicles. London: Routledge.

    Book  Google Scholar 

  • Link, A., & Vonortas, N. (Eds.). (2013). Handbook on the theory and practice of program evaluation. Cheltenham: Edward Elgar.

    Google Scholar 

  • Lynas, M. (2015). Europe turns against Science New York Times, October 25, 2015; SR6

  • Marburger, J. (2005). Keynoted address to the American Association for the Advancement of Science. 30th forum on science and technology policy, Washington, DC, April 21, 2005. Cited in Science Policy Up Close, (2015) R. Crease (Ed.). Cambridge, MA: Harvard University Press.

  • Mark, M., Henry, G., & Julnes, G. (2000). Evaluation. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Martin, B. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster. Research Evaluation, 20, 173–260.

    Article  Google Scholar 

  • Mayne, J. (2012). Contribution analysis: Coming of age. Evaluation, 18, 270–280.

    Article  Google Scholar 

  • Mervis, J. (1995). Asking science to measure up. Science, 267(6), 20–22.

    Article  Google Scholar 

  • Mokyr, J. (2002). The gifts of Athena. Princeton, NJ: Princeton University Press.

    Google Scholar 

  • Mowery, D. (2009). What does economic theory tell us about mission-oriented R&D? In D. Foray (Ed.), The new economics of technology policy (pp. 131–147). Cheltenham: Edward Elgar Publishing.

    Google Scholar 

  • National Academies-National Research Council. (2001). Energy research at DOE: Was it worth it?. Washington, DC: National Academy Press.

    Google Scholar 

  • National Academies-National Research Council (2014a). R. Litan, A. Wyckoff, & K. Husbands Fealing (Eds.), Capturing change in science, technology, and innovation. Washington, DC: National Academies Press.

  • National Academies-National Research Council. (2014b). Science of science and innovation policy-principal investigator’ conference. Washington, DC: National Academies Press.

    Google Scholar 

  • National Academies-National Research Council. (2016). Genetically engineered crops: Experiences and prospects. Washington, DC: National Academies Press.

    Google Scholar 

  • OECD. (2004). Science and innovation policy: Key challenges and opportunities. Paris: OECD.

    Google Scholar 

  • Parke, R., & Seidman, D. (1978). Social indicators and social reporting. Annals of the American Academy of Political and Social Science, 435, 1–22.

    Article  Google Scholar 

  • Pavitt, K. (1998). The social shaping of the national science base. Research Policy, 27, 793–805.

    Article  Google Scholar 

  • Perutz, M. (1991). Is science necessary?. New York: Oxford University Press.

    Google Scholar 

  • Rogers, E. (1995). Diffusion of innovations (4th ed.). New York: Free Press.

    Google Scholar 

  • Rosenberg, N. (1972). Factors affecting the diffusion of technology. Explorations in Economic History, 10, 3–33.

    Article  Google Scholar 

  • Rosenberg, N. (1982). “Learning by using” in inside the black box (pp. 120–140). Cambridge: Cambridge University Press).

    Google Scholar 

  • Rudd, R., Aleshire, N., Zibbell, J., & Gladden, R. (2016). Increases in drug and opiod overdose deaths-United States, 2000–2014. Morbidity and mortality weekly report, January 1, 2016/64:1378-82

  • Ruegg, R., & Jordan, G. (2011). Guide for conducting benefit-cost evaluation of realized impacts of public R&D programs. Washington, DC: U.S. Department of Energy.

    Book  Google Scholar 

  • Sampat, B. (2012). Mission-oriented biomedical research at the NIH. Research Policy, 41, 1729–1741.

    Article  Google Scholar 

  • Sapolsky, H., & Taylor, M. (2011). Politics and the science of science policy. In K. Husbands-Fealing, J. Lane, J. Marburger III, & S. Shipp (Eds.), The science of science policy (pp. 31–55). Palo Alto, CA: Stanford University Press.

    Google Scholar 

  • Sarewitz, D. (2016). Saving science. The New Atlantis, 49, 4–40.

    Google Scholar 

  • Schubert, T., & Schmoch, U. (2010). New public management in science and incentive-compatible resource-allocation based on indicators. In D. Jansen (Ed.), Governance and performance in the german public research sector (pp. 3–18). Berlin: Springer.

    Chapter  Google Scholar 

  • Scriven, M. (2016). The last frontier of evaluation: Ethics. In Donaldson and Picciotto, op. cit, pp. 11–48

  • Smith, B. (1990). American science policy since World War II. Washington, DC; The Brookings Institution.

  • Sovacool, B., & Dworkin, M. (2014). Global energy justice. Cambridge: Cambridge University Press).

    Book  Google Scholar 

  • Spaapen, J., & van Drooge, L. (2011). Introducing ‘productive interactions’ in social impact assessment. Research Evaluation, 20, 211–218.

    Article  Google Scholar 

  • U.S. House of Representatives, Committee on Science and Astronautics (1965) Basic Research and National Goals, Report by the National Academy of Sciences.

  • U.S. House of Representatives, Committee on Science and Astronautics (1967) Applied Science and Technological Progress, Report by the National Academy of Sciences.

  • U.S. House of Representatives, Committee on Science and Astronautics. (1969). Technology: Processes of assessment and choice, report of the national academy of sciences. Washington, DC: U.S. Government Printing Office.

    Google Scholar 

  • U.S. Office of Management and Budget, Budget of the U.S. Government, Fiscal Year. (2017). Analytical Perspectives, Performance And Management, Chapter 5.

  • van Drooge, L., & Spaapen, J. (2015). Towards a network approach of research evaluation. In Impacts of agricultural research-towards an approach of societal values, IMPAR conference, Book of Abstracts, Paris, November 3–4, 2015.

  • Veblen, T. (1918). The higher learning in America. New York: B.W. Huebsch.

    Google Scholar 

  • von Hippel, E. (2005). Democratizing Innovation. Cambridge, MA: MIT Press.

    Book  Google Scholar 

  • Walker, J. (1966). The diffusion of innovations among the American states. American Political Science Quarterly, 63, 880–899.

    Google Scholar 

  • Watts, S. M., George, M. D., & Levey, D. (2015). Achieving broader impacts in the National Science Foundation, Division of Environmental Biology. BioScience, February 2015.

  • Weiss, C. (1988). Evaluation for decisions: Is anybody there? Does anybody care. Evaluation Practice, 9, 5–20.

    Article  Google Scholar 

  • Whitley, R. (2011). Changing governance and authority relations in the public sciences. Minerva, 49, 359–385.

    Article  Google Scholar 

  • Woodhouse, E., & Sarewitz, D. (2007). Science policies for reducing inequities. Science and Public Policy, 2007, 139–149.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Irwin Feller.

Additional information

I have benefited greatly from the insightful and constructive critiques of 2 anonymous reviewers.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feller, I. Assessing the societal impact of publicly funded research. J Technol Transf 47, 632–650 (2022). https://doi.org/10.1007/s10961-017-9602-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10961-017-9602-z

Keywords

JEL Classification

Navigation