Advertisement

Counting what really counts? Assessing the political impact of science

  • A. GaunandEmail author
  • L. Colinet
  • P.-B. Joly
  • M. Matt
Article

Abstract

The production of scientific knowledge is expected to benefit society in a variety of ways. However, and despite the many theoretical models available in the literature, there are few practical frameworks for assessing the dimensions of the effect of research on society—including its political impacts. As part of the ASIRPA approach, in this paper we propose an ordinal rating scale to allow assessment of the political impact of research, based on a review of the literature, qualitative evidence of political impact gleaned from a collection of case studies, and an expert panel. The resulting metric uses a 1–5 scale to evaluate the intensity of the political impact of research according to generic criteria associated to each rating level. Routine application of this scale in case study research is increasing, and is allowing robust, simple and consistent self-assessments of the political impacts across cases, to complement qualitative case study descriptions. The methodology used to design the rating scale prompted the panel experts to reveal their evaluation rationales and justify their judgments, increasing the transparency of the assessments. We believe that the benefits of assigning an ordinal measure to the political impact of research outweigh the risks of misuse of an impact number. The advantages include influencing political agenda-setting by showing what really matters, the opportunities it provides for scaling-up analyses of multidimensional impacts and identifying impact-generating mechanisms, and learning about and promoting discussion of the value systems reflected in the assessment.

Keywords

Ex-post evaluation Political impact of science Metric Panel experts ASIRPA 

JEL Classification

H43 H83 O33 O38 A13 

Notes

Acknowledgements

The analysis was made possible by financial support from INRA (France) through the ASIRPA Project (Socio-economic Analysis of the diversity of Impacts of Public Agricultural Research).

References

  1. Almeida, C., & Báscolo, E. (2006). Use of research results in policy decision-making, formulation, and implementation: A review of the literature. Cadernos de Saúde Pública, 22, S7–S19.CrossRefGoogle Scholar
  2. Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual, and symbolic utilization of university research in government agencies. Science Communication. doi: 10.1177/1075547004267491.Google Scholar
  3. Arnold, E., Clark, J., & Muscio, A. (2005). What the evaluation record tells us about European Union Framework Programme performance. Science and Public Policy, 32(5), 385–397.CrossRefGoogle Scholar
  4. Boaz, A., Fitzpatrick, S., & Shaw, B. (2009). Assessing the impact of research on policy: A literature review. Science and Public Policy, 36(4), 255–270.CrossRefGoogle Scholar
  5. Bornmann, L. (2013). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology, 64(2), 217–233.CrossRefGoogle Scholar
  6. Bozeman, B., & Sarewitz, D. (2011). Public value mapping and science policy evaluation. Minerva, 49, 1–23.CrossRefGoogle Scholar
  7. Bozeman, B., & Youtie, J. (2015). Socio-economic impacts and public value of government—Funded research: Lessons from four US National Science Foundation Initiatives. In Présenté à ImpAR conference. Paris: INRA. https://colloque.inra.fr/impar/Program-Material
  8. Brewer, J. D. (2011). The impact of impact. Research Evaluation, 20(3), 255–256.CrossRefGoogle Scholar
  9. Callon, M. (1986). The sociology of an actor-network. In M. Callon, J. Law, & A. Rip (Eds.), Mapping the dynamics of science and technology. London: Macmillan.CrossRefGoogle Scholar
  10. Caplan, N. (1979). The two-communities theory and knowledge utilization. The American Behavioral Scientist, 22(3), 459.CrossRefGoogle Scholar
  11. Carden, F. (2004). Issues in assessing the policy influence of research. International Social Science Journal, 56(179), 135–151.CrossRefGoogle Scholar
  12. Cash, D., Clark, W. C., Alcock, F., Dickson, N., Eckley, N., & Jäger, J. (2003). Salience. SSRN Electronic Journal: Linking research, assessment and decision making, credibility, legitimacy and boundaries. doi: 10.2139/ssrn.372280.Google Scholar
  13. Cohen, G., Schroeder, J., Newson, R., King, L., Rychetnik, L., Milat, A., et al. (2015). Does health intervention research have real world policy and practice impacts: Testing a new impact assessment tool. Health Research Policy and Systems, 13(1), 3. doi: 10.1186/1478-4505-13-3.CrossRefGoogle Scholar
  14. Cozzens, S., & Snoek, M. (2010). Knowledge to policy contributing to the measurement of social, health, and environmental benefits. Présenté à workshop on the science of science measurement, Washington, DC.Google Scholar
  15. Davies, H. T., & Nutley, S. (2008). Learning more about how research-based knowledge gets used: Guidance in the development of new empirical research. New-York: W T Grant Foundation.Google Scholar
  16. Davies, H. T., Powell, A. E., & Nutley, S. M. (2015). Mobilising knowledge to improve UK health care: Learning from other countries and other sectors—A multimethod mapping study. Health Services and Delivery Research, 3(27), 17. doi: 10.3310/hsdr03270.CrossRefGoogle Scholar
  17. Donovan, C. (2011). State of the art of assessing research impact: Introduction to a special issue, research evaluation. Research Evaluation, 20(3), 175–179.CrossRefGoogle Scholar
  18. Georghiou, L., & Roesner, D. (2000). Evaluating technology programs: Tools and methods. Research Policy, 29, 657–678.CrossRefGoogle Scholar
  19. Greenhalgh, T., Raftery, J., Hanney, S., & Glover, M. (2016). Research impact: A narrative review. BMC Medicine, 14(1), 78. doi: 10.1186/s12916-016-0620-8.CrossRefGoogle Scholar
  20. Hanney, S. R., Gonzalez-Block, M. A., Buxton, M. J., & Kogan, M. (2003). The utilisation of health research in policy-making: Concepts, examples and methods of assessment. Health Research Policy and Systems, 1, 2.CrossRefGoogle Scholar
  21. Hazell, P., & Slade, R. (2014). Policy Research: The Search for Impact. In Workshop on best practice methods for assessing the impact of policy-oriented research: Summary and recommendations for the CGIAR, Washington, DC.Google Scholar
  22. Heclo, H. (1978). Issue networks and the executive establishment. In Public administration: Concepts and cases (Vol. 413, pp. 46–57).Google Scholar
  23. Hill, S. (2016). Assessing (for) impact: Future assessment of the societal impact of research. Palgrave Communications. doi: 10.1057/palcomms.2016.73.Google Scholar
  24. Howlett, M., & Ramesh, M. (1995). Studying public policy: Policy cycles and policy subsystems (p. 239). Oxford University Press.Google Scholar
  25. Joly, P.-B., Gaunand, A., Colinet, L., Larédo, P., Lemarié, S., & Matt, M. (2015). ASIRPA: A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation, 24(4), 1–14. doi: 10.1093/reseval/rvv015.CrossRefGoogle Scholar
  26. Jones, H. (2009). Policy-making as discourse: A review of recent knowledge-to-policy literature (no 5, p. 37). ODI-IKM working papers.Google Scholar
  27. Kingdon, J. W. (1984). Bridging research and policy: Agendas, alternatives, and public policies. New-York: Harper Collins.Google Scholar
  28. Kingdon, J. W. (1995). Agenda setting. In M. A. Cahn & S. Z. Theodoulou (Eds.), Public policy: The essential readings (1st ed., pp. 105–113).Google Scholar
  29. Langfeldt, L. (2004). Expert panels evaluating research: Decision-making and sources of bias. Research Evaluation, 13(1), 51–62.CrossRefGoogle Scholar
  30. Lasswell, H. D. (1977). Political socialization as a policy science. In S. A. Renhson (Ed.), Handbook of political socialization (pp. 445–467). New York: Free Press.Google Scholar
  31. Lasswell, H. D., & Lerner, D. (1951). The policy sciences. Redwood City: Stanford University Press.Google Scholar
  32. Lindquist, E. (2001). Discerning policy influence: Framework for a strategic evaluation of IDRC-supported research. Présenté à cases, concepts and connections: The influence of research on public policy; evaluation workshop. Ottawa, ON: School of Public Administration University of Victoria.Google Scholar
  33. Matt, M., Gaunand, A., Joly, P.-B., & Colinet, L. (2017). Opening the black box of impact—Ideal-type impact pathways in a public agricultural research organization. Research Policy, 46(1), 207–218. doi: 10.1016/j.respol.2016.09.016.CrossRefGoogle Scholar
  34. Molas-Gallart, J., & Davies, A. (2006). Toward theory-led evaluation: The experience of European science, technology, and innovation policies. American Journal of Evaluation, 27(1), 64–82.CrossRefGoogle Scholar
  35. Paradeise, C. (2012). Le sens de la mesure: la gestion par les indicateurs est-elle gage d’efficacité? Présenté à 9ème conférence de l’AFD/EUDN, Paris.Google Scholar
  36. Porter, T. M. (1996). Trust in numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.CrossRefGoogle Scholar
  37. Power, M. (1994). The audit explosion. London: Demos.Google Scholar
  38. Radaelli, C. M. (1995). The role of knowledge in the policy process. Journal of European Public Policy, 2(2), 159–183.CrossRefGoogle Scholar
  39. Raitzer, D. A., & Ryan, J. G. (2008). State of the art in impact assessment of policy-oriented international agricultural research. Evidence and Policy: A Journal of Research, Debate and Practice, 4(1), 5–30.CrossRefGoogle Scholar
  40. Renkow, M., & Byerlee, D. (2014). Assessing the impact of policy-oriented research: A stocktaking. In Workshop on best practice methods for assessing the impact of policy-oriented research: Summary and recommendations for the CGIAR, Washington, DC.Google Scholar
  41. Ruegg, R., & Feller, I. (2003). A toolkit for evaluating public R&D investment: Models, methods, and findings from ATP’s first decade. Grant/Contract Report. Gaithersburg: National Institute of Standards and Technology.Google Scholar
  42. Sabatier, P. A., & Jenkins-Smith, H. C. (1993). Policy change over a decade or more. Boulder, CO: Westview Press.Google Scholar
  43. Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014. Research Evaluation, 24, 229–241.CrossRefGoogle Scholar
  44. Spaapen, J. M., & Van Drooge, L. (2011). Introducing ‘productive interactions’ in social assessment. Research Evaluation, 20(3), 211–218.CrossRefGoogle Scholar
  45. Stirling, A. (1997). Multicriteria mapping: Mitigating the problems of environmental valuation? In J. Foster (Ed.), Valuing nature? Economics, ethics and environment. London: Routledge.Google Scholar
  46. Trostle, J., Bronfman, M., & Langer, A. (1999). How do researchers influence decision-makers? Case studies of Mexican policies. Health Policy and Planning, 14(2), 103–114.CrossRefGoogle Scholar
  47. Tsui, J., Hearn, S., & Young, J. (2014). Monitoring and evaluation of policy influence and advocacy (p. 90). Working paper no. 395. London: Overseas Development Institute.Google Scholar
  48. Weiss, C. H. (1977). Research for policy’s sake: The enlightenment function of social research. Policy analysis, 3, 531–545.Google Scholar
  49. Weiss, C. H. (1979). The many meanings of research utilization. Public Administration Review, 39(5), 426–431. doi: 10.2307/3109916.CrossRefGoogle Scholar
  50. Weiss, C. H. (1980). Knowledge creep and decision accretion. Science Communication, 1(3), 381–404.CrossRefGoogle Scholar
  51. Wooding, S., Hanney, S. R., Pollitt, A., Grant, J., & Buxton, M. J. (2014). Understanding factors associated with the translation of cardiovascular research: A multinational case study approach. Implementation Science, 9(1), 47. doi: 10.1186/1748-5908-9-47.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.INRA, Delegation for EvaluationParis Cedex 07France
  2. 2.LISIS, ESIEE Paris, CNRS, UPEMUniversité Paris EstMarne-la-Vallée Cedex 2France
  3. 3.Directorate CollegeINRAParis Cedex 07France
  4. 4.INRA, CNRS, Grenoble INPUniv. Grenoble-Alpes, GAELGrenobleFrance

Personalised recommendations