Despite considerable advances in developing new and more sophisticated impact evaluation methodologies and toolkits, policy research continues to suffer from persistent challenges in achieving the evaluation trifecta: identifying effects, isolating mechanisms, and influencing policy. For example, evaluation studies are routinely hampered by problems of establishing valid counterfactuals due to endogeneity and selection effects with respect to policy reform. Additionally, robust evaluation studies often must contend with heterogeneity in treatment, staggered timing, and variation in uptake. And finally, on practical grounds, researchers frequently struggle to involve policymakers and practitioners throughout the research process in order to engender the type of trust needed for policy influence. While it can be difficult to generalize about appropriate evaluation methodologies across contexts, prominent policy interventions like governance reforms for improving health services delivery nonetheless demand rigorous and comprehensive evaluation strategies that can produce valid results and engage policymakers. Drawing on illustrations from our research on health sector decentralization in Honduras, in this paper we present a quasi-experimental, multi-method, and participatory approach that addresses these persistent challenges to policy evaluation.
This is a preview of subscription content, log in to check access.
This project was completed with financial support from the National Science Foundation (Award Numbers DGE-1144083 & SMA-1328688), Social Science Research Council, University of Colorado Boulder, and University of Chicago. We are especially grateful for the support and assistance we received from staff at the Ministry of Health in Honduras and the Regional Health Authority of Intibucá. All errors and omissions are our own.
Compliance with ethical standards
Conflict of interest
The authors declare they have no conflicts of interest.
All procedures performed in the study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Approval for the research was received from the University of Colorado Boulder Institutional Review Board (Protocol #12-0318).
Informed consent was obtained from all individual participants included in the study.
Andersson, K.: Understanding decentralized forest governance: an application of the institutional analysis and development framework. Sustain. Sci. Pract. Policy 2(1), 25–35 (2006)Google Scholar
Bamberger, M.: Innovations in the use of mixed methods in real-world evaluation. J. Dev. Eff. 7(3), 317–326 (2015)CrossRefGoogle Scholar
Baron, R.M., Kenny, D.A.: The moderator–mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J. Pers. Soc. Psychol. 51(6), 1173–1182 (1986)CrossRefGoogle Scholar
Bennett, A.: Process tracing: a Bayesian approach. In: Box-Steffensmeier, J., Brady, H., Collier, D. (eds.) Oxford Handbook of Political Methodology. Oxford University Press, Oxford (2008)Google Scholar
Blume, G., Scott, T., Pirog, M.: Empirical innovations in policy analysis. Policy Stud. J. 42(S1), S33–S50 (2014)CrossRefGoogle Scholar
Brownson, R.C., et al.: Getting the word out: new approaches for disseminating public health science. J. Public Health Manag. Pract. 24(2), 102 (2018)CrossRefGoogle Scholar
Brady, H.: Data-set observations versus causal-process observations: the 2000 U.S. presidential election. In: Brady, H., Collier, D. (eds.) Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd edn. Rowman & Littlefield Publishers, Lanham (2010)Google Scholar
Cash, D., et al.: Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Social Science Research Network, Rochester. SSRN Scholarly Paper. https://papers.ssrn.com/abstract=372280 (October 9, 2018) (2002)
Cash, D.W., et al.: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. 100(14), 8086–8091 (2003)CrossRefGoogle Scholar
Clark, W.C., van Kerkhoff, L., Lebel, L., Gallopin, G.C.: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. 113(17), 4570–4578 (2016)CrossRefGoogle Scholar
Dunning, T.: Natural Experiments in the Social Sciences: A Design-Based Approach. Cambridge University Press, New York (2012)CrossRefGoogle Scholar
Fu, A.Z., Dow, W.H., Liu, G.G.: Propensity score and difference-in-difference methods: a study of second-generation antidepressant use in patients with bipolar disorder. Health Serv. Outcomes Res. Methodol. 7(1–2), 23–38 (2007)CrossRefGoogle Scholar
Grimmelikhuijsen, S., Jilke, S., Olsen, A.L., Tummers, L.: Behavioral public administration: combining insights from public administration and psychology. Public Adm. Rev. 77(1), 45–56 (2017)CrossRefGoogle Scholar
Habyarimana, J., Humphreys, M., Posner, D.N., Weinstein, J.M.: Coethnicity: Diversity and the Dilemmas of Collective Action. Russell Sage Foundation, New York (2009)Google Scholar
Imai, K., Keele, L., Tingley, D., Yamamoto, T.: Unpacking the black box of causality: learning about causal mechanisms from experimental and observational studies. Am. Polit. Sci. Rev. 105(04), 765–789 (2011)CrossRefGoogle Scholar
Imbens, G.W., Rubin, D.B.: Causal Inference for Statistics, Social, and Biomedical Sciences. Cambridge University Press, New York (2015)CrossRefGoogle Scholar
Imbens, G.W., Wooldridge, J.M.: Recent developments in the econometrics of program evaluation. Journal of Economic Literature 47(1), 5–86 (2009)CrossRefGoogle Scholar
Kapiszewski, D., MacLean, L., Read, B.: Field Research in Political Science. Cambridge University Press, New York (2014)Google Scholar
King, G., et al.: A ‘politically robust’ experimental design for public policy evaluation, with application to the Mexican universal health insurance program. J. Policy Anal. Manag. 26(3), 479–506 (2007)CrossRefGoogle Scholar
Kreif, N., Grieve, R., Radice, R., Sekhon, J.S.: Regression-adjusted matching and double-robust methods for estimating average treatment effects in health economic evaluation. Health Serv. Outcomes Res. Method. 13(2), 174–202 (2013)CrossRefGoogle Scholar
Ostrom, V., Tiebout, C.M., Warren, R.: The organization of government in metropolitan areas: a theoretical inquiry. Am. Polit. Sci. Rev. 55(4), 831–842 (1961)CrossRefGoogle Scholar
Pearl, J.: Causal inference in the health sciences: a conceptual introduction. Health Serv. Outcomes Res. Methodol. 2(3–4), 189–220 (2001)CrossRefGoogle Scholar
Posner, M.A., et al.: Comparing standard regression, propensity score matching, and instrumental variables methods for determining the influence of mammography on stage of diagnosis. Health Serv. Outcomes Res. Methodol. 2(3–4), 279–290 (2001)CrossRefGoogle Scholar
Ricks, J.I., Liu, A.H.: Process-tracing research designs: a practical guide. Polit. Sci. Polit. 51, 842–846 (2018)CrossRefGoogle Scholar
Rosenbaum, P.R., Rubin, D.B.: The central role of the propensity score in observational studies for causal effects. Biometrika 70(1), 41–55 (1983)CrossRefGoogle Scholar
Rubin, D.B.: Using propensity scores to help design observational studies: application to the tobacco litigation. Health Serv. Outcomes Res. Methodol. 2(3–4), 169–188 (2001)CrossRefGoogle Scholar
Rubin, D.B.: For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2(3), 808–840 (2008)CrossRefGoogle Scholar
Tannahill, A., Kelly, M.P.: Layers of complexity in interpreting evidence on effectiveness. Public Health 127(2), 164–170 (2013)CrossRefGoogle Scholar
Sekhon, J.S.: Opiates for the matches: matching methods for causal inference. Annu. Rev. Polit. Sci. 12(1), 487–508 (2009)CrossRefGoogle Scholar
Stokes, S.C., Dunning, T., Nazareno, M., Brusco, V.: Brokers, Voters, and Clientelism: The Puzzle of Distributive Politics. Cambridge University Press, New York (2013)CrossRefGoogle Scholar
White, H.: An introduction to the use of randomised control trials to evaluate development interventions. J. Dev. Effect. 5(1), 30–49 (2013)CrossRefGoogle Scholar
Wright, G.D., Andersson, K.P., Gibson, C.C., Evans, T.P.: Decentralization can help reduce deforestation when user groups engage with local government. Proc. Natl. Acad. Sci. 113(52), 14958–14963 (2016)CrossRefGoogle Scholar
Zarychta, A.: Making Social Services Work Better for the Poor: Evidence from a Natural Experiment with Health Sector Decentralization in Honduras. Working Paper (2018)Google Scholar