Despite considerable advances in developing new and more sophisticated impact evaluation methodologies and toolkits, policy research continues to suffer from persistent challenges in achieving the evaluation trifecta: identifying effects, isolating mechanisms, and influencing policy. For example, evaluation studies are routinely hampered by problems of establishing valid counterfactuals due to endogeneity and selection effects with respect to policy reform. Additionally, robust evaluation studies often must contend with heterogeneity in treatment, staggered timing, and variation in uptake. And finally, on practical grounds, researchers frequently struggle to involve policymakers and practitioners throughout the research process in order to engender the type of trust needed for policy influence. While it can be difficult to generalize about appropriate evaluation methodologies across contexts, prominent policy interventions like governance reforms for improving health services delivery nonetheless demand rigorous and comprehensive evaluation strategies that can produce valid results and engage policymakers. Drawing on illustrations from our research on health sector decentralization in Honduras, in this paper we present a quasi-experimental, multi-method, and participatory approach that addresses these persistent challenges to policy evaluation.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
The presence of major evaluation institutions, like the International Initiative for Impact Evaluation (3ie), Innovations for Poverty Action (IPA), and the Abdul Latif Jameel Poverty Action Lab (J-Pal), has helped in bridging gaps between scholars and policymakers.
A full description of the reform, as well as the MOH’s priorities, is included in the supplemental appendix.
As of late-2016, all health centers in this state have been decentralized under the reform.
Bennett (2008) defines causal process tracing as an analytical technique that examines “…evidence within an individual case, or a temporally and spatially bound instance of a specified phenomenon, to derive and/or test alternative explanations of that case…” (p. 704). See also Collier (2011) and Ricks and Liu (2018).
Andersson, K.: Understanding decentralized forest governance: an application of the institutional analysis and development framework. Sustain. Sci. Pract. Policy 2(1), 25–35 (2006)
Bamberger, M.: Innovations in the use of mixed methods in real-world evaluation. J. Dev. Eff. 7(3), 317–326 (2015)
Baron, R.M., Kenny, D.A.: The moderator–mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J. Pers. Soc. Psychol. 51(6), 1173–1182 (1986)
Bennett, A.: Process tracing: a Bayesian approach. In: Box-Steffensmeier, J., Brady, H., Collier, D. (eds.) Oxford Handbook of Political Methodology. Oxford University Press, Oxford (2008)
Blume, G., Scott, T., Pirog, M.: Empirical innovations in policy analysis. Policy Stud. J. 42(S1), S33–S50 (2014)
Brownson, R.C., et al.: Getting the word out: new approaches for disseminating public health science. J. Public Health Manag. Pract. 24(2), 102 (2018)
Brady, H.: Data-set observations versus causal-process observations: the 2000 U.S. presidential election. In: Brady, H., Collier, D. (eds.) Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd edn. Rowman & Littlefield Publishers, Lanham (2010)
Cash, D., et al.: Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Social Science Research Network, Rochester. SSRN Scholarly Paper. https://papers.ssrn.com/abstract=372280 (October 9, 2018) (2002)
Cash, D.W., et al.: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. 100(14), 8086–8091 (2003)
Clark, W.C., van Kerkhoff, L., Lebel, L., Gallopin, G.C.: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. 113(17), 4570–4578 (2016)
Collier, D.: Understanding process tracing. PS Polit. Sci. Polit. 44(4), 823–830 (2011)
Dixit, A.: Evaluating recipes for development success. World Bank Res. Obs. 22(2), 131–157 (2007). https://doi.org/10.1093/wbro/lkm005
Dunning, T.: Natural Experiments in the Social Sciences: A Design-Based Approach. Cambridge University Press, New York (2012)
Fu, A.Z., Dow, W.H., Liu, G.G.: Propensity score and difference-in-difference methods: a study of second-generation antidepressant use in patients with bipolar disorder. Health Serv. Outcomes Res. Methodol. 7(1–2), 23–38 (2007)
Grimmelikhuijsen, S., Jilke, S., Olsen, A.L., Tummers, L.: Behavioral public administration: combining insights from public administration and psychology. Public Adm. Rev. 77(1), 45–56 (2017)
Habyarimana, J., Humphreys, M., Posner, D.N., Weinstein, J.M.: Coethnicity: Diversity and the Dilemmas of Collective Action. Russell Sage Foundation, New York (2009)
Imai, K., Keele, L., Tingley, D., Yamamoto, T.: Unpacking the black box of causality: learning about causal mechanisms from experimental and observational studies. Am. Polit. Sci. Rev. 105(04), 765–789 (2011)
Imbens, G.W., Rubin, D.B.: Causal Inference for Statistics, Social, and Biomedical Sciences. Cambridge University Press, New York (2015)
Imbens, G.W., Wooldridge, J.M.: Recent developments in the econometrics of program evaluation. Journal of Economic Literature 47(1), 5–86 (2009)
Jung, H., Pirog, M.A.: What works best and when: accounting for multiple sources of pure selection bias in program evaluations. J. Policy Anal. Manag. 33, 752–777 (2014). https://doi.org/10.1002/pam.21764
Kapiszewski, D., MacLean, L., Read, B.: Field Research in Political Science. Cambridge University Press, New York (2014)
King, G., et al.: A ‘politically robust’ experimental design for public policy evaluation, with application to the Mexican universal health insurance program. J. Policy Anal. Manag. 26(3), 479–506 (2007)
Kreif, N., Grieve, R., Radice, R., Sekhon, J.S.: Regression-adjusted matching and double-robust methods for estimating average treatment effects in health economic evaluation. Health Serv. Outcomes Res. Method. 13(2), 174–202 (2013)
Lindner, S., John McConnell, K.: Difference-in-differences and matching on outcomes: a tale of two unobservables. Health Serv. Outcomes Res. Methodol. (2018). https://doi.org/10.1007/s10742-018-0189-0
Matson, P., Clark, W.C., Andersson, K.: Pursuing Sustainability: A Guide to the Science and Practice, 1st edn. Princeton University Press, Princeton (2016)
Ministry of Health (MOH), Government of Honduras: Marco Conceptual Político Y Estratégico de La Reforma Del Sector de Salud (2009)
Ministry of Health (MOH), Government of Honduras: Plan Nacional de Salud 2010–2014 (2010)
Morgan, S.L., Winship, C.: Counterfactuals and Causal Inference: Methods and Principles for Social Research. Cambridge University Press, New York (2007)
Moynihan, D.: A great schism approaching? Towards a micro and macro public administration. J. Behav. Public Adm. 1(1) (2018). https://doi.org/10.30636/jbpa.11.15
Normand, S.-L.T., Wang, Y., Krumholz, H.M.: Assessing surrogacy of data sources for institutional comparisons. Health Serv. Outcomes Res. Methodol. 7(1), 79–96 (2007)
O’Neill, K.: Decentralization as an electoral strategy. Comp. Polit. Stud. 36(9), 1068–1091 (2003)
O’Neill, S., et al.: Estimating causal effects: considering three alternatives to difference-in-differences estimation. Health Serv. Outcomes Res. Methodol. 16(1–2), 1–21 (2016)
Oakerson, R.J., Parks, R.B.: The study of local public economies: multi-organizational, multi-level institutional analysis and development. Policy Stud. J. 39(1), 147–167 (2011)
Ostrom, E.: Understanding Institutional Diversity. Princeton University Press, Princeton (2005)
Ostrom, V., Tiebout, C.M., Warren, R.: The organization of government in metropolitan areas: a theoretical inquiry. Am. Polit. Sci. Rev. 55(4), 831–842 (1961)
Pearl, J.: Causal inference in the health sciences: a conceptual introduction. Health Serv. Outcomes Res. Methodol. 2(3–4), 189–220 (2001)
Posner, M.A., et al.: Comparing standard regression, propensity score matching, and instrumental variables methods for determining the influence of mammography on stage of diagnosis. Health Serv. Outcomes Res. Methodol. 2(3–4), 279–290 (2001)
Ricks, J.I., Liu, A.H.: Process-tracing research designs: a practical guide. Polit. Sci. Polit. 51, 842–846 (2018)
Rosenbaum, P.R., Rubin, D.B.: The central role of the propensity score in observational studies for causal effects. Biometrika 70(1), 41–55 (1983)
Rubin, D.B.: Using propensity scores to help design observational studies: application to the tobacco litigation. Health Serv. Outcomes Res. Methodol. 2(3–4), 169–188 (2001)
Rubin, D.B.: For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2(3), 808–840 (2008)
Tannahill, A., Kelly, M.P.: Layers of complexity in interpreting evidence on effectiveness. Public Health 127(2), 164–170 (2013)
Sekhon, J.S.: Opiates for the matches: matching methods for causal inference. Annu. Rev. Polit. Sci. 12(1), 487–508 (2009)
Stokes, S.C., Dunning, T., Nazareno, M., Brusco, V.: Brokers, Voters, and Clientelism: The Puzzle of Distributive Politics. Cambridge University Press, New York (2013)
White, H.: An introduction to the use of randomised control trials to evaluate development interventions. J. Dev. Effect. 5(1), 30–49 (2013)
Wright, G.D., Andersson, K.P., Gibson, C.C., Evans, T.P.: Decentralization can help reduce deforestation when user groups engage with local government. Proc. Natl. Acad. Sci. 113(52), 14958–14963 (2016)
Zarychta, A.: Making Social Services Work Better for the Poor: Evidence from a Natural Experiment with Health Sector Decentralization in Honduras. Working Paper (2018)
This project was completed with financial support from the National Science Foundation (Award Numbers DGE-1144083 & SMA-1328688), Social Science Research Council, University of Colorado Boulder, and University of Chicago. We are especially grateful for the support and assistance we received from staff at the Ministry of Health in Honduras and the Regional Health Authority of Intibucá. All errors and omissions are our own.
Conflict of interest
The authors declare they have no conflicts of interest.
All procedures performed in the study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Approval for the research was received from the University of Colorado Boulder Institutional Review Board (Protocol #12-0318).
Informed consent was obtained from all individual participants included in the study.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
About this article
Cite this article
Zarychta, A., Andersson, K.P., Root, E.D. et al. Assessing the impacts of governance reforms on health services delivery: a quasi-experimental, multi-method, and participatory approach. Health Serv Outcomes Res Method 19, 241–258 (2019). https://doi.org/10.1007/s10742-019-00201-8
- Impact evaluation
- Policy analysis
- Causal inference
- Mixed methods