Assessing the impacts of governance reforms on health services delivery: a quasi-experimental, multi-method, and participatory approach

Abstract

Despite considerable advances in developing new and more sophisticated impact evaluation methodologies and toolkits, policy research continues to suffer from persistent challenges in achieving the evaluation trifecta: identifying effects, isolating mechanisms, and influencing policy. For example, evaluation studies are routinely hampered by problems of establishing valid counterfactuals due to endogeneity and selection effects with respect to policy reform. Additionally, robust evaluation studies often must contend with heterogeneity in treatment, staggered timing, and variation in uptake. And finally, on practical grounds, researchers frequently struggle to involve policymakers and practitioners throughout the research process in order to engender the type of trust needed for policy influence. While it can be difficult to generalize about appropriate evaluation methodologies across contexts, prominent policy interventions like governance reforms for improving health services delivery nonetheless demand rigorous and comprehensive evaluation strategies that can produce valid results and engage policymakers. Drawing on illustrations from our research on health sector decentralization in Honduras, in this paper we present a quasi-experimental, multi-method, and participatory approach that addresses these persistent challenges to policy evaluation.

This is a preview of subscription content, log in to check access.

Fig. 1

Notes

  1. 1.

    The presence of major evaluation institutions, like the International Initiative for Impact Evaluation (3ie), Innovations for Poverty Action (IPA), and the Abdul Latif Jameel Poverty Action Lab (J-Pal), has helped in bridging gaps between scholars and policymakers.

  2. 2.

    A full description of the reform, as well as the MOH’s priorities, is included in the supplemental appendix.

  3. 3.

    As of late-2016, all health centers in this state have been decentralized under the reform.

  4. 4.

    Bennett (2008) defines causal process tracing as an analytical technique that examines “…evidence within an individual case, or a temporally and spatially bound instance of a specified phenomenon, to derive and/or test alternative explanations of that case…” (p. 704). See also Collier (2011) and Ricks and Liu (2018).

  5. 5.

    Dunning (2012) and Kapiszewski et al. (2014) are notable exceptions.

References

  1. Andersson, K.: Understanding decentralized forest governance: an application of the institutional analysis and development framework. Sustain. Sci. Pract. Policy 2(1), 25–35 (2006)

    Google Scholar 

  2. Bamberger, M.: Innovations in the use of mixed methods in real-world evaluation. J. Dev. Eff. 7(3), 317–326 (2015)

    Article  Google Scholar 

  3. Baron, R.M., Kenny, D.A.: The moderator–mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J. Pers. Soc. Psychol. 51(6), 1173–1182 (1986)

    CAS  Article  Google Scholar 

  4. Bennett, A.: Process tracing: a Bayesian approach. In: Box-Steffensmeier, J., Brady, H., Collier, D. (eds.) Oxford Handbook of Political Methodology. Oxford University Press, Oxford (2008)

    Google Scholar 

  5. Blume, G., Scott, T., Pirog, M.: Empirical innovations in policy analysis. Policy Stud. J. 42(S1), S33–S50 (2014)

    Article  Google Scholar 

  6. Brownson, R.C., et al.: Getting the word out: new approaches for disseminating public health science. J. Public Health Manag. Pract. 24(2), 102 (2018)

    Article  Google Scholar 

  7. Brady, H.: Data-set observations versus causal-process observations: the 2000 U.S. presidential election. In: Brady, H., Collier, D. (eds.) Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd edn. Rowman & Littlefield Publishers, Lanham (2010)

    Google Scholar 

  8. Cash, D., et al.: Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Social Science Research Network, Rochester. SSRN Scholarly Paper. https://papers.ssrn.com/abstract=372280 (October 9, 2018) (2002)

  9. Cash, D.W., et al.: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. 100(14), 8086–8091 (2003)

    CAS  Article  Google Scholar 

  10. Clark, W.C., van Kerkhoff, L., Lebel, L., Gallopin, G.C.: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. 113(17), 4570–4578 (2016)

    CAS  Article  Google Scholar 

  11. Collier, D.: Understanding process tracing. PS Polit. Sci. Polit. 44(4), 823–830 (2011)

    Article  Google Scholar 

  12. Dixit, A.: Evaluating recipes for development success. World Bank Res. Obs. 22(2), 131–157 (2007). https://doi.org/10.1093/wbro/lkm005

    Article  Google Scholar 

  13. Dunning, T.: Natural Experiments in the Social Sciences: A Design-Based Approach. Cambridge University Press, New York (2012)

    Google Scholar 

  14. Fu, A.Z., Dow, W.H., Liu, G.G.: Propensity score and difference-in-difference methods: a study of second-generation antidepressant use in patients with bipolar disorder. Health Serv. Outcomes Res. Methodol. 7(1–2), 23–38 (2007)

    Article  Google Scholar 

  15. Grimmelikhuijsen, S., Jilke, S., Olsen, A.L., Tummers, L.: Behavioral public administration: combining insights from public administration and psychology. Public Adm. Rev. 77(1), 45–56 (2017)

    Article  Google Scholar 

  16. Habyarimana, J., Humphreys, M., Posner, D.N., Weinstein, J.M.: Coethnicity: Diversity and the Dilemmas of Collective Action. Russell Sage Foundation, New York (2009)

    Google Scholar 

  17. Imai, K., Keele, L., Tingley, D., Yamamoto, T.: Unpacking the black box of causality: learning about causal mechanisms from experimental and observational studies. Am. Polit. Sci. Rev. 105(04), 765–789 (2011)

    Article  Google Scholar 

  18. Imbens, G.W., Rubin, D.B.: Causal Inference for Statistics, Social, and Biomedical Sciences. Cambridge University Press, New York (2015)

    Google Scholar 

  19. Imbens, G.W., Wooldridge, J.M.: Recent developments in the econometrics of program evaluation. Journal of Economic Literature 47(1), 5–86 (2009)

    Article  Google Scholar 

  20. Jung, H., Pirog, M.A.: What works best and when: accounting for multiple sources of pure selection bias in program evaluations. J. Policy Anal. Manag. 33, 752–777 (2014). https://doi.org/10.1002/pam.21764

    Article  Google Scholar 

  21. Kapiszewski, D., MacLean, L., Read, B.: Field Research in Political Science. Cambridge University Press, New York (2014)

    Google Scholar 

  22. King, G., et al.: A ‘politically robust’ experimental design for public policy evaluation, with application to the Mexican universal health insurance program. J. Policy Anal. Manag. 26(3), 479–506 (2007)

    Article  Google Scholar 

  23. Kreif, N., Grieve, R., Radice, R., Sekhon, J.S.: Regression-adjusted matching and double-robust methods for estimating average treatment effects in health economic evaluation. Health Serv. Outcomes Res. Method. 13(2), 174–202 (2013)

    Article  Google Scholar 

  24. Lindner, S., John McConnell, K.: Difference-in-differences and matching on outcomes: a tale of two unobservables. Health Serv. Outcomes Res. Methodol. (2018). https://doi.org/10.1007/s10742-018-0189-0

    Article  Google Scholar 

  25. Matson, P., Clark, W.C., Andersson, K.: Pursuing Sustainability: A Guide to the Science and Practice, 1st edn. Princeton University Press, Princeton (2016)

    Google Scholar 

  26. Ministry of Health (MOH), Government of Honduras: Marco Conceptual Político Y Estratégico de La Reforma Del Sector de Salud (2009)

  27. Ministry of Health (MOH), Government of Honduras: Plan Nacional de Salud 2010–2014 (2010)

  28. Morgan, S.L., Winship, C.: Counterfactuals and Causal Inference: Methods and Principles for Social Research. Cambridge University Press, New York (2007)

    Google Scholar 

  29. Moynihan, D.: A great schism approaching? Towards a micro and macro public administration. J. Behav. Public Adm. 1(1) (2018). https://doi.org/10.30636/jbpa.11.15

  30. Normand, S.-L.T., Wang, Y., Krumholz, H.M.: Assessing surrogacy of data sources for institutional comparisons. Health Serv. Outcomes Res. Methodol. 7(1), 79–96 (2007)

    Article  Google Scholar 

  31. O’Neill, K.: Decentralization as an electoral strategy. Comp. Polit. Stud. 36(9), 1068–1091 (2003)

    Article  Google Scholar 

  32. O’Neill, S., et al.: Estimating causal effects: considering three alternatives to difference-in-differences estimation. Health Serv. Outcomes Res. Methodol. 16(1–2), 1–21 (2016)

    Article  Google Scholar 

  33. Oakerson, R.J., Parks, R.B.: The study of local public economies: multi-organizational, multi-level institutional analysis and development. Policy Stud. J. 39(1), 147–167 (2011)

    Article  Google Scholar 

  34. Ostrom, E.: Understanding Institutional Diversity. Princeton University Press, Princeton (2005)

    Google Scholar 

  35. Ostrom, V., Tiebout, C.M., Warren, R.: The organization of government in metropolitan areas: a theoretical inquiry. Am. Polit. Sci. Rev. 55(4), 831–842 (1961)

    Article  Google Scholar 

  36. Pearl, J.: Causal inference in the health sciences: a conceptual introduction. Health Serv. Outcomes Res. Methodol. 2(3–4), 189–220 (2001)

    Article  Google Scholar 

  37. Posner, M.A., et al.: Comparing standard regression, propensity score matching, and instrumental variables methods for determining the influence of mammography on stage of diagnosis. Health Serv. Outcomes Res. Methodol. 2(3–4), 279–290 (2001)

    Article  Google Scholar 

  38. Ricks, J.I., Liu, A.H.: Process-tracing research designs: a practical guide. Polit. Sci. Polit. 51, 842–846 (2018)

    Article  Google Scholar 

  39. Rosenbaum, P.R., Rubin, D.B.: The central role of the propensity score in observational studies for causal effects. Biometrika 70(1), 41–55 (1983)

    Article  Google Scholar 

  40. Rubin, D.B.: Using propensity scores to help design observational studies: application to the tobacco litigation. Health Serv. Outcomes Res. Methodol. 2(3–4), 169–188 (2001)

    Article  Google Scholar 

  41. Rubin, D.B.: For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2(3), 808–840 (2008)

    Article  Google Scholar 

  42. Tannahill, A., Kelly, M.P.: Layers of complexity in interpreting evidence on effectiveness. Public Health 127(2), 164–170 (2013)

    CAS  Article  Google Scholar 

  43. Sekhon, J.S.: Opiates for the matches: matching methods for causal inference. Annu. Rev. Polit. Sci. 12(1), 487–508 (2009)

    Article  Google Scholar 

  44. Stokes, S.C., Dunning, T., Nazareno, M., Brusco, V.: Brokers, Voters, and Clientelism: The Puzzle of Distributive Politics. Cambridge University Press, New York (2013)

    Google Scholar 

  45. White, H.: An introduction to the use of randomised control trials to evaluate development interventions. J. Dev. Effect. 5(1), 30–49 (2013)

    Article  Google Scholar 

  46. Wright, G.D., Andersson, K.P., Gibson, C.C., Evans, T.P.: Decentralization can help reduce deforestation when user groups engage with local government. Proc. Natl. Acad. Sci. 113(52), 14958–14963 (2016)

    CAS  Article  Google Scholar 

  47. Zarychta, A.: Making Social Services Work Better for the Poor: Evidence from a Natural Experiment with Health Sector Decentralization in Honduras. Working Paper (2018)

Download references

Funding

This project was completed with financial support from the National Science Foundation (Award Numbers DGE-1144083 & SMA-1328688), Social Science Research Council, University of Colorado Boulder, and University of Chicago. We are especially grateful for the support and assistance we received from staff at the Ministry of Health in Honduras and the Regional Health Authority of Intibucá. All errors and omissions are our own.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Alan Zarychta.

Ethics declarations

Conflict of interest

The authors declare they have no conflicts of interest.

Ethical approval

All procedures performed in the study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Approval for the research was received from the University of Colorado Boulder Institutional Review Board (Protocol #12-0318).

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 1827 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zarychta, A., Andersson, K.P., Root, E.D. et al. Assessing the impacts of governance reforms on health services delivery: a quasi-experimental, multi-method, and participatory approach. Health Serv Outcomes Res Method 19, 241–258 (2019). https://doi.org/10.1007/s10742-019-00201-8

Download citation

Keywords

  • Impact evaluation
  • Policy analysis
  • Causal inference
  • Mixed methods