Advertisement

The European Journal of Development Research

, Volume 31, Issue 2, pp 139–162 | Cite as

Bridging to Action Requires Mixed Methods, Not Only Randomised Control Trials

  • Wendy OlsenEmail author
Commentary

Abstract

Development evaluation refers to evaluating projects and programmes in development contexts. Some evaluations are too narrow. Narrow within-discipline impact evaluations are weaker than multidisciplinary, mixed-methods evaluations. A two-step process leads toward profoundly better arguments in assessing the impact of a development intervention. The first step is setting out the arena for discussion, including what the various entities are in the social, political, cultural and natural environment surrounding the chosen problem. The second step is that, once this arena has been declared, the project and triangulation of data can be brought to bear upon logical arguments with clear, transparent reasoning leading to a set of conclusions. In this second step, we do need scientific methods such as peer review, data and so on, but, crucially, the impact evaluation process must not rest upon a single data type, such as survey data. It is dangerous and undesirable to have the entire validity of the conclusions resting upon randomised control trials, or even a mixture of data types. Different contributions to knowledge exist within the evaluation process, including the interaction of people during action research, ethnography, case-study methods, process tracing and qualitative methods. The cement holding my argument together is that multiple logics are used (retroductive, deductive, and inductive, in particular). Deductive mathematics should not dominate the evaluation of an intervention, as randomised controlled trials on their own lend themselves to worrying fallacies about causality. I show this using Boolean fuzzy set logic. An indicator of high-quality development evaluation is the use of multiple logics in a transparent way.

Keywords

Evaluation Randomised control trials Comparative case-study research Methodology Retroduction Mixed-methods Impact evaluation 

Resume

L’évaluation du développement se réfère à l’évaluation de projets et programmes dans le contexte du développement. Certaines évaluations sont trop restreintes. Les évaluations étroites, au sein de la même discipline, sont plus faibles que les évaluations multidisciplinaires utilisant des approches mixtes. Pour évaluer l’impact des interventions en matière de développement, un procédé a deux étapes nous aide à bâtir des argumentations beaucoup plus fortes. La première étape est d’établir l’espace de la discussion, y compris les organismes sociales, politiques, culturales et naturels qui encerclent le problème choisi. La deuxième étape, une fois cet espace est défini, est d’utiliser le projet et la triangulation des données de forme logique, produisant un raisonnement transparent et clair qui nous amène à des conclusions. Cette deuxième étape nécessite de méthodologies scientifiques tels que la révision par des pairs, l’utilisation des données, etc. C’est crucial que le processus d’évaluation des impacts ne s’appuie pas que sur un seul type de données, comme par exemple les données d’enquête. Que l’ensemble de la validité des conclusions soit basé que sur des essais de contrôle randomisées (en anglais: randomised control trials, ou RCT), ou même sur une combinaison de différents types de données, est c’est au même temps dangereux et non désirable. Le processus d’évaluation contribue de différentes façons à la connaissance, y inclus à travers les interactions des gens pendant la recherche, l’ethnographie, les méthodologies d’études de cas, le traçage des processus, et les méthodologies qualitatives. C’est clé que multiples logiques soient utilisées (en particulier, retroductive, déductive, et inductive). Les mathématiques deductives ne doivent pas dominer les évaluations des interventions, puisque les RCTs mêmes se prêtent à des préoccupantes erreurs de casualité: on le démontre utilisant les ensembles Boolean de logiques floues. Un indicateur d’une haute qualité des évaluations des interventions est l’utilisation de multiples logiques d’une façon transparente.

Notes

References

  1. Agarwal, B. 2018. Can group farms outperform individual family farms? Empirical Insights from India, World Development 108 (8): 57–73.Google Scholar
  2. Allmark, Peter, and Katarzyna Machaczek. 2018. Discussion paper: Realism and pragmatism in a mixed methods study. Journal of Advanced Nursing 74: 1301–1309.  https://doi.org/10.1111/jan.13523.Google Scholar
  3. Aus, J.P. 2009. Conjunctural causation in comparative case-oriented research. Quality & Quantity 43 (2): 173–183.  https://doi.org/10.1007/s11135-007-9104-4.Google Scholar
  4. Barrett, C.B., and M.R. Carter. 2010. The power and pitfalls of experiments in development economics: Some non-random reflections. Applied Economic Perspectives and Policy 32 (4): 515–548.  https://doi.org/10.1093/aepp/ppq023.Google Scholar
  5. Befani, B., C. Barnett, and E. Stern. 2014. Introduction—rethinking impact evaluation for development. Ids Bulletin-Institute of Development Studies 45 (6): 1–5.  https://doi.org/10.1111/1759-5436.12108.Google Scholar
  6. Blaikie, N.W.H. 2000. Designing social research: The logic of anticipation. Cambridge, UK: Polity Press.Google Scholar
  7. Blaikie, P. 1993. Approaches to social enquiry. Cambridge: Polity.Google Scholar
  8. Brink, M., et al. 2011. Sustainable management through improved governance in the game industry. South African Journal of Wildlife Research 41 (1): 110–119.Google Scholar
  9. Brunie, A., L. Fumagalli, T. Martin, S. Field, and D. Rutherford. 2014. Can village savings and loan groups be a potential tool in the malnutrition fight? Mixed method findings from Mozambique. Children and Youth Services Review 47: 113–120.  https://doi.org/10.1016/j.childyouth.2014.07.010.Google Scholar
  10. Byrne, D., and C. Ragin (eds.). 2009. Handbook of case-centred research methods. London: Sage.Google Scholar
  11. Creswell, J.W. 1994. Research design: Qualitative and quantitative approaches. Thousand Oaks, CA: Sage.Google Scholar
  12. Creswell, John W., and Vicki L.Plano Clark. 2018. Designing and conducting mixed methods research, 3rd ed. London: Sage.Google Scholar
  13. Downward, P., and A. Mearman. 2007. Retroduction as mixed-methods triangulation in economic research: Reorienting economics into social science. Cambridge Journal of Economics 31 (1): 77–99.Google Scholar
  14. Duvendack, M., J.G. Hombrados, R. Palmer-Jones, and H. Waddington. 2012. Assessing ‘what works’ in international development: Meta-analysis for sophisticated dummies. Journal of Development Effectiveness 4 (3): 456–471.  https://doi.org/10.1080/19439342.2012.710642.Google Scholar
  15. Fisher, E., Attah, R., Barca, V., O'Brien, C., Brook, S., Holland, J., Kardan, A., Pavanello, S., and Pozarny, P. 2017. The livelihood impacts of cash transfers in sub-Saharan Africa: Beneficiary perspectives from six countries. World Development, 99 (C): 299–319.Google Scholar
  16. Funnell, Sue, and Patricia J. Rogers. 2011. Purposeful program theory: Effective use of theories of change and logic models. Sue Funnell and Patricia J Rogers, Sydney: Jossey-Bass.Google Scholar
  17. Gelli, A., E. Becquey, R. Ganaba, D. Headey, M. Hidrobo, L. Huybregts, and H. Guedenet. 2017. Improving diets and nutrition through an integrated poultry value chain and nutrition intervention (SELEVER) in Burkina Faso: Study protocol for a randomized trial. Trials 18: 412.  https://doi.org/10.1186/s13063-017-2156-4.Google Scholar
  18. Gelli, A., A. Margolies, M. Santacroce, N. Roschnik, A. Twalibu, M. Katundu, and M. Ruel. 2018. Using a community-based early childhood development center as a platform to promote production and consumption diversity increases children’s dietary intake and reduces stunting in Malawi: A cluster-randomized trial. Journal of Nutrition 148 (10): 1587–1597.  https://doi.org/10.1093/jn/nxy148.Google Scholar
  19. Gimenez, A., and A. Perez-Foguet. 2010. Challenges for water governance in rural water supply: Lessons learned from Tanzania. Water Resources Development 26 (2): 235–248.Google Scholar
  20. Hansen, H., O.W. Andersen, and H. White. 2011. Impact evaluation of infrastructure interventions. Journal of Development Effectiveness 3 (1): 1–8.  https://doi.org/10.1080/19439342.2011.547659.Google Scholar
  21. Hellstrom, E. 2001. Conflict cultures—Qualitative Comparative Analysis of environmental conflicts in forestry. Silva Fennica, 2-109.Google Scholar
  22. Hunt, Sheldon. 1994. A realist theory of empirical testing: Resolving the theory-ladenness/objectivity debate. The Philosophy of Social Sciences 24: 2.Google Scholar
  23. Kambala, C., J. Lohmann, J. Mazalale, S. Brenner, M. Sarker, A.S. Muula, and M. De Allegri. 2017. Perceptions of quality across the maternal care continuum in the context of a health financing intervention: Evidence from a mixed methods study in rural Malawi. Bmc Health Services Research 17: 392.  https://doi.org/10.1186/s12913-017-2329-6.Google Scholar
  24. Kelcey, B., Z.C. Shen, and J. Spybrook. 2016. Intraclass correlation coefficients for designing cluster-randomized trials in sub-saharan africa education. Evaluation Review 40 (6): 500–525.  https://doi.org/10.1177/0193841x16660246.Google Scholar
  25. Kikuchi, K., E. Ansah, S. Okawa, A. Shibanuma, M. Gyapong, S. Owusu-Agyei, and E.I.R.P. Ghana. 2015. Ghana’s Ensure Mothers and Babies Regular Access to Care (EMBRACE) program: Study protocol for a cluster randomized controlled trial. Trials 16: 22.  https://doi.org/10.1186/s13063-014-0539-3.Google Scholar
  26. King, E., and C. Samii. 2014. Fast-track institution building in conflict-affected countries? Insights from recent field experiments. World Development 64: 740–754.  https://doi.org/10.1016/j.worlddev.2014.06.030.Google Scholar
  27. King, Gary, and Richard Nielsen. forthcoming. Why propensity scores should not be used for matching, Political Analysis. Author pre-publication copy at http://j.mp/2ovYGsW .
  28. Lam, W.F., and E. Ostrom. 2010. Analyzing the dynamic complexity of development interventions: Lessons from an irrigation experiment in Nepal. Policy Sciences 43 (1): 1–25.  https://doi.org/10.1007/s11077-009-9082-6.Google Scholar
  29. Lan, J., and R.S. Yin. 2017. Research trends: Policy impact evaluation: Future contributions from economics. Forest Policy and Economics 83: 142–145.  https://doi.org/10.1016/j.forpol.2017.07.009.Google Scholar
  30. Layder, D. 1993. New strategies in social research. Cambridge: Polity Press.Google Scholar
  31. Lopez, J., and J. Scott. 2007. Social Structures. Series: Concepts in the Social Sciences. Buckingham: Open University Press.Google Scholar
  32. Lubinga, S.J., A.M. Jenny, E. Larsen-Cooper, J. Crawford, C. Matemba, A. Stergachis, and J.B. Babigumira. 2014. Impact of pharmacy worker training and deployment on access to essential medicines and health outcomes in Malawi: Protocol for a cluster quasi-experimental evaluation. Implementation Science 9: 156.  https://doi.org/10.1186/s13012-014-0156-2.Google Scholar
  33. Luo, L.P., and L. Liu. 2014. Reflections on conducting evaluations for rural development interventions in China. Evaluation and Program Planning 47: 1–8.  https://doi.org/10.1016/j.evalprogplan.2014.06.004.Google Scholar
  34. Masset, E., and A. Gelli. 2013. Improving community development by linking agriculture, nutrition and education: Design of a randomised trial of “home-grown” school feeding in Mali. Trials 14: 55.  https://doi.org/10.1186/1745-6215-14-55.Google Scholar
  35. Maxwell, J., and K. Mittapalli. 2010. Realism as a stance for mixed methods research. In Sage handbook of mixed methods in social and behavioural research, 2nd ed, ed. A. Tashakkori and C. Teddlie, 145–167. Thousand Oaks, CA: Sage.  https://doi.org/10.4135/9781506335193.Google Scholar
  36. McHugh, N., O. Biosca, and C. Donaldson. 2017. From wealth to health: Evaluating microfinance as a complex intervention. Evaluation 23 (2): 209–225.  https://doi.org/10.1177/1356389017697622.Google Scholar
  37. Mock, N.B., R.J. Magnani, L. Dikassa, J.C. Rice, A.A. Abdoh, W.E. Bertrand, and D.M. Mercer. 1993. The utility of case-control methods for health-policy and planning analysis—an illustration from kinshasa, Zaire. Evaluation and Program Planning 16 (3): 199–205.  https://doi.org/10.1016/0149-7189(93)90004-r.Google Scholar
  38. Morgan, Jamie, and Wendy Olsen. 2007. “Defining objectivity in realist terms: Objectivity as a second-order “bridging” concept”. Journal of Critical Realism, 6:2, 250–266; republished 2015 by Taylor & Francis, URL  https://doi.org/10.1558/jocr.v6i2.250 .
  39. Morgan, Jamie, and Wendy Olsen. 2008. “Defining objectivity in realist terms: Objectivity as a second-order “bridging” concept, Part 2: Bridging Into Action”, Journal of Critical Realism, 7:1, 107–132; URL  https://doi.org/10.1558/jocr.v7i1.107 See also open access URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.630.7118&rep=rep1&type=pdf.
  40. Murshed-e-Jahan, K., H. Ali, V. Upraity, S. Gurung, G.C. Dhar, and B. Belton. 2018. Making sense of the market: Assessing the participatory market chain approach to aquaculture value chain development in Nepal and Bangladesh. Aquaculture 493: 395–405.  https://doi.org/10.1016/j.aquaculture.2017.06.003.Google Scholar
  41. Nathan, S., L. Kemp, A. Bunde-Birouste, J. MacKenzie, C. Evers, and T.A. Shwe. 2013. We wouldn’t of made friends if we didn’t come to Football United: The impacts of a football program on young people’s peer, prosocial and cross-cultural relationships. Bmc Public Health 13: 399.  https://doi.org/10.1186/1471-2458-13-399.Google Scholar
  42. Ngwenya, Barbara Ntombi Ngwenya, Ketlhatlogile Keta Mosepele and Lapologang Magole. 2012. A case for gender equity in governance of the Okavango Delta fisheries in Botswana, Natural Resources Forum 36(2012):109–122.Google Scholar
  43. Olsen, Wendy. 2009. Non-nested and nested cases in a socio-economic village study”, chapter In Handbook of Case-Centred Research D. Byrne and C. Ragin, ed. London: Sage.Google Scholar
  44. Olsen, Wendy. 2012. Data collection. London: Sage.Google Scholar
  45. Olsen, Wendy. 2019. Social statistics using strategic structuralism and pluralism, chapter in philosophy of social science edited volume, Frontiers of Social Science: A Philosophical Reflection, Editor: Michiru Nagatsu and Attilia Ruzzene. London: Bloomsbury Publishing.Google Scholar
  46. Orr, L.L. 2015. 2014 rossi award lecture:* Beyond internal validity. Evaluation Review 39 (2): 167–178.  https://doi.org/10.1177/0193841x15573659.Google Scholar
  47. Pollard, S., and D. DuToit. 2011. Towards adaptive integrated water resources management in Southern Africa: The role of self-organisation and multi-scale feedbacks for learning and responsiveness in the letaba and crocodile catchments. Water Resource Management 25: 4019–4035.Google Scholar
  48. Pradhan, M., S.A. Brinkman, A. Beatty, A. Maika, E. Satriawan, J. de Ree, and A. Hasan. 2013. Evaluating a community-based early childhood education and development program in Indonesia: Study protocol for a pragmatic cluster randomized controlled trial with supplementary matched control group. Trials 14: 259.  https://doi.org/10.1186/1745-6215-14-259.Google Scholar
  49. Ragin, C.C. 2008. Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press.Google Scholar
  50. Ragin, C.C. 2000. Fuzzy set social science. Chicago, London: University of Chicago Press.Google Scholar
  51. Ravallion, M. 2009. Evaluation in the practice of development. World Bank Research Observer 24 (1): 29–53.  https://doi.org/10.1093/wbro/lkp002.Google Scholar
  52. Rihoux, B. 2006. Qualitative Comparative Analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research. International Sociology 21 (5): 679–706.Google Scholar
  53. Rihoux, B., and M. Grimm (eds.). 2006. Innovative comparative methods for policy analysis: Beyond the quantitative-qualitative divide. NY, Springer: New York.Google Scholar
  54. Sayer, A. 2000. Realism in social science. London: Sage.Google Scholar
  55. Smithson, M., and J. Verkuilen. 2006. Fuzzy set theory: Applications in the social sciences. Thousand Oaks; London: Sage Publications.Google Scholar
  56. Snow, D., and D. Cress. 2000. The outcome of homeless mobilization: The influence of organization, disruption, political mediation, and framing. American Journal of Sociology 105 (4): 1063–1104.Google Scholar
  57. Ssengooba, F., B. McPake, and N. Palmer. 2012. Why performance-based contracting failed in Uganda—an “open-box” evaluation of a complex health system intervention. Social Science and Medicine 75 (2): 377–383.  https://doi.org/10.1016/j.socscimed.2012.02.050.Google Scholar
  58. Taft, A.J., R. Small, C. Humphreys, K. Hegarty, R. Walter, C. Adams, and P. Agius. 2012. Enhanced maternal and child health nurse care for women experiencing intimate partner/family violence: Protocol for MOVE, a cluster randomised trial of screening and referral in primary health care. Bmc Public Health 12: 811.  https://doi.org/10.1186/1471-2458-12-811.Google Scholar
  59. Taylor, A., et al. 2012. Fostering environmental champions: A process to build their capacity to drive change. Journal of Environmental Management 98: 84–97.Google Scholar
  60. Teddlie, Charles, and Abbas Tashakkori. 2003. Handbook of mixed methods in social & behavioral research. Thousand Oaks: Sage.Google Scholar
  61. Teddlie, Charles, and Abbas Tashakkori. 2009. Foundations of mixed methods research. London: Sage.Google Scholar
  62. The Aspen Institute. 2004. Theory of change as a tool for strategic planning: A report on early experiences. For The Aspen Institute Roundtable on Community Change: Author Andrea A Anderson.Google Scholar
  63. Tremblay, C., and J. Gutberlet. 2010. Empowerment through participation: assessing the voices of leaders from recycling cooperatives in Sa ˜o Paulo, Brazil. Community Development Journal 47 (2): 282–302.Google Scholar
  64. UK Aid Connect. 2018. Guidance Note: Developing a Theory of Change. Downloaded January 2019, URL https://assets.publishing.service.gov.uk/media/5964b5dd40f0b60a4000015b/UK-Aid-Connect-Theory-of-Change-Guidance.pdf.
  65. UNDP/Hivos. 2011. Theory of Change. A Thinking and Action Approach to Navigate in the complexity of social change processes. Author Iñigo R Eguren. For Hivos, The Netherlands, and the UNDP Regional Centre for Latin America and the Caribbean.Google Scholar
  66. White, H. 2013. An introduction to the use of randomised control trials to evaluate development interventions. Journal of Development Effectiveness 5 (1): 30–49.  https://doi.org/10.1080/19439342.2013.764652.Google Scholar

Copyright information

© European Association of Development Research and Training Institutes (EADI) 2019

Authors and Affiliations

  1. 1.Department of Social StatisticsUniversity of ManchesterManchesterUK

Personalised recommendations