Designing Complex Development Programs

  • Apollo M. Nkwake


Approaches to designing development programs vary. Some program designs emphasize stakeholder involvement, others focus on the program environment, and still others emphasize the sequence of change depicted in program results (change frameworks). This chapter examines change frameworks for designing complex development programs, including the logical framework approach (LFA), theory of change approach (TOCA), and participatory impact pathways approach (PIPA), and the extent to which they enable stakeholders to explicate and question implicit assumptions.


Logical framework approach Theory of change Examining assumptions Pathways of change Participatory impact pathways analysis 


  1. Abonyi, G., & Howard, N. (1980). A Boolean approach to interactive program planning. Management Science, 26(7), 719–735.CrossRefGoogle Scholar
  2. Ahuja, G. (2000). Collaboration networks, structural holes, and innovation: A longitudinal study. Administration Science Quarterly, 45(1), 425–455.CrossRefGoogle Scholar
  3. Alvarez, S., Douthwaite, B., Thiele, G., Mackay, R., Córdoba, D., & Tehelen, K. (2010). Participatory impact pathways analysis: A practical method for project planning and evaluation. Development in Practice, 20(8), 946–958.CrossRefGoogle Scholar
  4. Ambrose, K., & Roduner, D. (2009). A conceptual fusion of the logical framework approach and outcome mapping. OM Ideas Paper No. 1. Retrieved from
  5. Anderson, A. (2004). Theory of change as a tool for strategic planning: A report on early experiences. Washington, DC: Aspen Institute.Google Scholar
  6. Bakewell, O., & Garbutt, A. (2005). The use and abuse of the logical framework approach: A review of international development NGOs’ experiences. Stockholm, Sweden: Swedish International Development Agency (SIDA). Retrieved from Scholar
  7. Bury, B. (2011). Response to Steve Powell on Illogical frameworks, composite results and logframe bloat. Retrieved from
  8. Chandrasekhar, A. G., Kinnan, C., & Larreguy, H. (2011). Informal insurance, social networks, and savings access: Evidence from a lab experiment in the field. Retrieved from
  9. Clark, H., & Anderson, A. (2004). Theories of change and logic models: Telling them apart. Presentation at the American Evaluation Association annual conference, Atlanta, Georgia, November 2004.Google Scholar
  10. Commonwealth Education Hub/PIASCY. (2015). The presidential initiative on AIDS strategy to youth (PIASCY). Kampala, Uganda: USAID. Retrieved from
  11. Coote, A., Allen, J., & Woodhead, D. (2004). Finding out what works. London, UK: Kings Fund.Google Scholar
  12. Davies, R. (2004). Scale, complexity and the representation of theories of change. Evaluation, 10(1), 101–121.CrossRefGoogle Scholar
  13. Davies, R. (2005). Moving from logical to network frameworks: A modular matrix approach to representing and evaluating complex programs. Unpublished manuscript.Google Scholar
  14. Deprez, S., & Van Ongevalle, J. (2006a). Building reflection and learning into educational programmes, Introducing Outcome Mapping. The case of St2eep. Conference Proceedings, International Conference on Strengthening Innovation and Quality in Education, Leuven, Belgium.Google Scholar
  15. Deprez, S., & Van Ongevalle, J. (2006b). Outcome Mapping: Learning the way forward—An alternative way to plan, monitor and evaluate programmes. Conference Proceedings, Environmental Education Association of Southern Africa, Harare, Zimbabwe.Google Scholar
  16. Douthwaite, B., Alvarez, S., Cook, S., Davies, R., George, P., Howell, J., … Rubiano, J. (2007). Participatory impact pathways analysis: A practical application of program theory in research-for-development. Canadian Journal of Program Evaluation, 22(2), 127–159.Google Scholar
  17. Douthwaite, B., Alvarez, S., Thiele, G., & Mackay, R. (2008). Participatory impact pathways analysis: A practical method for project planning and evaluation (ILAC Brief 17, May 2008). Rome, Italy: The Institutional Learning and Change (ILAC) Initiative. Retrieved from Scholar
  18. Douthwaite, B., Schulz, S., Olanrewaju, A. S., & Ellis-Jones, J. (2003). Impact pathway evaluation: An approach for achieving and attributing impact in complex systems. Agricultural Systems, 78, 243–265.CrossRefGoogle Scholar
  19. Earl, S., Carden, F., & Smutylo, T. (2001). Outcome mapping: Building learning and reflection into development programs. Ottawa, ON: International Development Research Centre.Google Scholar
  20. European Commission (EC). (2001). Project cycle management training courses handbook (Version 1.1). Brussels, Belgium: EC. Retrieved from Scholar
  21. Freeman, L. (2006). The development of social network analysis. Vancouver, BC: Empirical Press.Google Scholar
  22. Gasper, D. (1997). “Logical frameworks”: A critical assessment (Working Paper No. 278). The Hague, Netherlands: Institute of Social Studies.Google Scholar
  23. Gasper, D. (2000). Evaluating the “logical framework approach” towards learning-oriented development evaluation. Public Administration and Development, 20(1), 17–28.CrossRefGoogle Scholar
  24. Hogan, B., Carrasco, J., & Wellman, B. (2007). Visualizing personal networks: Working with participant-aided sociograms. Field Methods, 19(2), 116–144.CrossRefGoogle Scholar
  25. Hummelbrunner, R. (2010). Beyond logframe: Critique, variations and alternatives. In N. Fujita (Ed.), Beyond logframe: Using systems concepts in evaluation (pp. 1–33). Tokyo, Japan: Foundation for Advanced Studies on International Development. Retrieved from Scholar
  26. Jackson, B. (2000). Project designing projects and project evaluations using the logical framework approach. Retrieved October 12 2011, from www.Management/logicalframeworkapproach.htm.Google Scholar
  27. Jacobs, B., Mulroy, S., & Sime, C. (2002). Theories of change and community involvement in North Staffordshire Health Action Zone. In L. Bauld & K. Judge (Eds.), Learning from health action zones (pp. 139–148). Chichester, UK: Aeneas Press.Google Scholar
  28. Johnson, M. A., Casillas, W., Brown, J. U., & Trochim, W. (2011). Using systems thinking in evaluation capacity building: The systems evaluation protocol. Paper presented at the American Evaluation Association Annual Conference, Anaheim California, November 2011.Google Scholar
  29. Judge, K. (2000). Testing evaluation to the limits: The case of English Health Action Zones. Journal of Health Services Research and Policy, 5(1), 1–3.CrossRefGoogle Scholar
  30. Kibel, B. (2000). Outcome engineering toolbox. Retrieved from
  31. Leischow, S. J., Best, A., Trochim, W. M., Clark, P. I., Gallagher, R. S., Marcus, S., & Matthews, E. (2008). Systems thinking to improve the public’s health. American Journal of Preventive Medicine, 35(2), 196–203.CrossRefGoogle Scholar
  32. Mason, P., & Barnes, M. (2007). Constructing theories of change: Methods and sources. Evaluation, 13(2), 151–170.CrossRefGoogle Scholar
  33. Moody, J., & White, D. R. (2003). Structural cohesion and embeddedness: A hierarchical concept of social groups. American Sociological Review, 68(1), 103–127.CrossRefGoogle Scholar
  34. Owen-Smith, J., & Powell, W. W. (2004). Knowledge networks as channels and conduits: The effects of spillovers in the Boston Biotechnology Community. Organization Science, 15(1), 5–21.CrossRefGoogle Scholar
  35. Powell, S. (2011). Illogical frameworks, composite results and log frame bloat. Retrieved from
  36. Renger, R., & Titcomb, A. (2002). A three-step approach to teaching logic models. American Journal of Evaluation, 23, 493–503.CrossRefGoogle Scholar
  37. Richard, R. F. (2009). The logic model and systems thinking: Can they co-exist? Paper presented at the American Evaluation Association Conference, Orlando, Florida, November 2009.Google Scholar
  38. Rockwell, S. K., & Bennett, C. F. (2000). Targeting outcomes of programs. Retrieved July, 2011 from University of Nebraska, Institute of Agriculture and Natural Resources.
  39. Rowlands, J. (2003). Beyond the comfort zone: Some issues, questions, and challenges in thinking about development approaches and methods. In D. Eade (Ed.), Development methods and approaches: Critical reflections. Development in practice reader (pp. 1–29). London, UK: Oxfam GB. Retrieved from Scholar
  40. Trochim, W. M., Cabrera, D. A., Milstein, B., Gallagher, R. S., & Leischow, S. J. (2006). Practical challenges of systems thinking and modeling in public health. American Journal of Public Health, 96, 538–546.CrossRefGoogle Scholar
  41. Wasserman, S., & Faust, K. (1994). Social network analysis: Methods and applications. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  42. Weiss, C. H. (2000). Which links in which theories shall we evaluate? New Directions for Evaluation, 87(Fall), 35–45.CrossRefGoogle Scholar
  43. Williams, B. (2011). Comments on a blog.
  44. Williams, B., & Iman, I. (Eds.). (2008). Systems concepts in evaluation: An expert anthology. Point Reyes, CA: Edge Press of Inverness.Google Scholar
  45. World Bank. (2009). Interactive community planning-upgrading urban communities. Retrieved from

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Apollo M. Nkwake
    • 1
  1. 1.Questions LLCMarylandUSA

Personalised recommendations