Abstract
Diagnostic assumptions are stated as stakeholders’ expectations or beliefs of the major and minor causes of core problems. Since the intervention to address a problem is based on the causes of that problem, diagnostic assumptions are crucial to a normative theory and need to be examined from design, implementation, and evaluation perspectives. This chapter appraises the use of the policy scientific approach in explicating diagnostic assumptions. An Alternative Causes Approach is proposed.
We believe it is fundamentally important for us to examine the assumptions underlying our ways of doing things and our attitudes toward educational objectives and processes. Some of these attitudes were acquired by mere gregarious assent. Now that we have better means of checking up on our assumptions, we should proceed to overhaul the whole question of organizing, managing, and teaching our public schools. This is a big job—too big to be undertaken all at once. Under these circumstances the most important things should be done first. We suggest that they are not the minutiae but the broad basic facts underlying the larger aspects of education. In short, the time has come when we may properly examine the assumptions which we have considered sufficient and see whether, with the existing means of measurement and analysis, these assumptions maintain their validity
Educational Research Bulletin, editorial; 1923, p. 276
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Berberet, H. M. (2006). Putting the pieces together for queer youth: A model of integrated assessment. Child Welfare, 85(2), 361–377.
Chen, H. T. (2005). Practical program evaluation: Assessing and improving planning, implementation, and effectiveness. Newbury Park: Sage publications.
Editorial Comment. (1923). Fundamental assumptions. Educational Research Bulletin, 2(16), 276.
Ehren, M. C. M., Leeuw, F. L., & Scheerens, J. (2005). On the impact of the Dutch educational supervision act analyzing assumptions concerning the inspection of primary education. American Journal of Evaluation, 26(1), 60–76.
European Communities (EC) (1999a). Project cycle management training handbook. Brussels: EC.
European Commission. (1999b). Joint relex service for the management of community aid to non-member countries (SCR). Resources, relations with the other institutions, evaluation, and information evaluation. Sussex: EC.
Fitzpatrick, J. (2002). Dialogue with Stewart Donaldson. American Journal of Evaluation, 23(3), 347–365.
Kautto, P., & Similä, J. (2005). Recently introduced policy instruments and intervention theories. Evaluation, 11(1), 55–68.
Kruse-Levy, N., Senefeld, S., Sitha, A., & Arturo, A. (2007). Bridges of hope socioeconomic reintegration project report of a follow-up survey with clients living with HIV and AIDS.Catholic Relief Services (CRS): Baltimore.
Lozare, B. V. (2010). Summative evaluation surveys: Ethiopia, Rwanda and Uganda. Catholic Relief Services-Avoiding Risk, Affirming Life Program. Catholic Relief Services (CRS): Baltimore.
Leeuw, F. L. (2003). Reconstructing program theories: Methods available and problems to be solved. American Journal of Evaluation, 24(1), 5–20.
Makayi, M., Judge, D., Nyungula, M., & Leather, C. (2006). Evaluation of cash transfer pilot project in Western Province, Zambia. Oxford: OxfamGB.
Mayne, J. (2011). Contribution analysis addressing cause effect. In K. Forss, M. Marra & R. Schwartz (Eds.), Evaluating the complex: Attribute contribution and beyond, New Brunswick (pp. 53–96) New Jersy: Transactional Publishers
Marcano, L., & Ruprah, I. J. (2008). An impact evaluation of Chile’s progressive housing program. Washington, DC: Inter-American Development Bank (Working Paper: OVE/WP-06/08).
Ouane, A. (2002) Key competencies in lifelong learning. Institutionalizing lifelong learning: Creating conducive environments for adult learning in the Asian context. Hamburg: UNESCO Institute for Education.
van Noije, Lonneke, & Wittebrood, K. (2010). Fighting crime by fighting misconceptions and blind spots in policy theories: An evidence-based evaluation of interventions and assumed causal mechanisms. American Journal of Evaluation, 31(4), 499–516.
Social Impact (1999). Problem tree analysis. Retrieved November 22, 2011, from http://webarchive.nationalarchives.gov.uk/+/http://www.dfid.gov.uk/FOI/tools/chapter_03.htm
Wandersmana, A., Imma, P., Chinmanb, M., & Kaftarian, S. (2000). Getting to outcomes: A results-based approach to accountability. Evaluation and Program Planning, 23(2000), 389–395.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Nkwake, A.M. (2013). Diagnostic Assumptions. In: Working with Assumptions in International Development Program Evaluation. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-4797-9_8
Download citation
DOI: https://doi.org/10.1007/978-1-4614-4797-9_8
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-4796-2
Online ISBN: 978-1-4614-4797-9
eBook Packages: Humanities, Social Sciences and LawSocial Sciences (R0)