Abstract
Predicting the outcome or success of any public policy is notoriously difficult. Professionals responsible for the development, implementation and evaluation of education policy are faced with the thorny problem of identifying the impact of different education policies upon educational practice. Problems in determining the impact of a policy are compounded by the ways in which the same policy can be implemented in surprisingly different ways in different contexts at the local level. This leads to what Bardach (1977) descibes as an ‘implementation deficit’ where well-intentioned policy reforms fail to be realized in practice. In addition, competing claims made for gains in achievement and educational improvement based on the measurement of the outcomes of an educational policy, can be relatively easy to fabricate in situations where it is not difficult for stakeholders on all sides to ‘game’ a system based upon centrally imposed, crude targets and blunt measures of ‘outcomes’.
Traditionally, the development, implementation and evaluation of educational policy has been based upon a kind of positivist-technocratic policy science which lays claim to neutrality; asserts the ability to separate policy from politics; employs formal ‘scientific’ methods; systematically collects and analyses data; to arrive at what are claimed to be detached policy decisions grounded in science, hard evidence and facts.
However, assumptions that formal scientific knowledge and hard evidence on their own can secure the successful implementation of policy are quite seriously flawed for at least two reasons. Firstly, they are obliged to engage the services of an epistemic imposter which lays claim not only to absolute knowledge but also ownership of the powers of prescient judgment. Secondly, they rely at implementation stage upon a posturing, somewhat overbearing and rather high-maintenance accomplice, ‘command and control’. This chapter contends that our knowledge about issues in the implementation of education policy is acutely under-researched and that policy professionals need to involve the implementors of education policy—education practitioners in the evaluation of policies which aspire to bring about improvements in educational practice. It argues that while technical-rational, positivist approaches to policy development, implementation and evaluation, may at first may appear to be rational and intuitively appealing, they are in the long run unhelpful in bringing about real and sustainable change and educational improvement because they overlook and underestimate the challenges of the ‘implementation deficit’. It concludes that the time has come for policy professionals and practitioners in education to work alongside each other to recognize the importance of local knowledge and lived experience in order to explore what a more democratic-pragmatic-interpretive model of policy development, implementation and evaluation might look like in practice.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bardach, E. (1977). The Implementation Game: What Happens After a Bill Becomes a Law. Cambridge, MA: MIT.
Biesta, G., & Burbules, N. C. (2003). Pragmatism and Educational Research. Lanham, MD: Rowman and Littlefield.
Coffield, F. (2008). Just Suppose Teaching and Learning Became the First Priority. London: Learning and Skills Network.
Coffield, F. (2017). Will the Leopard Change its Spots?: A New Model of Inspection for Ofsted. London: Institute of Education Press, University College London.
Davies, H. T. O., Nutley, S., & Walter, I. (2005). Approaches to assessing research impact: Report of the ESRC symposium on assessing the non-academic impact of research. Retrieved April 9, 2007, from http://www.esrcsocietytoday.ac.uk/ESRCInfoCentre/Forums/attach.aspx?a564
Dewey, J. (1910). How We Think. Boston: Heath.
Dewey, J. (1916). Introduction to Essays in Experimental Logic. In J. A. Boydston (Ed.), The Middle Works 1899–1924 (Vol. 10, pp. 320–369). Carbondale: Southern Illinois University Press.
Dewey, J. (1933, Revised Edn.). How We Think: A Restatement of the Relation of Reflective Thinking to the Reflective Process. Boston: Heath.
Fielding, M. (2003). The Impact of Impact. Cambridge Journal of Education, 33 (2), 289–295. London: Routledge.
Gardner, J., Holmes, B., & Leitch, R. (2008). Where There Is Smoke, There Is (the Potential for) Fire: Soft Indicators of Research and Policy Impact. Cambridge Journal of Education, 38(1), 89–104.
Geertz, C. (2000). Available Light: Anthropological Reflections on Philosophical Topics. New Jersey: Princeton University Press.
Gregory, M. (2009). Shaped by Stories, The Ethical Power of Narratives. Indiana: University of Notre Dame.
Gregson, M., & Nixon, L. (2009). Assessing Effectiveness: Ways of Seeing Impact. International Journal of Interdisciplinary Social Sciences: Annual Review, 3(12), 67–74.
Gregson, M., & Todd, B. (2019). Realizing Standards of Quality in Vocational Education and Training. In Handbook of Vocational Education and Training for the Changing World of Work. Switzerland: Springer. ISBN:978-3-319-94531-6.
Hajer, M. A., & Wagenaar, H. (2003). Deliberative Policy Analysis: Understanding Governance in the Network Society. New York: Cambridge University Press.
Hyland, T. (2018). Embodied Learning in Vocational Education and Training. Journal of Vocational Education and Training, 71(1). https://doi.org/10.1080/13636820.2018.1517129.
Keep, E. (2006). State Control of the English Education and Training System—Playing with the Biggest Train Set in the World. Journal of Vocational Education and Training, 58(1), 47–64.
Midgley, M., (1996). Utopias, Dolphins and Computers. London: Routledge.
Parlett, M., & Hamilton, D. (1972). Evaluation As Illumination: A new approach to the study of innovatory programmes, occasional paper No. 9. Centre for Research in the Educational Sciences, University of Edinburgh.
Sarason, S. (1998). The Predictable Failure of Education Reform. San Francisco: Jossey-Bass.
Scott, J. C. (1998). Seeing Like a Sate: How Certain Schemes to Improve the Human Condition Have Failed. New Haven, CT: Yale University Press.
Sennett, R. (2009). The Craftsman. London: Penguin.
Silver, H. (2003). Re-viewing Impact. Cambridge Journal of Education, 33, 2.
Stake, R. E. (1973). The Countenance of Educational Evaluation. Teachers College Record, 68, 523–540.
Thagard, P., & Beam, C. (2004). Epistemological Metaphors and the Nature of Philosophy. Metaphilosophy, 35(4), 504–516.
Wagenaar, H., & Noam Cook, S. D. (2003). Understanding Policy Practices: Action, Dialectic and Deliberation in Policy Analysis. In M. A. Hajer & H. Wagenaar (Eds.), Deliberative Policy Analysis: Understanding Governance in the Network Society. New York: Cambridge University Press.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 The Author(s)
About this chapter
Cite this chapter
Gregson, M., Kessell-Holland, P. (2020). Bringing Practitioners Back In. In: Gregson, M., Spedding, P. (eds) Practice-Focused Research in Further Adult and Vocational Education. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-38994-9_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-38994-9_12
Published:
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-030-38993-2
Online ISBN: 978-3-030-38994-9
eBook Packages: EducationEducation (R0)