Advertisement

Journal of Educational Change

, Volume 16, Issue 1, pp 79–99 | Cite as

Knowledge creation as an approach to facilitating evidence informed practice: Examining ways to measure the success of using this method with early years practitioners in Camden (London)

  • Chris Brown
  • Sue Rogers
Article

Abstract

This paper has three key aims. First it examines the authors’ attempts to use knowledge creation activity as a way of developing evidence informed practice amongst a learning community of 36 early years practitioners in the London Borough of Camden. Second, it seeks to illustrate how the authors approached the idea of measuring evidence use and our engagement with two separate measurement scales: the ‘ladder of research use’ and Hall and Hord’s (Implementing change: patterns, principles and potholes, Allyn and Bacon, Boston, 2001) levels of use scale. Finally we examine the ‘trustworthiness’ of our approaches to measuring evidence use, which we explored via in-depth semi-structured interviews and the analysis of meeting notes. Our findings would appear to be encouraging, suggesting that knowledge creation activity provides an effective way of communicating research and keeping it top of mind; also that our data would appear to support the trustworthiness of our measurement scales as a means to ascertain levels of evidence use. At the same time the approach we have developed does have its limitations: namely, that it is only really applicable to situations where researchers are working regularly with practitioners on areas of practice development, where the general desire is that these areas should become evidence-informed. We suggest, however, that in school systems such as England’s, where the expectation is that schools or alliances of schools should lead their professional development activity, often in partnership with universities, it is likely that these instances will soon be increasing in number.

Keywords

Evidence informed practice Measures of evidence informed practice Expertise in evidence use Measuring evidence use Knowledge creation Early years Early years foundation stage EYFS 

References

  1. Anning, A. (2006). Early years education: Mixed messages and conflicts. In D. Kassem, E. Muti, & J. Robinson (Eds.). Education studies: Issues and critical perspectives. Maidenhead: Open University Press.Google Scholar
  2. Biesta, G. (2007). Why ‘what works’ won’t work: Evidence-based practice and the democratic deficit in educational research. Educational Theory, 57(1), 1–22.CrossRefGoogle Scholar
  3. Brown, C. (2013). Making evidence matter: A new perspective on evidence-informed policy making in education. London: IOE Press.Google Scholar
  4. Brown, C. (2014). Evidence informed policy and practice in education. A sociological grounding. London: Bloomsbury.Google Scholar
  5. Cartwright, N. (2013). Knowing what we are talking about: Why evidence doesn’t always travel. Evidence & Policy, 9(1), 97–112.CrossRefGoogle Scholar
  6. Cherney, A., Povery, J., Head, B., Boreham, P., & Ferguson, M. (2012). What influences the utilisation of educational research by policy-makers and practitioners?: The perspectives of academic educational researchers. International Journal of Educational Research, 53, 23–34.CrossRefGoogle Scholar
  7. Cooper, A., Levin, B., & Campbell, C. (2009). The growing (but still limited) importance of evidence in education policy and practice. Journal of Educational Change, 10(2–3), 159–171.CrossRefGoogle Scholar
  8. Department for Education (2011). Early Years Foundation Stage: Report on the evidence (Tickell Review). London: Department for Education.Google Scholar
  9. Department for Education (2012). Early Years Foundation Stage Framework 2012. http://www.foundationyears.org.uk/early-years-foundation-stage-2012/. Accessed on 8 Jul 2012.
  10. Earl, L., & Timperley, H. (2008). Understanding how evidence and learning conversations work. In L. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement. Netherlands: Springer.Google Scholar
  11. Earley, P., & Porritt, V. (2013). Evaluating the impact of professional development: the need for a student-focused approach. Professional Development in Education,. doi: 10.1080/19415257.2013.798741.Google Scholar
  12. Edwards, C., Gandini, L., & Forman, G. (Eds.). (2012). The hundred languages of children: Reggio Emilia in transformation (3rd ed.). California: ABC-CLIO.Google Scholar
  13. Flyvbjerg, B. (2001). Making social science matter. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  14. Goldacre, B. (2013). Building evidence into education. https://www.gov.uk/government/news/building-evidence-into-education. Accessed on 27 Jan 2014.
  15. Greany, T. (2014) Are we nearly there yet? Progress, issues and next steps for a self-improving system, inaugural professorial lecture to the Institute of Education, 18 March 2014.Google Scholar
  16. Hall, G. E., & Hord, S. M. (2001). Implementing change: Patterns, principles and potholes. Boston: Allyn and Bacon.Google Scholar
  17. Hargreaves, D. (1996) The Teaching Training Agency Annual Lecture 1996: Teaching as a research based profession: possibilities and prospects. http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/TTA%20Hargreaves%20lecture.pdf. Accessed on 14 Jan 2013.
  18. Hargreaves, D. (2010). Creating a self-improving school system. Nottingham: National College for School Leadership.Google Scholar
  19. Hillage, L., Pearson, R., Anderson, A., & Tamkin, P. (1998). Excellence in research on schools. London: DfEE.Google Scholar
  20. Knott, J., & Wildavsky, A. (1980). If dissemination is the solution, what is the problem? Knowledge: Creation, Diffusion, Utilization, 1(4), 537–578.Google Scholar
  21. Landry, R., Amara, N., & Lamari, M. (2003). The extent and determinants of utilization of university research in government agencies. Public Administration Review, 63(2), 192–205.CrossRefGoogle Scholar
  22. Levin, B. (2013). To know is not enough: Research knowledge and its use. Review of education, 1(1), 2–31.CrossRefGoogle Scholar
  23. Lincoln, Y., & Guba, E. (1985). Naturalistic Inquiry. Newbury Park, CA: Sage Publications.Google Scholar
  24. Loucks, S. F., Newlove, B. W., & Hall, G. E. (1975). Measuring levels of use of the innovation: A manual for trainers, interviewers and raters. Austin: The University of Texas. Research and Development Center for Teacher Education.Google Scholar
  25. MacLure, M. (2005). ‘Clarity bordering on stupidity’: Where’s the quality in systematic review? Journal of Educational Policy, 20(4), 393–416.CrossRefGoogle Scholar
  26. März, V., & Kelchtermans, G. (2013). Sense-making and structure in teachers’ reception of educational reform. A case study on statistics in the mathematics curriculum. Teaching and Teacher Education, 29, 13–24.CrossRefGoogle Scholar
  27. Moss, G. (2013). Research, policy and knowledge flows in education: What counts in knowledge mobilisation? Contemporary Social Science: Journal of the Academy of Social Sciences, 8(3), 237–248.Google Scholar
  28. Nonaka, I., & Takeuchi, H. (1995). The knowledge creating company: how Japanese companies create the dynamics of innovation. New York: Oxford University Press.Google Scholar
  29. Nutley, S. M., Walter, I., & Davies, H. T. O. (2007). Using evidence: How research can inform public services. Bristol: The Policy Press.Google Scholar
  30. Rexvid, D., Blom, B., Evertsson, L., & Forssen, A. (2012). Risk reduction technologies in general practice and social work. Professions and Professionalism, 2(2), 1–18.CrossRefGoogle Scholar
  31. Rogers, S. (2014). An enabling pedagogy: Meanings and practices. In J. Moyles, J. Payler, & J. Georgeson (Eds.), Early years foundations: Critical issues. Maidenhead: Open University Press.Google Scholar
  32. Rogers, S., & Brown, C. (2014) Developing early years learning and pedagogy: An evaluation report, unpublished report for Camden Partnership for Excellence in Education.Google Scholar
  33. Seashore-Louis, K. (2010). Learning communities in learning schools: Developing the social capacity for change. In C. Day (Ed.). Routledge international handbook of teacher and school development. London: Routledge.Google Scholar
  34. Stoll, L. (2008). Leadership and policy learning communities: promoting knowledge animation. In B. Chakroun & P. Sahlberg (Eds.), Policy learning in action: European Training Foundation Yearbook 2008. Torino, Italy: European Training Foundation.Google Scholar
  35. Stoll, L. (2009) Knowledge Animation in Policy and Practice: Making Connections, Paper presented at the Annual Meeting of the American Educational Research Association as part of the symposium Using Knowledge to Change Policy and Practice. www.oise.utoronto.ca/rspe/UserFiles/File/Publications%20Presentations/AERA%2009%20knowledge%20animation%20paper%20Stoll.pdf. Accessed on 23 Jan 2014.
  36. Stoll, L. (2012). Stimulating learning conversations. Professional Development Today, 14(4), 6–12.Google Scholar
  37. Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning communities: A review of the literature. Journal of Educational Change, 7, 221–258.CrossRefGoogle Scholar
  38. Tooley, J., & Darby, D. (1998). Educational research: A critique. London: Ofsted.Google Scholar
  39. Virtanen, A., & Tynjälä, P. (2008). Students’ experiences of workplace learning in Finnish VET. European Journal of Vocational Training, 44, 199–213.Google Scholar
  40. Wenger, E., McDermott, R., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Boston, MA: Harvard Business School Press.Google Scholar
  41. Whitebread, D., Coltman, P., Pino Pasternak, D., Sangster, C., Grau, V., Bingham, S., et al. (2009). The development of two observational tools for assessing metacognition and self-regulated learning in young children. Metacognition and Learning, 4(1), 63–85.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Institute of EducationUniversity of LondonLondonUK

Personalised recommendations