Analysis and Discussion of Classroom and Achievement Data to Raise Student Achievement

Chapter
Part of the Studies in Educational Leadership book series (SIEL, volume 17)

Abstract

In New Zealand, there is evidence that analysing data in teams can lead to improvements in student achievement. In this country, data discussions in professional learning communities were an important component of research and development interventions in three clusters of schools (n = 48 schools). These interventions significantly improved student achievement over 3 years, and these achievement gains were sustained after the interventions. In this chapter, the authors focus on a central feature of these data discussions, understanding classroom instruction in relation to student achievement patterns. The importance of inter-dependence between schools and external experts, greater pedagogical content knowledge to link classroom instruction to achievement results and the creation and use of school artefacts (e.g., data analysis reports) to facilitate effective data use are also discussed.

References

  1. Annan, B. (2007). A theory for schooling improvement: Consistency and connectivity to improve instructional practice. Unpublished thesis, University of Auckland, Auckland.Google Scholar
  2. Annan, B., Lai, M. K., & Robinson, V. M. J. (2001). Teacher talk to improve teacher practices. SET, 1, 31–35.Google Scholar
  3. Biemiller, A. (2001). Teaching vocabulary: Early, direct and sequential. American Educator, 25(1), 24–28 and 47.Google Scholar
  4. Block, C. C., & Pressley, M. (Eds.). (2002). Comprehension instruction: Research-based best practices. New York: Guilford.Google Scholar
  5. Borman, G. D. (2000). Making the most of summer school: A meta analytic and narrative review. In H. Cooper, K. Charlton, J. C. Valentine & L. Muhlenbruck (Eds.), Monographs of the society for research in child development. 260(65), 119–127.Google Scholar
  6. Borman, G. D. (2005). National efforts to bring reform to scale in high-poverty schools: Outcomes and implications. In L. Parker (Ed.), Review of research in education, (29, pp. 1–28). Washington: American Educational Research Association.Google Scholar
  7. Buly, M. R., & Valencia, S. W. (2002). Below the bar: Profiles of students who fail state reading assessments. Educational and Evaluation and Policy Analysis, 24(3), 219–239.CrossRefGoogle Scholar
  8. Campbell, C., & Levin, B. (2009). Using data to support educational improvement. Educational Assessment, Evaluation and Accountability, 21(1), 47–65.CrossRefGoogle Scholar
  9. Cawelti, G., & Protheroe, N. (2001). High student achievement: How six school districts changed into high-performance systems. Arlington: Educational Research Service.Google Scholar
  10. Coburn, C., Toure, J., & Yamashita, M. (2009). Evidence, interpretation and persuasion: Instructional decision making at the district central office. Teachers College Record, 111(4), 1115–1161.Google Scholar
  11. Darling-Hammond, L., & Bransford, J. (2005). Preparing teachers for a changing world: What teachers should be able to learn and be able to do. San Francisco: Wiley.Google Scholar
  12. Earl, L., & Timperley, H. (Eds.). (2008). Evidence-based conversations to improve educational practices. Netherlands: Kluwer/Springer Academic.Google Scholar
  13. Education Review Office (2009). Getting the most out of your ERO review: Schools. Wellington, New Zealand: Education Review Office. http://www.ero.govt.nz/ Review-Process/For-Parents. Accessed 21 Jan 2010.Google Scholar
  14. Elley, W. (2001). STAR supplementary test of achievement in reading: Years 4–6. Wellington. New Zealand: New Zealand Council for Educational Research.Google Scholar
  15. Fancy, H. (2007). Schooling reform: Reflections on the New Zealand experience. In T. Townsend (Ed.), International handbook of school effectiveness and improvement (Vol. 1). Netherlands: Springer.Google Scholar
  16. Frederiksen, N. (1984). The real test bias: Influences of testing on teaching and learning. American Psychologist, 39, 193–202.CrossRefGoogle Scholar
  17. Fuller, B., Wright, J., Gesicki, K., & Kang, E. (2007). Gauging growth: How to judge: No child left behind. Educational Researcher, 36, 268–278.CrossRefGoogle Scholar
  18. Halverson, R. (2003). Systems of practice: How leaders use artifacts to create professional community in schools. Education Policy Analysis Archives, 11(37). Retrieved from http://epaa.asu.edu/epaa/v11n37.Google Scholar
  19. Halverson, R. (2007). How leaders use artifacts to structure professional community. In L. Stoll & K. Seashore-Louis (Eds.), Professional learning communities: Divergence, depth and dilemmas (pp. 93–105). New York: Open University Press.Google Scholar
  20. Hart, B., & Risley, T. R. (1995). Meaningful differences in the everyday experience of young American children. Baltimore: Paul H. Brookes.Google Scholar
  21. Lai, M. K., & McNaughton, S. (2008). Raising student achievement in poor, urban communities through evidence-based conversations. In L. Earl & H. Timperley (Eds.), Evidence-based conversations to improve educational practices (pp. 13–27). Netherlands: Kluwer/Springer Academic.Google Scholar
  22. Lai, M. K., & McNaughton, S. (2009). Not by achievement analysis alone: How inquiry needs to be informed by evidence from classrooms. New Zealand Journal of Educational Studies, 44(2), 93–108.Google Scholar
  23. Lai, M. K., & McNaughton, S. (2010). Evidence-informed discussions: The role of pedagogical content knowledge. In H. Timperley & J. Parr (Eds.), Weaving evidence, inquiry and standards to build better schools (pp. 157–172). Wellington: New Zealand Council for Educational Research.Google Scholar
  24. Lai, M. K., McNaughton, S., Amituanai-Toloa, M., Turner, R., & Hsiao, S. (2009a). Sustained acceleration of achievement in reading comprehension: The New Zealand experience. Reading Research Quarterly, 44(1), 30–56.CrossRefGoogle Scholar
  25. Lai, M. K., McNaughton, S., Timperley, H., & Hsiao, S. (2009b). Sustaining continued acceleration in reading comprehension achievement following an intervention. Educational Assessment, Evaluation and Accountability, 21(1), 81–100.CrossRefGoogle Scholar
  26. McNaughton, S. (2002). Meeting of minds. Wellington: Learning Media.Google Scholar
  27. McNaughton, S., & Lai, M. K. (2010). The learning schools model of school change to raise achievement in reading comprehension for culturally and linguistically diverse students in New Zealand. In P. H. Johnston (Ed.), RTI in literacy—responsive and comprehensive (pp. 313–331). Newark: International Reading Association.Google Scholar
  28. McNaughton, S., Lai, M.K. & Hsiao, S. (2012). Testing the effectiveness of an intervention model based on data use: A replication series across clusters of schools. School effectiveness and School Improvement, 23(2), 203-228.CrossRefGoogle Scholar
  29. Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1(2), 117–175.CrossRefGoogle Scholar
  30. Reid, N. A., & Elley, W. B. (1991). Revised progressive achievement tests: Reading comprehension. Wellington: New Zealand Council for Educational Research.Google Scholar
  31. Robinson, V. M. J. (1993). Problem-based methodology: Research for the improvement of practice. Great Britain: Pergamon.Google Scholar
  32. Robinson, V. M. J., & Lai, M. K. (2006). Practitioner research for educators: A guide to improving classrooms and schools. California: Thousand Oaks Corwin.Google Scholar
  33. Timperley, H., & Parr, J. (2010). Evidence, inquiry and standards. In H. Timperley & J. Parr (Eds.), Weaving evidence, inquiry and standards to build better schools. Wellington: NZCER.Google Scholar
  34. Timperley, H., McNaughton, S., Lai, M. K., Hohepa, M., Parr, J., & Dingle, R. (2010). Towards an optimal model for building better schools. In H. Timperley & J. Parr (Eds.), Weaving evidence, inquiry and standards to build better schools (pp. 25–49). Wellington: NZCER.Google Scholar
  35. Toole, J. C., & Seashore-Louis, K. (2002). The role of professional learning communities in international education. In K. L. P. Hallinger (Ed.), Second international handbook of educational leadership and administration (pp. 245–279). Dordrecht: Kluwer Academic.CrossRefGoogle Scholar
  36. Wang, J., & Guthrie, J. T. (2004). Modeling the effects of intrinsic motivation, extrinsic motivation, amount of reading, and past reading achievement on text comprehension between US and Chinese students. Reading Research Quarterly, 39(2), 162–186.CrossRefGoogle Scholar
  37. Wilkinson, I., & Son, A. (2010). A dialogic turn in research on learning and teaching to comprehends. In M. L. Kamil, P. D. Pearson, E. B. Moje & P. Afflerbach (Eds.), Handbook of reading research (Vol. IV). New York: Routledge.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  1. 1.Woolf Fisher Research Centre, Faculty of EducationThe University of AucklandAucklandNew Zealand

Personalised recommendations