Journal of Educational Change

, Volume 18, Issue 4, pp 465–494 | Cite as

Continuous improvement in the public school context: Understanding how educators respond to plan–do–study–act cycles

  • Ariel Tichnor-Wagner
  • John Wachen
  • Marisa Cannata
  • Lora Cohen-Vogel
Article

Abstract

The last 5 years have witnessed growing support amongst government institutions and educational foundations for applying continuous improvement research (CIR) in school settings. CIR responds to the challenge of implementing effective educational innovations at scale by working with practitioners in local contexts to understand “what works, for whom, and under what conditions.” CIR works to achieve system improvement through the use of plan–do–study–act (PDSA) cycles, which are multiple tests of small changes. This comparative case study of two urban school districts examined how innovation design teams took up PDSA in their work to improve high school student outcomes, and their perceptions of PDSA as an approach to innovation development, adaptation, and implementation. Findings revealed both possibilities and challenges for implementing PDSA. Nearly all participants reported the value in PDSA, and participants pointed to connections to previous experiences and PDSA training as helping to build capacity. However, we found mixed levels of enthusiasm for actually conducting PDSA cycles, and capacity constraints regarding time and data collection.

Keywords

School improvement Continuous improvement research School districts 

Notes

Acknowledgements

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305E100030 to Vanderbilt University. The opinions expressed herein are those of the authors and do not represent views of the Institute or the U.S. Department of Education.

References

  1. Anderson, T., & Shattuck, J. (2012). Design-based research a decade of progress in education research? Educational Researcher, 41, 16–25.CrossRefGoogle Scholar
  2. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1), 1–14.CrossRefGoogle Scholar
  3. Berwick, D. M. (1998). Developing and testing changes in delivery of care. Annals of Internal Medicine, 128, 651–656.CrossRefGoogle Scholar
  4. Bheuyan, N., & Baghel, A. (2005). An overview of continuous improvement: From the past to the present. Management Decision, 43, 761–771.CrossRefGoogle Scholar
  5. Bhuiyan, N., & Baghel, A. (2005). An overview of continuous improvement: From the past to the present. Management Decision, 43(5), 761–771.CrossRefGoogle Scholar
  6. Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3–15.CrossRefGoogle Scholar
  7. Bryk, A. S. (2009). Support a science of performance improvement. Phi Delta Kappan, 90(8), 597–600.CrossRefGoogle Scholar
  8. Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press.Google Scholar
  9. Cannata, M., Redding, C., & Rubin, M. (2016). Continuous improvement in action: Educators’ evidence use for school improvement. Presented at the annual meeting of the Assocation for Education Finance and Policy, Denver, CO.Google Scholar
  10. Cobb, P., Jackson, K., Smith, T., Sorum, M., & Henrick, E. (2013). Design research with educational systems: Investigating and supporting improvements in the quality of mathematics teaching and learning at scale. In B. J. Fishman, W. R. Penuel, A. R. Allen, & B. H. Chang (Eds.), Design based implementation research: Theories, methods, and exemplars. New York: Teachers College.Google Scholar
  11. Cohen, D. K., & Hill, H. C. (2001). Learning policy: When state education reform works. New Haven: Yale University Press.CrossRefGoogle Scholar
  12. Cohen-Vogel, L. (2011). “Staffing to the test” are today’s school personnel practices evidence based? Educational Evaluation and Policy Analysis, 33(4), 483–505.CrossRefGoogle Scholar
  13. Cohen-Vogel, L., Tichnor-Wagner, A., Allen, D., Harrison, C., Kainz, K., Socol, A. R., & Wang, Q. (2015). Implementing educational innovations at scale: Transforming researchers into continuous improvement scientists. Educational Policy, 29(1), 257–277.CrossRefGoogle Scholar
  14. Deming, W. E. (2000). The new economics for industry, government, and education. Cambridge, MA: The MIT Press.Google Scholar
  15. Dolle, J. R., Gomez, L. M., Russell, J. L., & Bryk, A. S. (2013). More than a network: Building professional communities for educational improvement. National Society for the Study of Education, 112(2), 443–463.Google Scholar
  16. Farrell, C. C., & Marsh, J. A. (2016). Metrics matter how properties and perceptions of data shape teachers’ instructional responses. Educational Administration Quarterly, 52(3), 423–462. doi: 10.1177/0013161X16638429.CrossRefGoogle Scholar
  17. Firestone, W. A. (1989). Using reform: Conceptualizing district initiative. Educational Evaluation and Policy Analysis, 11(2), 151–164.CrossRefGoogle Scholar
  18. Fishman, B. J., Penuel, W. R., Allen, A., & Cheng, B. H. (Eds.). (2013). Design-based implementation research: Theories, methods, and exemplars. National Society for the Study of Education Yearbook (Vol. 112, No. 2) (pp. 136–156). New York, NY: Teachers College Record.Google Scholar
  19. Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Suk Yoon, K. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915–945.CrossRefGoogle Scholar
  20. Hatch, T. (2001). Incoherence in the System: Three perspectives on the implementation of multiple initiatives in one district. American Journal of Education, 109(4), 407–437.CrossRefGoogle Scholar
  21. Hiebert, J., Morris, A. K., & Glass, B. (2003). Learning to learn to teach: An “experiment’’ model for teaching and teacher preparation in mathematics. Journal of Mathematics Teacher Education, 6(3), 201–222. doi: 10.1023/A:1025162108648.CrossRefGoogle Scholar
  22. Honig, M. I., & Venkateswaran, N. (2012). School-central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199–222.CrossRefGoogle Scholar
  23. Hopkins, M., & Woulfin, S. L. (2015). School system (re)design: Developing educational infrastructures to support school leadership and teaching practice. Journal of Educational Change, 16(4), 371–377. doi: 10.1007/s10833-015-9260-6.CrossRefGoogle Scholar
  24. Kemmis, S., & McTaggart, R. (2005). Participatory action research: Communicative action and the public sphere. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (pp. 559–603). Thousand Oaks, CA: Sage Publications.Google Scholar
  25. Langley, G. J., Nolan, K. M., Norman, C. L., & Provost, L. P. (2009). The improvement guide: Practical approach to enhancing organizational performance (2nd ed.). San Francisco, CA: Jossey Bass.Google Scholar
  26. Lewis, C. (2015). What is improvement science? Do we need it in education? Educational Researcher, 44(1), 54–61.CrossRefGoogle Scholar
  27. Lynch, D., Smith, R., Provost, S., & Madden, J. (2016). Improving teaching capacity to increase student achievement: The key role of data interpretation by school leaders. Journal of Educational Administration, 54(5), 575–592. doi: 10.1108/JEA-10-2015-0092.CrossRefGoogle Scholar
  28. Mann, N.R. (1993). Statisticians in history: W. Edwards Deming. Retrieved from http://www.amstat.org/about/statisticiansinhistory/index.cfm?fuseaction=biosinfo&BioID.
  29. Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 110309.Google Scholar
  30. Marshall, C., & Rossman, G. B. (2011). Designing qualitative research (5th ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
  31. McLaughlin, M. W. (1990). The RAND change agent study revisited: Macro perspectives and micro realities. Educational Researcher, 19, 11–16.CrossRefGoogle Scholar
  32. Means, B., & Harris, C. J. (2013). Towards an evidence framework for design-based implementation research. In B. J. Fishman, W. R. Penuel, A. R. Allen, & B. H. Chang (Eds.), Design based implementation research: Theories, methods, and exemplars. New York, NY: Teachers College.Google Scholar
  33. Miles, M., & Huberman, M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: Sage Publications.Google Scholar
  34. Morris, A. K., & Hiebert, J. (2011). Creating shared instructional products an alternative approach to improving teaching. Educational Researcher, 40(1), 5–14.CrossRefGoogle Scholar
  35. Morris, A., Hiebert, J., Sisofo, E., & Hwang, S. (2015). Using evidence in classroom practice. Presented at the National Center on Scaling Up Effective Schools conference: Using continuous improvement to integrate design, implementation, and scale. Nashville, TN: Vanderbilt University.Google Scholar
  36. Orginc, G., & Shojania, K. G. (2014). Building knowledge, asking questions. BMJ Quality and Safety, 23, 265–267.CrossRefGoogle Scholar
  37. Park, S., Hironaka, S., Carver, P., & Nordstrum, L. (2013). Continuous improvement in education. Palo Alto, CA: Carnegie Foundation for the Advancement of Teaching.Google Scholar
  38. Penuel, W. R., Allen, A., Coburn, C. E., & Farrell, C. (2015). Conceptualizing research–practice partnerships as joint work at boundaries. Journal of Education for Students Placed at Risk, 20(1–2), 182–197.CrossRefGoogle Scholar
  39. Peurach, D. J. (2016). Innovating at the nexus of impact and improvement: Leading educational improvement networks. Educational Researcher, 45(7), 421–429. doi: 10.3102/0013189X16670898.CrossRefGoogle Scholar
  40. Reed, J. E., & Card, A. J. (2016). The problem with Plan-Do-Study-Act cycles. BMJ QualSaf, 25, 147–152.CrossRefGoogle Scholar
  41. Russell, J. L., Bryk, A. S., Dolle, J. R., Gomez, L. M., LeMahieu, P. G., & Grunow, A. (in press). A framework for the initiation of networked improvement communities. Teachers College Record, 119(7).Google Scholar
  42. Shortell, S. M., Bennett, C. L., & Byck, G. R. (1998). Assessing the impact of continuous quality improvement on clinical practice: What it will take to accelerate progress. Milbank Quarterly, 76(4), 593–624.CrossRefGoogle Scholar
  43. Sowers, N., & Yamada, H. (2015). Pathways impact report. Palo Alto, CA: Carnegie Foundation for the Advancement of Teaching.Google Scholar
  44. Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research, 72(3), 387–431.CrossRefGoogle Scholar
  45. Supovitz, J. A. (2012). Getting at student understanding: The key to teachers’ use of test data. Teachers College Record, 114(11), 110309.Google Scholar
  46. Taylor, M. J., McNicholas, C., Nicolay, C., Darzi, A., Bell, D., & Reed, J. E. (2014). Systematic review of the application of the plan–do–study–act method to improve quality in healthcare. BMJ Quality and Safety, 23, 290–298.CrossRefGoogle Scholar
  47. Tseng, V. (2012). Forging common ground: Fostering the conditions for evidence use. William T. Grant Foundation Annual Report.Google Scholar
  48. Tyack, D., & Cuban, L. (1995). Tinkering toward utopia: A century of public school reform. Cambridge, MA: Harvard University Press.Google Scholar
  49. Weiss, J. A. (2012). Data for improvement, data for accountability. Teachers College Record, 114(11), 110309.Google Scholar
  50. Yin, R. K. (2009). Case study research: Design and methods (Vol. 5). Thousand Oaks, CA: Sage Publications.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2017

Authors and Affiliations

  • Ariel Tichnor-Wagner
    • 1
  • John Wachen
    • 2
  • Marisa Cannata
    • 3
  • Lora Cohen-Vogel
    • 2
  1. 1.ASCDAlexandriaUSA
  2. 2.University of North Carolina at Chapel HillChapel HillUSA
  3. 3.Vanderbilt UniversityNashvilleUSA

Personalised recommendations