What works may hurt: Side effects in education

Abstract

Medical research is held as a field for education to emulate. Education researchers have been urged to adopt randomized controlled trials, a more “scientific” research method believed to have resulted in the advances in medicine. But a much more important lesson education needs to borrow from medicine has been ignored. That is the study of side effects. Medical research is required to investigate both the intended effects of any medical interventions and their unintended adverse effects, or side effects. In contrast, educational research tends to focus only on proving the effectiveness of practices and policies in pursuit of “what works.” It has generally ignored the potential harms that can result from what works. This article presents evidence that shows side effects are inseparable from effects. Both are the outcomes of the same intervention. This article further argues that studying and reporting side effects as part of studying effects will help advance education by settling long fought battles over practices and policies and move beyond the vicious cycle of pendulum swings in education.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2

Notes

  1. 1.

    http://dictionary.cambridge.org/us/dictionary/english/side-effect.

  2. 2.

    Direct instruction (di) here refers to the general pedagogical approach characterized by explicit instruction. It includes both the lower and upper case dis (Rosenshine 2008).

  3. 3.

    The National Institute for Direct Instruction published a 102 page long bibliography of writings on Direct Instruction in 2015, each page containing about 12 entries (National Institute for Direct Instruction 2015).

  4. 4.

    Although What Works Clearing House found the effects to be small or indiscernible based on its reviews of two programs using direct instruction (What Works Clearinghouse 2006, 2007).

References

  1. Adams, G. L., & Engelmann, S. (1996). Research on direct instruction: 25 years beyond DISTAR. Seattle, WA: Educational Achievement Systems.

    Google Scholar 

  2. Baker, K. (2007). Are international tests worth anything? Phi Delta Kappan, 89(2), 101–104.

    Article  Google Scholar 

  3. Barker, B. (2010). The pendulum swings: Transforming school reform. Sterling, VA: Trentham Books.

  4. Becker, W. C., & Gersten, R. (1982). A follow-up of follow through: The later effects of the direct instruction model on children in fifth and sixth grades. American Educational Research Journal, 19(1), 75–92.

    Article  Google Scholar 

  5. Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18–20.

    Article  Google Scholar 

  6. Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in education? Lessons from Switzerland and the US. European Journal of Education, 46(1), 101–116. doi:10.1111/j.1465-3435.2010.01462.x.

    Article  Google Scholar 

  7. Bonawitza, E., Shaftob, P., Gweonc, H., Goodmand, N. D., Spelkee, E., & Schulzc, L. (2011). The double-edged sword of pedagogy: Instruction limits spontaneous exploration and discovery. Cognition, 120(3), 322–330.

    Article  Google Scholar 

  8. Brent, G., & DiObilda, N. (1993). Effects of curriculum alignment versus direct instruction on urban children. The Journal of Educational Research, 86(6), 333–338.

    Article  Google Scholar 

  9. Brown, E. (2016, April 27). U.S. high school seniors slip in math and show no improvement in reading. The Washington Post. Retrieved from https://www.washingtonpost.com/local/education/us-high-school-seniors-slip-in-math-and-show-no-improvement-in-reading/2016/04/26/9b8f033e-0bc8-11e6-8ab8-9ad050f76d7d_story.html?utm_term=.7a9d458243cd.

  10. Bryk, A. S. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press.

    Google Scholar 

  11. Buchsbauma, D., Gopnika, A., Griffithsa, T. L., & Shaftob, P. (2011). Children’s imitation of causal action sequences is influenced by statistical and pedagogical evidence. Cognition, 120(3), 331–340.

    Article  Google Scholar 

  12. Campbell, D. T. (1976). Assessing the impact of planned social change. Hanover New Hampshire, USA. Retrieved from https://www.globalhivmeinfo.org/CapacityBuilding/OccasionalPapers/08AssessingtheImpactofPlannedSocialChange.pdf.

  13. Common Core State Standards Initiative. (2011). Common core state standards initiative. Retrieved from http://www.corestandards.org/.

  14. Cuban, L. (1990). Reforming again, again, and again. Educational Researcher, 19(1), 3–13.

    Article  Google Scholar 

  15. Darling-Hammond, L., & Lieberman, A. (Eds.). (2012). Teacher education around the world. New York: Routledge.

    Google Scholar 

  16. Dean, D., & Kuhn, D. (2007). Direct instruction vs. discovery: The long view. Science Education, 91(3), 384–397.

    Article  Google Scholar 

  17. Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251.

    Article  Google Scholar 

  18. Feniger, Y., & Lefstein, A. (2014). How not to reason with PISA data: An ironic investigation. Journal of Education Policy. doi:10.1080/02680939.2014.892156.

    Google Scholar 

  19. Figazzolo, L. (2009). Impact of PISA 2006 on the education policy debate. Retrieved from http://download.ei-ie.org/docs/IRISDocuments/ResearchWebsiteDocuments/2009-00036-01-E.pdf.

  20. Ginsberg, R., & Kingston, N. (2014). Caught in a vise: The challenges facing teacher preparation in an era of accountability. Teachers College Record, 116(1), n1.

    Google Scholar 

  21. Gunn, B., Biglan, A., Smolkowski, K., & Ary, D. (2000). The efficacy of supplemental instruction in decoding skills for Hispanic and non-Hispanic students in early elementary school. Journal of Special Education, 34(2), 90–103.

    Article  Google Scholar 

  22. Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. London, New York: Routledge.

    Google Scholar 

  23. Hempenstall, K. (2012, November 11). Reviews supporting direct instruction program effectiveness. Retrieved from https://www.nifdi.org/news/hempenstall-blog/403-reviews-supporting-direct-instruction-program-effectiveness.

  24. Hempenstall, K. (2013, October 10). Why does direct instruction evoke such rancour? Retrieved from https://www.nifdi.org/news-latest-2/blog-hempenstall/389-why-does-direct-instruction-evoke-such-rancour.

  25. Hess, F. M. (2011). Our achievement-gap mania. National Affairs, Fall 2011 (Number 9), 113–129.

  26. Hout, M., & Elliott, S. W. (Eds.). (2011). Incentives and test-based accountability in education. Washington, DC: National Academies Press.

    Google Scholar 

  27. Jensen, B. (2012). Catching up: Learning from the best school systems in East Asia. Retrieved from Melbourne: http://www.grattan.edu.au/publications/129_report_learning_from_the_best_main.pdf.

  28. Kaestle, C. (1985). Education reform and the swinging pendulum. Phi Delta Kappan, 66(6), 422–423.

    Google Scholar 

  29. Kaplan, K. (2016, December 9). Cooling cap helps cancer patients preserve their hair during chemotherapy, clinical trial shows. The Los Angeles Times. Retrieved from http://www.latimes.com/science/sciencenow/la-sci-sn-cooling-scalp-chemotherapy-20161209-story.html.

  30. Kapur, M. (2014). Productive failure in learning math. Cognitive Science, 38(5), 1008–1022.

    Article  Google Scholar 

  31. Kapur, M. (2016). Examining productive failure, productive success, unproductive failure, and unproductive success in learning. Educational Psychologist, 51(2), 289–299.

    Article  Google Scholar 

  32. Kapur, M., & Bielaczyc, K. (2012). Designing for productive failure. Journal of the Learning Sciences, 21(1), 45–83.

    Article  Google Scholar 

  33. Klein, D. (2007). A quarter century of US ‘math wars’ and political partisanship. BSHM Bulletin: Journal of the British Society for the History of Mathematics, 22(1), 22–33.

    Article  Google Scholar 

  34. Kreiner, S., & Christensen, K. B. (2014). Analyses of model fit and robustness. A new look at the PISA scaling model underlying ranking of countries according to reading literacy. Psychometrika. doi:10.1007/s11336-013-9347-z.

  35. Lamb, S., & Fullarton, S. (2002). Classroom and school factors affecting mathematics achievement: A comparative study of Australia and the United States Using TIMSS. Australian Journal of Education, 46(2), 154–171.

    Article  Google Scholar 

  36. Loveless, T. (2006). How well are American students learning? Retrieved from Washington, DC: http://www.brookings.edu/~/media/Files/rc/reports/2006/10education_loveless/10education_loveless.pdf.

  37. Loveless, T. (2014). PISA’s China problem continues: A response to Schleicher, Zhang, and Tucker. Retrieved from http://www.brookings.edu/research/papers/2014/01/08-shanghai-pisa-loveless.

  38. McMurrer, J. (2007). Choices, changes, and challenges: Curriculum and instruction in the NCLB era. Retrieved from Washington, DC: http://www.cep-dc.org/displayDocument.cfm?DocumentID=312.

  39. Meyer, L. A. (1984). Long-term academic effects of the direct instruction project follow through. The Elementary School Journal, 84(4), 380–394.

    Article  Google Scholar 

  40. Meyer, H.-D., & Benavot, A. (2013). PISA, power, and policy: The emergence of global educational governance. Oxford: Oxford University Press.

    Google Scholar 

  41. Morrison, H. (2013, December 1). Pisa 2012 major flaw exposed. Retrieved from https://paceni.wordpress.com/2013/12/01/pisa-2012-major-flaw-exposed/.

  42. Mullis, I. V. S., Martin, M. O., & Foy, P. (with Olson, J.F., Preuschoff, C., Erberber, E., Arora, A., & Galia, J.). (2008). TIMSS 2007 international mathematics report: Findings from IEA’s trends in international mathematics and science study at the fourth and eighth grades. Retrieved from Chestnut Hill, MA.

  43. Mullis, I. V. S., Martin, M. O., Foy, P., & Arora, A. (2012). TIMSS 2011 international results in mathematics. Retrieved from Chestnut Hill, MA.

  44. National Institute for Direct Instruction. (2015). Writings on direct instruction: A bibliography. Retrieved from Eugene, OR.

  45. National Research Council. (1999). Global perspectives for local action: Using TIMSS to improve U.S. mathematics and science education. Washington, D.C.: National Academy Press.

  46. Nelson, D. I. (2002). Using TIMSS to inform policy and practice at the local level. CPRE policy briefs. Access ERIC: FullText (CPRE-36). Retrieved from Pennsylvania: http://www.gse.upenn.edu/cpre/. For full text: http://www.cpre.org/Publications/rb36.pdf. Provider: OCLC.

  47. Nichols, S. L., & Berliner, D. C. (2007). Collateral damage: How high-stakes testing corrupts America’s schools. Cambridge, MA: Harvard Education Press.

    Google Scholar 

  48. No Child Left Behind Act of 2001, 107–110, Congress (2002).

  49. OECD. (2011). Strong performers and successful reformers in education: Lessons from PISA for the United States. Retrieved from Paris: http://dx.doi.org/10.1787/9789264096660-en.

  50. OECD. (2013a). Programme for international student assessment (PISA). Retrieved from http://www.oecd.org/pisa/aboutpisa/.

  51. OECD. (2013b). Ready to learn: Students engagement, drive, and self-beliefs. Retrieved from Paris: http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-III.pdf.

  52. OECD. (2014). PISA 2012 results: What students know and can do: Student performance in mathematics, reading and science (Volume I) [Revised edition February 2014]. Retrieved from Paris: http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-i.htm.

  53. OECD. (2016). PISA 2015 results (volume I): Excellence and equity in education. Retrieved from Paris: http://dx.doi.org/10.1787/9789264266490-en.

  54. Pearson, P. D. (2004). The Reading wars. Educational Policy, 18(1), 216–252.

    Article  Google Scholar 

  55. Peterson, P. L. (1979). Direct instruction: Effective for what and for whom. Educational Leadership, 37(1), 46–48.

    Google Scholar 

  56. Ravitch, D. (2013). Reign of error: The hoax of the privatization movement and the danger to America’s public schools. New York: Knopf.

    Google Scholar 

  57. Roehler, L. R., & Duffy, G. G. (1982). Matching direct instruction to reading outcomes. Language Arts, 59(5), 476–480.

    Google Scholar 

  58. Rosenshine, B. (2008). Five meanings of direct instruction. Retrieved from Lincoln, IL: http://www.centerii.org/search/Resources%5CFiveDirectInstruct.pdf.

  59. Sahlberg, P. (2011). Finnish lessons: What can the world learn from educational change in Finland?. New York: Teachers College Press.

    Google Scholar 

  60. Sanchez, C. (Writer). (2013). El Paso schools cheating scandal: Who’s accountable? Washington DC: NPR.

  61. Schleicher, A. (2013, December 3). What we learn from the PISA 2012 results. Retrieved from http://oecdeducationtoday.blogspot.com/2013/12/what-we-learn-from-pisa-2012-results.html.

  62. Schmidt, W. H. (1999). Facing the consequences: Using TIMSS for a closer look at U.S. mathematics and science education. Dordrecht; Boston: Kluwer Academic Publishers.

  63. Schwerdt, G., & Wuppermann, A. C. (2011). Sage on the stage: Is lecturing really all that bad? Education Next, 11(3), 63–67.

    Google Scholar 

  64. Shakeel, M. D., Anderson, K. P., & Wolf, P. J. (2016). The participant effects of private school vouchers across the globe: A meta analytic and systematic review. Retrieved from Little Rock: http://www.uaedreform.org/downloads/2016/05/the-participant-effects-of-private-school-vouchers-across-the-globe-a-meta-analytic-and-systematic-review-2.pdf.

  65. Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. Washington DC: National Academy Press.

    Google Scholar 

  66. Sjøberg, S. (2012). PISA: Politics, fundamental problems and intriguing results Recherches en Education, 14. Retrieved from http://www.recherches-en-education.net/spip.php?article140.

  67. Slavin, R. E. (1989). PET and the pendulum: Faddism in education and how to stop it. Phi Delta Kappan, 70(10), 752–758.

    Google Scholar 

  68. Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31(7), 15–21.

    Article  Google Scholar 

  69. Slavin, R. E. (2008). Perspectives on evidence-based research in education—What works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5–14.

    Article  Google Scholar 

  70. Sneader, W. (2005). Drug discovery: A history. Hoboken, NJ: Wiley.

    Google Scholar 

  71. Stewart, W. (2013, December 3rd). Is Pisa fundamentally flawed? Retrieved from http://www.tes.co.uk/article.aspx?storycode=6344672.

  72. Swanson, H. L., & Sachse-Lee, C. (2000). A meta-analysis of single-subject-design intervention research for students with LD. Journal of Learning Disabilities, 33(2), 114–136.

    Article  Google Scholar 

  73. Tarver, S. G. (1998). Myths and truths about direct instruction. Effective School Practices, 17(1), 18–22.

    Google Scholar 

  74. Telegraph Reporters. (2016, July 23). Half of primary school pupils to follow Chinese style of learning maths with focus on whole-class teaching. Retrieved from http://www.telegraph.co.uk/education/2016/07/12/half-of-primary-school-pupils-to-follow-chinese-style-of-learnin/.

  75. Tienken, C. H., & Zhao, Y. (2013). How common standards and standardized testing widen the opportunity gap. In P. L. Carter & K. G. Welner (Eds.), Closing the opportunity gap: What America must do to give every child an even chance (pp. 113–122). New York: Oxford University Press.

    Google Scholar 

  76. Toppo, G., Amos, D., Gillum, J., & Upton, J. (2011, March 17). When test scores seem too good to believe. USA TODAY. Retrieved from http://www.usatoday.com/news/education/2011-03-06-school-testing_N.htm.

  77. Tucker, M. (Ed.). (2011). Surpassing Shanghai: An agenda for American education built on the world’s leading systems. Boston: Harvard Education Press.

    Google Scholar 

  78. Tucker, M. (2014). Chinese lessons: Shanghai’s rise to the top of the PISA league tables. Retrieved from Washington, DC: http://www.ncee.org/wp-content/uploads/2013/10/ChineseLessonsWeb.pdf.

  79. U.S. Department of Health and Human Services (Food and Drug Administration Center for Drug Evaluation and Research (CDER) Center for Biologics Evaluation and Research (CBER)). (2012). Guidance for industry and investigators: Safety reporting requirements for INDs and BA/BE studies. Retrieved from Silver Spring, MD: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM227351.pdf.

  80. Vogell, H. (2011, July 6, 2011). Investigation into APS cheating finds unethical behavior across every level. The Atlanta Journal-Constitution. Retrieved from http://www.ajc.com/news/investigation-into-aps-cheating-1001375.html.

  81. What Works Clearinghouse. (2006). Reading mastery. Retrieved from http://ies.ed.gov/ncee/wwc/interventionreport.aspx?sid=417.

  82. What Works Clearinghouse. (2007). Direct instruction. Retrieved from http://ies.ed.gov/ncee/wwc/interventionreport.aspx?sid=139.

  83. What Works Clearinghouse. (2014). What works clearinghouse: Procedures and standards handbook version 3.0 Retrieved from Washington, DC.

  84. What Works Clearinghouse. (2015). Frequently asked questions. Retrieved from http://ies.ed.gov/ncee/wwc/Document.aspx?sid=15-wwc.

  85. Zhao, Y. (2012a, December 11). Numbers can lie: What TIMSS and PISA truly tell us, if anything? Retrieved from http://zhaolearning.com/2012/12/11/numbers-can-lie-what-timss-and-pisa-truly-tell-us-if-anything/.

  86. Zhao, Y. (2012b). World class learners: Educating creative and entrepreneurial students. Thousand Oaks, CA: Corwin.

    Google Scholar 

  87. Zhao, Y. (2014). Who’s afraid of the big bad dragon: Why China has the best (and worst) education system in the world. San Francisco: Jossey-Bass.

    Google Scholar 

  88. Zhao, Y. (2015). Lessons that matter: What we should learn from Asian school systems. Retrieved from Melbourne: http://www.mitchellinstitute.org.au/reports/lessons-that-matter-what-should-we-learn-from-asias-school-systems/.

  89. Zhao, Y. (2016a). Counting what counts: Reframing education outcomes. Bloomington, IN: Solution Tree Press.

    Google Scholar 

  90. Zhao, Y. (2016b). From deficiency to strength: Shifting the mindset about education inequality. Journal of Social Issues, 72(4), 716–735.

    Article  Google Scholar 

  91. Zhao, Y. (2016c, November 29). It must be the chopsticks: The less reported findings of 2015 TIMSS and explaining the East Asian outstanding performance. Retrieved from http://zhaolearning.com/2016/11/29/it-must-be-chopsticks-the-less-reported-findings-of-2015-timss-and-explaining-the-east-asian-outstanding-performance/.

  92. Zhao, Y. (2016d, November 29). Stop copying others: TIMSS lessons for America. Retrieved from http://zhaolearning.com/2016/11/30/stop-copying-others-timss-lessons-for-america/.

Download references

Acknowledgements

Ken Frank of Michigan State University, James Basham and Jason Travers of University of Kansas, and Yechen Zhao of Stanford University read drafts of the manuscript and provided invaluable suggestions.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Yong Zhao.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhao, Y. What works may hurt: Side effects in education. J Educ Change 18, 1–19 (2017). https://doi.org/10.1007/s10833-016-9294-4

Download citation

Keywords

  • Educational research
  • Methodology
  • RCT
  • Direct instruction
  • International assessment
  • Side effects
  • PISA
  • Educational policy
  • Educational reform