Medical research is held as a field for education to emulate. Education researchers have been urged to adopt randomized controlled trials, a more “scientific” research method believed to have resulted in the advances in medicine. But a much more important lesson education needs to borrow from medicine has been ignored. That is the study of side effects. Medical research is required to investigate both the intended effects of any medical interventions and their unintended adverse effects, or side effects. In contrast, educational research tends to focus only on proving the effectiveness of practices and policies in pursuit of “what works.” It has generally ignored the potential harms that can result from what works. This article presents evidence that shows side effects are inseparable from effects. Both are the outcomes of the same intervention. This article further argues that studying and reporting side effects as part of studying effects will help advance education by settling long fought battles over practices and policies and move beyond the vicious cycle of pendulum swings in education.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
Direct instruction (di) here refers to the general pedagogical approach characterized by explicit instruction. It includes both the lower and upper case dis (Rosenshine 2008).
The National Institute for Direct Instruction published a 102 page long bibliography of writings on Direct Instruction in 2015, each page containing about 12 entries (National Institute for Direct Instruction 2015).
Adams, G. L., & Engelmann, S. (1996). Research on direct instruction: 25 years beyond DISTAR. Seattle, WA: Educational Achievement Systems.
Baker, K. (2007). Are international tests worth anything? Phi Delta Kappan, 89(2), 101–104.
Barker, B. (2010). The pendulum swings: Transforming school reform. Sterling, VA: Trentham Books.
Becker, W. C., & Gersten, R. (1982). A follow-up of follow through: The later effects of the direct instruction model on children in fifth and sixth grades. American Educational Research Journal, 19(1), 75–92.
Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18–20.
Bieber, T., & Martens, K. (2011). The OECD PISA study as a soft power in education? Lessons from Switzerland and the US. European Journal of Education, 46(1), 101–116. doi:10.1111/j.1465-3435.2010.01462.x.
Bonawitza, E., Shaftob, P., Gweonc, H., Goodmand, N. D., Spelkee, E., & Schulzc, L. (2011). The double-edged sword of pedagogy: Instruction limits spontaneous exploration and discovery. Cognition, 120(3), 322–330.
Brent, G., & DiObilda, N. (1993). Effects of curriculum alignment versus direct instruction on urban children. The Journal of Educational Research, 86(6), 333–338.
Brown, E. (2016, April 27). U.S. high school seniors slip in math and show no improvement in reading. The Washington Post. Retrieved from https://www.washingtonpost.com/local/education/us-high-school-seniors-slip-in-math-and-show-no-improvement-in-reading/2016/04/26/9b8f033e-0bc8-11e6-8ab8-9ad050f76d7d_story.html?utm_term=.7a9d458243cd.
Bryk, A. S. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press.
Buchsbauma, D., Gopnika, A., Griffithsa, T. L., & Shaftob, P. (2011). Children’s imitation of causal action sequences is influenced by statistical and pedagogical evidence. Cognition, 120(3), 331–340.
Campbell, D. T. (1976). Assessing the impact of planned social change. Hanover New Hampshire, USA. Retrieved from https://www.globalhivmeinfo.org/CapacityBuilding/OccasionalPapers/08AssessingtheImpactofPlannedSocialChange.pdf.
Common Core State Standards Initiative. (2011). Common core state standards initiative. Retrieved from http://www.corestandards.org/.
Cuban, L. (1990). Reforming again, again, and again. Educational Researcher, 19(1), 3–13.
Darling-Hammond, L., & Lieberman, A. (Eds.). (2012). Teacher education around the world. New York: Routledge.
Dean, D., & Kuhn, D. (2007). Direct instruction vs. discovery: The long view. Science Education, 91(3), 384–397.
Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251.
Feniger, Y., & Lefstein, A. (2014). How not to reason with PISA data: An ironic investigation. Journal of Education Policy. doi:10.1080/02680939.2014.892156.
Figazzolo, L. (2009). Impact of PISA 2006 on the education policy debate. Retrieved from http://download.ei-ie.org/docs/IRISDocuments/ResearchWebsiteDocuments/2009-00036-01-E.pdf.
Ginsberg, R., & Kingston, N. (2014). Caught in a vise: The challenges facing teacher preparation in an era of accountability. Teachers College Record, 116(1), n1.
Gunn, B., Biglan, A., Smolkowski, K., & Ary, D. (2000). The efficacy of supplemental instruction in decoding skills for Hispanic and non-Hispanic students in early elementary school. Journal of Special Education, 34(2), 90–103.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. London, New York: Routledge.
Hempenstall, K. (2012, November 11). Reviews supporting direct instruction program effectiveness. Retrieved from https://www.nifdi.org/news/hempenstall-blog/403-reviews-supporting-direct-instruction-program-effectiveness.
Hempenstall, K. (2013, October 10). Why does direct instruction evoke such rancour? Retrieved from https://www.nifdi.org/news-latest-2/blog-hempenstall/389-why-does-direct-instruction-evoke-such-rancour.
Hess, F. M. (2011). Our achievement-gap mania. National Affairs, Fall 2011 (Number 9), 113–129.
Hout, M., & Elliott, S. W. (Eds.). (2011). Incentives and test-based accountability in education. Washington, DC: National Academies Press.
Jensen, B. (2012). Catching up: Learning from the best school systems in East Asia. Retrieved from Melbourne: http://www.grattan.edu.au/publications/129_report_learning_from_the_best_main.pdf.
Kaestle, C. (1985). Education reform and the swinging pendulum. Phi Delta Kappan, 66(6), 422–423.
Kaplan, K. (2016, December 9). Cooling cap helps cancer patients preserve their hair during chemotherapy, clinical trial shows. The Los Angeles Times. Retrieved from http://www.latimes.com/science/sciencenow/la-sci-sn-cooling-scalp-chemotherapy-20161209-story.html.
Kapur, M. (2014). Productive failure in learning math. Cognitive Science, 38(5), 1008–1022.
Kapur, M. (2016). Examining productive failure, productive success, unproductive failure, and unproductive success in learning. Educational Psychologist, 51(2), 289–299.
Kapur, M., & Bielaczyc, K. (2012). Designing for productive failure. Journal of the Learning Sciences, 21(1), 45–83.
Klein, D. (2007). A quarter century of US ‘math wars’ and political partisanship. BSHM Bulletin: Journal of the British Society for the History of Mathematics, 22(1), 22–33.
Kreiner, S., & Christensen, K. B. (2014). Analyses of model fit and robustness. A new look at the PISA scaling model underlying ranking of countries according to reading literacy. Psychometrika. doi:10.1007/s11336-013-9347-z.
Lamb, S., & Fullarton, S. (2002). Classroom and school factors affecting mathematics achievement: A comparative study of Australia and the United States Using TIMSS. Australian Journal of Education, 46(2), 154–171.
Loveless, T. (2006). How well are American students learning? Retrieved from Washington, DC: http://www.brookings.edu/~/media/Files/rc/reports/2006/10education_loveless/10education_loveless.pdf.
Loveless, T. (2014). PISA’s China problem continues: A response to Schleicher, Zhang, and Tucker. Retrieved from http://www.brookings.edu/research/papers/2014/01/08-shanghai-pisa-loveless.
McMurrer, J. (2007). Choices, changes, and challenges: Curriculum and instruction in the NCLB era. Retrieved from Washington, DC: http://www.cep-dc.org/displayDocument.cfm?DocumentID=312.
Meyer, L. A. (1984). Long-term academic effects of the direct instruction project follow through. The Elementary School Journal, 84(4), 380–394.
Meyer, H.-D., & Benavot, A. (2013). PISA, power, and policy: The emergence of global educational governance. Oxford: Oxford University Press.
Morrison, H. (2013, December 1). Pisa 2012 major flaw exposed. Retrieved from https://paceni.wordpress.com/2013/12/01/pisa-2012-major-flaw-exposed/.
Mullis, I. V. S., Martin, M. O., & Foy, P. (with Olson, J.F., Preuschoff, C., Erberber, E., Arora, A., & Galia, J.). (2008). TIMSS 2007 international mathematics report: Findings from IEA’s trends in international mathematics and science study at the fourth and eighth grades. Retrieved from Chestnut Hill, MA.
Mullis, I. V. S., Martin, M. O., Foy, P., & Arora, A. (2012). TIMSS 2011 international results in mathematics. Retrieved from Chestnut Hill, MA.
National Institute for Direct Instruction. (2015). Writings on direct instruction: A bibliography. Retrieved from Eugene, OR.
National Research Council. (1999). Global perspectives for local action: Using TIMSS to improve U.S. mathematics and science education. Washington, D.C.: National Academy Press.
Nelson, D. I. (2002). Using TIMSS to inform policy and practice at the local level. CPRE policy briefs. Access ERIC: FullText (CPRE-36). Retrieved from Pennsylvania: http://www.gse.upenn.edu/cpre/. For full text: http://www.cpre.org/Publications/rb36.pdf. Provider: OCLC.
Nichols, S. L., & Berliner, D. C. (2007). Collateral damage: How high-stakes testing corrupts America’s schools. Cambridge, MA: Harvard Education Press.
No Child Left Behind Act of 2001, 107–110, Congress (2002).
OECD. (2011). Strong performers and successful reformers in education: Lessons from PISA for the United States. Retrieved from Paris: http://dx.doi.org/10.1787/9789264096660-en.
OECD. (2013a). Programme for international student assessment (PISA). Retrieved from http://www.oecd.org/pisa/aboutpisa/.
OECD. (2013b). Ready to learn: Students engagement, drive, and self-beliefs. Retrieved from Paris: http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-III.pdf.
OECD. (2014). PISA 2012 results: What students know and can do: Student performance in mathematics, reading and science (Volume I) [Revised edition February 2014]. Retrieved from Paris: http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-i.htm.
OECD. (2016). PISA 2015 results (volume I): Excellence and equity in education. Retrieved from Paris: http://dx.doi.org/10.1787/9789264266490-en.
Pearson, P. D. (2004). The Reading wars. Educational Policy, 18(1), 216–252.
Peterson, P. L. (1979). Direct instruction: Effective for what and for whom. Educational Leadership, 37(1), 46–48.
Ravitch, D. (2013). Reign of error: The hoax of the privatization movement and the danger to America’s public schools. New York: Knopf.
Roehler, L. R., & Duffy, G. G. (1982). Matching direct instruction to reading outcomes. Language Arts, 59(5), 476–480.
Rosenshine, B. (2008). Five meanings of direct instruction. Retrieved from Lincoln, IL: http://www.centerii.org/search/Resources%5CFiveDirectInstruct.pdf.
Sahlberg, P. (2011). Finnish lessons: What can the world learn from educational change in Finland?. New York: Teachers College Press.
Sanchez, C. (Writer). (2013). El Paso schools cheating scandal: Who’s accountable? Washington DC: NPR.
Schleicher, A. (2013, December 3). What we learn from the PISA 2012 results. Retrieved from http://oecdeducationtoday.blogspot.com/2013/12/what-we-learn-from-pisa-2012-results.html.
Schmidt, W. H. (1999). Facing the consequences: Using TIMSS for a closer look at U.S. mathematics and science education. Dordrecht; Boston: Kluwer Academic Publishers.
Schwerdt, G., & Wuppermann, A. C. (2011). Sage on the stage: Is lecturing really all that bad? Education Next, 11(3), 63–67.
Shakeel, M. D., Anderson, K. P., & Wolf, P. J. (2016). The participant effects of private school vouchers across the globe: A meta analytic and systematic review. Retrieved from Little Rock: http://www.uaedreform.org/downloads/2016/05/the-participant-effects-of-private-school-vouchers-across-the-globe-a-meta-analytic-and-systematic-review-2.pdf.
Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. Washington DC: National Academy Press.
Sjøberg, S. (2012). PISA: Politics, fundamental problems and intriguing results Recherches en Education, 14. Retrieved from http://www.recherches-en-education.net/spip.php?article140.
Slavin, R. E. (1989). PET and the pendulum: Faddism in education and how to stop it. Phi Delta Kappan, 70(10), 752–758.
Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31(7), 15–21.
Slavin, R. E. (2008). Perspectives on evidence-based research in education—What works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5–14.
Sneader, W. (2005). Drug discovery: A history. Hoboken, NJ: Wiley.
Stewart, W. (2013, December 3rd). Is Pisa fundamentally flawed? Retrieved from http://www.tes.co.uk/article.aspx?storycode=6344672.
Swanson, H. L., & Sachse-Lee, C. (2000). A meta-analysis of single-subject-design intervention research for students with LD. Journal of Learning Disabilities, 33(2), 114–136.
Tarver, S. G. (1998). Myths and truths about direct instruction. Effective School Practices, 17(1), 18–22.
Telegraph Reporters. (2016, July 23). Half of primary school pupils to follow Chinese style of learning maths with focus on whole-class teaching. Retrieved from http://www.telegraph.co.uk/education/2016/07/12/half-of-primary-school-pupils-to-follow-chinese-style-of-learnin/.
Tienken, C. H., & Zhao, Y. (2013). How common standards and standardized testing widen the opportunity gap. In P. L. Carter & K. G. Welner (Eds.), Closing the opportunity gap: What America must do to give every child an even chance (pp. 113–122). New York: Oxford University Press.
Toppo, G., Amos, D., Gillum, J., & Upton, J. (2011, March 17). When test scores seem too good to believe. USA TODAY. Retrieved from http://www.usatoday.com/news/education/2011-03-06-school-testing_N.htm.
Tucker, M. (Ed.). (2011). Surpassing Shanghai: An agenda for American education built on the world’s leading systems. Boston: Harvard Education Press.
Tucker, M. (2014). Chinese lessons: Shanghai’s rise to the top of the PISA league tables. Retrieved from Washington, DC: http://www.ncee.org/wp-content/uploads/2013/10/ChineseLessonsWeb.pdf.
U.S. Department of Health and Human Services (Food and Drug Administration Center for Drug Evaluation and Research (CDER) Center for Biologics Evaluation and Research (CBER)). (2012). Guidance for industry and investigators: Safety reporting requirements for INDs and BA/BE studies. Retrieved from Silver Spring, MD: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM227351.pdf.
Vogell, H. (2011, July 6, 2011). Investigation into APS cheating finds unethical behavior across every level. The Atlanta Journal-Constitution. Retrieved from http://www.ajc.com/news/investigation-into-aps-cheating-1001375.html.
What Works Clearinghouse. (2006). Reading mastery. Retrieved from http://ies.ed.gov/ncee/wwc/interventionreport.aspx?sid=417.
What Works Clearinghouse. (2007). Direct instruction. Retrieved from http://ies.ed.gov/ncee/wwc/interventionreport.aspx?sid=139.
What Works Clearinghouse. (2014). What works clearinghouse: Procedures and standards handbook version 3.0 Retrieved from Washington, DC.
What Works Clearinghouse. (2015). Frequently asked questions. Retrieved from http://ies.ed.gov/ncee/wwc/Document.aspx?sid=15-wwc.
Zhao, Y. (2012a, December 11). Numbers can lie: What TIMSS and PISA truly tell us, if anything? Retrieved from http://zhaolearning.com/2012/12/11/numbers-can-lie-what-timss-and-pisa-truly-tell-us-if-anything/.
Zhao, Y. (2012b). World class learners: Educating creative and entrepreneurial students. Thousand Oaks, CA: Corwin.
Zhao, Y. (2014). Who’s afraid of the big bad dragon: Why China has the best (and worst) education system in the world. San Francisco: Jossey-Bass.
Zhao, Y. (2015). Lessons that matter: What we should learn from Asian school systems. Retrieved from Melbourne: http://www.mitchellinstitute.org.au/reports/lessons-that-matter-what-should-we-learn-from-asias-school-systems/.
Zhao, Y. (2016a). Counting what counts: Reframing education outcomes. Bloomington, IN: Solution Tree Press.
Zhao, Y. (2016b). From deficiency to strength: Shifting the mindset about education inequality. Journal of Social Issues, 72(4), 716–735.
Zhao, Y. (2016c, November 29). It must be the chopsticks: The less reported findings of 2015 TIMSS and explaining the East Asian outstanding performance. Retrieved from http://zhaolearning.com/2016/11/29/it-must-be-chopsticks-the-less-reported-findings-of-2015-timss-and-explaining-the-east-asian-outstanding-performance/.
Zhao, Y. (2016d, November 29). Stop copying others: TIMSS lessons for America. Retrieved from http://zhaolearning.com/2016/11/30/stop-copying-others-timss-lessons-for-america/.
Ken Frank of Michigan State University, James Basham and Jason Travers of University of Kansas, and Yechen Zhao of Stanford University read drafts of the manuscript and provided invaluable suggestions.
About this article
Cite this article
Zhao, Y. What works may hurt: Side effects in education. J Educ Change 18, 1–19 (2017). https://doi.org/10.1007/s10833-016-9294-4
- Educational research
- Direct instruction
- International assessment
- Side effects
- Educational policy
- Educational reform