Skip to main content
Log in

The Status of the Testing Effect for Complex Materials: Still a Winner

  • Commentary
  • Published:
Educational Psychology Review Aims and scope Submit manuscript

Abstract

The target articles in the special issue address a timely and important question concerning whether practice tests enhance learning of complex materials. The consensus conclusion from these articles is that the testing effect does not obtain for complex materials. In this commentary, I discuss why this conclusion is not warranted either by the outcomes reported in the target articles or by the available evidence from prior research. Importantly, the weight of the available evidence does not alter the prescription for teachers and students to use practice testing to enhance learning of complex materials. However, the special issue highlights the need for more empirical and theoretical work on test-enhanced learning for complex materials, to further examine when and why these effects may be limited and to inform efforts to optimize test-enhanced learning for educationally relevant materials and tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. I computed Cohen’s d here and below using pooled standard deviation (as per Cortina and Nouri 2000).

  2. I computed achieved power here and below using G*Power 3.1.9.2 (Faul et al. 2007).

  3. A few other studies that included delayed criterion tests compared example-problems to problems only, and the general finding is that example-problem conditions outperform problem-only conditions (e.g., Carroll 1994; Salden, Aleven, Renkl, and Schwonke 2009; Ward and Sweller 1990). Note that the analog comparison in the testing effect literature would be study-test conditions vs. test-only conditions, and research has consistently shown study-test to outperform test only. Thus, the modal outcomes in the worked example and testing effect literatures are consistent with one another on this front (i.e., the directional effect of study-test vs. test only does not appear to be moderated by material complexity).

  4. Darabi et al.’s article stated that initial instruction included description of how to troubleshoot malfunctions of components in the plant, but a reviewer noted that they did not state explicitly whether these instructions specifically described to how to solve troubleshooting problems like those encountered in the practice phase. With less specific initial instruction, one would arguably expect a weaker testing effect, to the extent that problem-solving performance during practice would be lower (i.e., less effective practice tests).

References

  • Carroll, W. M. (1994). Using worked examples as an instructional support in the algebra classroom. Journal of Educational Psychology, 86, 360–367.

    Article  Google Scholar 

  • Cortina, J. M., & Nouri, H. (2000). Effect size for ANOVA designs. Thousand Oaks: Sage.

    Google Scholar 

  • Darabi, A. A., Nelson, D. W., & Palanki, S. (2007). Acquisition of troubleshooting skills in a computer simulation: worked example vs. conventional problem solving instructional strategies. Computers in Human Behavior, 23, 1809–1819.

    Article  Google Scholar 

  • de Jonge, M., Tabbers, H. K., Rikers, R. M. J. P. (2015). The effect of testing on the retention of coherent and incoherent text material. Educational Psychology Review.

  • Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14, 4–58.

    Article  Google Scholar 

  • Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.

    Article  Google Scholar 

  • Gates, A. I. (1917). Recitation as a factor in memorizing. Archives of Psychology, 40.

  • Leahy, W., Hanham, J., Sweller, J. (2015). High element interactivity information during problem solving may lead to failure to obtain the testing effect. Educational Psychology Review.

  • Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing instruction and study to improve student learning (NCER 2007–2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education.

    Google Scholar 

  • Rawson, K. A., & Dunlosky, J. (2011). Optimizing schedules of retrieval practice for durable and efficient learning: how much is enough? Journal of Experimental Psychology: General, 140, 283–302.

    Article  Google Scholar 

  • Roediger, H. L. I. I. I., Putnam, A. L., & Smith, M. A. (2011). Ten benefits of testing and their applications to educational practice. Psychology of Learning and Motivation, 55, 1–36.

    Article  Google Scholar 

  • Rowland, C. A. (2015). The effect of testing versus restudy on retention: a meta-analytic review of the testing effect. Psychological Bulletin.

  • van Gog, T., & Kester, L. (2012). A test of the testing effect: acquiring problem-solving skills from worked examples. Cognitive Science, 36, 1532–1541.

  • van Gog, T., & Sweller, J. (2015). Not new, but nearly forgotten: the testing effect decreases or even disappears as the complexity of learning materials increases. Educational Psychology Review.

  • van Gog, T., Kester, L., Dirkx, K., Hoogerheide, V., Boerboom, J., Verkoeijen, P. P. J. L. (2015). Testing after worked example study does not enhance delayed problem-solving performance compared to restudy. Educational Psychology Review.

  • Ward, M., & Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7, 1–3.

    Article  Google Scholar 

Download references

Acknowledgments

The author would like to thank John Dunlosky for the helpful input on the content of this article.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Katherine A. Rawson.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rawson, K.A. The Status of the Testing Effect for Complex Materials: Still a Winner. Educ Psychol Rev 27, 327–331 (2015). https://doi.org/10.1007/s10648-015-9308-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10648-015-9308-4

Keywords

Navigation