Red X’s and Green Checks: A Model of How Students Engage with Online Homework


In many university mathematics courses, homework accounts for the majority of students’ interaction with mathematics content. However, we know little about students’ activity as they complete homework. This paper presents an empirically-based model of students’ activity as they complete an online homework assignment. I developed the model based on analyses of video recordings of nine Calculus II students completing an online homework assignment and follow-up interviews with the students about the homework session. In the context of the introduced model, I present two additional findings. First, students’ activity when solving online homework problems is cyclic and similar to mathematicians’ activity when problem solving. The online platform contributes to this by verifying answers and providing students multiple tries per problem. Second, students leverage their multiple tries per question and ability to submit parts of questions individually to obtain intermediate feedback. They use this feedback as formative assessment to guide their work on the remainder of the problem.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7


  1. 1.

    The students in Ellis et al. (2015) were at PhD-granting institutions selected as part of the Characteristics of Successful Programs in College Calculus project (c.f. Bressoud et al. 2015); details about whether the students were mathematics majors or whether the course was common for all science students were not provided. Subjects from Krause and Putnam’s (2016) study were enrolled in a mainstream calculus course.

  2. 2.

    This paper is based on the same data set described in Dorko (2018) and refines the model presented there.

  3. 3.

    As a result of this randomization, the numbers that appear in the student examples later on in the paper may differ from what is shown in this section.

  4. 4.

    This branch of the model characterizes instances in my data in which students decided (via their own verification, not the online program’s) that their answer was incorrect. It is possible that a student might have worked on a problem, self-verified an answer, and then submitted it. I did not ask students if they self-verified answers that I knew were correct, so “self-verify” is not an explicit component of the model. I have described this branch as representing instances of students self-verifying their answers because students told me in the second interview that when they did not submit an answer and tried a different approach, it was because they sensed their answer or method was incorrect.


  1. Adiredja, A., & Zandieh, M. (2017). Using intuitive examples from women of color to reveal nuances about basis. In A. Weinberg, C. Rasmussen, J. Rabin, M. Wawro, & S. Brown (Eds.), Proceedings oft he 20th Annual Conference on Research in Undergraduate Mathematics Education (pp. 346–359). San Diego, CA.

  2. Artigue, M., Haspekian, M., & Corblin-Lenfant, A. (2014). Introduction to the Theory of Didactical Situations (TDS). In A. Bikner-Ahsbahs & S. Prediger (Eds.), Networking of Theories as a Research Practice in Mathematics Education (pp. 47–65). Switzerland: Springer International Publishing.

    Google Scholar 

  3. Bressoud, D., Mesa, V., & Rasmussen, C. (2015). Insights and Recommendations from the MAA National Study of College Calculus. Washington, DC: Mathematical Association of America.

    Google Scholar 

  4. Brousseau, G. (1997). Theory of didactical situations in mathematics. Dordrecht, The Netherlands: Kluwer.

  5. Butler, M., & Zerr, R. (2005). The use of online homework systems to enhance out-of-class student engagement. The International Journal for Technology in Mathematics Education, 12(2), 51–58.

    Google Scholar 

  6. Carlson, M. P., & Bloom, I. (2005). The cyclic nature of problem solving: An emergent multidimensional problem-solving framework. Educational Studies in Mathematics, 58, 45–75.

    Google Scholar 

  7. Cohen, D., Raudenbush, S., & Ball, D. (2003). Resources, instruction, and research. Educational Evaluation and Policy Analysis, 25(2), 119–142.

    Google Scholar 

  8. Dorko, A. (2018). Red x’s and green checks: A preliminary study of student learning from online homework. In A. Weinberg, C. Rasmussen, J. Rabin, M. Wawro, & S. Brown (Eds.), Proceedings of the 21st Annual Conference on Research in Undergraduate Mathematics Education (pp. 46–60). San Diego, CA.

  9. Dorko, A. (2019). Professors’ intentions and student learning in an online homework assignment. In A. Weinberg, D. Moore-Russo, H. Soto, & M. Wawro, Proceedings of the 22nd Annual Conference on Research in Undergraduate Mathematics Education. (pp. 172–179). Oklahoma City, OK.

  10. Dorko, A. (2020). What do we know about student learning from online mathematics homework. In J. P. Howard & J. F. Rivers (Eds.), Teaching and Learning Mathematics Online. New York, NY: CRC Press.

  11. Ellis, J., Hanson, K., Nuñez, G., & Rasmussen, C. (2015). Beyond plug and chug: An analysis of calculus I homework. International Journal of Research in Undergraduate Mathematics Education, 1(1), 268–287.

    Google Scholar 

  12. Gage, M., Pizer, A., & Roth, V. (2001). WeBWorK: An internet-based system for generating and delivering homework problems. Retrieved from

  13. Gage, M., Pizer, A., & Roth, V. (2002). WeBWorK: Generating, delivering, and checking math homework via the Internet. In ICTM2 International Congress for Teaching of Mathematics at the Undergraduate Level. Hersonissos: Available at

  14. Halcrow, C., & Dunnigan, G. (2012). Online homework in calculus I: Friend or foe? PRIMUS, 22(8), 664–682.

    Google Scholar 

  15. Hauk, S., & Segalla, A. (2005). Student perceptions of the web-based homework program WeBWorK in moderate enrollment college algebra classes. Journal of Computers in Mathematics and Science Teaching, 24(3), 229.

  16. Herbst, P., & Chazan, D. (2012). On the instructional triangle and sources of justification for actions in mathematics teaching. ZDM – The International Journal on Mathematics Education, 44(5), 601–612.

    Google Scholar 

  17. Hirsch, L., & Weibel, C. (2003). Statistical evidence that web-based homework helps. MAA Focus, 23(2), 14.

    Google Scholar 

  18. Krause, A., & Putnam, R. (2016). Online calculus homework: The student experience. In T. Fukawa-Connelly, N. Infante, M. Wawro, & S. Brown (Eds.), Proceedings of the 19th Annual Conference on Research in Undergraduate Mathematics Education (pp. 266–280). Pittsburgh, PA.

  19. Lester, F. (1994). Musings about mathematical problem-solving research: The first 25 years in JRME. Journal for Research in Mathematics Education, 25(6), 660–675.

    Google Scholar 

  20. Lithner, J. (2003). Students’ mathematical reasoning in university textbook exercises. Educational Studies in Mathematics, 52(1), 29–55.

    Google Scholar 

  21. President’s Council of Advisors on Science and Technology (PCAST). (2011). Report to the President. In Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Washington, DC: Executive Office of the President Retrieved from

    Google Scholar 

  22. Raines, J. M., & Clark, L. M. (2013). Analyzing the effectiveness of tutorial learning aids in a course management system. Journal of Studies in Education, 3(3), 120–136.

    Google Scholar 

  23. Rogawski, J., & Adams, C. (2015). Calculus: Early Transcendentals. New York, NY: W. H. Freeman and Company.

  24. Roth, V., Ivanchenko, V., & Record, N. (2008). Evaluating student response to WeBWorK, a web-based homework delivery and grading system. Computers & Education, 50, 1462–1482.

    Google Scholar 

  25. Schoenfeld, A. H. (1983). The wild, wild, wild, wild, wild world of problem solving: A review of sorts. For the Learning of Mathematics, 3, 40–47.

    Google Scholar 

  26. Schoenfeld, A. H. (1992). Learning to think mathematically: Problem solving, metacognition, and sense-making in mathematics. In D. Grouws (Ed.), Handbook for Research on Mathematics Teaching and Learning (pp. 334–370). New York, NY: MacMillan.

  27. Strauss, A., & Corbin, J. (1994). Grounded theory methodology: An overview. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp. 273–285). Thousand Oaks, CA: Sage Publications.

  28. Suzuki, J. (2003). Using online quizzes: A report from the trenches. HRFocus, 23(9), 8–10.

    Google Scholar 

  29. Weibel, C. and L. Hirsch. (2002). WeBWorK effectiveness in Rutgers Calculus. preprint. 18 pages.

  30. Zerr, R. (2007). A quantitative and qualitative analysis of the effectiveness of online homework in first-semester calculus. Journal of Computers in Mathematics and Science Teaching, 26(1), 55–73. Chesapeake: Association for the Advancement of Computing in Education (AACE).

Download references


I would like to express my thanks to Kevin Moore for his help in preparing this manuscript and the RUME community for the opportunity to present a previous version of the manuscript.

Author information



Corresponding author

Correspondence to Allison Dorko.

Ethics declarations

Conflict of Interest

No conflicts of interest affected this work.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Dorko, A. Red X’s and Green Checks: A Model of How Students Engage with Online Homework. Int. J. Res. Undergrad. Math. Ed. 6, 446–474 (2020).

Download citation


  • Online homework
  • Instructional triangle
  • Didactic contract
  • Problem solving