Introduction

From very earliest studies it has been shown that technology-enhanced mathematics education can significantly contribute to various aspects of students’ learning process and their learning outcomes. Current research regarding the use of digital technology has overgrown (e.g., Ball et al., 2018; Drijvers, 2019; Hillmayr et al., 2020). It ranges from: providing quick and reliable feedback (Drijvers, 2019); supporting students self-regulation and analysis of their mistakes (Radović et al., 2019); creating simulations for exploring mathematical problems (Hillmayr et al., 2020); seeing relations between algebraic and geometric representations of objects (Godwin & Sutherland, 2004); learning with applets and dynamic images (Ball et al., 2018); and enabling students to practice various skills and techniques at the speed and pace that suits them (Metwally et al., 2021).

The results of empirical studies have recognized numerous pearls (e.g., Drijvers, 2019), perils (e.g., Hillmayr et al., 2020), challenges (Bray & Tangney, 2017), as well as the opportunities that technology-enhanced learning brings for both students and teachers (e.g., Hillmayr et al., 2020). Metwally et al. (2021) argued in their research that technology has the potential to enhance cognitive potentials of students, support their development of problem-solving and higher thinking skills, and expand students’ knowledge. Moreover, technology in learning is often correlated with increase of students’ motivation during learn (Radović et al., 2019). Radović and Passey (2016) argued that the responsibilities of learning are shifted to students so that they develop their skills of self-management and self-assessment. However, there are many factors with an impact on the outcome of technology integration into mathematics learning (Bray & Tangney, 2017). Studies often emphasize technology design, curriculum specificities, teacher practices, and underlaying instructional design (Godwin & Sutherland, 2004; Lim & Oakley, 2013). To harness its full potential, it is of utmost importance to thoroughly understand the educational context and underlying pedagogical principles (Drijvers, 2019; Hillmayr et al., 2020). To focus the discussion of previous research on the educational problem that is central to the research outlined in this manuscript, we will concentrate on a specific aspect of technology enhanced learning practice: homework in mathematics.

Researchers have studied various technological approaches developed for homework, including online homework platforms that automatically assign homework, management systems that improve the efficacy of homework organization and control, and anti-plagiarism systems that reduce copying activity and increase engagement during homework (Zhai et al. (2023). Other researchers have introduced different tools that enhance the process, such as MathAid (Viberg et al., 2023) or ASSISTments (Murphy et al., 2020). The outcome of these initiatives reviled affirmative characteristics that include providing feedback on answers and analysing mistakes (Ceviker et al., 2022), adaptive questions with appropriate levels of knowledge and difficulty (Zhai et al., 2023), and personalized study plans based on the collected data targeting each student's individual strengths and weaknesses (Serhan & Almeqdadi, 2020).

Moreover, technology for homework has made it possible for students to practice and improve their understanding at their own pace and comfort (Ceviker et al., 2022; Diara, 2023; Magalhães et al., 2020), with automatic feedback immediately available (Diara, 2023; Zhai et al., 2023). This also enables teachers to shift their focus from grading to adapting their instruction to meet students’ needs (Murphy et al., 2020). For instance, teachers can analyse common errors and misconceptions among students and modify their following teaching classes accordingly (Ceviker et al., 2022; Diara, 2023; Murphy et al., 2020).

Nonetheless, integrating technology into mathematics homework comes with several challenges. Viberg et al. (2023) stressed out the need to shift the teacher's role from being an instructor to that of a facilitator, which requires a planned and structured approach to instructions. In recent research, Murphy et al. (2020) recognized that the quality of homework is perceived higher by students when the homework process supports their learning and is relevant to what happens in class, including teacher follow-up. Other scholars emphasize the importance of relevant and innovative teaching and learning methods, highlighting the challenge for teachers to design effective learning experiences that incorporate technologies for homework with appropriate pedagogical methods (Magalhães et al., 2020; Viberg et al., 2023).

It is becoming clear that technology will not unfold its pedagogical potential, outlined in the literature, on its own. Rather, effective and efficient integration of digital technology is about the instructional interplay between educational technology, teaching strategies, and student learning practices (Bray & Tangney, 2017; Metwally et al., 2021). In order to achieve a comprehensive understanding (Viberg et al., 2023), the next section of this paper will establish the theoretical and research foundation for homework activities, examining both the advantages and challenges faced by both students and teachers. “Research questions for this study” section will then focus on defining the research questions, exploring how technology can be best utilized to enhance the benefits of homework activities. The subsequent section, “Research Methods” section, will introduce the research methodology, including the study's context, as well as the experimental and control learning environments. The following sections will present data analysis and research findings, concluding with a discussions, limitations, and conclusions.

Homework

The term “homework” refers to tasks assigned to students that are meant to be completed outside of regular school hours (Cooper et al., 2006; Magalhães et al., 2020). In Serbia’s education system, homework serves an important pedagogical purpose with several prominent characteristics: it is assigned after each lesson, typically takes students between half an hour to an hour to complete, aims to help students review and reinforce the material learned in class, while also allowing teachers to identify areas where students may be struggling or misunderstanding the concepts being taught. Teachers and educators have long viewed homework as an important way for students to practice and reinforce material taught in class (Ceviker et al., 2022). However, the value of homework continues to be debated both in the scientific community and in broader society, particularly in the United States (Gill & Scholssman, 2004; Murphy et al. (2020)). Interestingly, historical reports from the late 19th and early twentieth centuries reveal that homework was not always viewed as a critical component of the learning process. Critics referred to it as “mechanical schooling,” with some even suggesting that it could be detrimental to student learning (Heffernan, 2019, p. 80). Gill and Schlossman (2004) note that the most negative attitudes toward homework were present in 1901, when California law abolished homework for children under 15 and limited it in public high schools. However, during the space race of the 1960s, homework became more widespread and was mandated at all levels of education, with policies dictating the number of hours students were required to work (Cooper et al., 2006).

In current mathematics education and research, data on homework point to both positive and undesirable influences on students’ learning and learning outcomes (Heffernan, 2019; Metwally et al., 2021; Scheerens et al., 2013). Existing literature indicates that traditional, paper-based homework assignments may be viewed by students as a mundane aspect of their education, often leading to negative attitudes towards them (Ceviker et al., 2022; Cooper et al., 2006; Magalhães et al., 2020). This can lead to procrastination, frustration, and burden to complete assignments, which can lead to copycat reactions from students (Magalhães et al., 2020). Studies also imply that student interest in homework can often diminish, especially when its characteristics are not balanced (e.g., amount, difficulty, pedagogical value, instructional rationale) (Cooper et al., 2006; Corno & Xu, 2004).

Despite these perils, homework is still valued by students (Murphy et al., 2020). Several literature reviews (Cooper et al., 2006; Fan et al., 2017; Magalhães et al., 2020) recognized several categories of benefits: (1) direct impact on learning and achievement, (2) connecting math learning at home and at school, and (3) increasing teachers’ knowledge of students' thinking and understanding.

Learning outcomes

Numerous research studies comparing students who regularly do homework and those who do not have shown that completing homework has a direct positive impact on learning outcomes (Cooper et al., 2006; Fan et al., 2017). Homework is highly valued for its potential to develop time management skills, study habits, self-regulation, and lead to better academic performance and higher grades (Corno & Xu, 2004; Fan et al., 2017). According to Huyen Tham et al. (2020), students have reported that homework can facilitate the development of a self-study routine, reduce stress levels, and enhance their sense of learning autonomy. A review study by Cooper et al. (2006) found a positive correlation of approximately 0.60 SD between homework completion and academic achievement. Although a more recent meta-analysis by Baş et al. (2017) found a smaller average effect of around 0.20, it was still significant. However, the amount of time spent on homework is still a topic of scientific debate. Metwally et al. (2021) conducted a review that showed a positive relationship between homework time and student achievement, including retention of knowledge, exam and final grades, homework assignments completed, and overall performance. However, as noted by Scheerens et al. (2013), who reviewed 128 research articles, the results are inconclusive, with 32% of studies showing negative effects, 33% showing no significant effects, and 35% showing positive effects.

Connecting mathematics learning at home and in schools

Homework not only has academic value, but also fosters student responsibility and blurs the line between formal and informal learning contexts, facilitating knowledge acquisition both inside and outside the classroom (Radović & Passey, 2016; Radović et al., 2019). This is demonstrated in the findings of Murphy et al., (2020), Diara (2023), and Ceviker et al. (2022) research, which identified benefits such as a shift toward self-regulated learning; increased flexibility and autonomy for students; and better connection between home and school activities. Homework breaks down the traditional boundary between home and school learning, allowing classroom materials and educational obligations to be applied to informal learning situations (Fig. 1).

Fig. 1
figure 1

An overview of learning settings and learning activities (Radović & Passey, 2016)

Enhancing teachers’ knowledge of students’ thinking from analyzing their homework

As Cooper and colleagues (2006) argued, one way to gain insight into students’ knowledge and skills can be done through evaluation of homework. Moreover, it can help teachers not only deepen their understanding of students’ way of thinking and level of understanding of mathematical concepts, but also in planning instructions (Ceviker et al., 2022). Such an approach allows teachers to respond in a timely manner to clarify perceived ambiguities, adapt homework assignments and follow-up classroom discussions to students’ needs and to demonstrated misconceptions (poorly formed, fragile or missing concepts) (Murphy et al. (2020)). In addition, analysing students’ homework allows teachers to be aware of individual needs of each student (Cooper et al., 2006; Radović et al., 2019; Zhai et al., 2023). Murphy et al. (2020) found that the type of intervention or tools used can impact teachers’ ability to target specific problems during their classroom review of homework.

Homework copying between students

Despite the potential benefits of completing homework, students in the traditional learning settings and paper-based homework often resort to copying it (Diara, 2023; Zhai et al., 2023). Academic dishonesty undermines the student’s integrity and places a burden on teachers who must invest time and effort into preventing it or handling the aftermath when their efforts fail (Emerson & Smith, 2022). According to Sweet’s (2017) report, 80% of surveyed students admitted to copying homework at least once a month. Similarly, Felder's (2011) study found that 49% of surveyed students engaged in unauthorized collaboration on homework. Students often justify their misconduct as a means to manage their workload given their time and resource constraints (Magalhães et al., 2020). Other reasons cited by Palazzo et al. (2010) include difficult homework problems that require too much time and a lack of interest in the learning that comes with homework. Felder (2011) suggests that this behavior is more prevalent in academic environments that prioritize students’ academic performance over the quality and quantity of their knowledge acquisition.

When students copy homework, teachers lack accurate information about their work, which can lead to incorrect assumptions about their mathematical abilities and the causes of their errors (Palazzo et al., 2010; Radović et al., 2019). This creates an ongoing challenge for teachers and researchers to develop more effective approaches to encourage and incentivize students to complete homework regularly, as well as to assist teachers in analyzing homework to better understand students' thought processes and plan their instruction accordingly (Heffernan, 2019).

Research questions for this study

Numerous studies examined the empirical relationship between homework and academic achievement, motivation, and self-regulation (Heffernan, 2019; Magalhães et al., 2020). Studies have also analyzed practices such as copying homework and possible remedies (Felder, 2011; Palazzo et al., 2010; Radović et al., 2019). Another strand of literature considered educational technology as an enabling tool for students’ homework engagement and academic gains (e.g., Cooper et al., 2006; Hillmayr et al., 2020; Murphy et al., 2020; Radović & Passey, 2016).

In researching the instructional interplay between educational technology, teaching strategies, and student learning practices, this study operates under two main postulates while acknowledging the potential benefits and drawbacks of homework. The first postulate is that the use of technology in education can brings many benefits, but vary in terms of implementation, student activities, and the teacher’s role. The second postulate is that homework is not necessarily effective, and its success depends on its characteristics and pedagogical implementation. This study was conducted to address a gap in research and evaluate students' learning gains in different learning environments where technology was implemented, and students’ and teachers’ activities were affected by the method of homework technology implementation. The study sought to answer one main research question:

RQ: To what extent do student learning outcomes differ in different learning environments where technology was implemented and students’ and teachers’ activities were affected by the method of homework technology implementation?

Research methods

To evaluate different instructional interplays between technology and homework activities this study employs mix method research design (quantitative and qualitative) with pre and post testing. Three variants of learning environments were designed: (1) Traditional Homework (control group, CON); (2) Technology supported homework with final solutions (experimental group 1, EXP1); and (3) Technology supported homework with explanations (experimental group 2, EXP2). The detailed differences will be explained in the “Treatment and the context of the study” Section. Treatment and the context of the study.

The study was conducted during the period of 8 weeks. Multiple data sources were used: PreTest assessment (initial knowledge test), every second weeks’ knowledge test (progress test), and at the end of the study PostTest assessment (assess students' overall knowledge acquired during the intervention time). Additionally, student individual responses to homework tasks were collected, and used in qualitative part of the analysis.

Participants

This study took place in four different elementary schools in Serbia. Participants were 325 students from 12 different classroom cohorts who gave written consent to participle in the study. There were 165 fifth, 103 sixth, 35 seventh and 22 eighth grade students (from 11- to 14-year-old). Classes were randomly assigned to one of three conditions within each school: CON group (n = 120), EXP1 group (n = 97), and EXP2 group (n = 101).

Treatment and the context of the study

The experimental conditions were three variations of the mathematics learning environment. The different interactions between educational technology, teaching practices and students’ learning activities are shown in Table 1.

Table 1 The interplay between educational technology, teaching practice, and student learning activities across different research groups

In the control condition, students did homework as usual, without using educational technology. At the end of each lesson, the teacher provided students with a list of homework assignments that were the same for all students. In the subsequent lesson, during the introduction (usually lasting a few minutes), the teacher briefly reviewed the students' paper-based solutions, addressed any issues with the tasks, resolved any doubts raised by students, and proceeded with the scheduled lesson material.

Students in the experimental groups (EXP1 and EXP2) completed their homework on the eZbirka web platform, which differed from the control group in several ways. Firstly, the tasks were slightly randomized for each student, making it difficult for them to copy from others. Secondly, the platform provided automatic feedback on students’ submissions, allowing them to self-assess and compare their answers with rubrics. This feature allowed students to practice as many times as they wanted. In contrast, the control group completed homework as usual, with no automatic feedback and no opportunity for redoing their work. Teachers in the experimental groups also had the advantage of checking students’ results before the next lesson, which enabled them to address any issues or concerns before class began. This personalized approach allowed teachers to adjust their teaching plans in advance according to the needs of their students.

The experimental conditions differed in the type of solution submitted by students. In EXP1, students provided short answers, enabling teachers to check if the homework was completed and whether the answers were correct. However, teachers were unable to determine the nature of any errors made by the students (such as calculation errors or misconceptions). In contrast, students in EXP2 not only provided the final answer but also detailed the steps they took to arrive at the solution and the reasoning behind their chosen problem-solving strategy. If they were unable to solve the problem, they also explained the reason for their difficulty. This provided valuable insight to the teachers, who could use it to address any misunderstandings in the next lesson.

Each homework assignment comprised of six tasks intended to aid students in comprehending the material taught after each lesson. The tasks were adjusted to align with the curriculum and encompassed various levels of complexity and mathematics competencies. Using the eZbirka web platform (depicted in Fig. 1), students typed in their solutions for each task in the designated answer field. The answers were then saved in the database and promptly available for teachers to review and analyze. Upon submission, students were given feedback and had the option to self-assess their solutions. If the feedback helped them enhance their learning and comprehension, they could attempt another homework assignment with different tasks Fig. 2.

Fig. 2
figure 2

An example of homework for the teaching unit “Addition and subtraction of fractions with the same denominator” for students in the eighth grade

Results

Analysis of knowledge tests result

Since the data were normally distributed, a parametric test was performed. Analyses of variance (ANOVA) were used to determine whether there was a statistically significant difference between the study groups in terms of knowledge developed. The significant results, adjusted with the correction for multiple testing, were further examined by post hoc tests and pairwise comparisons between groups.

The comparison of the mean values and effect of the different learning environments on students’ learning performance and test results are shown in Table 2. Although the participants in the Exp 2 group tended to score higher on the PreTest, the ANOVA test showed no statistically significant difference between the groups in terms of students' performance on the PreTest.

Table 2 Comparison of test performance by group

Regarding students’ results on the assessment of knowledge during study (knowledge test 1–4), we can observe some significant differences between the groups. As for knowledge test 1, the analysis of the results shows that the students of the three groups achieved relatively similar results (Table 2 and Fig. 3). After the first 2 weeks, the results of the experimental groups begin to improve (with several significant statistical differences depending on the research condition). As for the Knowledge Test 2, the ANOVA showed that there was a statistically significant difference in the mean test score between groups (F(2, 296) = 3.97, p = 0.02). The post hoc multiple comparisons test showed that the mean value of the test score differed significantly only between the control group and EXP2 (Mean Diff = 0.525, p = 0.02). As for knowledge test 3, analysis of the results showed no statistical difference. For knowledge test 4, however, the ANOVA showed that there was again a statistically significant difference in the mean test score between the groups (F(2, 111) = 3.8, p = 0.02). The post hoc multiple comparison test showed that the mean value of the test score was again significantly different only between the control group and EXP2 (Mean Diff = 0.878, p = 0.02).

Fig. 3
figure 3

Comparison of mean scores of Knowledge tests performance by group

Finally, the posttest scores of the three groups of students were compared (Table 2). The results of the ANOVA showed a statistically significant difference in the mean test score between the groups with respect to students’ final test knowledge (F(2, 313) = 4.47, p = 0.01). The post hoc multiple comparisons test showed that the mean value of the test score was again significantly different only between the control group and EXP2 (Mean Diff = 0.574, p = 0.01). Students in the Technology-assisted homework with explanations (EXP2) group developed significantly more knowledge.

Pearson's rank correlation was performed to determine the relationship between the knowledge test results of students in EXP2. The analysis suggested that students' results were completely correlated (Table 3). Increase in knowledge on the knowledge test was correlated with higher results on the final test.

Table 3 Pearson’s rank-order correlation

Analysis of student homework task solving description

The purpose of this qualitative part of the study is to analyze and illustrate students’ errors and misconceptions while working on homework tasks (Prakitipong & Nakamura, 2006; Trance, 2013). To this end, Newman’s Error analysis shame was used to code errors based on four types: (1) Comprehension (errors in understanding the task), (2) Transformation (error in problem transformation), (3) Mathematical processing (errors in process skills), and (4) Encoding errors (errors in writing answers) (Newman, 1977, 1983; Trance, 2013). From homework solutions of students in the EXP2 group, we selected characteristic tasks and answers to illustrate the extent to which teachers were able to analyze and classify student errors as a reference for choosing appropriate teaching strategies for the next lesson to reduce and even eliminate student errors and misconceptions. This process could not be replicated with homework solutions from the EXP1 participant group because the students’ thinking process and problem-solving steps were often missing.

Homework task: comprehension errors

With comprehension errors, students have problems with misunderstanding the requirements of the task. They do not understand the meaning of symbols or questions, or they misunderstand mathematical terms. It also includes the inability of students to determine what is known and what is required of the problem (Newman, 1977, 1983). One of the most common ways of making these types of errors was selecting information incorrectly, or not being able to distinguish between relevant and irrelevant information (e.g., using all the information provided in a task or neglecting relevant information), or not being able to recall information that was not provided directly in the task.

Task

In the lesson “Solve systems of equations using substitution”, students should check their knowledge by solving the following task: “Solve the system of equations using substitution: x = y + 23, x + 2 = −1”

Solution

The statement obtained by the students: “I don’t know how to work with the system of equations without variables y in the second equation.”

The student solved all the previous tasks in the same lesson, where he had to calculate and handle systems of equations. In this case, the student was confused because one of variables was missing. Here an error occurred because student was not able to gather required information that was not directly provided in the task.

Task

In the lesson “Equations with an unknown factor”, students should check their knowledge of the equation. One of the tasks is “What is the size of the side b of rectangle, if the size of side a = 3.5 cm and the area of rectangle P = 14cm2?”

Solution

The explanation obtained by students: “Unknown side I get if I divide area by a size of known side, but how can we convert centimeters to fractions?”

This lesson incorporates solving real-world examples of equations that contain fractions. The student demonstrates pure mathematical knowledge of solving equations, but also a degree of misunderstanding of the relationship between the unit of measurement and the measured value.

Homework task: transformation errors

The transformation error is an error when the student incorrectly transforms the problem into a mathematical model such as equation, picture, graph, or table. This type of error is also noted when students try to answer the task without using all the mathematical procedures. It also occurs when students use an incorrect operation or mathematical concept without analyzing whether it is possible.

Task

In the lesson “Percentage,” with the aim of refreshing students’ knowledge about the concept and characteristics of percentage, one of the questions reads, “Mrs. Petrovic bought 1 kg of shelled walnut. When she cleaned it, she got 400 g of clean walnuts. Determine what percentage fell on the shell.”

Solution

The explanation given to the students: “Using the ratio: 1000 g: 400 g = 100: x, we get that x = 40%. Therefore, on the shell 140% is gone.”

Student made the proportion correctly, was able to solve it, but did not understand how to answer the authentic and real-world question—learner did not understand the mathematical concept of percentage. The result reported as a solution suggests that the student answered even without analyzing whether the solution was possible.

Homework task: mathematical processing errors

Errors of this type correspond to students’ failure to perform mathematical rules or procedures, or lack of practice in solving mathematical prblem. Process skill error is when student using the correct procedure but making mistakes in calculation or computing. These include errors in solving algebraic expressions, or functions, errors in arithmetic and mathematical interpretation.

Task

In the lesson “Application of multiplication and division of fractions” students must apply the knowledge of multiplication and division of fractions using real-life examples, one of the tasks is: “At the beginning of the school year, 400 notebooks were sold in a bookstore. The first day 3/5 of the total quantity was sold, 5/8 of the rest was sold on the second day. How many notebooks were sold on the third day?”

Solution

The statement obtained by the students, “3/5 equals 220 notebooks, 5/8 equals 250 notebooks. That means on the third day 490 notebooks were sold.”

The teacher may notice the student’s handling of fractions (three-fifths of 400 does not equal 220), but the lack of understanding of the mathematical requirements (in the second part of the answer “5/8 equals 250” is calculated from the original value noted in the task requirement—400, and not from what is left form calculation).

Task

In unit “More Complex Equations,” students were expected to solve equations in which one variable occurs more than once: “Solve the equation: (a) 2x+ 3x+ 5.3+ 1.4 = 9.2; (b) 4x+ 12.5+ 3.8 = x−10.”

Solution

Students’ obtained explanation: “I didn’t realize how in the task there can be two variables x, and I don’t know which of the arithmetic operations to start from.”

The learner, in this case, self-defining a problem in knowledge. This error corresponds to students' failure to perform the mathematical process of grouping sums.

Homework task: mathematical encoding errors

Encoding error is the student’s error in writing the answer correctly, unable to show the truth of the answer or validate the mathematical solution in terms of the real-world problem, or not writing the conclusion of the answer. This error is often reflected in student’s impossible or unrealistic answer.

Task

In the teaching unit “Construction of the perpendicular bisector,” one of the assignments verify that students are able to use the knowledge to solve practical problems: “Milos drew a circle but forgot to indicate the center point. Help him find a center point. How can you construct it?”

Solution

Student’s explanation: “I measured the diameter of the circle, dividing it in half to find a point of which all points are equidistant.”

Although the student showed that he understood the requirements of the assignment, his solution suggests that he was unable to correctly apply the mathematical laws to solve the real problem. The solution given by the student cannot be implemented. The characteristic of the diameter is that it passes through the center of the circle, but without knowing where the center is—it is impossible to determine the diameter.

Task

In the unit “Application of numerical expressions” students are expected to apply their knowledge of forming numerical expressions to solve problems that may arise in everyday life. The task, which students must solve, is as follows: “Marko has to divide 1 kg of sugar among five bowls. How many kilograms of sugar should there be in each bowl?”

Solution

The explanation obtained by the students: “1 kg * 5 = 5 kg. Each bowl contains 5 kg of sugar.”

This is an example of the student’s failure to interpret a mathematical answer as a solution that fits into the real-world context of a task. His answer of 5 kg is, within the context of this task, an answer that makes no sense.

Discussion

The literature by Cooper et al. (2006), Heffernan (2019), Bray and Tangney (2017), Radović et al. (2019), and others suggest that students and teachers should utilize technology to enhance the homework process. However, to facilitate wider adoption of technology, it is essential to provide teachers with adequate support to develop appropriate teaching strategies and promote effective interactions (Viberg et al., 2023). The goal of the present study was to provide empirical evidence on how to optimize learning outcomes for students in both home and school settings (Bray & Tangney, 2017; Hillmayr et al., 2020) by researching the interaction between educational technology, teaching practices, and student learning activities. The study yielded both qualitative and quantitative findings, which provide several important points of discussion.

The present study found statistically significant differences in students’ grades between the Technology-assisted homework with explanations (EXP2) group and the Traditional homework group. This was demonstrated through two knowledge tests during the study, as well as the final knowledge test, at the end of the experimental period. To some extent, this result is consistent with prior research on assigning and collecting student homework. For instance, Murphy et al. (2020) noted that information about common wrong answers for each task helped teachers to address students’ cognitive issues and promote better comprehension. In the present study, students who provided explanations for their problem-solving activities (EXP2), enabled teachers to better identify those who struggled with various aspects of the homework and mathematical concepts. As Murphy et al. (2020) observed in an earlier study, providing more and better explanations may be especially helpful for lower-performing students. One could interpret students’ activities as providing them with more time for self-reflection and awareness of their understanding. By writing down the steps they took to arrive at a solution, as well as the reasoning behind their chosen problem-solving strategy, or explaining the reason for their difficulty, students were able to clarify better their understanding.

However, the present study brings another important result to be discussed. While technology can be advantageous for homework activities, its benefits may not always be statistically significant for students’ learning. Specifically, the present study found no statistically significant differences in learning outcomes between students who used the same technologies for homework but with writing only final solutions (EXP1) and those who completed traditional homework. This outcome suggests that effective practices must be developed through interaction between teachers and students, and as Viberg et al. (2023) have noted, the development of these practices should be led by teachers. Moreover, according to Zhai et al. (2023), the effectiveness of technology depends on the role of the teacher and students’ activities. Therefore, for technology integration in education it is essential to have a comprehensive understanding of the educational context, pedagogical principles, and lesson design. This aligns with previous research on the subject, which emphasizes the challenges of effectively integrating digital technology into the mathematics classroom (Bray & Tangney, 2017; Hillmayr et al., 2020). Later, we will discuss several specific practical features of technology integration in mathematics education during homework based on the study's findings.

The qualitative component of the study showcased how teachers could effectively analyze and comprehend students’ errors and misconceptions while reviewing their work. In the EXP2 group, teachers were able to adjust their instruction to cater to their students’ needs by scrutinizing patterns of misconceptions in homework, which was not feasible for students’ homework in the EXP1 group. Through a systematic process that involves identifying the errors and misunderstandings’ patterns, analysing their causes, adapting lessons to meet students' needs, and implementing corrective action, teachers made significant progress in comprehending and addressing students’ difficulties in learning mathematics (Murphy et al., 2020; Viberg et al., 2023). This study demonstrated that this pedagogical process of teacher was highly beneficial for student learning.

Limitations

Several limitations of this study must be considered. First, we only measured students’ direct learning outcomes—knowledge development. Studies are needed that use measures other significant variables than grades, and take into account motivation, satisfaction during learning, development of self-regulatory skills, etc. Second, in this study we did not evaluate the teachers’ perspective on the process, which may help to better understand the overall effect of technology integration in education. The third limitation pertains to the implementation of technologies in homework specifically for mathematics courses. Therefore, generalizing the findings of this research to evaluate its effectiveness in other STEAM subjects (such as science, technology, engineering, and the arts) or other learning activities cannot be confirmed. The fourth limitation is the relatively short period of experimentation, which was limited to 8 weeks. To obtain more accurate results, it would be worthwhile to conduct a longitudinal study that spans an entire semester or academic year. It is also essential to recognize the strong possibility of the existence of unobserved confounding variables, including unobserved mediators that may be correlated with targeted homework activities and student achievement, such as homework time, number of tasks completed, homework completion rates, and cognitive load during students’ learning.

In the future research, we will try to apply this research method and expand this work to include larger samples of students and teachers to allow for more comprehensive findings.

Conclusions

In conclusion, this study has contributed to the field in two significant ways. Firstly, we conducted a rigorous experiment that intersected three crucial elements of contemporary education: (a) technology, (b) mathematics homework practices, and (c) teacher-student interactions. The results demonstrate that this overlap provides a promising area for intervention in the ongoing pursuit of enhancing students’ mathematics achievement, as well as the teachers’ activities of identifying patterns of errors and misunderstandings, analysing their causes, adapting lessons to suit students’ needs, and implementing corrective measures. Secondly, we show that technology cannot independently establish effective connections between learning environments and situations. The critical factors are the pedagogical activities that the technology supports, how it is implemented in the learning process, and the teaching approach employed (Bray & Tangney, 2017; Drijvers, 2019; Radović et al., 2019). It is essential to have a comprehensive understanding of the educational context, pedagogical principles, and lesson design.

Taken together, the findings of this study suggest four practical recommendations that can contribute to the successful implementation of technology for homework activities (also presented in Table 1 and Fig. 4):

  1. 1.

    Provide opportunities for students to articulate their problem-solving methods and thought processes. Students should use language that reflects their comprehension of the subject matter and the complexity of the concept or problem they are describing.

  2. 2.

    Use students’ answers to gain valuable insights into their cognitive processes, identify any misconceptions, and assess their level of understanding. This information can then be used to tailor the teaching approach and provide targeted guidance that addresses student needs.

  3. 3.

    Provide students with timely feedback on their homework performance or rubric for self-assessment to enhance their learning experience. By receiving immediate feedback, students can identify areas of strengths and weaknesses and take steps to address any gaps in their understanding.

  4. 4.

    Allocate sufficient time in the next class to briefly summarize the main errors and misconceptions that students exhibited. Adjust teaching approach to address common areas of difficulty and create a classroom culture that values learning from mistakes and encourages students to ask questions and seek clarification.

Fig. 4
figure 4

The schema of successful implementation of technology for homework activities (Radović et al., 2019)