Introduction

Teachers prescribe homework with diverse intentions, such as revising or practicing what has been worked on in class and/or expanding academic skills, preparing new content, or ensuring that every student participates in their learning processes (Epstein & Van Voorhis, 2001). At this point, achieving a homework prescription that effectively favors school engagement and promotes student autonomy in their learning process has become a challenge (Rodríguez et al., 2020). With the ultimate purpose of turning school homework into a valuable educational resource, this quasi-experimental investigation is developed to verify to what extent the implementation of the method MITCA (homework implementation method) (Valle & Rodríguez, 2020) contributes to students’ school engagement.

The study of school engagement has acquired great relevance in education in the last decade (Boekaerts, 2016; Christenson et al., 2012), to the extent that it has been linked, among other variables, to early school dropout (Wang & Fredricks, 2014) and the disruptive behavior of adolescents (O’Toole & Due, 2015; Wang & Fredricks, 2014). Indeed, the available evidence has linked school engagement with satisfaction in school (King & Gaerlan, 2014) and even life satisfaction among students (Liang et al., 2016; Martin et al., 2014).

In the academic field, previous research has shown that achievement and learning are positively linked to school engagement (e.g., Froiland & Worrell, 2016; Motti-Stefanidi, et al., 2015; Tomás et al., 2016). The interaction of school engagement with academic motivation has also been evidenced in terms of perception of competence (Li et al., 2010), dedication of effort, and persistence (Skinner et al., 2009).

Prescription of quality homework

In an attempt to ensure that homework contributes to promoting a more positive attitude toward school (Buijs & Admiraal, 2013; Ramdass & Zimmerman, 2011), favors academic involvement, and protects the emotional well-being and perception of student competence (Dettmers et al., 2011; Ramdass & Zimmerman, 2011), different authors have contributed to point out conditions for a prescription of quality homework (see, e.g., Coutts, 2004; Marzano & Pickering, 2007; Rosário et al., 2018; Rosário et al., 2019; Vatterott, 2010, among others). Taking these contributions into consideration and also taking into account other studies that have incorporated homework as work topics within teaching programs aimed at improving organizational habits and student autonomy (Akioka & Gilmore, 2013; Breaux et al., 2019; Flunger et al., 2021; Gambill et al., 2008; Langberg, 2011), evidence related to the homework characteristics and correction practices are compiled below.

Evidence on the conditions for prescribing quality homework

One of the conditions that seem to have the greatest consensus when prescribing quality homework would be the need to establish and make explicit the purpose of the tasks (Coutts, 2004; Marzano & Pickering, 2007; Rosário et al., 2019). Ensuring that the goal of the assignments is clear and consistent with the type of tasks would be a fundamental element for the prescription of quality homework. Furthermore, it seems relevant to design tasks that are perceived as meaningful, useful, and interesting tasks by the students. In this line, assigning varied activities both attending to the needs and preferences of the students and adjusted to the academic curriculum could be understood as a condition for the prescription of quality homework (Akioka & Gilmore, 2013; Flunger et al., 2021; Marzano & Pickering, 2007; Rosário et al., 2019; Vatterott, 2010).

Evidence on homework correction practices

Making homework a valuable educational resource surely requires feedback that incorporates frequent and personalized comments on the homework itself, including the correct answers, if applicable. Also, teachers’ feedback on homework should include notes to improve and/or advice that can be extrapolated to exam situations (Akioka & Gilmore, 2013). In addition to constituting a reference for the teacher, homework correction and feedback would contribute to the student’s own self-assessment (Rosário et al., 2019), and, specifically, to the recognition of their difficulties (Bang, 2012; Epstein & Van Voorhis, 2001).

In summary, understanding that quality homework must consider, at least, conditions related to its prescription and correction, and given that, as far as we know, there are no programs or methods that specify the set of guidelines to follow, the MITCA method (Valle & Rodríguez, 2020) is designed.

MITCA and school engagement

The MITCA method has been designed by establishing five conditions:

  1. 1)

    Not only revision or post-topic tasks should be prescribed, but also it is necessary to assign similar proportions of revision, organization, and production tasks that also include pre-topic tasks (Varied Tasks),

  2. 2)

    Tasks are described by the mental work they involve and the content they address (Specific Tasks),

  3. 3)

    The teacher must convey the usefulness, interest, importance and/or applicability of homework (Worthwhile Tasks),

  4. 4)

    Tasks are prescribed weekly and the students establish their timeslots in which to do them (Weekly Tasks), and

  5. 5)

    Tasks are corrected weekly, either in the classroom or individually, differentiating between aspects to improve and positive points (Evaluated Tasks).

These five conditions for the prescription of homework are synthesized as Varied, Specific, Worthwhile, Weekly, and Evaluated. Thus, the method contemplates that the students, while carrying out information identification tasks, e.g., marking, writing, and/or reviewing literal, and organizational, e.g., differentiating and ordering ideas, are involved in more constructive ways of participation, e.g., paraphrasing or writing an opinion, and interactive ,e.g., preparing an explanation for others or defending an argument in public, when dealing with homework at home (Varied Tasks—STEP 1 of the MITCA method).

The Varied Tasks condition, which implicitly calls for the increased prescription of more elaborate and generally more challenging extension tasks for learners, should positively affect learners’ cognitive and behavioral engagement (see, e.g., Dunlosky et al., 2013; Fiorella & Mayer, 2015).

On the other hand, taking into account the evidence around the need to establish and explain the purpose of the tasks and based on the TASC conditions developed by McCardle et al. (2016) for the establishment of learning purposes, MITCA refers the teacher to the need to define the tasks that are prescribed in terms of cognitive operation and content (Specific Tasks—STEP 2 of the MITCA method). It is expected that setting clear purposes for homework allows progress to be monitored, difficulties to be recognized, and revision opportunities to increase (McCardle et al., 2016). Furthermore, it has been linked to student engagement in school (Shernoff, 2013).

MITCA maintains that the subjective value attributed to the tasks that are prescribed can be improved when expectations are adjusted, the intrinsic interest is adjusted, as far as possible, and the instrumental value of the same is identified (Worthwhile Tasks—STEP 3 of the MITCA method) (Eccles & Wigfield, 2002). Aligning homework with the syllabus and integrating it into classroom activities would enhance students’ perception of its usefulness, thereby contributing to their motivation and behavioral engagement in academic tasks (Núñez et al., 2019).

In this sense, it is understood that attributing some kind of recognition to tasks—for example, “this type of task will be in the exam” or “the best ones will be presented in class”—or instrumental value, e.g., “they will help you learn to buy well at sales or to learn to speak in public”, will improve affective school engagement (Katz & Assor, 2006).

In summary, based on the literature around the characteristics of the tasks that are prescribed, it is hypothesized that the conditions of varied tasks, concrete tasks, and valuable tasks of the MITCA method improve and/or contribute to containing the cognitive, behavioral, and/or emotional engagement with the school.

The MITCA method proposes a weekly task prescription, instigating the teacher to collaborate with students in establishing their own timetable to complete them in the first six weeks of implementation of the presented method (Weekly Tasks—STEP 4 of the MITCA method). In various programs that aim to promote students’ autonomy and self-regulation through homework, certain conditions have been considered, such as planning schoolwork outside of school hours, estimating the time required to complete tasks (Langberg, 2011), and providing instructions for organizing agendas and materials (Gambill et al., 2008).

Finally, MITCA also includes the informative and motivating feedback condition as a feedback strategy (Evaluated Tasks—STEP 5 of the MITCA method). It is understood that feedback that provides individual information on improvements and guides on aspects to improve—informative feedback—becomes an educational resource capable of optimizing the learner’s self-regulatory skills and increasing their academic engagement. In fact, we have evidence to suggest that feedback that incorporates both criticism and praise, directed at controllable aspects, such as effort or dedication, will contribute to students’ motivational engagement (Cunha et al., 2018; Fong et al., 2019).

Likewise, it is expected that the weekly assignment of tasks (Weekly Tasks—STEP 4 of the MITCA method), together with regular informative and motivating feedback (Evaluated Tasks - STEP 5 of the MITCA method), contribute to improving or sustaining student’ school engagement.

In order to verify if the prescription of homework following these five conditions of the MITCA method during 12 school weeks contributes, indeed, to the motivational, cognitive, and behavioral engagement of the students, this quasi-experimental research is proposed with a control group (CG) and measures pre- and post-intervention.

Materials and methods

This study presents an experimental research design composed of a control group and an experimental group evaluated in two stages (pre-test and post-test). Specifically, the MITCA method was implemented for 12 weeks—a full term of the Spanish academic year—in the subjects of Spanish, Galician, and Mathematics. These three subjects have been chosen for this study as they are core subjects in the Spanish academic curriculum. They are common across the different grades explored and are given greater weight within the academic curriculum.

An experimental (with classes convenience assigned to the experimental group, EG, or the control group ,CG) study was designed to observe the impact of using the MITCA method for 12 school weeks on school engagement, differentiating between cognitive, motivational, and behavioral engagement with the school, in 5th- and 6th-grade Primary School students. Data were collected at two measurement time points (pre-test and post-test) for the dependent variables. Therefore, the two groups who participated in this research are characterized by the following conditions:

  • Control condition: A group of teachers with their respective pupils who assign and perform homework under their convictions, without prior training.

  • Experimental condition: A group of teachers with their respective pupils who assign and perform homework under the characteristics of the MITCA method, with previous training in this method and weekly follow-ups by the researchers.

Participants

In this study, a total of 43 teachers participated, teaching either Spanish or Galician Language and/or Mathematics to the 5th- and 6th-grade students in Primary Education. Specifically, there were 23 teachers from the 5th grade and 20 teachers from the 6th grade. The student sample comprised 964 individuals, consisting of 469 boys and 495 girls. These participants were drawn from 20 Primary Education schools located in the Autonomous Community of Galicia (Spain). While convenience sampling was employed to select the participants, the research implementation procedure utilized official channels associated with the teacher training centers of the government of Galicia. This approach ensured that all Primary Schools within the target population had an equal opportunity to be included in the sample.

There were 17 schools in the experimental group, consisting of 11 public schools and 6 subsidized schools. All these schools were situated in an urban context, and the socio-economic level of the area was classified as medium–high according to the Spanish national statistics (National Institute of Statistics, 2020).

In this research, the decision was made to focus the analysis on students in the 5th and 6th grades of Primary Education. These grades were chosen as they represent the final years required to complete this educational stage before transitioning to Secondary Education, which often takes place in a different school center. In Spain, Primary school students have one teacher as their tutor for all or most subjects, except for specialized areas such as English or Physical Education. Meanwhile, Secondary Education students have different teachers for each subject.

The sample was differentiated into an experimental group and a control group. The experimental group is made up of 24 teachers (12 from the 5th year and 12 from the 6th year) and 533 students (270 students from the 5th year and 262 students from the 6th year), while the control group is made up of 19 teachers (11 from the 5th grade and 8 from the 6th grade) and 431 students (263 students from the 5th grade and 168 students from the 6th grade).

All participants were evaluated before and after the intervention. Ethical and biosafety implications were previously approved within the framework of the project developed for this research. Specifically, the study adhered to the guidelines outlined in the Declaration of Helsinki and was conducted in accordance with the ethical standards set by the Ethics Committee of the University of A Coruña, given its involvement with human participants.

Instruments

The student’s school engagement was measured through the Ramos-Díaz et al. (2016) validated Spanish version of The School Engagement Measure (SEM) by Fredricks et al. (2005), which contains 19 items with a 5-point Likert-type response format (where 1 is “never” and 5 is “always”). The exploratory factorial analysis of the elements in the Spanish adaptation allows us to replicate the original structure differentiating the behavioral engagement (α = .82) (example items: In class I pay attention; When I am in class I dedicate myself to work (study); I follow the rules that mark in my school), emotional engagement (α = .80) (example items: I have fun in class; I am happy at school; I like being at school), and cognitive engagement (α = .72) (example items: I read extra books about things we do at school; I try to watch TV shows about things we do at school; When I read a book I ask myself questions to make sure I understand what I read).

Procedure

Several methods were used to select the sample. The participation of teachers who taught either Spanish or Galician and/or Mathematics in the 5th and 6th year of Primary Education was requested through the six Training and Resource Centers (CFRs) dependent on the Xunta de Galicia (Government of the Autonomous Community from Galicia, Spain). The official social networks of the Autonomous Center for Training and Innovation (CAFI) of this same institution and the social networks of the research group in Educational Psychology of the University of A Coruña were also used. Four meetings were arranged, one in each of the provinces of the autonomous community of Galicia (Spain), to make the objective of the research known to the teachers who showed interest in participating. Twenty-four of the teachers who attended this first meeting agreed to implement the homework prescription method during the 12 weeks during the second school term of the academic year—the academic year for Primary Education in Spain is divided into three terms—(experimental group).

Nineteen teachers agreed to participate as a control group, committing themselves to continue with a conventional task without incorporating any change to their usual practice during the 12 weeks of implementation of the method. All the teachers that participated in the intervention received a training seminar of approximately one hour on the principles and conditions for the prescription and correction of homework following the MITCA method (Vieites, 2022).

In the middle of the intervention, 6 weeks after the implementation of the method began, the teachers that constituted the experimental group attended a second meeting whose purpose was to collect impressions on the suitability of the method, reporting on how it was articulated in their habits, routines, and particular characteristics.

On Monday or Tuesday of each of the 12 weeks of the intervention, teachers in the experimental group prescribed homework for their students in the classroom. The prescription included (a) the mental process, e.g., identify, organize, and solve, and the academic content involved, e.g., adjectives, numbers, and quantity problems, and the value that had been expressly attributed to those tasks, e.g., usefulness, interest, importance, and/or applicability of homework. The prescription of different types of tasks—review vs organization vs production tasks; post-topic task vs pre-topic task—had to be proportional at the end of the 12-week intervention. From week two of the intervention, the teachers in the experimental group reported not only the tasks they prescribed but also the evaluation procedure adopted. During the 12 weeks of intervention, the correction of the prescribed tasks had to be either individualized—the notebooks were collected and corrected, indicating errors and strong points—or solved in class out loud and/or waxed, one by one. The use of both correction procedures was to be proportional by the end of the intervention.

Throughout the intervention period, the participating teachers communicated the homework they prescribed to the students in the experimental group via email. They were encouraged to ask questions and share any difficulties they encountered while incorporating the method into their teaching practice. Moreover, they received constructive feedback from the research group on the content of the tasks assigned, along with suggestions for improvement and corrections.

The data referring to the variables under study were collected during school hours by research collaborators, with the prior consent of the management team and the student’s families. The variables related to homework and student school engagement were obtained in the second term of 2020–21.

Data analysis

To address the hypotheses of this research, intergroup differences were analyzed both before and after a 12-week intervention. Pre-test and post-test comparisons were conducted considering the type of prescription—conventional vs MITCA—as a factor and measuring school engagement through cognitive engagement, behavioral engagement, and emotional engagement as dependent variables. The study of differences was interpreted using Cohen’s (1988) criteria, where Cohen’s d values below 0.20 indicate no effect, values between 0.21 and 0.49 indicate a small effect, values between 0.50 and 0.70 indicate a moderate effect and values above 0.80 indicate a large effect.

Results

Since the sample selection procedure was convenience-based, our initial objective was to examine whether there were any baseline differences in cognitive, behavioral, and emotional engagement between the control group (n = 431) and the experimental group (n = 533).

Table 1 contains pre-test and post-test data, including means, standard deviations, skewness, kurtosis, and the number of students in each group.

Table 1 Descriptive statistics of the variables of school engagement

While no significant differences were found in cognitive or behavioral engagement between the two groups, there were reported differences in emotional engagement favoring the experimental group (t = 2.147, p < .05, d = .14) (see Fig. 1).

Fig. 1
figure 1

Pre-test group differences in school engagement

The analysis of the mean suggests significant differences between students who participated in the 12 weeks of MITCA intervention and those who did not participate in emotional engagement (t = 4.185, p < .001, d = .28) and behavioral engagement (t = 3.610, p < .001, d = .24). No significant differences were reached in cognitive engagement (see Fig. 2).

Fig. 2
figure 2

Post-test group differences in school engagement

According to our results, the group of students who did not participate in the 12-week MITCA significantly worsened their emotional (t = 4.084, p < .001, d = .21), behavioral (t = 3.743, p < .001, d = .20), and cognitive (t = 2.17, p < .05, d = .11) engagement (see Fig. 3).

Fig. 3
figure 3

Pre-test and post-test differences in the control group

Although the pre-post differences for the group that participated in MITCA did not reach statistical significance, higher average scores were observed for all three dimensions of school engagement after 12 weeks (see Fig. 4).

Fig. 4
figure 4

Pre-test and post-test differences in the experimental group

Discussion

Previous research suggests that the decline in school engagement typically begins in the last years of primary school, which is the focus of this study (Archambault & Dupéré, 2017; Bae et al., 2020; Fredricks et al., 2019). In line with this, our findings indicate a significant decline in school engagement among participants in the control group, whereas no such decline is observed among MITCA students. Similarly, some studies have indicated that school engagement tends to progressively decrease as students’ progress through their schooling (see, e.g., Rosário et al., 2019). To examine the potential impact of appropriate homework prescription on students’ school engagement, this research aims to evaluate the effectiveness of the MITCA method in the two final grades of Primary Education (Vieites, 2022).

Overall, the findings of this study support the hypothesis of previous literature regarding the usefulness of homework as a strategy to improve students’ school engagement (Fong et al., 2016). Specifically, after 12 weeks of MITCA implementation, participants followed the rules set by their school, paid attention in class, and were more engaged in studying or performing teacher-mandated tasks—behavioral engagement—than students in the control group. The students in the experimental group of this research also reported being happier at school, having more fun, and enjoying themselves more in the classroom—emotional engagement—after 12 weeks than the students who functioned as controls.

As suggested by previous literature, the diversification of tasks assigned for home completion (Varied Tasks—STEP 1 of the MITCA method) may have contributed to consolidating behavioral engagement and facilitating cognitive engagement in the experimental group. This diversification allows for better adaptation of tasks to students’ learning rhythms and styles (e.g., Rosário et al., 2019; Vatterott, 2010). Furthermore, the specification of processes and content (Specific Tasks—STEP 2 of the MITCA method) and the emphasis on the value of tasks when assigned (Worthwhile Tasks—STEP 3 of the MITCA method) enable students to establish clear connections between homework, classroom activities, and academic achievement. These factors may explain the sustained sense of responsibility and behavioral disposition observed among students in the experimental group after 12 weeks of intervention (e.g., Katz & Assor, 2006; Shernoff, 2013). The observed improvement in behavioral and emotional school engagement may also be attributed to the feedback condition proposed by the MITCA method (Evaluated Tasks—STEP 5 of the MITCA method), as it contributes to students’ sense of control over the learning process and enhances their confidence (Fong et al., 2019).

In this study, it was expected that cognitive engagement would be higher among students in the experimental group after the intervention. This expectation was based on the cognitive operationalization of homework (Specific Tasks—STEP 2 of the MITCA method) and the inclusion of varied tasks (Varied Tasks—STEP 1 of the MITCA method), which implicitly promote more active, constructive, and interactive learning approaches (Dunlosky et al., 2013; Fiorella & Mayer, 2015). While cognitive disengagement was observed in the control group, no significant differences in cognitive engagement were observed between the groups after 12 weeks of intervention.

We interpret that the ability to self-regulate and the use of deep learning strategies associated with cognitive engagement (Fredricks et al., 2004; Wang et al., 2016) may require more extensive intervention. In this regard, we account that studies such as Wang and Fredricks (2014) and Quin et al. (2017) in Secondary Education obtained significant results in their interventions for emotional and behavioral engagement, but not for cognitive engagement. The research design incorporating, specifically, lagged measures that allow us to observe long-term trends is assumed to be a limitation in this work. At the same time, the incorporation of resources or instructional strategies specifically aimed at cognitive support to approach tasks at home is proposed as a future line of work.

In any case, we note at this point that different authors have warned about a certain disarticulation of the construct of cognitive engagement, which could also be more specific to the subject or content than other dimensions of school engagement (Quin et al., 2017).

The items measuring the cognitive engagement focus on tasks such as revision work, self-monitoring strategies, and self-evaluation. However, the cognitive operationalization of the MITCA method may not fully capture the cognitive engagement construct after only 12 weeks, lacking qualitative aspects. In a study by Li and Lajoie (2022), where they present an integrated model of cognitive engagement in self-regulated learning, they highlight the importance of incorporating the qualitative aspect of cognitive engagement, concerning students’ learning strategies and their adaptation. Future research could explore interventions that consider both quantitative and qualitative aspects of cognitive engagement, integrating them into the intervention design.

Additionally, it is worth noting that the intervention time for this dimension of engagement may not have been sufficient. It would be valuable to investigate whether longer intervention periods (e.g., the entire school year instead of one term) would lead to more significant changes. Furthermore, considering delayed measures could provide further insights into the long-term effects of the intervention.

Finally, while positive trends are observed when comparing the results of the control group and experimental group, the absence of statistically significant results may be attributed to the relatively short intervention time of 12 weeks, during which only the prescription and correction conditions of homework were affected. Notably, measures of engagement, where significant changes are yet to occur, tend to require more time to exhibit noticeable effects. Additionally, it is important to acknowledge the limitation of this study, which is the absence of delayed measures. Further research with extended intervention periods and delayed measurements could provide deeper insights into the long-term impact of the MITCA method on students’ engagement.

Conclusion

Educational practices that aim at improving and/or maintaining school engagement appear to be the key in ensuring the foundations of learning and preventing disengagement or dropout later on. Beyond the potential on academic performance, the results of this investigation should be interpreted with regard to the relevance of school engagement for the prevention of school failure or dropout, satisfaction in school, and even personal happiness and self-fulfillment (Clark & Malecki, 2019; Gutiérrez et al., 2017; Liang, et al., 2016; Martin et al., 2014; Rodríguez-Fernández et al., 2016).

Previous research has sought to promote student engagement in Primary Education through various means, such as providing classroom support, fostering curiosity, or emphasizing self-awareness of one’s engagement with school tasks (e.g., Schardt et al., 2019; Vaz et al., 2015). In this context, the MITCA method could specify and complement practices and educational strategies specifically aimed at promoting or maintaining school engagement, by differentiating the characteristics of the prescribed tasks, the frequency of prescription, and the type of feedback provided.

In terms of the characteristics of the prescribed tasks, two specific strategies can be proposed to improve and/or maintain student engagement in the final years of Primary Education. First, ensuring that students understand the purpose of the assigned task by specifying the mental processes, academic content involved, and the attributed value (Coutts, 2004; Marzano & Pickering, 2007; Rosário et al., 2018; Rosário et al., 2019; Vatterott, 2010). Second, prescribing varied tasks, specifically incorporating elaborative and pre-topic tasks (Akioka & Gilmore, 2013; Flunger et al., 2021; Rosário et al., 2019; Vatterott, 2010). These strategies can be considered as specific approaches to enhance and/or contain student engagement at the end of this educational stage.

Similarly, the weekly prescription of tasks, along with teacher support in learning time management (Gambill et al., 2008; Langberg, 2011), and the combination of individualized correction, including personalized comments on the homework itself, pointing out errors and strong points, and solving them in class, either by reading aloud or through class discussion, one task at a time (Akioka & Gilmore, 2013; Rosário et al., 2019), can be understood as specific steps to promote school engagement in light of the findings of this study.

The absence of follow-up measures, which prevents considering the long-term effects of the intervention in general and the potential changes in cognitive engagement specifically, is understood as a limitation in this study. Although the current results suggest the promising value of this intervention, which involved simple and feasible modifications in the practices of participating teachers, future research should consider including follow-up measures.

While we understand that the end of the Primary Education stage can provide a crucial framework for the promotion of self-regulatory skills and is particularly sensitive to the decline in school engagement, the impact of quality homework prescription in other educational stages remains open for future investigation.