1 Introduction

Theories from social psychology propose that from very early on, humans hold beliefs about the malleability of intelligence (Dweck, 1999, 2006). Two beliefs are usually opposed: the belief that intelligence is fixed (fixed mindset) and the belief that intelligence can grow with effort and training (growth mindset). Fixed and growth mindsets develop during childhood (Schroder et al., 2017) and are shaped through interactions with parents (Gunderson et al., 2013), teachers (Rattan et al., 2012), and classmates (Goudeau & Cimpian, 2021; King, 2020). Importantly, having a growth mindset over a fixed mindset is beneficial for well-being at school (Zeng et al., 2016) and academic success (Blackwell et al., 2007), particularly amongst underprivileged students (Sisk et al., 2018; Yeager et al., 2016).

Against this background, numerous mindset interventions have been developed to encourage students to adopt a growth mindset (Aronson et al., 2002; Blackwell et al., 2007; Good et al., 2003; Sarrasin et al., 2018; Yeager et al., 2013) and have been recognized to be an effective tool to combat school inequality (Burnette et al., 2023; Huillery et al., 2021; Paunesku et al., 2015). However, recent publications have raised questions about the effectiveness of mindset interventions on performances, indicating that their impact may be most pronounced within specific demographics, notably underprivileged students, while demonstrating limited or even negligible effects on a broader scale (Macnamara & Burgoyne, 2023; Sisk et al., 2018). Although it’s worth noting that some of these studies have faced scrutiny regarding applied methodologie (Tipton et al., 2023), they have undeniably shed light on a crucial so far often underestimated aspect: the inherent variability in the outcomes of these interventions. Consequently, this growing body of research has spurred investigations into the underlying mechanisms that influence the effectiveness of mindset interventions and the various factors that play a role in shaping their impact. Notably, the long-lasting effects of mindset interventions have been attributed to the “snowball effect”, which represents a virtuous circle between adopting a growth mindset following an intervention and experiencing positive outcomes, such as a better grade in an exam (Kenthirarajah & Walton, 2015). Indeed, the snowball effect proposes that students who have learned that their intelligence can grow challenges this idea by working and persevering more, which can then potentially lead to experiencing better outcomes. This positive experience thus reinforces the growth mindset and motivates the students to work and persevere even more. However, to generate such a snowball effect, students’ efforts need to be encouraged (Walton & Yeager, 2020; Yeager & Walton, 2011). Walton and Yeager proposed the concept of psychological affordance using the “seed and soil” metaphor to illustrate how and when mindset interventions that target student’s intelligence mindsets can be beneficial. Similar to a seed that needs to be planted in fertile soil, mindset interventions need to be delivered in environments with effort-supporting and challenge-seeking norms to positively affect students’ achievement. For example, a recent study conducted in the United States highlighted the moderating impact of peer norms (Yeager et al., 2019) and the mindset of teachers (Yeager et al., 2021). The math grades of students improved following a mindset intervention, but only if their math teacher had a growth mindset, as measured in teachers by a 2-item scale. Intriguingly this effect was observed when the mindset interventions were delivered to the students online and not by the teacher. This finding suggests that teachers’ mindset play a crucial role in influencing the effectiveness of mindset even when they are not directly involved in delivering the mindset intervention.

This result is consistent with other previous work showing significant effects of teachers’ mindset on their practices such as the type of feedback they provide to students. For instance, Rattan et al. (2012) demonstrated that teachers with a fixed mindset are more prone to offering comforting feedback to students encountering failure, and through it inadvertently convey to the students a belief that their teacher doubts their capacity for improvement. It is then possible that students exposed to growth mindset ideas through the intervention, but confronted with fixed mindset ideas from their teacher, might be less inclined to cultivate a growth mindset and act upon it.

However, until now, the moderating impact of teachers on the efficacy of mindset interventions has only been tested by solely measuring the teachers’ pre-existing mindset. It has never been directly manipulated by administering short mindset interventions to the teachers. Thus, evidence for a causal relationship between the teachers’ mindset and the efficacy of mindset interventions on the students is still lacking. This is an important question, because positively affecting the mindset of students by delivering a short intervention to teachers would be an easy, affordable way to improve the efficiency of interventions targeting students and encourage them to adopt a growth mindset.

Here, we addressed this gap by testing whether a short mindset intervention for teachers improves the impact of mindset interventions delivered to middle-school students from underprivileged areas. Our specific focus was on mathematics mindset, as this domain tends to be more strongly associated with a fixed mindset compared to other subjects taught in middle school. Prior research has consistently demonstrated that mathematics is often linked to the notion of “innate talent” (Gunderson et al., 2017; Leslie et al., 2015). Interestingly, even expertise in mathematics does not provide immunity to this notion, as studies have shown that mathematics teachers tend to hold more fixed mindsets compared to their colleagues teaching social sciences or language (Jonsson et al., 2012).

2 Methods

The experiment was conducted following the Helsinki Declaration. We received ethical approval from the Sorbonne University Ethical Committee and the Versailles Academy. Consent from teachers and the students’ parents was obtained before data collection. Study’s design and analyses were registered after data collection was completed, but before data analyses were conducted (https://osf.io/8ybzf/?view_only=d38f100f3565492d8f598f7e92d23b79).

2.1 Participants

Six middle schools from underprivileged areas (Priority Education Network, “Réseau d’Education Prioritaire”) in the Paris Area were selected by the Versailles Academy. In France, schools are part of a Priority Education Network when located in urban areas that are pockets of poverty. To be considered as such, the criteria are: 1- A large proportion of students from low socio-economic backgrounds 2- A large proportion of students receiving government-sponsored scholarships following socio-economic eligibility criteria, 3- A large proportion of students living in city housing projects, and 4- A large proportion of students having repeated grades during kindergarten or primary schoolFootnote 1. These middle schools were randomly assigned to one of the three conditions: control condition (C0), student intervention condition (C1), and combined intervention condition (teacher intervention + student intervention) (C2). Randomization was performed at the level of the middle school to avoid spillover effects because teachers from the same school usually share their experience and practice with each other. In total, 29 classrooms of 6th-grade students were initially considered eligible for participation. However, two classrooms from one of the students + teacher interventions group were ultimately excluded due to the absence of the math teacher during the teacher intervention. Consequently, a total of 27 classrooms were included in the study. Only students whose parents provided consent participated in the study and responded to the questionnaires. At T1, 296 students responded, and at T2, 314 participants responded. After eliminating participants who entered incorrect personal anonyme code (preventing us to match their pre and post intervention data), and those who participated only at T1 or T2, the sample size was reduced to 196 participants. Further data screening led to the removal of 5 participants due to incomplete data, 9 participants whose teachers missed the teacher mindset intervention (from the 2 excluded classes), and 8 participants who exhibited random response patterns (e.g., consistently responding “0” to all items). Additionally, 3 participants were excluded due to software error. Since no information was gathered regarding students without parental consent, direct comparisons with the included students are not feasible. Nevertheless, it’s important to highlight that all students in the included classes, irrespective of data collection, received a mindset intervention at some point during the academic year. Therefore, it is less likely that teachers or students consciously declined consent for participation based on their attitude toward mindset interventions.

Ultimately, only students who met the criteria of having parental consent, participating at both T1 and T2, and completing the entire questionnaire were included in the final analyses, resulting in a population of N = 171 students (see Table 1).

To ensure the confidentiality and anonymity of the students, the exact total number of students per classroom and the names of students whose parents did not provide consent were not disclosed to the researchers. However, based on the typical classroom size in Priority Network middle schools, which usually varies from 18 to 25 students, the approximate total number of eligible students was around 200 for both students intervention group (C1) and students + teacher interventions group (C2), and 140 for the no intervention group (C0). At each school, the percentage of participants with parental consent who were present during both the pretest and post-test assessments ranged from 23 to 40%. It is important to note that the low participation rate can be attributed to the data collection period in the 2021–2022 fall-winter semester, coinciding with a peak of COVID-19 infections, resulting in increased student absences due to infections. Moreover, discussions with the school principals revealed that economic instability and the ongoing pandemic significantly impacted parental responsiveness. Many parents were less engaged, responding only to mandatory school communications, which contributed to the lower rate of parental consent for student participation in the study. Similarly, out of 21 eligible teachers, 20 participated in the study, with one teacher being absent during the intervention for C2. However, only 9 teachers completed the questionnaires at both the pre and post-test stages. This limited teacher participation can be attributed to the overwhelming demands placed on educators during the pandemic, resulting in reduced responsiveness to questionnaires that were not directly linked to their daily teaching practices, unlike the mindset intervention. Notably, all participating teachers were mathematics teachers. It is important to highlight that the mathematics curriculum for 6th graders in France remains consistent across schools (https://www.education.gouv.fr/les-programmes-du-college-3203), ensuring that students from all included classrooms received uniform instruction encompassing geometry and arithmetic topics.

Table 1 Description and comparison of the three condition groups

2.2 Experimental protocol

Middle schools were randomly assigned to one of the following conditions:

  • Control Condition (C0): no intervention.

  • Student interventions condition (C1): Mindset intervention for students (3 sessions).

  • Combined interventions condition (C2): Mindset intervention for teachers (1 session) and then mindset intervention for students (3 sessions).

Data were collected at the same time period for all conditions. The first data collection (T1) was conducted at the beginning of the school year (October) and the second (T2) after the Christmas holidays (January). For the Student intervention (C1) and combined intervention condition (C2), the first data collection was quickly followed by a logistic meeting at which the students’ interventions were presented. This meeting focused mostly on the logistic component of the interventions (number of interventions, material, planning). For teachers in the combined intervention condition (C2), this meeting was complemented by a short mindset intervention. The first mindset intervention for students was conducted a few weeks later for the C1 and C2 conditions. Importantly, baseline group comparisons revealed no discernible differences in group composition, with equivalent age and gender distributions (see Table 1) and comparable mindset levels (see Table 2).

2.3 Materials

2.3.1 Student interventions

Three one-hour mindset interventions from the Energie Jeunes program were conducted in class. Each intervention was conducted by trained volunteers. Volunteers conducted the interventions during their available time, without receiving any compensation, representing a variety of background including active workers, students, and retirees. All volunteers trained in a comprehensive one-day workshop prior to the study. The workshop comprised theoretical and practical training on wise interventions such as growth mindset interventions. During the study in each classroom, the intervention was conducted by two volunteers, with a prerequisite that at least one of them was an expert in the field of mindset intervention. In more detail, these expert volunteers had already delivered numerous interventions and had received additional training in wise intervention application in the field. To ensure the fidelity of the intervention and avoid any subjective interpretation, all critical messages are conveyed through videos and games, rather than being directly delivered by volunteers. In this context, the role of the volunteer is primarily centered on facilitating smooth transitions between activities and providing support to encourage active student participation, rather than delivering explicit mindset messages.

During the sessions, students participated in games, watched videos, and engaged in personal reflection exercises designed to help them develop a growth mindset. The impact of such interventions has been assessed in a large-scale impact study, showing a positive impact on students’ achievement and a negative impact on school fatalism (Huillery et al., 2021).

2.3.2 Teachers intervention

The teacher’s intervention was organized around two steps. A few days before the logistical meeting, teachers in the combined intervention condition (C2) received a video by email. The main message of the video was that all students can improve in math and that some are too young to acknowledge that errors are part of the learning process. At the end of the video, teachers were incentivized to reflect on practices that they already have (or could develop) in their classes to promote a growth mindset and to help students to not be afraid of errors. At the end of the logistical meeting, teachers in the combined intervention condition were invited to share their feelings about the video and to share the practices they had thought about with their colleagues. The primary goal of this intervention was to convey a positive social norm that promotes the adoption of a growth mindset by the teachers, and through it equip the teachers with practical strategies to integrate this mindset into their classrooms, seamlessly. By emphasizing the significance of nurturing students’ growth mindset, teachers can also reinforce their personal belief in the importance of adopting such a mindset, as supported by the “saying-is-believing” effect (Higgins, 1999). In addition, helping teachers identify specific behaviors and instructions that can be implemented in their classrooms, is crucial to ensure meaningful changes in their practices, beyond nurturing their conceptual agreement with growth mindset principles (Barger et al., 2022).

2.3.3 Data collected among the studentsFootnote 2

We tested the hypothesis that the mindset of students will change more if their teachers also receive an intervention by collecting data concerning the students’ intelligence mindset and math mindset. As a change in a student’s mindset generally affects their motivation and achievement (Sarrasin et al., 2018), we also collected data about their perception of the teachers’ mindset, causal attributions, attitude toward challenges, score anticipation, and scores on standardized math tests. A description of these additional variables and the results of the analyses are presented in Supplementary Materials.

Data were collected using REDCap electronic data capture tools hosted at the Paris Brain Institute (Harris et al., 2009, 2019). Most of the responses were given using visual analog scales ranging from 0 to 100. Training questions were proposed at the beginning of the questionnaire to help students familiarize themselves with the scales. All questionnaires (for students and for teachers) can be found in the Supplementary Materials.

2.3.3.1 Socio-demographic data2

Data concerning the participants’ age and gender were collected. Please note that we could not provide information about the socio-economic status of students’ families because this data was protected by the schools’ administrations and could not be shared with us. However, all included middle schools were labeled “Réseau d’éducation prioritaire – Priority Education Network.”

Because the students knew that the study was not an evaluation or test, we did not expect any impact of reporting socio-demographic information on the responses to the subsequent, other questionnaires (Goudeau et al., 2020). It is worth noting that there were no differences between female and male students in math performances, main effect of gender: F(2, 168) = 0.163, p = 0.85, η2 = 1.93 e-03. To some extent, this non-significant main effect of gender rules out that reporting socio-demographic information at the beginning of the experiment may have affected participants’ responses.

2.3.3.2 Intelligence mindset and mathematics mindset2

For the first data collection (T1), the students’ mindset was measured using the three-item Growth Mindset Scale (Dweck, 1999) translated into French. Based on Yeager et al. (2016) who recommended to add math-specific mindset items, the same scale was adapted to measure the math mindset by replacing “intelligence” by “skills in mathematics”. Scores were collected on a visual analog scale ranging from 0 to 100. Scores were then reversed to obtain a fixed mindset associated with lower scores and a growth mindset associated with higher scores. As revealed by interactions between students and experimenters during the first data collection, fixed mindset items were not easily understood by students. Indeed, the experimenters had to systematically explain the items to students in each classroom. To avoid this difficulty during the second data collection, mindsets were measured using the Implicit Theories of Intelligence scale (TIDI, Fonseca et al., 2007). We choose this scale because it was tested and validated for French adolescents and its items were shorter and more straightforward. This scale comprises three fixed mindset items and three growth mindset items. Responses were also collected on a visual analog scale ranging from 0 to 100. As a validity check, we assessed the correlation between T1 fixed mindset and T2 fixed mindset to ensure that despite changes in mindset, responses to these items remained interrelated. The results revealed a significant positive correlation, r = 0.18, p = 0.019. This observed correlation is considered small based on Cohen’s criteria (1988), which is not particularly surprising given that the intervention was designed to influence this variable. To provide a more meaningful interpretation, it would be valuable to compare this correlation to those observed in pre-post mindset interventions. Regrettably, most studies on mindset interventions do not assess pre-post mindset score correlations. To our knowledge, only two studies have provided such information with opposite results. Williams et al. (2018) reported a negative correlation of −0.11 between fixed mindset ratings given before and after a mindset intervention by using the Conception of the Nature of Athletic Ability Questionnaire 2. Then, Burnette et al. (2018) reported a positive correlation of 0.58 between mindset scores measure before and after a mindset intervention, measured by the 3 fixed mindset items of Dweck’s 1999 scale. Additional research is required to establish the expected magnitude and direction of correlation between mindset scores at different points in time and especially following mindset interventions.

For T1, the mindset score was calculated by averaging the scores for the three items and dividing this score by 100. For T2, a score for a fixed mindset was calculated by averaging the responses to the three fixed mindset items and the same operation was carried out for the growth mindset. In more detail, we divided the growth mindset score by the total sum of the growth and fixed mindset score. For example, if a student had a growth mindset score of 5 and a fixed mindset score of 2, their mindset score was calculated by 5 / (5 + 2) = 0.71. Both scales were measuring mindset and resulted in similar scores ranging from 0 (very Fixed mindset) to 1 (very Growth mindset).

2.4 Statistical analyses

We tested our main hypothesis with the contrast method (Brauer & McClelland, 2005). This method allowed us to test the precise prediction that the growth mindset of students in the intervention for students group would increase more than that for students in the no-intervention group and less than students in the interventions for students and teachers group.

The adoption of the contrast method is recommended when investigating the directionality of linear effects across multiple groups (Brauer & McClelland, 2005; Judd, 2000; Judd & McClelland, 1989). This approach presents a distinct advantage over employing a one-way analysis of variance, which would subsequently require post-hoc contrast analyses to elucidate the direction of the observed effect. Within our specific context, these ensuing post-hoc analyses yield power ranging from 0.1 to 0.54, notably below the optimal 0.80 to 0.90 range (Serdar et al., 2021). Given this lack of power associated with post-hoc t-tests, the adoption of contrast analyses aligns more fittingly with our study’s objectives, thereby enhancing the sensitivity of our analysis (Brauer & McClelland, 2005).

We created a model with two orthogonal centered contrasts. The first contrast corresponded to the main hypothesis, with the no-intervention group coded −1, the intervention for students group coded 0, and the interventions for students and teachers coded 1. The second orthogonal contrast tested the residual effect. In this second orthogonal contrast, the no-intervention group was coded 1, the intervention for students group coded −2, and the interventions for students and teachers group coded 1. These contrasts were entered into an ANOVA to test the impact of the condition on the evolution of the mindset between the testing time points. Based on the study of Brauer and McClelland, two conditions must be satisfied to validate the main hypothesis: The p value of the analysis should be < .05 for the contrast of interest (contrast 1), but not for the second residual effect contrast (contrast 2). This means that the first contrast, which tested the linear effect of the intervention type, explains a significant part of the variance (Brauer & McClelland, 2005).

3 Results

3.1 Impact of the intervention on students

We tested our main hypothesis by comparing the change in students’ mindset between two testing time points: at baseline, before mindset interventions (T1), and after mindset interventions (T2) between three groups of middle-school students: a control group, who did not receive a mindset intervention, a group of students who received a mindset intervention, and a group of students who received a mindset intervention and who were taught by teachers who also received a mindset intervention. We hypothesized that the mindset intervention for students would increase their growth mindset but that this effect would be stronger when associated with an intervention for teachers.

The contrast analysis (Brauer & McClelland, 2005), which compared the evolution of mindsets across testing time points between groups, confirmed a linear effect of groupFootnote 3. Indeed, the first contrast, which tested the main hypothesis of a linear group effect, was significant (F(1, 167) = 8.98, p < .01, η2 = 0.05), but not the second orthogonal contrast, which tested the residual effect (F(1, 167) = 0.09, p = .76, η2 = 5.98e-04). This finding indicates that the growth mindset of students developed more when they received an intervention and even more so when their teachers also received an intervention.

Cohen’s d values obtained from two-tailed t-tests were used to assess the impact of time on mindset within each group. These values support the idea that the combined student + teacher intervention group (C2) experienced the most significant change in their mindset towards a growth mindset (no intervention: d = 0.89, intervention for students: d = 1.50, intervention for students and teachers: d = 2.04). Additionally, Cohen’s d values associated with the differences in mindset evolution between the intervention groups were 0.17 between the no intervention group (C0) and the students’ intervention group (C1), and 0.36 between the students’ intervention group (C1) and the student + teacher intervention group (C2) (see Fig. 1).

Fig. 1
figure 1

Distribution of changes in growth mindsets across experimental conditions. a Effect of mindset intervention. The boxplots display the distributions of the difference between growth mindsets adopted at baseline (T1, 1st assessment) and after the interventions (T2, 2nd assessment) for each experimental condition. The black lines indicate the medians (50th percentile), green boxes the interquartile range (25th to 75th percentile), and whiskers the lowest and highest score, respectively. Jitter elements to the left of each boxplot show individual participants. b Frequency of mindset changes among students and experimental conditions. The heatmap shows the percentage of students in each experimental condition within six categories of mindset change, ranging from negative changes (which indicate a weaker mindset at the 2nd assessment relative to baseline) to strong changes of more than 80%

Because initial levels of mindset were tendentially influenced by groups, F(2,167) = 2.96, p = .05, and because two different scales assessed the mindset at T1 and T2, we conducted an additional analysis testing the impact of groups on mindset at T2, while controlling for the level of mindset at baseline. We used the same contrast as for the first analysis with the no intervention group coded − 1, the students intervention group coded 0 and the teachers + students intervention group coded 1. This time, our DV was mindset at T2, and we added the mindset at baseline in the model to control for this initial level of mindset. Results showed a significant first contrast for the main effect of group (F(1,166) = 4.46, p = .03, η2 = 0.03) and a non-significant residual contrast (F(1,166) = 0.03, p = 0.86, η2 = 1.83e-04) revealing that even after controlling for initial levels of mindset, students from the student intervention group had more of a growth mindset than students in the control group after the interventions, but less than the students in the teacher + student interventions.

Table 2 Zero-order correlations between mindset variables, mean and baseline equivalence for the students

3.2 Impact of the intervention on teachers

Out of the 21 teachers included in the study, only 9 of them responded to the questionnaire at both T1 and T2, preventing us from conducting statistical analyses regarding changes in mindset. Interestingly, among the teachers who did respond, all of them held a growth math mindset by the end of the data collection period. Additionally, at the conclusion of the data collection, a majority of teachers in C1 and C2 expressed the belief that poor achievement in students was more closely linked to a lack of effort rather than a lack of innate talent. However, given the small teacher sample size, this finding should be corroborated by future studies, conducted across larger samples sizes (see Table 3).

Table 3 Number of teachers in our sample with a fixed/mixed/growth intelligence and math mindset, and explaining failure by a lack of talent/a lack of work and talent/ a lack of work at T1 and T2

4 Discussion

4.1 General discussion

In this study, we directly targeted growth mindsets of teachers to test whether such direct manipulation affected the efficiency of growth mindset interventions for middle-school students from underprivileged areas. We found evidence that delivering teacher mindset interventions, in addition to mindset interventions for students, was beneficial for encouraging students to adopt a growth mindset.

This finding is in accordance with the results of previous research, which showed that mindset interventions are more effective when delivered to students whose teachers also adopted a growth mindset (Yeager et al., 2021). In the aforementioned study, the intervention condition among students interacted with the teachers’ mindset; the teachers’ mindset positively affected the efficacy of the mindset intervention on math grades relative to the control intervention. The results also showed a significant impact of the intervention on math grades for students whose teacher more strongly endorsed a growth mindset (i.e., mindset score above the mean score + 1 SD) but not for students whose teacher more strongly endorsed a fixed mindset (i.e., mindset score above the mean score −1 SD).

Here, we have taken an additional step forward by directly testing the effect of exposing teachers to a short mindset intervention before delivering mindset interventions to students, aiming to enhance students’ adoption of a growth mindset. Our findings provide valuable contributions to the ongoing efforts in identifying effective contexts for interventions targeting the growth mindset of middle-school students (Yeager et al., 2022). This encouraging result holds significant promise, as delivering a mindset intervention to teachers represents a low-cost and time-effective solution, making it relatively easy to implement.

Our finding that the combined intervention most strongly changed the students’ mindset toward a growth mindset raises the question of the underlying mechanisms, in particular, how the teachers can influence students’ adoption of stronger growth mindsets. Many studies have been conducted to test the impact of the students’ perception of their teachers’ mindset (Gutshall, 2016), achievement goals (Shim et al., 2013), and feedback (Rattan et al., 2012), but have mostly focused on how these variables, which were measured for the teachers, influenced the students’ goals and mindsets. We found that the interventions tended to change the students’ perception of their teacher’s mindset (see Supplementary Material), although more evidence is still needed to shed light on how the students’ representations of their teachers’ beliefs can determine the efficacy of mindset interventions.

4.2 Limitations

We initially planned to ask teachers about their mindsets at the two timepoints. However, our study was conducted during a period when teachers were still overwhelmed by implementing the sanitary restrictions related to the COVID pandemic. Thus, the number of teachers included was relatively small, and very few teachers responded to the mindset questionnaires at the two testing timepoints (i.e., T1 and T2). This restricted sample size did not allow us to test the role of potential moderators, such as causal attributions and how teachers orient their feedback (comforting or strategic), which could have contributed to the observed effects. We therefore encourage more studies to test (1) whether the impact of mindset interventions on student mindsets is determined by the type of goals teachers adopt or the type of feedback teachers give before and after participating themselves in a mindset intervention, and (2) how these variables, and the way the students perceive them, mediate the effect of the teachers’ mindset intervention on the students’ change in mindset.

It is worth noting that other variables that are generally associated with growth and fixed mindsets and reflect the long-term effects of the interventions were also included in the present study. As reported in the supplement, we did not find an impact of interventions on the causal attributions of exam outcomes, which measured how much students attributed their success and failure in school to internal (e.g., own responsibility and engagement) or external factors (e.g., question framing of exams, peers, teachers) and how much they considered that they were the consequence of work or talent. Similarly, we did not find any effects of interventions on challenge seeking or anticipated and obtained mathematics scores. These null results are not surprising, because the effect size of mindset interventions on these secondary long-term outcome variables are generally measured in bigger samples (Huillery, 2021, Rege et al., 2021) and over a longer period of time. For example, even if students start to adopt a growth mindset after having participated in a mindset intervention, they need time to increase their effort investment and show an improvement in their skills (Kenthirarajah & Walton, 2015). This eventually leads to an improvement in school performance, but in the long term. In our study, mindsets were measured at baseline and then a second time, only a few weeks after the interventions. It is thus possible that the results would have been different if the questionnaires had been completed later during the academic school year.

While the widely used 3-items mindset scale (Dweck, 1999) has been commonly employed to assess intelligence mindset in middle school students (Jones et al., 2009; King, 2020; Sarkar, 2021), our participants required assistance in understanding the sentence and responding to it during the initial data collection. To address this issue, we utilized Da Fonseca et al.‘s (2007) scale at T2, which proved to be more suitable for our participants. This observation aligns with previous research indicating that negative statements can be particularly challenging for children (Borgers et al., 2000). Although both scales were developed to measure the same concept (intelligence mindset), no study has directly compared students’ responses to these scales. Therefore, one may argue that the students’ responses were unaffected by the change of scale. However, despite not hypothesizing any differential impact of the scale change across groups, it was reassuring to find similar results when examining the impact of the intervention condition on (1) mindset change between T1 and T2, and (2) its effect on mindset at T2. We call for more future studies that empirically compares the use of different instruments to measure mindsets within the same sample.

Previous studies have also looked at the moderating role of inter-individual differences between the students, such as achievement level (Huillery et al., 2021), socio-economic background (Sisk et al., 2018), and gender (Gouëdard, 2021; Huillery et al., 2021). Notably, students who are at a high risk of school dropout, low achieving, and from underprivileged backgrounds are more responsive to mindset interventions. Our sample size did not allow us to conduct sub-group inter-individual difference analyses. Our data were collected directly in the field and were not part of a pre-existing database. Thus, the enrollment of each individual participant required parental consent. Moreover, the sub-sample available for within-participant comparisons across testing time points was considerably smaller than the total number of participants initially recruited for the study, primarily due to dropouts between testing time points. Because we lack information about the students whose parents did not provide consent and participants who missed one of the data collections, we cannot discount the possibility that these individuals might have responded differently to the mindset intervention. Consequently, it remains plausible that effect sizes might have varied if data were collected for both time points across the entire sample. This question should be addressed by future studies.

4.3 Conclusion

Overall, our results suggest that delivering a short intervention to teachers can provide a good strategy to increase the mindset of middle school students from underprivileged areas. This finding is important, because adding a short intervention for teachers does not significantly increase the cost of growth mindset interventions and could be a cost-effective solution to further increase the impact of the program.