Understanding the complexity of the natural world is empowering, providing us with a sense that we can explain phenomena and solve difficult problems. To engage in civic discourse about how to effect positive change requires an understanding of systems and how the components of a system can interact to produce emergent behaviour that is more than the sum of its parts. This understanding is the essence of system modelling. System models are powerful tools for investigating the world around us, allowing scientists and students to make sense of phenomena, represent complex causal relationships, solve problems, and share ideas. Therefore, modelling, systems thinking, and supports for associated cognitive skills such as causal reasoning should be significant components of every student’s education.

In a previous publication (Bielik et al., 2019), we presented our framework of four aspects of system modelling. In this study, we expand this framework by focusing on the following research question: how are students engaging with the four aspects of system modelling, and what are the challenges they encounter in the process? To explore this question, we provide empirical evidence in support of student use of these aspects of system modelling practice and how they changed over the course of a 10th grade instructional unit in chemistry. By analyzing students’ responses and produced artefacts, we deepen our understanding of these aspects while addressing the opportunities and challenges students face when engaging with curricular materials that include modelling environments. This is followed by a set of recommendations for educators and curricular materials developers who aim to support students as they learn system modelling.

Literature Review

Scientific Modelling Practice

Models are generative epistemological tools, consisting of components and the relationships between them (Harrison & Treagust, 2000; Nicolaou & Constantinou, 2014; Schwarz et al., 2009), and are essential to scientists and engineers for representing a system and for explaining phenomena and predicting possible outcomes (Harrison & Treagust, 2000; National Research Council, 2012). Developing and using models is one of the core scientific and engineering practices described in A Framework for K-12 Science Education (National Research Council, 2012). Schwarz et al. (2009) defined the modelling practice as the ability to construct, use, evaluate, and revise models of the natural world. These prior works influenced how we discuss the four aspects of system modelling practice. To build scientific models, students need appropriate modelling tools that include scaffolds to support their ability to build and use models as explanatory and predictive mechanistic tools of phenomena (Clement, 2000; Lehrer & Schauble, 2006), but most students have few opportunities to meaningfully engage with models (Louca & Zacharia, 2012; Nicolaou & Constantinou, 2014; Schwarz et al., 2009).

Systems Thinking

Systems thinking is required for investigating and learning about complex systems and typically includes ideas such as the ability to consider the system boundaries, the components of the system, the interactions between system components and between different subsystems, and emergent properties and behaviour of the system (Passmore et al., 2014; Russ et al., 2008; Wilensky & Resnick, 1999). The difficulties of understanding complex systems are well documented (Booth Sweeney & Sterman, 2000; Dörner, 1980; Hmelo-Silver & Pfeffer, 2004; Jacobson & Wilensky, 2006). We suggest that engaging in modelling through the use of a system modelling tool with appropriate scaffolds can support students in developing a systems thinking perspective.

System Dynamics Modelling Practices

System dynamics (SD) was created at MIT in the 1950s to help humans think about complex systems (Forrester, 1968). With the introduction of computer simulations based on stock and flow models (Richmond et al., 1987), SD became more accessible and eventually began to be used in educational settings with children and young adults (Gould-Kreutzer, 1993). Within the five stages of SD modelling (problem identification, system conceptualization, model formulation, testing, and using), Martinez-Moyano and Richardson (2013) compiled a list of 27 best practice statements regarded of highest importance by practitioners and experts in SD. These include identifying the components of the system and the relationships between them, iteratively developing the structure of the models by adding detail as needed, comparing simulated with real behaviour, and iteratively testing and validating the model. Students face many challenges in thinking about complex dynamic systems (Stratford et al., 1998; Tadesse & Davidsen, 2020), and there has long been hope within the SD community for new tools to introduce these challenging ideas to an audience not as constrained by age and ability (Gould-Kreutzer, 1993).

Causal Reasoning

To understand a system, explain it, and troubleshoot system problems often requires understanding causal relationships between components in the system so that one can predict and compare with real-world system behavior (Jonassen & Ionas, 2008). Student difficulties with causal reasoning have been well studied (e.g., Schauble, 1996; Zimmerman, 2007). The causal structures in complex systems can be especially difficult to reason through and to teach about (Perkins & Grotzer, 2000; Yoon et al., 2017). According to Jonassen and Ionas (2008), system modelling tools are the only class of tools that can enable learners to model both covariational and mechanistic attributes of causal relationships necessary for full causal reasoning.

Given the theoretical background discussed above, there is a need for better understanding of how instructional practices and technology tools can support students in system modelling.

Four Aspects of System Modelling Practice

In Bielik et al. (2019), we presented a theoretical examination of four aspects of system modelling practice illustrated with several student exemplars. These aspects are as follows: (1) defining the boundaries of the system by including components in the model that are relevant to the phenomenon under investigation, (2) determining appropriate relationships between components in the model, (3) using evidence and reasoning to construct, use, evaluate, and revise models, and (4) interpreting the behavior of a model to determine its usefulness in explaining and making predictions about phenomena. The first two aspects primarily grew out of challenges we observed when students construct system models. The second two aspects are related to our initial design ideas for a tool that would support sensemaking with models through comparative data analysis and ease of model construction. In the “Analysis” section, we describe criteria to evaluate whether students engage with each aspect and the results of applying those criteria to student work.

Materials and Methods

This study is part of a larger design-based research project aimed at understanding how students learn when given the opportunity to use a modelling tool that allows them to create an instantiation of their conceptual understanding in the form of an external, runnable model. Their models can generate output that feeds back into the students’ understanding of the phenomena and leads to iterations in both their conceptual understanding and their external, runnable models. The focus of the present analysis is a single instructional unit that provided opportunities for students to revise their models multiple times. In particular, we focus on evidence of students’ engagement with the aspects as they revised their models.

The Modelling Tool

The modelling tool, a free, Web-based, open-source tool, was created by the authors in partnership with other project team members.Footnote 1 It is designed to scaffold student learning so that young students, beginning in middle school, can engage in systems thinking through constructing, using, evaluating, and revising models. The tool facilitates the modelling of a system and also makes it possible to calculate and visualize model output without requiring students to write equations or code (Damelin et al., 2017).

With our modelling tool, students begin by dragging images that represent components to a canvas and then linking the components with arrows to specify relationships (Fig. 1). For the system model to become a runnable model, each component is treated as a variable that can be calculated by the modelling engine. The next step for students is to define each relationship link in the model such that the impact of one variable can be calculated on all of the variables to which it is linked. Variable values are defined using a low-to-high scale. Students construct a verbal description of how one variable affects another by using drop-down menus, e.g., “An increase in temperature causes volume to [increase] by [about the same].” The resulting relationship is also depicted by a graph showing a visual representation of this relationship (Fig. 1). Defining relationships with words helps students overcome the mathematical obstacles typically associated with creating system models and allows them to focus on a conceptual understanding of the relationships between variables (Damelin et al., 2017).

Fig. 1
figure 1

Initial model of pair I (S13 and S20)

To “run” the model, a student uses a slider to move an independent variable through a range of values. To facilitate analysis of the impact of this variable on the rest of the model, our modelling tool integrates CODAP, the Common Online Data Analysis Platform, a tool designed to support student exploration of data, including comparisons of model-output and real-world data sets (Finzer & Damelin, 2016).

Context: High School Chemistry Unit About Emergent Properties of Gases

This study involves data collected from an enactment of a high school curricular unit conducted in the spring of 2017 in an honors chemistry class. The enactment included 11 lessons of about 70 min each. Prior to starting the unit on the emergent properties of gases, all students completed an Introduction to Modelling unit, which directly addressed the issue of appropriate variables for a system model. The chemistry unit focused on the emergent properties of gases and was co-designed by the authors, together with a high school chemistry teacher who enacted the unit in her classroom. The unit was designed to explore the anchoring phenomenon of an oil tanker that imploded after being steam cleaned, as shown in an online video. This builds towards the driving question of the unit: ‘How can something that can’t be seen crush a 67,000 pound oil tanker made of half inch steel?’ A full description of the curricular unit can be found in Appendix 1. Descriptions of phenomena, activities, and expected target models in the unit are provided in Table 1.

Table 1 Description of the gas laws instructional unit and expected target models


Data was collected from 20 tenth grade students (14 male, 6 female) in a public school in the northeastern USA. Students were from an average socioeconomic level (27.5% of the students were eligible for free or reduced lunch, 88.5% White, 5.5% Hispanic, 2.5% Asian, 1.5% African-American). The teacher was an experienced chemistry teacher. Researchers observed the enactment and supported the teacher.

Data Sources

Model Reflection Questions

Throughout the unit, students engaged in offline and online activities. The online component included illustrations, readings, labs, embedded assessments, opportunities to develop and revise models in the modelling tool, and text prompts with reflection questions on the model construction and revisions. Student reflections on their models were illustrative in exploring how and why they created their models, the reasons behind changes in their models, and insights into progress toward the aspects of system modelling practice considered here.

After the first model construction activity, students were asked the following question (among others): What are you still uncertain about in your model? Following each subsequent model revision, students were asked the following: (i) What did you change in your most recent model? (ii) What were your reasons for making these changes? (iii) What are you still uncertain about in your model? Students responded in pairs or individually (depending if their partner was absent) to these questions, which were embedded in the online curriculum. However, due to student absences, the total number of responses for each revision is less than 20. A total of 9 pairs of student responses were taken for analysis, referred to below as pairs A–I.

Student Models

Student models were automatically collected via use of the Web-based modelling tool and analyzed by two members of the team. Students had four opportunities to modify their models, each following an activity designed to help them explore a feature of the phenomenon in a more focused way, typically through a lab experiment. Therefore, each succeeding model revision added content and relationships as students learned about them. Each revision was also preceded by peer review of the models and class discussion. Students worked in pairs on the models; nine of the ten student pairs submitted complete sets of models.

Student Interviews

In semi-structured interviews conducted by two of the authors during and after the unit, six students (five males, one female) who had worked in pairs were interviewed individually. They were asked to describe their models and to show how their models could explain the unit’s driving question of an oil tanker implosion. The interviews were videotaped and transcribed.


Model Reflection Questions

Using the list of aspects from Bielik et al. (2019), the second author examined a sample of student responses to the model reflection questions and developed a tentative set of observables that gave evidence for engagement or lack of engagement for each aspect. Two team members who were experienced modelers then provided feedback on the validity of the observables. The resulting list was used as coding criteria, and applied to additional student answers and refined in an iterative process. Once the wording of the coding criteria had stabilized, the first two authors independently applied the codes to the entire set of student responses. and achieved Inter-Rater Reliability (IRR) (kappa = 0.73). They then discussed differences to reach full consensus. Coding criteria are provided in Appendix 2.

Student Models

In the course of a single unit, it was not practical to evaluate improvement in the modelling practice in terms of the models alone. To a large extent, changes in the models were due to students adding new variables to represent new features of the phenomenon as they learned about them. In addition, as students learned more about the four aspects, they sometimes disassembled their models and began rebuilding them due to new ways of thinking about the system. Therefore, the emphasis was not on producing a perfect final model, but on the process of modelling. However, evidence from the models can be triangulated with other results, particularly with respect to Aspects 1 and 2. To this end, the variables and connections in the models were analyzed independently from the written explanations. The coding criteria are presented in the Analysis section. For Aspect 1, defining the boundaries of the system by including components in the model that are relevant to the phenomenon under investigation, the models were examined jointly by one of the authors with a researcher/observer who was familiar with how the boundaries were defined during the classroom implementation of the curricular unit. For Aspect 2, determining appropriate relationships between components in the model, two team members independently coded the models for the presence of indirect or redundant connections and had 100% agreement.

Student Interviews

The interview transcripts were reviewed as a potential source of additional information about student thinking. However, only limited use is made of these data in the present analysis. Two of the models discussed in this paper were from a pair of students we interviewed, and we use quotations from one of their final interviews to suggest a possible explanation for some surprising features of their models and written explanations.

Results and Discussion

Examples of student responses are used to illustrate how the four aspects of modelling practice can be used as a lens for analyzing student progress in engaging in modelling, systems thinking, and causal reasoning. Student quotations are drawn from the written responses to the reflection questions and in one instance from a student interview transcript. Screenshots of models from two student pairs are used as exemplars. Together these data provide a window into how students exhibited aspects of system modelling practice and allow us to characterize changes that occurred during the unit.

Aspect 1: Defining the Boundaries of the System by Including Components in the Model That Are Relevant to the Phenomenon Under Investigation

This aspect is characterized by the two following features:

  1. A

    Distinguishing between objects and variables.

Directing students to define the components in their models as measurable variables and not as objects in order to run a simulation of their model is not a simple task. It requires explicit focus of the teacher and repeated experiences. In the enacted unit, the teacher emphasized this issue when supporting the students in developing their models. Although most students produced initial models that included only variables that could be defined on a low-to-high scale, there were some exceptions. For example, in Fig. 1 of pair I initial model, the components Change in temp and Amount of pressure are defined appropriately as variables, but the component named Components of elements outside describes an object rather than a variable with specific characteristics, and cannot be defined on a low-to-high scale in any practical way. These students might have intended to describe the ratio of different elements in the air and to imply that the ratio might affect the air pressure. In any case, these students removed this variable in their first model revision (Fig. 2).

Fig. 2
figure 2

First model revision of pair I

  1. B

    Choosing relevant variables through consideration of appropriate size and scope.

To exemplify this aspect, the model in Fig. 1, Change in temp is a variable of appropriate size and scope for the tanker phenomenon; temperature changes have an important effect on the outcome. Elevation, however, is an example of a variable that is not of appropriate size or scope. In their first revision (Fig. 2), these students associated elevation with Density of gas inside vs outside the tanker. Although the ratio of densities outside and inside is crucial to the phenomenon, the density outside does not change appreciably in the scenario and so does not help to explain the phenomenon. After a class discussion in which models containing elevation were discussed, these students (pair I) removed both Elevation and Density of gas inside vs outside as they were not needed within the scope of these students’ explanatory model. After their second model revision (Fig. 3) they wrote, ‘We did not think the elevation and density of the molecules in the tanker was crucial to the model.’

Fig. 3
figure 3

Second model revision of pair I

In their responses to the model reflection questions, students commonly mentioned that what they changed in their models was related to Aspect 1, in particular, variables (over 50% of responses after each model revision; over 80% after the second and third revisions; see Table 2). Students’ responses as to why they made these changes fell into one or more of the following categories: to have them better explain and describe the phenomenon under investigation, to make the models more correct, to have the models make more sense or be more logical, to include important or missing variables, to have more specific variables, or to remove variables that are not significant.

Table 2 Evidence for Aspects 14 in student written explanationsa

Aspect 1 Discussion

Although the Introduction to Modelling unit had directly addressed the issue of appropriate variables for a system model, in the early models of the chemistry unit (initial model through Revision 2), seven of nine student pairs included variables that were outside the boundaries of the tanker system. Of these seven, only two pairs failed to make progress; in over half the final models all variables were relevant (see Table 3). Moreover, students’ written responses show active engagement in thinking about which variables should be included to best represent that system, considering both size and scope, and ensuring that each variable represented a measurable quantity (see Table 2).

Table 3 Number of variables that were outside the size or scope in student modelsa

Aspect 2: Determining Appropriate Relationships Between Components in the Model

This aspect is characterized by the two following features:

  1. A

    Defining scientifically accurate relationships to represent interactions between variables.

There are several ways a link between two variables could be incorrect.

  1. 1.

    There may be no relationship between variable A and variable B.

  2. 2.

    There is a relationship, but the way the relationship is defined does not match the real-world behavior of the interaction between variable A and variable B.

  3. 3.

    The direction of causality is reversed.

An example of this feature can be found in the initial model of pair I in Fig. 1; the sequence of two connections that link Size of tank to Amount of pressure do not match the real-world behavior of the system. The students changed this during their first model revision. As S13 wrote, “I got rid of this part of the model because an increase in volume does not lead to an increase in pressure. Actually, an increase in volume causes a decrease in pressure” (Fig. 4, third revision of pair I model). There was an additional change in their next revision (Fig. 5) that was also related to feature A, a change in the direction of causality between Amount of pressure and [V]olume. In the tanker scenario, a change in volume was not what caused the decrease in pressure; rather the decrease in pressure caused the volume to change—suddenly. These students continued to change relationships throughout each model revision, which was typical for all students.

Fig. 4
figure 4

Third model revision of pair I

Fig. 5
figure 5

Fourth model revision of pair I

In their responses to the model reflection questions about what they changed in their models, most students mentioned the relationships between the variables (74 and 75% after the first and second revisions, 100% after the third and fourth revisions; see Table 2). In their written reasons for making changes to these relationships, students mentioned: having the model make better sense and be more logical, aligning the model with new findings from experiments, making the model better fit what was learned in the lessons, having the model better explain the phenomenon under study, and including the necessary and relevant relationships.

  1. B

    Defining direct relationships between variables.

There are two ways problems with the directness of relationships manifest themselves:

  1. 1.

    There may be large gaps in the causal chain.

  2. 2.

    There may be indirect relationships included between variables.

Gaps in causal chains sometimes resulted when students leapfrogged one or more steps in a causal chain of variables. Students were encouraged to add microscopic causal mechanisms, helping them to explain why one variable had an effect on another. Pair I students explicitly mentioned adding Kinetic Energy to fill a gap they perceived in their causal chain. After their third revision (Fig. 5), they wrote, “We added kinetic energy as a variable in between temperature and molecule speed in our model and had kinetic energy increase the same amount as temperature. These changes help respond to the unit’s driving question because they show how as temperature increases, so does kinetic energy and molecule speed about the same.” This is a situation where the model was used to explicate covariational and mechanistic attributes of the relationship between temperature and pressure. In a gas, temperature is defined in terms of average kinetic energy of the molecules. Explicitly including kinetic energy as a variable in their model allowed a direct link to molecule speed, which, in turn, allowed a direct link between speed and number of collisions, which provided a direct link to gas pressure (Fig. 4).

In this same model, the students added a causal mechanism, Number of molecule collisions, between Molecule speed and Amount of pressure in the tanker. The single link that also connects Molecule speed and Amount of pressure represents an indirect relationship. It complicates the model and adds no new information, and it was later removed.

Five of the nine student pairs who submitted complete model sets had indirect links in all of their model revisions (Table 4). However, after an initial increase in indirect links there was a drop in such links in their final models. In the third revision, there were 10 total indirect links in class models while in the final revision, there were 6 indirect links.

Table 4 Number of indirect links in student modelsa

Aspect 2 Discussion

Defining appropriate relationships between variables in the system under study is crucial for producing a model that is useful for understanding and making predictions about a phenomenon. Examples of student work illustrate the two features related to this aspect: defining scientifically accurate relationships between variables and defining direct relationships between variables. Students made steady improvements in defining the relationships between variables. These improvements took the form of relationships that showed appropriate directionality in cause and effect, correct definitions in how one variable affects the other, and more direct linkage between variables. Nevertheless, indirect links were widespread within student models, remaining pervasive in their third revision, where over half of the models had one or two such links. In their final models, there were fewer indirect links. A possible explanation is that students were learning new mechanisms and adding new links during each revision, and in their final models noticed that these new links made some of their earlier links unnecessary.

Aspect 3: Using Evidence and Reasoning to Construct, Use, Evaluate, and Revise Models

After designing and carrying out experiments, including using a molecular dynamics simulation of gas behavior, students revised their models. Some students cited evidence from the experiments and/or simulations as inspiration for changes in their models. Explicit references to evidence from experiments or other class activities were mentioned after revisions 1 and 2 (21% and 13% of the responses, respectively), but not mentioned after the third and fourth revisions (Table 2). This could have been due to the structure and order of the questions asked of the students, or perhaps because students tended to focus more on having the appropriate variables and relationships in their later model revisions. Observation notes suggest that more could have been done to support students in making links between their experimental results and the predictions their models generated.

An example where a student cited evidence from experimental results was S21’s written comments after the first model revision: “The biggest reason as to why I made the changes was because of the recent experiment that we conducted in which we learned about the relationship between pressure and volume. Also, based on this new knowledge, I removed things I thought weren’t applicable to the model.” Sometimes the link between experiment and model revision was more tenuous. For instance, to explain why she and her partner added a link to their model, S11 wrote, “We connected amount of air to pressure…. We did this because we learned that the amount of air affects pressure.” She did not mention where she had learned this relationship, although she correctly answered questions about an experiment she had conducted that explored it.

Aspect 3 Discussion

The instructional unit included explicit cycles of evidence gathering through exploration of phenomena and activities and model building to promote development of this aspect of modelling practice (see Table 1). Model revisions suggest that students were incorporating new evidence into their models as they encountered it, although they seldom cited empirical evidence or their reasoning about it as justification for those revisions. One reason may be that the teacher and curricular materials did not ask students to draw explicit connections between the sources of evidence and the changes they made to their models. Due to the critical nature of using evidence to justify model design, a greater emphasis on this could have been incorporated into class discussions and model reflection questions.

Aspect 4: Interpreting the Behaviour of a Model to Determine Its Usefulness in Explaining and Making Predictions About Phenomena

S13 and S20 exhibited this aspect of system modelling practice. They created a model that was testable, compared it with expected real-world behavior, and revised it accordingly. They also revised it to more clearly address the unit’s driving question, drawing a line of cause and effect all the way from Steam Cleaning to Implosion of Tanker (Fig. 6), although aspects of that chain were still problematic. They ran their model multiple times, and attempted several visualizations of the output, but appeared to have trouble deciding which relationships would be useful to explore. Nevertheless, they exhibited a clear ability to interpret the behavior of the model in terms of the visual representations of relationships, writing after their final revision, ‘We added steam cleaning because it increased the temperature, which increased the kinetic energy, which increased the molecule speed and number of collisions, which increased the pressure of the tank.” This is an impressive five-link chain that correctly described most of their model.

Fig. 6
figure 6

Third model revision of pair E (S7 and S17)

However, many of the students struggled with this aspect. This struggle was evident in how pair E students (S7 and S17) had air pressure in tanker as the outcome variable in all of their models (the last two of which are shown in Figs. 6 and 7), but never explicitly connected this to the driving question in any of their model reflection responses. Nonetheless, it was evident from their interviews that both students understood the connection between air pressure in tanker and the driving question about the tanker implosion. To interpret the behavior of their model, they ran it and constructed graphs of the relationships between three different variables and air pressure in tanker. Two of their graphs showed expected relationships; one did not. They could have used this unexpected behavior to evaluate the appropriateness of their model to address the driving question and to troubleshoot relationships in their model, but they did not do so. An excerpt from the final interview with S17 offers one possible explanation for why they were unable to do this on their own. After the student explained their model (Fig. 7) and exhibited some understanding of the phenomena involved, the interviewer asked him to trace lines of cause and effect between Size of tanker and air pressure in tanker. S17 responded: “… this is saying that the bigger the size of the tanker, the larger the amount of air in the tanker. So if there’s more air in the tanker, I’m guessing that it would have a higher air pressure.’ In this short, two-link causal chain, S17’s reasoning about each separate link would be correct only if the tanker could change size for the first relationship (bigger tanker means more air), but remain a constant volume for the second relationship (more air means higher pressure in tanker). Further consideration might have shown him that this reflected an inconsistency in the logic of his model and perhaps in his conceptual understanding of the cumulative effects of relationships.

Fig. 7
figure 7

Fourth model revision of pair E

In written responses to the model reflection questions, nine students (from five of the pairs) did mention changing their model to make it better explain why the tanker imploded or to connect the behavior of their model to real-world behavior, but most did so only once (Table 2).

Aspect 4 Discussion

Students varied in how explicitly their models were designed around the driving question versus a more generic model of the emergent properties of gases. Regardless of the model focus, some students appeared to evaluate the behavior of their model one link at a time but did not evaluate the entire chain of cause and effect extending from input variables to outcome for the driving question. These results indicate a limitation in the extent to which these students interpreted the behavior of their models in relation to their usefulness in explaining real-world phenomena. This suggests that additional supports to bring the focus of the modelling activity back to answering the driving question may be needed to help students achieve this aspect of system modelling practice. We believe this should be explicitly addressed in teachers’ professional learning programs, in the curriculum, and in classroom instruction.

Concluding Discussion

In line with the goals of scientific modelling practice (National Research Council, 2012; Schwarz et al., 2009), we found evidence of students engaging with system modelling, especially Aspects 1 and 2, as they constructed, used, evaluated, and revised their models to explain the phenomenon under investigation. This engagement was demonstrated by determining, testing, and revising the boundaries of the system and the relationships between the variables in the models, as suggested by Harrison and Treagust (2000). This was not an easy and straightforward process in many cases. Nonetheless, most students were able to make progress toward achieving Aspects 1 and 2 of system modelling practice.

In Table 2, we can see that the students focus on Aspect 1, considering which variables to include and how to label them, decreased during the last model revision. This makes sense because at this point in the unit, all aspects of the phenomenon to be introduced had been introduced, and the focus was on refining the relationships, not on adding new variables. In fact, we do see that attention to Aspect 2, considering the relationships and how to revise them, increased to 100% for the last two model revisions.

However, focus on Aspects 3 and 4, using evidence and relating the model behaviour to the real world, also waned. A focus on these aspects may have peaked during the first revision, after students had watched videos and completed an experiment. One take-away is that the connection to real world aspects (Aspects 3 and 4) may need to be strengthened at points in the unit when students are not conducting an experiment, especially near the end of the unit when this information could help them evaluate and revise their models. In addition, students may need a different kind of scaffolding in how to use real world information to help them improve their models.

The four aspects of system modelling practice (Bielik et al., 2019) were used here as a way to evaluate student engagement in the process of understanding phenomena through constructing, using, evaluating, and revising models, as well as changes in that engagement as a curriculum unit progressed. We suggest that these aspects can also provide a framework for curricular designers who want to promote young students’ systems thinking, causal reasoning, and modelling practice. The four aspects can also be useful as an epistemic framework for teachers and students when reflecting on how to construct, use, evaluate, and revise models in the classroom.

Several challenges with causal reasoning were experienced by the students in this study, These challenges relate to Aspects 3 and 4 and included providing evidence and reasoning for their chosen variables and relationships in the models and explaining how their models address the driving question of the unit. These challenges align with those described by Schauble (1996) and Koslowski and Masnick (2002). However, these students improved their model-based explanation in each model revision, which suggests that the technology-rich environment and curricular materials supported students’ causal reasoning abilities. For instance, the students used the models as a common referent when discussing cause and effect in the system.

One limitation of this study is that, as mentioned in the context section, it was performed by honor students with high academic achievements, and future studies should test this conclusion in other classrooms.

Challenges students face when engaging in systems thinking and understanding complex models were evident in our results. It may not be realistic to expect students to achieve a high level of proficiency in all four aspects of system modelling practice following a brief Introduction to Modelling unit and a single instructional unit. While our results indicate that the modelling tool and curricular supports provided students with a strong foundation to develop their systems thinking, students will likely require repeated experiences in multiple learning environments to achieve mastery of the modelling practice. It is important to mention that the unit was revised following the enactment described in this study to focus on engaging students in comparing their model outputs to their collected experimental data from their self-generated laboratory experiments.

In conclusion, students progressed in Aspects 1 and 2 with their ability to choose appropriate variables, determine relevant relationships, and clarify causal mechanisms to make their relationships more direct. Less success was observed in Aspects 3 and 4 with respect to using evidence to support model design and explicitly linking the overall behaviour of the model to the driving question about the phenomenon under investigation. Our classroom implementation suggests that identifying appropriate curricular and teacher supports may be key.


Based on the in-depth analysis of the results of the implementation described here, we provide the following recommendations for teachers wishing to provide an introductory system modelling experience for students. These can be implemented in activities that engage secondary students in scientific systems modelling, especially those that take advantage of technology-rich modelling environments. In addition to the specific recommendations below, we suggest that the four aspects of system modelling can be used as an epistemic framework to support teachers and students when engaging in the modelling practice and when constructing, using, evaluating, and revising models.

  1. 1.

    Focus on using evidence to support evaluation of model components and the relationships defined between them. Running simulations to evaluate the outcome of a model in comparison with real-life data is an important advantage when using a modelling tool. This is a crucial checkpoint in each step of constructing, using, evaluating, and revising the model. In the results of the unit enactment, we found that students did not often evaluate their models in comparison with real-life data without explicit support. This is in line with findings from Chinn and Brewer (1993), who note that when a conflict between real-world data and model output occurs, students do not necessarily consider revising their models or theory, but sometimes ignore or disregard conflicting results. Although it is beyond the scope of this study, we suggest that teachers explicitly address these cases and be able to support students in resolving such conflicts. This recommendation should be further examined in future research. We also recommend using activities in the curricula and learning materials, such as text boxes in the software, to direct students to explicitly state the goal of their model and the evidence they have to support their claims.

  2. 2.

    Evaluate models in whole-class and small-group discussions. Interviews suggest that getting students to talk through their models and graphs can be helpful in prompting them to recognize inconsistencies and problematic model behavior. Student-centered discussions are powerful tools for promoting the sharing of ideas related to the phenomenon being modelled and for engaging in activities that support growth in system modelling practice. These discussions can include peer review and whole-class discourse around student models.

  3. 3.

    Frequently revisit the overarching phenomenon and the driving question the models are intended to address. Students can easily lose the big picture of what they are modelling. In the enacted unit, the teacher often directed the students to consider the driving question and results indicate that students focused on making sense of the anchoring phenomenon (the imploding tanker) in their models. We suggest that teachers frequently refer back to the driving question and collect student questions and comments related to it. We also suggest that the driving question should be consistently visible for the students while they develop and revise their model. This could be done by adding a text box in the modelling tool that includes the driving question.