Introduction

The increased use of technology in education in recent years affects mathematics teaching and learning processes in classrooms (Trgalová & Tabach, 2023), in particular, using technology for formative assessment (FA) (Haelermans & Ghysels, 2013; Sung et al., 2016). Many of these technologies prompted by the approaches calling for providing information that can promote teachers’ making sense of the ideas students generate on the spot and use these ideas to inform instructional decisions during the lesson, helping students move toward the mathematical goals of the lesson (Fahlgren et al., 2021). FA platforms provide dashboards for teachers that present different analyses of their students’ answers, referred to as learning analytics (LA) visualization, using information visualization techniques (Duval, 2011; Schwendimann et al., 2016; Verbert et al., 2014).

Previous studies have focused primarily on the design, development, and evaluation of teacher analytics tools; however, few have evaluated the effects of these tools on teacher practice and student learning (Aldon & Panero, 2020; Holstein et al., 2018; Mavrikis et al., 2019; Verbert et al., 2014). The growing use of such tools encourages examination of the interaction between technologies that provide information to teachers about their students’ work and teachers’ formative assessment practices. Few studies have examined teachers’ practices in the classroom when interacting with LA (Clark-Wilson, 2010; Martinez-Maldonado et al., 2014), and some of these studies have shown that real-time teacher interaction with LA can enhance students’ learning outcomes. This suggests that LA visualizations affect teachers’ knowledge, lesson plans, and what is taught in class (Holstein et al., 2018; Xhakaj et al., 2017). These studies focused on student behavior and the correctness of the answers. Despite the increase in research on teachers’ practice when interacting with LA, exploring teachers’ practice when interacting with students’ answers to eliciting examples tasks (EET) and learning analytics that offer an analysis of students’ work methods and the mathematical characteristics in their answers in their real classroom is an interesting area that needs to be better understood.

The ability to help teachers effectively use technology in FA depends on understanding the relationship between promoting student mathematics learning, the role of technology, and teachers’ practices (Koehler et al., 2014). This study used the Formative Assessment in Science and Mathematics Education (FaSMEd) framework (Cusi et al., 2024) to describe teachers’ practices based on three factors: key areas of FA, the timing of the process, and functionalities of the technology they used. Using this framework, we present LA visualizations, describe how teachers utilize them in their teaching of linear and quadratic functions, and explore how the features of each report influence teachers’ FA practices. We discuss the findings related to the value of LA visualizations to help teachers gain insights about their students’ answers and respond accordingly and draw conclusions regarding their potential and actual use in general mathematical content.

Literature Review

Formative Assessment Platforms in Mathematics Education

In this study, we focus on FA, conceptualized as “all those activities that are undertaken by teachers, and/or by their students, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged” (Black & Wiliam, 1998). Recently, researchers have begun to consider technology as a tool for enhancing teaching strategies including FA (Aldon et al., 2015). Apart from the traditional functions of technology in the mathematics learning and teaching process, broadly framed by Sinclair and Robutti (2020) as support for the organization of the teacher’s work (producing worksheets, keeping grades) and supporting new ways of doing and representing mathematics, Trgalová and Tabach (2023) added a new and relevant function: support provided to the teacher while enacting technology-based mathematics activities that are often provided with features that allow teachers to monitor learners’ activities. Typically, FA technologies provide an automatic assessment or evaluation of the students’ work for either the students (Sangwin, 2013) or teachers (Stacey et al., 2018), or both (Abu-Raya & Olsher, 2021; Pearce-Lazard et al., 2010; Stacey et al., 2018). Technology can be used to support formative feedback in many ways (Spector et al., 2016), one of which is in the format of learning analytics.

LA is defined by Siemens and Baker (2012) (p. 1) as “the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” The combination of LA and visual information, referred to as visual learning analytics, aims to provide teachers with visual information that they can interpret based on which they can make decisions, taking into account educational situations. Visual LA is a single display that aggregates various indicators about learners, learning processes, and learning contexts into one or more visualizations (Schwendimann et al., 2016). This information serves for practical orchestration of the lesson, referring to the design and real-time management of multiple classroom activities, learning processes, and teaching actions (Dillenbourg & Jermann, 2010).

The teacher’s dashboard presents the visual LA of students’ work making students’ ideas visible to teachers based on the principles of each platform analysis on the one hand. The LA provided may include task-independent (TI) or task-dependent (TD) indicators, or both. TI indicators refer to aspects of the student’s interaction with the task but do not require knowledge of the concrete task the student is working on (e.g., student behavior and the number of tasks completed). In contrast, TD indicators require access to knowledge about the particular task the student is working on (e.g., the correctness of the answer or the existence of certain mathematical characteristics in students’ submissions) (Gutierrez-Santos et al., 2012). Ley et al. (2023) considered the model-based LA which includes different levels of models offered to the teachers. The model-based LA relates to how transparent the models of student learning are to teachers, a coupling between the knowledge models that underpin the intelligent system and the knowledge used by teachers in their classroom decision-making. Ley et al. (2023) observed a range from non-transparent to transparent approaches. While inspectable models represent a step towards transparency, truly transparent models necessitate meaningful language to convey insights about student learning. LA research has been criticized for lacking utility for teachers. The transparent model provides meaningful insights to the teacher, whereas the actionable model emphasizes practical action over theoretical description (Wise & Jung, 2019). LA insights can also be categorized into three levels of assistance (Van Leeuwen et al., 2019). The first level of assistance is described as mirroring: learners’ data are collected in the digital learning environment, analyzed, and displayed by environments such as DESMOS and STEP, any interpretation and decision concerning subsequent action is then left to the teacher (Tabach, 2021; Trgalová & Tabach, 2023). The second level is alerting, where the teacher’s dashboard also provides alerts or classifications of data that require attention. ASSISTments, for example, provides hints for students and gives teachers fast and continuous feedback on student progress, to help the teacher make a decision about how to proceed (Koedinger et al., 2010; Schwarz et al., 2018). The third level is an advising dashboard that assists in interpreting the information that triggers the alert by providing explanations about the meaning of this information and suggesting a course of action for the teacher. The geXpresser grouping tool developed by MicGen (Chen, 2006; Mavrikis et al., 2019) is an example of this level of assistance.

The STEP environment, which is the formative assessment platform used in this study, supports the generation of examples. Students are asked to build an example that meets or contradicts certain conditions, after which the environment analyzes their answers according to mathematical characteristics. These are rich tasks, with more than one correct answer and many different representations, which aim to support conceptual learning (Olsher et al., 2016). These types of platforms provide teachers with visual LA about a variety of task-dependent (TD) characteristics that are also specific to the mathematical content area.

Teachers’ Formative Assessment Practices When Interacting with LA

Provided with relevant information, teachers need to be able to benefit from platform affordances (Trgalová & Tabach, 2023). Making students’ ideas visible to teachers creates a situation of interacting with evidence about students’ understanding that drives instructional decisions (Matuk et al., 2016) and demonstrates the role of the interactive environment in teachers’ FA practices (Abu-Raya & Olsher, 2021). For example, teachers can monitor students’ progress (Abu-Raya & Olsher, 2021; Aldon & Panero, 2020; Mavrikis et al., 2019), assist students who seem to experience difficulties, and check for common misconceptions manifested in students’ work (Stacey et al., 2018). In some cases, teachers present the students’ submissions to the entire class (Abu-Raya & Olsher, 2021; Aldon & Panero, 2020) and use interactive components of the dashboard to filter and display submissions that have particular characteristics or provide insight into issues related to the classroom or the individual student (Abu-Raya & Olsher, 2021). LA dashboards are also used to monitor students’ behavior and determine which students have completed all tasks and are ready for additional activities (Mavrikis et al., 2019). Dashboards enable teachers to monitor student progress by displaying information on individual students or workgroups, including the current task in the unit, the time spent on that task, their overall score for the unit, and the percentage of the unit completed (Matuk et al., 2016). Teachers can also use advising tools to help in decision-making, for example, in pairing students (Abdu et al., 2021; Mavrikis et al., 2019), or in advancing collaborative work by facilitating group discussions (Verbert et al., 2014). In summary, teachers find mirroring dashboards to be a helpful tool for enhancing their insight into student activities (for example, Wise & Jung, 2019). Such tools have been shown to improve teachers’ abilities to detect meaningful events (Casamayor et al., 2009). However, there are also studies that show no improvement in this ability (Fahlgren et al., 2021; Van Leeuwen et al., 2015), with the explanation that teachers may draw different conclusions from the same information, with the result that they do not always prioritize the group that is most in need of support, provide help to the students, and prepare for whole-class discussion.

Most of these studies investigated platforms that focused on group work and group-level progress and behavior during lessons. This study focuses on teachers’ implementation of LA in their actual classrooms when they are given TD LA visualizations for both the student and classroom levels.

Enhancing teaching practices through learner-generated examples via interactive tasks (EET) and TD LA on students’ responses to EET can reveal students’ understanding and a variety of answers beyond their initial examples (Zazkis & Leikin, 2008; Watson & Mason, 2002). EET offer flexible tools, such as interactive diagrams, for algebraic problem-solving while clarifying possible misconceptions and reflecting students’ comprehension levels (Olsher et al., 2016). Thoroughly examining students’ EET submissions necessitates a detailed analysis of the mathematical characteristics, as these are essential for evaluating their understanding of the material (Olsher et al., 2016). Integration with learning analytics through TD LA allows for advanced analysis and provides teachers with information on comprehension levels, both individually and for the class as a whole. In this study, we aim to explore how this partnership between EET and TD LA can enhance teaching practices when teachers interact with automated analysis and visualizations on the teacher dashboard and respond to their students.

FaSMEd Framework

We explored how a technological environment influences teachers’ FA using the FaSMEd framework (Cusi et al., 2024). Cusi et al. (2024) described teachers’ practices based on three factors: first are key FA strategies (“Where”) (Black & Wiliam, 2009), which include sharing goals and criteria with learners, designing and implementing classroom discussions and other learning activities, enhancing the quality of feedback, and involving students in peer and self-assessment. Second, the timing of the process(“When”) which describes the three phases of the teaching process: preparation (planning), paration/meta-paration (enacting and sense- making), and post-paration (reflecting). Third, the functionalities of the technology they used (“How”) include communicating, analyzing, and adapting.

We described the use of technology by teachers during the different phases of the FA cycle, the three phases of lesson implementation, and the three functionalities of technology (communicating, analyzing, and adapting), which support the teacher’s role: initially, in the communication and interaction with students; in the analysis of students’ work; after the lesson, in facilitating the interpretation of the analysis provided by the technology; and finally, in the adaptation of the teaching based on the insights provided by technology for the next teaching steps.

Method

The present paper examines teachers’ interaction with LA and its effect on class discussions and on changes in the content of teaching functions using STEP. The lessons were conducted in middle schools (grades 8 and 9) during the 2020–2021 school year in Israel. Since these lessons were conducted at the time of the pandemic, the first lesson was conducted in the physical classroom while the rest of the lessons were conducted using Zoom. This study seeks to answer the following question:

Research question: What formative assessment processes and roles of functions offered by technology do teachers use when they interact with LA visualizations while evaluating student answers to example-eliciting tasks (EETs) in a digital environment?

Participants

Two 8th and 9th grade teachers with more than 15 years of teaching experience were the participants. One had an MSc in mathematics and the other had a BSc in mathematics education. The teachers attended a professional development program and used STEP in their classes, making them appropriate candidates for the study.

Research Setting

The teachers used STEP in their classrooms as part of the routine teaching sequence and used the STEP reports in the class discussions after the students completed the online activities. The study took place during the pandemic, and the first lesson was held in the classroom, while the rest were delivered via Zoom. Each teacher assigned five activities to the five separate lessons. At the end of the lesson, the teacher received the LA for the students’ work. The lessons were audio-recorded and the teachers were interviewed to clarify their decisions and actions during the lesson. If the activity was not completed during the lesson or the teacher decided to continue the same activity in the next lesson following the LA, the subsequent lesson was recorded. At the end of each activity, the teachers completed a questionnaire about what they found in the students’ answers and how they connected it to the students’ knowledge. The STEP teacher dashboard produces six types of reports: tables, grids, histograms, Venn diagrams, perceptual landscapes, and bubble reports (Abu-Raya & Olsher, 2021). Each interactive report was designed to address a specific pedagogical need. The participating teachers were introduced to STEP in professional development programs, including its design principles and use in the classroom. We focus on the reports that teachers used in their lessons—the table, grid, histogram, and Venn diagram—and separately present the goals and mechanisms of each report. Other reports are presented but were not used by teachers in this study as they are specific to types of tasks that teachers did not choose to use in their classes.

Research Tools

The platform (STEP) has several types of reports (Abu-Raya & Olsher, 2021). Each report provides a specific LA visualization at the level of the whole class and the level of the student.

The table report contains a row for each student submission and a column for each characteristic, indicating which characteristics are evident and which are not for each student submission (Fig. 1a). The goal of this report was to provide elaborate information for each task for every student in the class. The table report enables the teacher to see at a glance which characteristics are prominent in the students’ submissions and which students submitted no answers.

Fig. 1
figure 1

STEP reports: a Table, b Histogram, c Grid, d Venn diagram

The histogram presents the distribution of characteristics across different submissions. For each characteristic, one report shows a visual representation of a column whose height corresponds to the number of submissions with this characteristic (Fig. 1b). The goal of the report was to provide a relative visual representation of all the characteristics of a given task, providing insight into the relative distribution of characteristics in student submissions. This interactive report allows choosing a certain characteristic by clicking on it and then displaying a filtered grid report below the histogram of student submissions with this characteristic.

The grid presents a snapshot of all student submissions in a collage (Fig. 1c), which resembles other platforms where each submission is presented on the screen (e.g., TI-Inspire). The goal of the grid is to grant teachers access to all students’ work in the classroom in a single report, allowing the teacher to filter student work according to predetermined characteristics. The filtering process can help the teacher identify different phenomena in the students’ work rather than simply go over all the snapshots trying to locate work that is relevant for the ensuing classroom discussion. When a picture was clicked, it loaded the interactive diagram submitted by the student.

The Venn diagram presents the interrelations between the characteristics of students’ submissions (Fig. 1d). The goal of the Venn diagrams is to show the distribution of more complex phenomena in the submissions that could not be captured based on one characteristic but were reflected in the relationships between several characteristics. Clicking on part of the diagram displays a filtered grid report based on the selected region.

Fig. 2
figure 2

Task design

Data Sources and Analysis

The platform collected all students’ answers to a variety of tasks, including multiple selection items, yes-no questions, and three examples that fulfilled a given set of conditions. The platform can filter student submissions based on various mathematical characteristics that are specific to the content area. For example, STEP indicates whether a submitted function is a constant one, rising or descending, and whether the student used a graphic representation or a numeric table of values (Abu-Raya & Olsher, 2021). The analysis of submitted examples provides information about the critical and non-critical features of a mathematical concept that can attest to a broad understanding of it, including the relationships between its attributes. Another type of information that may be analyzed concerns methods of construction, which provides evidence about the process of the students’ work, not only about its product (Popper & Yerushalmy, 2021). Thus, in addition to checking the correctness of answers, the platform assesses other mathematical characteristics.

Data sources included student submissions for the tasks, interviews with every participating teacher, after-class discussions following the student submissions, and classroom observations. Additionally, the teachers captured screenshots when they used the LA.

In this study, we hypothesized that teachers’ practice is affected by the type and design of data offered by LA visualizations, which means that the functionality of each report affects teachers’ formative assessment practices. We focused on one phase of the process, paration and metaparation. During the phase of paration, the teachers interact with the LA concerning students’ work in the classroom, reflect, give feedback, process information, and initiate further steps of the learning process. Technology was used to communicate, analyze the data, and make appropriate decisions in the classroom, which may strongly influence teachers’ practices in all four FA areas. The phase of metaparation, in which the teachers practice metacognitive processes to reflect on their teaching during the lesson, is an important part of paration and crucial in all four FA areas. For the RQ, the aim of the data was to analyze teachers’ practices when interacting with LA based on the FaSMEd framework. Initial coding indicated no evidence of two FA processes: sharing goals and criteria with learners and engaging students in peer and self-assessment.

To analyze each teacher’s interaction with LA and students, we coded each process and its components using particular evidence. For designing and implementing classroom discussions and other learning activities, we coded each participant’s interactions with LA and/or students according to the three aspects of this process. Teachers’ interactions are classified as monitoring students’ understanding, including scanning, looking for, or identifying interesting work in the students’ answers based on different teachers’ considerations. Teachers’ interactions are classified as Scaffolding students’ understanding, when the teacher discussed the students’ answers in the class discussion and displayed the dashboard to the entire class with the goal of developing their partial or missing conceptions, discussing and correcting their mistakes, highlighting original, rare solutions, and supporting the use of multiple representations based on the data obtained from the teachers interaction with LA. Fostering students’ reflections coding included teachers asking students to explain, make inferences, generalize, or reflect on their work. Any teacher interaction that indicated a more advanced search of the students’ work analysis was coded as enhancing the quality of feedback. After describing the teachers’ interaction with each report in enhancing the teachers’ FA practices, we compare and summarize how the different reports support and enhance each teacher’s FA keys. Since teachers’ interaction with LA includes moving between the different reports and drawing insight based on them, we analyze whether the teacher’s use of a combination of the different reports can enhance teachers’ FA practice.

Results

The results are presented from a double perspective. In the first section, we present the teachers’ use of the different reports in their lessons. We describe and analyze the role of the tool’s functionalities across all the reports and teachers’ FA practices in the four key areas (Table 1). Following this, we summarize, for each of the FA processes, how teachers’ use of the different reports enhances teachers’ FA practices.

Table 1 The role of the functionalities of the tool across all the reports and teachers’ FA practice in the four key areas

Examples of the Use of STEP Reports

The findings indicate that all reports proved useful for teachers designing and implementing classroom discussions and other learning activities in different ways. Only two reports (histogram and Venn diagram) showed their potential to enhance the quality of feedback through additional analysis of the data. The examples show how the design principles of each report served teachers in different ways and demonstrate the role of the interactive environment in the teacher’s FA.

The use of the grid was illustrated by a task assigned to an 8th grade mathematics lesson. The students were given the task: “Construct a linear function whose graph passes through two given points. If you believe this cannot be done, explain why.” The interactive diagram included multiple linked representations of a function (Fig. 2). The platform presented students with a choice of various pairs of “random” points selected by developers based on pedagogical considerations and activated by pressing the New Points button (Bagdadi, 2019).

Initially, Asala (all teacher names are pseudonyms) scanned students’ submissions to identify recurring phenomena in their work. She checked whether the students used numeric representations in their solutions and discovered that only a few of them did so. She investigated the effect of this assistance tool (numeric representation) on their submissions by clicking on one of the grid examples. Using a unique feature of this report, the teacher accessed this submission and the interactive applet, and explained how the use of numeric representation can help them find an equation.

Asala continued to scan the students’ submissions looking for different solutions and mistakes. She indicated in the interview that most students submitted correct answers; therefore, she did not address this characteristic in the lesson.

She chose to address the characteristic “Two points with same Y value” (Fig. 3); based on the analysis provided by the platform, she scanned the filters, and based on the number of submissions that exist in each filtered grid report, she indicated that “this characteristic describes a constant function and helps students solve this type of tasks efficiently, saving them time” (Asala, questionnaire). At the time of the discussion, Asala presented examples of student submissions displaying this characteristic. She asked students questions like “What did these students do [referring to other students’ work]?” Asala scaffolded her students thinking by discussing the differences and similarities between the solutions (Constant function), “What type of function do you get in this situation?” “Do you think it [constant function] is easier [than a function with a non-zero slope]?” “How would you find the equation?” Asala here fostered her students’ reflection by asking them to make inferences and generalization about their submissions. Asala also addressed student submissions in which the two points coincide, which enabled students to submit non-constant functions. In the interview, Asala indicated that this choice of points helped students understand that an equation can be written without calculating the slope and plugging in the X and Y values of the points. In other words, by this discussion, Asala chose to point out the relation between the coordinates of the two points and the graphical representation of the line they will get by making them zoom in to these submission in order to reveal the students to the fact that all the points on this line had the same Y.

Fig. 3
figure 3

Filtered grid report

The grid report modifies the teacher’s role in communication and interaction with students, starting with the monitoring of students’ understanding. Snapshots of students’ work can be reviewed faster and at a glance either by sharing and communicating all submissions to the teacher or by filtering (categorizing and analyzing) them. When interpreting the analysis provided by STEP, teachers can decide which characteristics they want to discuss during a lesson. Asala mentioned in the final interview: “The Grid report is an elegant report with rich data that you can scan at a glance, seeing all the submission together at the same time make me and the students to examine differences between the solution.” She also adds: “I used the Grid to encourage students talk about their solutions, comparing different solutions and make a generalization by using filtered Grid, I also used empty filtered Grid!” Asala’s engagement with the Grid report underscores the significance of presenting the students’ submissions collectively, so that each student can view their own submission alongside those of their peers on the same dashboard. This arrangement encourages each student to analyze and compare their work with that of their counterparts, thereby fostering an environment of active participation and discussion.

The use of the table report was illustrated by a task assigned to an 8th grade mathematics lesson. The students were asked to “choose a point and construct expressions for several linear functions whose graph passes through the given point and intersects the positive part of the y-axis.” The given interactive diagram included multiple linked representations and a point generating button similar to the task in Fig. 2. Asala began by scanning for students with fewer characteristics by displaying the report before the students. She identified characteristics that were missing from students’ submissions. Based on the report, she determined that none of the students paid attention to the table of values provided in this task (numeric representation characteristic in the second column of Fig. 1a); in the final interview, she mentioned: “Table report inform me about which characteristics didn’t exist in the students’ answers and that also important data that I need to know.” Asala noted that the table of values helped students identify attributes of the function, and considered it important to illustrate the added value of the numeric representation, connecting it to the function. Asala continued to show the students how the table of values could assist them in verifying their solutions. Using the table report, she obtained an overview of the students’ work status, where it was easy to find the frequency or presence of any characteristic in the classroom submissions (Fig. 1a). Although the assignment did not specify the use of any additional representation beyond the algebraic expression, Asala proposed demonstrating the utility of this tool to the students and illustrating its application with a constant function. By doing so, Asala aimed to expand the students’ proficiency in employing multiple representations.

The use of the histogram report is illustrated with a task assigned to a 9th grade mathematics lesson. The task focused on identifying the extremum and calculating the distance between the extrema of two different quadratic functions. The task was formulated as follows: “Functions f(x), g(x) are from the family y=a(x-p)^2+k. Claim: There is only one situation for the functions f(x) and g(x), in which the distance between the extremum point of the functions is five units. If you think this claim is true, provide an algebraic expression of each function. If not, use the interactive diagram to create five examples of different functions.”

Ranya used the histogram to identify different phenomena in the class work by showing the wide range of work characteristics to the class and discussing them. She noticed that most of the students provided correct answers, and most of them indicated a distance that was not vertical, something she did not expect. Following this insight, she presented a filtered grid report for the characteristic “vertical distance” (Fig. 4) because she expected that will be her students’ first example and then conducted a discussion to understand from the students talk why there is just one example that includes this characteristic (rare) in the students’ submissions. Next, she noticed that there were many submissions that included distance with a slope (i.e., not vertical/horizontal), which she found strange because the students had not yet encountered the distance formula (Fig. 5). Therefore, she used the filtered grid report of a distance that has a slope to share the students’ work. When she shared the filtered grid, she understood that the histogram did not provide the necessary information about the interaction between the characteristics, which could be further explored using the Venn diagram.

Fig. 4
figure 4

Histogram report

Fig. 5
figure 5

A portion of the filtered grid report of a particular characteristic

The histogram report provided an overview of the students’ work using statistical data, but the teacher was not satisfied because the results of the analysis were unexpected. In this issue, the teacher mentioned, “Histogram report gives me important analysis about the students’ work, but it is not enough to make interpretation, so usually I moved to filtered grid report or Venn diagram.”

Therefore, Ranya used another representation of data from her classroom to validate her thinking. The histogram provides statistical data, allowing the teacher to access the filtered grid report of the given characteristic to conduct the discussion. The Venn diagram provides an in-depth understanding of the relationship between various characteristics.

The use case for the Venn diagram follows from the previous case involving a task carried out in a 9th grade mathematics lesson that sought to identify the extremum and calculate the distance between two extrema of two different quadratic functions. After Ranya used the histogram report, she noticed that most of the student’s submissions were correct and that most of the submissions had distance with a slope: “The histogram showed that most of the students gave examples with a non-zero slope, which was strange for me because they did not study the distance formula yet. So, I turned to the Venn diagram to see the intersection between the characteristics” (Ranya, interview). Using the Venn diagram, Ranya checked the relationship between the two characteristics (Fig. 6) and noticed that only four of the 11 submissions were both correct and had a non-zero slope. Ranya said: “I chose to look into this characteristic.”

Fig. 6
figure 6

Venn diagram report

“Although there were many submissions with a [non-zero] slope, they were not correct, since they calculated the distance only according to the X values” (Ranya, interview). Ranya moved to the filtered grid report of this intersection by clicking on the area of the intersection in the report. During the discussion, one student reported using the Pythagorean theorem and calculate the distance based on the length of the hypotenuse. Following this comment, the teacher asked the student to show her the solution (rare solution) and used it to explain to the other students why it was not correct to calculate the distance the way they did, activating students as instructional resources for one another and as the owners of their own learning (Fig. 7). By asking this student to explain the rare characteristic, she fostered the students’ reflection to consider the Pythagorean theory to calculate distance with slope.

Fig. 7
figure 7

Filtered grid report of the intersection between two characteristics in the foreground of the submission by one of the students

Based on the analysis and interactive reports, the teacher collected valuable information about the students’ work, which helped her manage an instructive discussion about a rare correct solution submitted by one of the students. In the course of the discussion, she provided insight to the whole class about the common mistake and refined the insight she gained from the histogram report by using the Venn diagram. The process of exploration that Ranya conducted during the lesson indicated that she needed a transparent LA to respond to the students. The use of different reports leads her to get a more detailed examination of the students’ submissions to identify specific characteristics that exist in their submissions. In other words, she detected the phenomenon, understood why it occurred, shared this with the students, and shared an interesting solution proposed by one of the students during the entire class discussion.

Using the filtered grid report provided by the Venn diagram, she was able to enhance the quality of feedback.

Teachers’ FA Practices When Interacting with LA Visualization

The three main aspects of monitoring the students’ answers were observed: scanning, looking for, and identifying. The nature of each process is related to the features of the report. Initially, teachers preferred to use the grid report, which indicated that their monitoring of students’ answers was limited to scanning and looking for specific characteristics that interest the teacher. Over time, teachers looked for and used other reports and realized their specific features, which elaborated their monitoring of students’ understanding to include identifying phenomena in the classroom in the form of characteristics suggested by the platform and did not exist in the students’ answers. In addition, they fostered their efforts to include more advanced data, such as the relationship between different characteristics (the intersection of two or three characteristics in students’ submissions), which offers the teacher a new and higher level of analysis.

The findings illustrate that teachers used discussing students’ answers as a scaffold for students’ learning. The different features of each report were reflected in the teacher’s investigation. Using the grid report, the teachers discussed the differences between submissions. The Table report was used to discuss the non-existent characteristics of their students’ answers, and in the histogram, they discussed rare characteristics or non-expected phenomena in their students’ answers. Finally, the teachers used Venn diagrams to investigate the relationships between different characteristics.

Different reports show different enhancements in students’ reflections. There are two ways to foster this. Sometimes it began with the grid report, while other times it began with another report that allowed the teacher to move to the filtered grid. This indicates the important role of the grid report in fostering students’ work which are now more focused for the specific goal that the teacher identified after realizing the data provided from reports.

Teachers sought different insights that could enhance their feedback by moving between a variety of reports and emphasizing the specific affordances of each report. Usually, this movement occurred when teachers used Venn diagrams or histograms, and they moved to the filtered grid of specific characteristics of what they found interesting in these reports but was not enough to make inferences about the students’ understanding.

Discussion and Conclusion

In this study, we describe how two teachers responded to their students’ EET answers using LA visualizations related to a particular topic in the mathematics curriculum. The teachers’ examples of using EET and DT LA visualizations demonstrated ways in which they enhance teachers’ FA practices. EET demonstrated two important factors that affected teachers’ FA practices: first, EET provided windows into students’ ideas; second, LA facilitated in-depth examination of students’ answers and provided teachers with rich visual aids that depicted the entire classroom. Interactive diagrams, as a flexible tool in algebraic problem-solving, allow for the extraction of rich data from students’ responses to the EET, enhance the clarity of students’ misconceptions, and reflect students’ comprehension levels (Olsher et al., 2016). When teachers interact with LA visualizations using their dashboards, they respond appropriately to their students.

The current study distinguishes between LA visualization by STEP and other platforms designed mostly for monitoring the existence of mathematical characteristics in student submissions and the correctness of their answers (Ley et al., 2023). We argue that it is valuable to focus on this type of LA because, from a pedagogical point of view, it provides complementary ways of engaging students and supporting their understanding of particular mathematical concepts by offering accessible data for teachers about their students’ work. Communicating student ideas, analyzing the status of their learning, and gaining advanced insight into their students’ work enable teachers to make appropriate decisions in the classroom.

The analysis uncovered teachers did not concentrate solely on evaluating the correctness of responses, unlike other assessment tools that place a strong emphasis this aspect. Instead, they took into account a broader range of factors, such various representations, strategies, and unconventional solutions. The diverse visualizations of these factors facilitated a deeper understanding for the teachers. In the final interview, both teachers emphasized the availability of multiple characteristics in the platform and array of reports showcasing diverse visualization functions would have otherwise restricted their discussions. This limitation was attributed to reliance on previous knowledge of their students’ work and the range of answers in past assessment contexts.

When examining teachers’ practices in the use of technology, most studies, especially in mathematics education, use a lab setting as an alternative to the real classroom (e.g., Mavrikis et al., 2019; van Leeuwen et al., 2019). Only a few studies have examined teachers’ practices in the classroom as they interact with LA (Clark-Wilson, 2010; Martinez-Maldonado et al., 2014). Other studies have indicated that the information provided by dashboards affects teachers’ knowledge, lesson plans, and what they cover during class (Holstein et al., 2018; Xhakaj et al., 2017). In these studies, teachers paid more attention to their students’ behavior than to analyzing their submissions. The present study used a “real lab,” which is the teacher’s classroom and students working on the standard school curriculum. The teachers in the present study used technology that provided them with rich tasks and analyzed students’ answers to these tasks. We stress the role of LA visualization in enhancing teachers’ practice by providing them with an advanced analysis of their students’ work concerning the specific content and level of the task at hand.

The findings demonstrate the potential of LA visualization to enhance teachers’ evaluation of their students’ work, including their misconceptions, strengths, and other characteristics that manifest in their submission in a given content area. In general, teachers use STEP reports quickly and thoughtfully. We can infer that the reports offered teachers two functions based on their features: from static reports (histograms and tables), teachers gained statistical insight, helping them make sense of the students’ work; dynamic reports (grid and Venn diagram) offered interactive tools that supported further analysis processes, such as displaying the filtered grid of certain characteristics based on substantiated choices, checking the relationship between various characteristics, and other manipulations. The examples of report use indicated that when teachers were not convinced by the data they received from LA, they could use critical thinking and consider accessing other reports and interactive tools to validate their thinking. This finding shows the differences between static and dynamic reports, and the ability of dynamic reports to serve as advanced tools for exploring and interacting with the LA. Based on the teachers’ interviews, it was found that combining static and dynamic reports led to a more accurate comprehension of the students’ work. Through the use of a Venn diagram, teachers were able to create a transparent report by selecting which relationships between characteristics they wished to display. These findings support the significance of a transparent LA in enhancing teachers’ decision-making and response to their students, as previously demonstrated by Ley et al. (2023). These findings are consistent with previous research (Olsher & Raya, 2019) showing how teachers changed the order of characteristics in classroom discussions when provided with LA on the topic of quadratic functions. From their experience with a particular class, teachers had knowledge about what their students generally could and could not do well at any given time. Working with LA visualization, they gained new insights into the performance and knowledge of their students. This enabled them to change their decisions regarding the importance of mathematical characteristics and the order in which they were discussed in class.

The key areas of FA that are supported when teachers interact with LA include designing and implementing classroom discussions and other learning activities and enhancing the quality of feedback (Table 1). Based on the teachers’ practices that we observed, we did not identify two out of the four key areas of FA (sharing goals and criteria with learners and involving students in peer and self-assessment).

Concluding Remarks and Future Research

In this study, we examined teachers’ interactions with LA visualizations. This interaction provided an indication about the teachers’ use of the reports, which was mediated by the level of accessibility of report features and by the type of characteristics included in each task. The sample, which included two experienced teachers, neutralized the effect of the skill level in operating the technological platform. The findings can assist platform developers in their considerations when designing reports and interactive LA tools and enhancing the professional development content that focuses on teachers’ use of FA assessment platforms. A carefully designed report functionality that considers teachers’ needs has the potential to “augment” teacher practice (Mavrikis et al., 2019). Future research should examine the relationship between report design and teachers’ noticing their students’ work more extensively.