ZDM

, Volume 48, Issue 1, pp 69–82

Instructional reasoning about interpretations of student thinking that supports responsive teaching in secondary mathematics

Original Article

DOI: 10.1007/s11858-015-0740-1

Cite this article as:
Dyer, E.B. & Sherin, M.G. ZDM Mathematics Education (2016) 48: 69. doi:10.1007/s11858-015-0740-1

Abstract

Basing instruction on the substance of student thinking, or responsive teaching, is a critical strategy for supporting student learning. Previous research has documented responsive teaching by identifying observable teaching practices in a broad range of disciplines and classrooms. However, this research has not provided access to the teacher thinking that is at play leading to these responsive practices. In this study, we use an innovative methodology to explore the cognitive dimensions of responsive teaching as they take place during the moments of instruction. Data include 17 point-of-view observations in which two high school mathematics teachers saved video of important moments in real time and discussed the moments during an interview immediately following instruction. Both teachers were observed to use sophisticated responsive teaching practices, providing a window into consequential teacher thinking around student thinking. An open coding of teachers’ comments about the moments they selected to capture during instruction was used to identify three types of instructional reasoning about interpretations of student thinking used by the teachers: (a) making connections between multiple specific moments of student thinking, (b) considering the relation between the mathematics of student thinking and the structure of a mathematical task, and (c) developing tests of student thinking. We provide contextualized examples of each of the types of instructional reasoning and discuss how these findings relate to current theory on responsive teaching. We conclude by considering the implications of this study for the necessary mathematical and pedagogical expertise required to engage in responsive teaching, and consider how this instructional reasoning can be supported through professional development.

Keywords

Responsive teaching Teacher cognition Teacher noticing Teacher learning 

1 Introduction

More and more, research finds that engaging with student thinking is critical for supporting student learning, particularly in science and mathematics (Black et al. 2003). In particular, research suggests that responsive teaching in which the substance of students’ ideas are the basis for instruction is supportive of student learning (Carpenter et al. 1989; Franke et al. 2009; Pierson 2008). However, responsive teaching is complex and can be difficult for teachers to enact (Hammer et al. 2012; Lampert 2001). A better understanding of responsive teaching is needed to help to demystify this instructional approach and to support teachers in adopting responsive teaching practices.

Research on responsive teaching in mathematics and science classrooms has begun to elaborate what responsive teaching looks like, primarily through case studies and in-depth analyses of discourse practices (e.g. Lineback 2015; Pierson 2008). However, this research has not provided access to the teacher thinking that underlies the use of responsive practices. We address this gap by investigating the cognitive aspects of responsive teaching, that is, by exploring the ways that teachers reason about students’ ideas as they engage in responsive teaching. In particular, we investigate the following question: How do secondary mathematics teachers reason about student thinking during responsive teaching? Unpacking the cognitive underpinnings of responsive teaching will offer a more nuanced picture of what responsive teaching entails. In addition, this work informs professional development aimed at promoting responsive teaching practices by making visible the complex nature of the teacher thinking involved.

2 Teacher cognition about student thinking

Understanding student’s mathematical thinking has long been considered an important component of mathematics teaching expertise, and studies have long provided detailed accounts of what teachers understand about student thinking (Ball 2001). More recently, research has highlighted which aspects of the mathematics in students’ ideas teachers focus on. For example, teachers may focus on what they need to correct in student thinking, such as an error in a procedure (Crespo 2000; Empson and Jacobs 2008) or a deep-rooted misconception (Pierson 2008). In addition, teachers may focus on the way in which the mathematics makes sense to a student, such as identifying the specific strategy a student is using (Empson and Jacobs 2008), or the meaning of what a student said or wrote (Crespo 2000). We consider each of these different characterizations, which go beyond descriptions of what a student said or did, to be about the type of interpretation of student thinking a teacher makes.

In addition to various types of interpretations of student thinking, we believe that there are other critical dimensions of teachers’ thinking related to student ideas that go beyond interpretation. In other words, teachers likely engage in additional cognitive work in order to make their interpretations useful for what they do in the classroom. For example, teachers may synthesize and compare across interpretations of student thinking (Stein et al. 2008), hypothesize links between the classroom environment and interpretations of student thinking (Yeh and Santagata 2015), and consider how to respond to students’ ideas based on their interpretations (Jacobs et al. 2010). The point here is that teachers’ thinking about students’ ideas during instruction likely goes beyond interpreting a student’s idea.

To investigate this additional thinking, we introduce the term instructional reasoning about interpretations of student thinking. We use the instructional to highlight that we believe this reasoning helps teachers make sense of student thinking in ways that are instructionally-relevant. Furthermore, while we believe that instructional reasoning is done with an interpretation of student thinking in mind, we do not mean to suggest that a teacher first develops an interpretation of student thinking and then reasons about it. Instead we propose a more dynamic relationship between these processes. Teachers may develop a preliminary interpretation that is revised through instructional reasoning, or instructional reasoning may help teachers change the focus of their interpretations or fill in details that were initially poorly understood by the teacher. Therefore, our model of the way teachers make sense of student thinking treats interpretations and instructional reasoning as working in conjunction with one another, and could be iteratively revised and used flexibly.

3 Characterizing responsive teaching practice

To investigate instructional reasoning about interpretations of student thinking, we examine teachers engaged in responsive teaching practices. Responsive teaching is characterized by teachers using the substance of student thinking to guide instructional decisions (Hammer et al. 2012). This type of teaching is seen as highly adaptive; the teacher shifts the direction an interaction, lesson, or unit over time based on teachers’ in-process understanding of student thinking. This adaption is similar to the more general idea of teachers incorporating formative assessment practices (Black et al. 2003) and teachers attending to student thinking (Sherin 2001). Responsive teaching, however, explicitly incorporates the idea that teachers should respond to the substance of students’ thinking, not just the superficial or easily observable aspects of student thinking (Coffey et al. 2011). This emphasis on the substance of students’ thinking aligns with teachers focusing their interpretations of student thinking on how the mathematics makes sense to students rather than how to fix students’ thinking.

Research has concentrated on understanding the directly observable aspects of responsive teaching. In particular, case studies in science and mathematics education have helped to provide an overall picture of responsive teaching in actual classrooms (e.g. Cohen 2004; Hammer et al. 2012; Hutchison and Hammer 2010; Lineback 2012; Maskiewicz and Winters 2012; Pierson 2008; Wood et al. 1991). For example, Pierson (2008) highlights examples of teachers responding to students by taking up their ideas, and Lineback (2012) details the ways that a teacher encourages students’ alternative explanations of natural phenomena. In other work, contrasting cases of teachers who engage with student thinking in less substantive ways provide examples of teaching that might look responsive at first glance, but are not. Examples of these types of teaching include having students share thinking in a show-and-tell format without in-depth discussion of ideas (Ball 2001; Stein et al. 2008) or consistently funneling students towards correct answers (Lineback 2012). Finally, recent studies have characterized the way particular teaching moves are enacted with varying levels of responsiveness, such as eliciting student thinking (Franke et al. 2009) and redirecting student thinking (Lineback 2015). Together, these studies provide a vision of what it means to base instructional decisions around the substance of student thinking, both in the overall characterization of teachers’ practice and in the specific teaching moves used.

We believe that moving beyond these observable aspects toward cognitive aspects of responsive teaching is critical for the field. In particular, to base instructional decisions on of the substance of student thinking requires significant cognitive work on the part of the teacher, including interpreting students’ thinking. However, we believe teachers using responsive teaching practices will also rely heavily on instructional reasoning about interpretations of student thinking, making this a rich context for the current study. Furthermore, to uncover instructional reasoning that could be useful for making instructional decisions during teaching, we focus on teachers’ in-the-moment thinking—that is, the thinking that teachers do in the midst of instruction.

4 Modeling teacher in-the-moment thinking

To explore teachers’ in-the-moment instructional reasoning, we draw on two related frameworks for modeling teachers’ in-the-moment cognition: teacher decision making (Herbst and Chazan 2012; Schoenfeld 2011) and teacher noticing (Jacobs et al. 2010; Sherin et al. 2011a).

Research on teacher decision-making has worked to unpack the process through which teachers decide what to do during instruction. For example, Schoenfeld (2011) claims that teachers’ goals, knowledge, and orientations are central cognitive resources that teachers draw on in the moments of instruction. These aspects of teacher cognition interact dynamically in ways that result in specific actions teachers make at specific times.

In a complementary body of work, teacher noticing highlights the ways that teachers filter through the many events happening simultaneously and make sense of the classroom environment. Sherin et al. (2011a) generalize a teacher noticing framework to include the following two processes: (a) attending to particular events (i.e. selective attention), and (b) making sense of events.

In this paper we draw on both frameworks as we investigate teachers’ in-the-moment thinking. We take a similar stance to Schoenfeld (2011), who highlights the consequential nature of teacher noticing for teacher decision making—that teachers can only act on that which they see. Jacobs et al. (2010) also highlight this link between teacher noticing and decision-making in the context of responsive teaching in mathematics. They found that expertise in attending to and interpreting students mathematical thinking serve as foundational elements in planning to respond to students’ ideas. Thus, here we study responsive teaching by characterizing teachers’ in-the-moment instructional reasoning about interpretations of student thinking. We believe instructional reasoning influences the specific actions teachers take, including enacting responsive teaching practices. While we do not explore this link empirically in this paper, we discuss potential connections in Sect. 7.1.

5 Methods

This project was part of a larger study on the development of responsive teaching practices among secondary mathematics teachers. The larger study set out to investigate the ways that teachers use feedback from their own classrooms as a site for learning to become responsive to student thinking. Here we focus on a subset of the data, which includes teachers who exhibit consistently responsive practices.

5.1 Participants

Two secondary mathematics teachers that had collaborated with us previously in a series of projects over the past 10 years were selected as the focus of this study. Both teachers shared and discussed video excerpts of their teaching with peers in several video clubs that we organized. They also participated in a district-wide program for teachers applying for National Board certification that we observed and studied (both teachers were awarded National Board Certification in 2004) (Brantlinger et al. 2011). As part of our work together we had observed and videotaped in both teachers’ classrooms on multiple occasions since 2003. Based on these experiences, we suspected that both teachers consistently used responsive teaching practices.

5.2 Data sources

Two main data sources were used in this study: videotaped classroom observations and interviews from point-of-view classroom observations. First, in order to establish whether or not the teachers’ practices were responsive, video from the teachers’ classrooms at two different points in time were examined. For each teacher, we selected one videotaped observation of a lesson from several years ago (2009–2010 for Mary and 2006–2007 for Rachel) and a second from the 2013–2014 school year. Both teachers worked at the same school during this time period, an urban selective enrollment high school in a large Midwestern city. The school uses block scheduling and all classes were 100 min long.

To look at teacher cognition in the midst of instruction, interviews with the teachers during the 2013–2014 school year were used. Each teacher completed a series of point-of-view (POV) observations (Sherin et al. 2011b) in which the teacher wore a small camera on the side of her head that captured video from her perspective. Furthermore, the camera allowed the teacher to select moments to save on video immediately after they occurred. For this study, the teachers were asked to press a button on a remote worn around their wrist to save video clips whenever an event took place that they “wanted to reflect on or think about later.” This prompt was chosen to provide access to moments that were significant to teachers in ways that would be consequential for their instructional decision-making. This prompt allows us to examine general patterns of teacher thinking during significant moments for teachers rather than thinking that underlies particular responsive teaching practices. Data around teacher thinking each time a teacher used a particular responsive teaching practice would be necessary for investigating the latter.

After each lesson, the teacher met with a researcher to discuss the saved clips in order to access teachers’ in-the-moment thinking during each of the moments saved. The teacher was asked to discuss the following about each clip: (a) the reason she had saved the clip, (b) what she noticed or what caught her attention in the moment, (c) how she interpreted what was happening in the moment, and (d) what implications the moment had for her future plans. At the end of the interview, the teacher was also asked if there were any moments that she wished she had captured, but did not. A similar discussion then took place concerning the non-captured moments.

This point-of-view observation methodology provides better access to teachers’ in-the-moment thinking than typical post-lesson interviews with teachers (Sherin et al. 2008, 2011b). First, teachers only watched the video clips saved up to the point that they recognized the moment, which frequently happened within the very beginning of the clip. This ease of recognition with little re-watching of the video suggests that teachers used the video as a cue to help them recall their thinking about the specific moment. Second, both teachers felt comfortable telling us when they did not remember why they captured a moment or what they were thinking rather than constructing an explanation after watching the video. Finally, each interview with the teachers took place immediately after the lesson had occurred, leaving little time for additional thinking or reflection between the time the clip was saved and the interview. For these reasons, we believe that teachers were not creating ad hoc or retrospective accounts of their noticing, and were instead providing access to their in-the-moment thinking.

Across the two teachers, 17 POV observations were completed, seven with Mary and 10 with Rachel. A total of 100 captured moments were recorded by the teachers and an additional eight non-captured moments were identified by the teachers during the interviews. In addition, the teachers stated that they did not remember why they had saved 13 of the captured moments, thus those clips were not discussed during the interviews. This resulted in a total of 95 discussed moments (Table 1).
Table 1

Types of moments captured and discussed across observations

Observation number

Mary

Rachel

Total

1

2

3

4

5

6

7

1

2

3

4

5

6

7

8

9

10

Captured moments

18

21

14

5

8

4

1

6

3

2

4

3

1

1

4

4

1

100

Non-captured moments

1

0

0

0

0

0

1

1

0

0

0

0

0

3

0

0

2

8

Moments not remembered

4

4

1

0

0

0

0

0

0

0

1

1

0

0

1

0

1

13

Total moments discussed

15

17

13

5

8

4

2

7

3

2

3

2

1

4

3

4

2

95

5.3 Analysis

5.3.1 Responsive teaching practices

Because the goal of our research was to examine the nature of teacher cognition during responsive teaching practices, we needed a way to first examine whether the teachers in our sample were in fact teaching in a way that aligned with responsive teaching. While there are many different aspects teaching that support the use of responsive teaching practices, including the mathematical tasks (Stein et al. 1996), teachers’ elicitation of student thinking (Franke et al. 2009), and classroom norms (Yackel and Cobb 1996), we focused on the teaching practices related to responding to students’ thinking. While these other aspects of instruction are certainly important for enacting responsive teaching practices, they often constrain the way teachers responded to student thinking (Stein et al. 1996). Additionally, they are often present in less-substantive engagement with student thinking. In other words, just having access to the substance of students’ thinking is necessary, but not sufficient for responsive teaching. Therefore, focusing on teacher moves in response to student thinking provides a more restrictive measure of teachers’ enactment of responsive teaching practices.

To analyze teachers’ responses to student thinking, we synthesized recent literature and identified three teaching practices that are considered to be central components of responsive teaching (Table 2). As we chose to focus on the use of particular teaching practices rather than general characterizations of responsive teaching in mathematics, the work of (Pierson 2008) and (Lineback 2015) were particularly influential.
Table 2

Coding categories of responsive teaching practices

Teaching practice

Definition

Example

Substantive probe of student idea

Teacher asks student speaker to elaborate on his or her thinking; often a request for explanation or justification

“What does your x represent?”

“Why did you choose that approach?”

Invitation for student comment

Teacher asks one or more students to comment on another student’s idea

“Can someone restate Nate’s idea?”

“Who wants to respond to Nora?”

Teacher uptake of student idea

Teacher substantively pursues a student’s idea

“Does that work if x is negative?”

“So Jill asked about the slope. Can we use that idea to look at it graphically?”

All three practices build on the notion of responsive teaching as situating student thinking at the center of the intellectual work of the classroom. First, a range of research highlights the need for teachers to not just elicit students’ ideas and strategies but to probe and press on these ideas (Franke et al. 2009; Stein et al. 2008) in order to advance student learning. In line with Pierson (Pierson 2008), we use the term substantive probe to capture this practice in which teachers call for students to elaborate their thinking or to explain the reasoning behind a strategy or idea. Second, research focuses on the value of using of student ideas to move the content of a lesson forward productively. We understand this to happen in two ways: (a) by having students comment on each other’s ideas, what we refer to as invitation for student comment and (b) by having the teacher substantively address a students’ idea for the class, what we refer to as teacher uptake. Asking students to comment and build on each other’s thinking (Sherin et al. 2000) continues to place student ideas in the center of the classroom work. Similarly, teacher uptake reflects the idea of the teacher “taking up” a student’s idea as a lesson proceeds. In this case, rather than ask students to respond to their peer’s ideas, the teacher offers a substantive comment or critique. For example, the teacher may explicitly challenge or counter a student’s claim (Pierson 2008). In other cases, the teacher may use a student’s comment as an opportunity to shift the activity in which the class is engaged and/or to shift the focus of the class’ attention (Lineback 2015).

With this understanding of responsive teaching in mind, analysis proceeded through several stages. First, we identified those portions of the four lessons in which the class was engaged in whole-class discussion. We focused on whole-class discussion because we suspected that, if being used, responsive teaching practices would be evident during those portions of instruction. Furthermore, previous case study research on teacher growth has shown that using responsive practices in whole-class discussions is likely to be incorporated in the final stages of growth (Steinberg et al. 2004; Wood et al. 1991). Preliminary analysis has found similar use of these practices in small-group interactions as well. In the second stage of analysis, one researcher viewed the whole-class discussions from all four lessons and coded each teacher turn as responsive, neutral, or not responsive. Teacher turns were coded as not responsive when the teacher put her ideas at the center of discussion rather than the students’ ideas. In other words, these were instances in which the teacher’s own thinking about the content was the focus of the class discussion. Teacher turns were coded as neutral when they either did not address a student idea or did so in a way that did not offer a substantive reaction. Such turns included “Okay,” “Interesting.” Teacher turns coded as responsive were categorized into one of the three types of responsive teaching practices described above. This analysis allowed us to keep track of the extent to which a teacher’s practice was both responsive and not responsive, as well as the extent to which the teachers applied different responsive teaching practices.

5.3.2 Instructional reasoning about interpretations of student thinking

Our purpose in identifying instructionalreasoning about interpretations of student thinking was to investigate dimensions of teacher thinking related to student thinking that go beyond interpretations of student thinking. Therefore, in the analysis we did not attempt to characterize the different types of interpretations teachers make about student thinking. Instead, we set out to identify different ways that teachers reason about their interpretations of student thinking. The instructional reasoning we identified was guided by previous work that looked at teachers’ sense-making strategies, as well as research on teacher knowledge around student thinking and accounts of responsive teaching. For example, we looked for teachers making different types of comparisons and generalizations concerning their interpretations of students’ thinking (Colestock and Sherin 2009), times when teachers were considering relationships between curriculum and students’ ideas (Ball et al. 2008), anticipating student thinking (Stein et al. 2008), and considering the order in which to discuss students’ ideas in class (Stein et al. 2008).

The analysis was completed in two main stages with teachers’ discussions of captured and non-captured moments in the interviews serving as the units of analysis. The goal of the first stage of analysis was to identify those discussions that might exhibit instructional reasoning about interpretations of student thinking. To do so, we first removed those discussions in which the teachers did not mention students from further analysis. Next we examined the remaining discussions and excluded those in which the teachers did not make interpretations of student thinking. We defined interpretations as statements in which the teachers went beyond simply describing what a student did or said. The discussions that met these criteria (see Table 3) were included in further analysis.
Table 3

Discussions with interpretations of student thinking across observations

Observation number

Mary

Rachel

Total

1

2

3

4

5

6

7

1

2

3

4

5

6

7

8

9

10

Total moments discussed

15

17

13

5

8

4

2

7

3

2

3

2

1

4

3

4

2

95

Students mentioned

15

16

13

5

7

4

2

7

3

2

3

2

1

4

3

4

2

93

Interpretation of thinking

15

13

13

5

7

3

1

7

3

2

3

2

1

4

3

3

2

87

In the second stage, an open coding of the remaining 87 moments was completed to identify similar instructional reasoning teachers used about interpretations of student thinking. The open coding resulted in a list of several types of instructional reasoning and their defining characteristics, which we then coded for in the final phase of analysis. As teachers’ thinking could be quite complex during a single moment, we allowed coding for multiple types of instructional reasoning for one moment. In this paper we discuss three types of instructional reasoning (see Table 4). We focus on three that we found evidence of both teachers using; an additional type of reasoning, developing tests for how student thinking is influenced by classroom context, is discussed in Dyer (2015).
Table 4

Description and indicators of types of instructional reasoning

Code

Description

Indicators

Connecting specific moments

Teacher connects interpretations of student thinking from multiple specific moments in time

Teacher connects multiple interpretations of student thinking, either in terms of similarity or contrast

Teacher references specific moments with students in the interpretations. A specific moment includes describing an event or action that happened, or reproducing what a student said or wrote

Relating to task structure

Teacher considers interpretations of student thinking in light of the way a mathematical task was structured, either in terms of the numbers, wording, or representations used

Teacher interprets the mathematical aspects of the task or of the context, including the wording of a task, the numbers used, or the representations of mathematics

Teacher makes a connection between the task and the student thinking in terms of the mathematics or the task context

Developing tests

Teacher develops a context that could be used to test preliminary interpretations of student thinking

Teacher identifies a new context (or test) that will verify her interpretation of student thinking through experimentation. These contexts could be an aspect of a current mathematical task (perhaps overlooked by students), or a new mathematical task. These tests are explicitly related to her interpretation of student thinking

The testing may be completed by the teacher in the lesson, or planned for a future lesson

6 Findings

In this section we report the findings from our analysis, first examining the responsive teaching practices of the two teachers, and then showing the types of instructional reasoning about interpretations of student thinking used by the teachers.

6.1 Responsive teaching practices

Coding for responsive teaching practices revealed that both teachers engaged in a high degree of responsive teaching (Tables 5, 6). Across all four lessons, turns that reflected responsive teaching practices comprised over 60 % of the teachers’ turns. Furthermore the majority of the remaining turns were neutral. The number of turns indicating non-responsive practices was quite small, at most 7 % of a teacher’s total turns. This suggests that both teachers did in fact engage in primarily responsive teaching practices in a fairly consistent manner across a lesson. Moreover, the fact that these lessons reflect teaching separated by a number of school years indicates some stability in these teachers’ responsive practices. However, the manner in which each teacher’s practices are responsive is somewhat different.
Table 5

Extent of teacher responsiveness

 

Mary

Rachel

2008–2009

2013–2014

2006–2007

2013–2014

Minutes of whole-class discussion

55 min

54 min

26 min

25 min

Total teacher turns

204

156

64

59

Rate of teacher turns during whole-class discussion

3.7 turns per minute

2.9 turns per minute

2.5 turns per minute

2.4 turns per minute

Types of turns

 Not responsive turns

5 (2 %)

8 (5 %)

0 (0 %)

4 (7 %)

 Neutral turns

75 (35 %)

54 (35 %)

14 (23 %)

16 (27 %)

 Responsive turns

124 (61 %)

94 (60 %)

50 (78 %)

39 (66 %)

Table 6

Frequency of responsive turns

 

Mary

Rachel

2008–2009

2013–2014

2006–2007

2013–2014

Number of responsive turns

124

94

50

39

Substantive probe of student idea

38 (31 %)

33 (35 %)

18 (36 %)

11 (28 %)

Invitation for student comment

28 (23 %)

16 (17 %)

14 (28 %)

8 (20 %)

Teacher uptake of student idea

58 (47 %)

45 (48 %)

18 (36 %)

20 (51 %)

In Mary’s classroom, teacher talk is a central feature of whole class discussion. She also has a fairly high percentage of neutral turns much of which serves to let students know that she has heard them without providing substantive feedback. In addition, many of Mary’s neutral turns are initial requests from the teacher for students to share their ideas. In terms of her responsiveness, Mary engages in all three practices that were examined, though the majority of Mary turns across both lessons represent teacher uptake of student ideas.

Rachel’s classroom generally spent less time overall in whole-class discussion. Much of class time was instead spent in group work. Furthermore, there is somewhat less frequent teacher talk in Rachel’s discussions than in Mary’s discussions. In between the teacher’s turns students often talked to each other, without intervening turns from the teacher. Thus, Rachel had less frequent neutral turns and a higher percent of responsive turns. Rachel appears to use all three types of responsive teaching practices; in her early lesson she relies equally on substantive probes of student ideas and teacher uptakes of student ideas, with only a few lesson invitations for student comment. In the 2013–2014 lesson, we see her primary responsive practice to be uptake of student ideas. Because of the small number of lessons analyzed, it is unclear whether this change represents a true shift in the teacher’s practice over time.

6.2 Instructional reasoning about interpretations of student thinking

Analysis of the teacher interviews revealed a variety of instructional reasoning about student thinking, three of which we discuss in this paper: (a) connecting specific moments of student thinking, (b) relating student thinking to the structure of a given task, and (c) developing tests of student thinking. This instructional reasoning is not only used in conjunction with interpretations of student thinking, but also includes reasoning that extends beyond making those interpretations. Both teachers applied all three types of instructional reasoning at least once, with connecting specific moments of student thinking occurring in almost every lesson. The other reasoning was used less frequently (Table 7).
Table 7

Distribution of instructional reasoning use across lessons

Observation number

Mary

Rachel

Total

1

2

3

4

5

6

7

1

2

3

4

5

6

7

8

9

10

Connecting specific moments

2

3

2

1

3

1

0

2

1

2

1

2

1

2

2

4

1

30

Relating to task structure

0

0

0

0

2

0

0

2

1

1

0

0

0

1

2

0

0

9

Developing tests

0

0

1

0

0

0

0

0

0

0

0

0

1

0

1

1

0

4

In the following sections we illustrate each type of instructional reasoning with an example and discuss how the reasoning relates to prior research.

6.2.1 Connecting specific moments of student thinking

By far the instructional reasoning used most frequently by the teachers was to connect their interpretations of student thinking across different points in time. The first feature of the use of this instructional reasoning is that teachers interpret two or more student ideas in relation to one another, either because the ideas are similar or different. The second important feature of this reasoning is that teachers reference specific moments or events that happened. That is, rather than mention a student’s idea at a general point in time (“Jenna seemed to know that cos(90) is zero”), teachers relate a student’s ideas to a particular moment in time (“When Jenna went up to the board yesterday, she knew that cos(90) is zero.”).

Consider the following example in which Mary noted different students’ use of notation to represent the antiderivative in her calculus class. Students were working on a problem in which they were asked to evaluate \(\int_{3}^{7} {2xdx}\). Traditionally, F(x) is used to represent the antiderivative, which in this case is x2. Evaluating this function at 3 and 7 would thus yield 72 − 32 = 40 as the answer.

In class that day, Mary captured a moment in which she was talking to a student, Jonah, about the notation he had written down on his paper. Mary described her interaction with Jonah, saying, “[Instead of] big F(x), [he] had like this stretched out F(x) squared from a to b…from 3 to 7. And I was [thinking], ‘What was going on there?’” She explained that “[Jonah] wasn’t distinguishing…big F(x) as notation for whatever your antiderivative is…He was…putting in…another made-up notation that he thought he needed.”

Mary then connected Jonah’s notation to the work of other students. She continued, “this came up somewhere else,” and proceeded to describe other ways that students had represented the antiderivative in class that day. As she discussed what she had seen, Mary reproduced the students’ notation on the board:
$$\mathop \smallint \limits_{3}^{7} 2x dx\quad F\left( x \right) = x^{2}$$
$$\begin{aligned} & F\left( b \right) - F(a) \\ & F\left( 7 \right) - F(3) \\ \end{aligned}$$

Mary emphasized that, to her, the students were “kind of overdoing it… It wasn’t like it was wrong. It was just a lot of extra stuff… Just do 72 − 32!”

In this example, Mary references several moments that she observed in class that day when she claims that students used notation in ways that she considered to be student-invented (as in the first instance) or unnecessary (as in the second instance), but not wrong. She frames these moments as similar because they illustrate ways in which students were not using the notation of “big F(x)” effectively, with Jonah inventing additional notation and other students using the “big F notation” excessively. Furthermore, Mary considers how notation, something that is often considered to simply be a procedural aspect of mathematics, provides her with an indicator of the way that students have connected (or not connected) the antiderivative to definite integrals.

Both teachers used this type of instructional reasoning regularly, during 30 different moments in 16 of the 17 lessons (see Table 7). This reasoning can involve making connections among the ideas of two or more students, as shown in this example, or between the ideas of an individual student at different points in time. In most cases, the teachers connected specific moments from within the same lesson, but teachers also connected moments across different lessons. Additionally, in some instances teachers referenced more than two moments, including one instance that connected five distinct moments together. We believe that this instructional reasoning illustrates a level of complexity that exists in teachers’ thinking about students’ ideas. Not only do teachers hold multiple claims about student thinking in mind, but each of these claims corresponds to specific events that took place during instruction.

To be clear, the idea that teachers connect or make comparisons among the thinking of their students during instruction is not particularly new (e.g. Stein et al. 2008; Wood et al. 1991). Our data suggest that teachers may compare students’ ideas even when they are not making these comparisons explicit for the class, and may do so rather frequently. Furthermore, in the case of Mary and Rachel, doing so represents what they believe to be an important aspect of their teaching, important enough for them to want to “reflect on these events later.”

We also find the fact that teachers were referencing specific moments or events in their comparisons quite notable. Teachers’ recollections of students’ ideas are closely tied to the moments in which these ideas appeared in class. That is, Mary and Rachel often discussed their students’ thinking and reasoning situated in the context in which that thinking appeared. Furthermore, when discussing the ideas that students raised in class, teachers reflect on these ideas as a collection of moments, rather than just a collection of interpretations. We believe this distinction is worth noting. Responsive teaching may involve remembering the moment that underlies the interpretations that teachers make about student thinking, and being able to easily reference these specific moments when connecting student thinking.

6.2.2 Relating student thinking to task structure

A second type of instructional reasoning involved connecting a student’s ideas to the structure of the mathematical task that the student was exploring. This connection generally focused on the ways in which specific features of the task appeared to influence the student’s work on the problem. Sometimes the teachers focused on the relationship between the representations given in a problem and a student’s solution approach. In other cases, the teachers highlighted that the particular values students were given in a problem might have prompted a particular kind of reasoning about the task. What is key is that the teacher relates the mathematical thinking on the part of the student to some aspect of the mathematics as it is represented in the task.

One example of this instructional reasoning occurred when Rachel considered how the value of the polar coordinates she selected for students to graph related to the approach students used to find the x and y coordinates for the points. Students were asked to graph the point (4, 60°), as well as find a way to convert (4, 60°) to Cartesian coordinates. In a previous lesson with different students, the same question was asked, but with the coordinate pair (2, 30°). The polar coordinate (2, 30°) refers to the point that is a distance of 2 units from the origin at an angle of 30° from the x-axis (see Fig. 1). A right triangle using this coordinate has side lengths equal to the x-coordinate and the y-coordinate of the point.
Fig. 1

Graphs of (2, 30°) and (4, 60°)

Rachel connects the way students were thinking to the numbers that were used in the problem in several ways. First, Rachel explained that during the previous lesson students thought “that this length [the hypotenuse] was 2 and basically that that [the x-coordinate] was 2, but not really realizing that that’s what they were doing because the angle was so small.” In this instance, Rachel claims that the students did not find the similar lengths problematic, and attributes that thinking to the small angle used, 30°. As a result of this conclusion, Rachel changed the point to (4, 60°). Rachel explicitly connects the change in numbers between the two lessons to how she hoped student thinking would change by explaining, “I wanted it to make it more clear that those two [lengths] weren’t the same…by making that angle bigger.” During the interview, Rachel reflects that this change was not completely effective because a student still made the same mistake, “he still thought [the two lengths were] the same, so obviously it was still an issue.” However, she also mentions a student that was able to notice a difference, saying “one student over here actually had drawn that really well so he actually drew the traced out part so he was clear…[thinking] this is definitely shorter because it started out over here so he could see that.[difference].” In this case, Rachel explains how the clearer diagram allowed this student to see a qualitative difference in the lengths, which would have been facilitated by the increased angle. Each of these instances highlight the link Rachel proposes between the size of the angle and the similarity of the lengths of the two sides of the triangle.

Interestingly, a common element among several of the relating to task structure moments was that the student thinking was unexpected. Specifically, in several of the moments, the teachers commented that the student thinking was new or surprising to them. This association suggests that one way teachers attempt to make sense of unanticipated student thinking may be by looking for influences within the tasks themselves. Furthermore, this correspondence could indicate that this instructional reasoning would be found more often in teachers with less experience listening to and interpreting student thinking. If that is the case, it might explain why this reasoning was used relatively infrequently (see Table 7), as compared to the connecting specific moments reasoning, given that Mary and Rachel had been engaged in responsive teaching for several years.

This type of instructional reasoning has much in common with conceptions and assessments of teacher content knowledge that look at how teachers make sense of and enact tasks or problems that are given to students (Charalambous and Hill 2012; Wilhelm 2014). This literature suggests that an important component of teacher knowledge is an understanding of how to select problems and tasks that are appropriate for students, promote specific learning goals, and provide information to the teacher about how students are thinking. We believe that teachers’ use of this instructional reasoning provides evidence of precisely this kind of thinking on the part of Mary and Rachel—moments when they wrestled with the relationship between student thinking and mathematical tasks while in the midst of teaching. In particular, we find it notable that this reasoning is used in the moment by teachers, not just when they are engaged in instructional planning or selection and modification of tasks. Moreover, we suspect that this type of teacher knowledge may develop through practice in the case of teachers who are responsive to student thinking. Because this instructional reasoning was often used to make sense of student thinking that was new or surprising to the teacher, this reasoning could prompt teachers to look closely at the relationship between the details of students’ ideas and the details of tasks. In fact, this reasoning may be particularly important for teachers beginning to develop responsive teaching practices by helping them isolate key aspects of tasks that are consequential for student thinking.

6.2.3 Developing tests of student thinking

The final type of instructional reasoning we discuss is one in which teachers develop contexts or situations to test their interpretations of student thinking during instruction. Specifically, teachers make an interpretation of student thinking and develop a way to test the accuracy of their interpretation. Such tests could involve asking follow-up questions about a student’s idea or proposing a new situation for students to consider. Teachers’ purpose in using this instructional reasoning is to become more confident about their interpretation of student thinking, particularly with respect to the context in which a student’s thinking is likely to be used. This could include tests about which situations students use similar thinking, or at which points students start to use alternate ways of thinking. This reasoning differs from instances in which a teacher simply probes student thinking (i.e., asking students to elaborate their thinking) because the teacher attempts to either prove or disprove an existing hypothesis by creating a “test” in the moment of instruction rather than simply gathering more details about their existing interpretation.

We also want to emphasize that those discussions coded as developing tests were not moments in which teachers stated that they were trying to shift students’ thinking, or were developing a different problem or context that might cause a student to think differently. Thus we did not include instances in which teachers experimented with ways to prompt students’ thinking to change by asking particular questions or providing a new problem context. Our focus here is on teachers identifying or developing tests to understand student thinking rather than tests of how to change student thinking.

An example of developing tests of student thinking comes from one of Rachel’s lessons in which students were using symmetry in a graph to find multiple solutions to equations with trigonometric functions. Students were asked to find the solutions to the equation \(- 15 = 20\cos 30x\) between −6 and 18 with angles expressed in degrees. One solution can be determined using the inverse cosine function on a calculator (\(x \approx 4.62\)). Students were also given a graph of the function \(f(x) = 20\cos 30x\) (Fig. 2). The graph shows three additional points at which the value of the function is −15. The remaining solutions can be found using symmetry around the maximum and minimum values of the graph. For instance, symmetry around the minimum at x = 6 can be used to find the solution of \(x \approx 7.38\) and symmetry around the y-axis can be used to find the solution of \(x \approx - 4.62\).
Fig. 2

Graph of \(f(x) = 20\cos 30x\)

Rachel recalled that a student, Vicki, previously had trouble with the process of using graphical symmetry to find solutions. However, in class that day, Rachel noticed that Vicki and her partner were able to use symmetry in the given problem to find the additional solutions when certain reference line segments were drawn on the graph (see Fig. 3),
Fig. 3

Cosine graph with initial reference segments drawn

I just happened to draw something on the graph and they like both [saw the symmetry]… I said something about well how long is this side [b] and they knew because it was the same as this [4.62], and then I said okay if we want to get here [B]… they saw that if I did that same rectangle [I] with this same length as that one [II] then I got to the place I wanted.

In this description, Rachel suggests that the students saw the symmetry of the graph using rectangles, and as a result were able to find an additional solution.

Rachel then tests her hypothesis that students could use rectangles to examine symmetry in the graph. Specifically, she tests whether students can use similar rectangles to see the symmetry around x = 6. To do so, Rachel drew another set of rectangles on the graph (Fig. 4) about the line x = 6. She describes the uncertainty she had about how students would think, explaining she thought “Oh maybe they’ll [see the symmetry] this other way.” She then describes the result of her test, saying “They didn’t get that symmetry [around x = 6], but they did get the other one [around the y-axis].” In this moment, Rachel was testing the boundaries of when students would use a certain way of thinking (i.e. under which contexts students could use similar rectangles to reason about the symmetry). The test she developed consisted of the second set of rectangles that used a slightly different symmetry than the first set of rectangles, and seemed to be developed directly from Rachel’s interpreting of how students were seeing the symmetry in the first set of rectangles. From this test Rachel learned something additional about the way these students are thinking, namely that they are able to see the first type of symmetry around the y-axis, but not the second type around x = 6. This new assessment of the students led Rachel to suggest “let’s stick with the first one [symmetry about the y-axis]” when working with these students in the future.
Fig. 4

Cosine graph with tested reference segments drawn

This instructional reasoning of developing a test was used relatively infrequently compared to the other types of reasoning. Across the two teachers only four moments were identified in which this reasoning was used. The rarity of this reasoning might be a consequence of the teachers typically feeling confident in their assessment of how students are thinking. Alternatively, because this reasoning seems to take additional instructional time, it might be the case that it is used only when the teacher feels that the information would be consequential for their instructional decisions. Additionally, these teachers have used responsive teaching practices for many years. It seems likely that they have become much more effective and quicker at diagnosing student thinking, which would lead to fewer times when teachers saw the benefit in developing specific tests of student thinking. Finally, these teachers may consider developing tests of student thinking to be routine or not notable. Because the teachers were asked to save moments they want to reflect on think about later, if moments were not notable they may not be included in the moments saved by teachers.

We believe that this type of instructional reasoning highlights an important role of the teacher in responsive teaching. In order to help students construct knowledge, it can be helpful for the teacher to thoroughly diagnose how students are reasoning through testing. This new role may be particularly important for teachers who are becoming more responsive to student thinking because they are just beginning to unpack how students are thinking in their classroom. In fact, developing tests of student thinking may be a key tool for uncovering the nuances that hide behind the surface-level student thinking most teachers gain access to. Finally, it should be noted that the instances of teachers developing tests of student thinking would have been difficult to identify observationally. Without using the POV observation methodology, it often would have been unclear whether teachers use new contexts or situations to test how a student is thinking rather than extend or change how a student is thinking, or if the tests were developed in the moment.

7 Discussion and conclusion

In the preceding sections we presented our analysis that the two teachers showed evidence of using responsive teaching practices that were stable over time. We also identified three common types of instructional reasoning about interpretations of student thinking used by the teachers. These types of reasoning included making connections between specific moments of student thinking, considering the relationship between the mathematics of student thinking and the structure of a mathematical task and testing student thinking. To be clear, our goal was not to produce a comprehensive list of all of the types of instructional reasoning that teachers use to make sense of student thinking. Rather, in presenting these three types of reasoning we take a valuable step towards characterizing some of the different ways that teachers who use responsive practices make sense of student thinking in the moment. Additionally, the frequencies with which these types of reasoning were identified likely underestimate their use. Studies that asked teachers to capture moments whenever they used a particular type of reasoning would more accurately capture their frequency of use.

7.1 Connection between Instructional Reasoning and Responsive Teaching Practices

In addition to characterizing some of the ways that responsive teachers reason about their interpretations of student thinking, here we make an additional claim. Specifically, we propose that the three types of instructional reasoning introduced here likely support teachers’ use of responsive teaching practices. That is, we did not simply identify reasoning that is part of the pedagogical repertoire of teachers who typically engage in responsive practices. Rather, we hypothesize that these types of instructional reasoning strategies actually enable and promote the use of responsive teaching practices. To elaborate this claim, in the paragraphs that follow, we discuss potential responsive teaching moves that could be supported by each type of instructional reasoning. One avenue for future research could be to investigate this claim empirically through an investigation of the types of instructional reasoning used during specific responsive teaching practices, including the categories of responsive teaching practices discussed in Sect. 5.3.1.

The connecting specific moments reasoning is likely to be crucial for helping teachers to collectively advance students’ thinking in the context of instruction. One challenge with responsive teaching is that teachers are expected to unpack the substance of individual students’ thinking for all students in a class. To do so, teachers need to develop efficient ways of attending to both the breadth and substantive details of student thinking (Steinberg et al. 2004). By connecting specific moments together, teachers can find the similarities and differences in how students are thinking and plan whole-class discussions or even new tasks that directly address and build on multiple students’ thinking. Moreover, by referencing specific student actions or comments, teachers still have a link to the substance of student thinking, which could likely guide how teachers facilitate discussions or structure tasks.

Similarly, the relating to task structure reasoning supports responsive teaching practices. When tasks are designed in ways that allow students to actively build upon previous ideas, tasks can take some of the burden away from the teacher to guide students through this process. By looking at the specific features of the task rather than something more general (e.g. the “challenge” of a task), teachers are able to see how tasks influence students to reason in particular ways. Teachers can then strategically choose or structure tasks help students develop particular mathematical understanding (Tyminski et al. 2014).

We find that developing tests is perhaps one of the types of instructional reasoning most obviously related to the needs of responsive teaching because it enables teachers to gain more accurate and nuanced understandings of students’ reasoning and thinking. This access allows teachers’ responses to be more targeted and aligned to how students are actually thinking. Additionally, the development of tests encourages teachers to develop hypotheses about the way that the new question or situation is related to the particular way that students are reasoning.

7.2 Implications for research and teacher education

The three types of instructional reasoning about interpretations of student thinking that we identified also have important implications for research on teacher cognition. The connections that teachers made among specific moments are an interesting contrast to the literature on teacher knowledge. While the research base on teacher knowledge seems to focus on the general understandings teachers have related to common patterns of student thinking (Ball et al. 2008), the connecting specific moments reasoning highlights the importance for teachers of the specifics of particular moments or contexts. While our methodology certainly makes it likely that teachers will make comments about particular moments that were captured, we found the frequency with which teachers mentioned an additional specific moments in relation to the captured moment particularly striking. This type of instructional reasoning suggests that specific evidence or experiences are important sources of knowledge for teachers in addition to more general knowledge.

Second, we found it interesting that teachers were considering the relationship between task structure and student thinking in the midst of teaching. This in-the-moment thinking is not commonly referenced in literature about the mathematical tasks that teachers use. More frequently, researches have associated teachers’ thoughts about tasks and their structure as something that happens while teachers are planning instruction or reflecting on instruction, not something that happens in the moment (Stein et al. 1996).

Third, we suggest that developing tests of student thinking may point to another way for teachers to investigate the details of student thinking beyond the work of probing students’ thinking that is often discussed in the literature (Franke et al. 2009; Stein et al. 2008). In both testing and probing students’ thinking, teachers must identify ambiguous student statements or written work that are critical to making a hypothesis about student thinking. When testing student thinking, teachers also need to consider new situations or contexts in which that ambiguity can be tested. In contrast, probing student thinking often involves simply asking students for clarification or further explanation. We believe this skill of developing tests, questions, or new situations on-the-fly is very difficult for teachers to master and requires a considerable level of subject matter expertise. In particular, we note similarities between this instructional reasoning and the practice of cognitive clinical interviews (Ginsburg 1997), which requires considerable expertise. Research that attempts to better understand this particular skill of developing tests in response to student thinking would be a great advance in our field. Additionally, research could work to uncover teachers’ general or specific knowledge of tests or contexts related to student thinking.

These results also have implications for teacher education and professional development. The types of instructional reasoning represent cognitive aspects of responsive teaching that may be important skills to target in addition to more observable teaching practices (e.g. probing student thinking) during professional development focused on responsive teaching. Given the usefulness of video in supporting change in teacher noticing (van Es and Sherin 2010), video analysis could be a way to develop this noticing-related instructional reasoning. Additionally, it is possible that these types of instructional reasoning not only help teachers enact responsive teaching practices, but may also support the development of responsive teaching practices. For example, considering the relationship between the structure of mathematical tasks and student thinking may lead teachers to identify which tasks support productive student thinking or build on existing student reasoning. This understanding of tasks could gradually lead teachers to select and sequence tasks in a way that is responsive to student thinking and its development. Further research could investigate whether these types of instructional reasoning act as mechanisms that support teacher improvement by studying teachers undergoing the process of growth toward responsive teaching.

Acknowledgments

This material is based upon work supported by the Arthur Vining Davis Foundations, the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-0824162, and the NAEd/Spencer Dissertation Fellowship Program.

Copyright information

© FIZ Karlsruhe 2015

Authors and Affiliations

  1. 1.Northwestern UniversityEvanstonUSA

Personalised recommendations