Advertisement

A Theoretical Approach to Ensuring Instructional and Curricular Coherence in the Flipped Classroom Model of a Differential Equations Course

  • Jenna TagueEmail author
  • Jennifer Czocher
Article

Abstract

Over the past two decades, the flipped classroom model has gained traction in post-secondary educational settings. However preliminary studies indicate that students perceive a disconnect between out-of-class components and in-class components (Bowers and Zazkis 2012) which may be amplified by the flipped classroom model. The purpose of this paper was to present students’ perceptions of instructional and curricular coherence in a flipped version of a Differential Equations course for 80 undergraduate engineering students. The course was designed to address cognitive obstacles (Herscovics 1989) and to reduce the perceived disconnect between in-class and out-of-class activities. Students’ perceptions of the course suggested that our model for flipped classroom design did circumvent many of the instructional problems reported in prior studies of flipped classrooms.

Keywords

Flipped classroom model Cognition Post-secondary education 

Introduction

During the 1990s, instructors began to use video, presentation software, and Internet capabilities to teach with a flipped classroom model (then called “inverted classroom”). The philosophy was that students would go through lecture materials at home (e.g., watch a video) and class time would be used for what was traditionally at-home work (e.g., solving problems) (Baker 2000). In this way, students would be ready to engage with the most difficult parts of the content during class time while the instructor was present. Unsurprisingly after inquiry-based and problem-solving instructional approaches have begun to take hold, flipped classroom models have gained traction in post-secondary mathematics courses.

However preliminary reports on flipped classrooms are troubling. Students perceive a disconnect between out-of class components and in-class components (Bowers and Zazkis 2012) which may be amplified by the flipped classroom model. For example, the structure of the classroom may be ineffective in orienting students to the learning tasks in the course (Strayer 2007). Specifically, in-class activities may fail to address student misconceptions (either from previous courses or generated by out-of-class activities), out-of-class activities may require only low-level recall, or course materials may not be conceptually coherent (Andrews et al. 2011).

Taken together, this growing body of research into the efficacy of flipped classroom models suggests that there is a need to create a theoretical understructure to support the implementation of the model. Thus, we take the stance that flipped classroom models could be leveraged to help students coordinate mathematical knowledge within the course and we focus our theoretical efforts on the idea of promoting coherence. We propose two types of coherence that impact flipped classroom models: instructional coherence (cohesion and coordination among instructional materials) and curricular coherence (the extent to which mathematics content is logically, cognitively, and epistemologically sequenced).

The goals of this paper are a first step toward aligning course structure with a cognitive view of course content. Here, we elaborate on a synthesis of theories of mathematical thinking that allowed us to articulate a particular flipped classroom model. Specifically, we designed the out-of-class materials and mathematical content with theoretical grounding in the constructs cognitive obstacles (Herscovics 1989) and conceptual analysis (Thompson 2008).

We used these constructs to develop theoretical grounding for designing a flipped classroom model that would address needs reported in the literature. We then applied our newly developed theoretical grounding in a differential equations course. In this paper, we focus on the out-of-class modules and how they related to the content during in-class instruction, rather than the pedagogy. The scope of this paper is a report of students’ perceptions of the instructional coherence of our modified differential equations course using a flipped classroom model. Specifically, we addressed the following research questions:
  • What are students’ perceptions of instructional coherence prior to the start of and throughout a differential equations course that uses a flipped classroom model?

  • Is it promising to use cognitive obstacles as a theoretical grounding to reduce instructional incoherence?

This research differs from other studies of flipped classrooms because we report on whether our theoretical grounding to design a flipped classroom mitigates shortcomings reported in other flipped classroom models, rather than focusing on measuring effectiveness of the flipped model itself or studying student satisfaction. We argue that such an approach does not do the same damage to instructional coherence as ad hoc models of the flipped classroom. The main contributions are a theoretically-based model for designing flipped classroom instruction and extension of conceptual analysis techniques (Thompson 2002, 2008) to the differential equations domain and to blended learning environments. As such, this study contributes to the growing literature on best practices for designing blended instruction.

Existing Flipped Classroom Models

There are multiple models of the flipped classroom (Tucker 2012) indicating that a theory of implementation of the flipped classroom is currently in flux. Some have reported using a combination of video recordings, educational videos, and comprehension or readiness quizzes (Bowers and Zazkis 2012) for students to interact with as at-home work.

Some of the flipped classroom literature has demonstrated positive effects on students’ perceptions of the flipped versions of courses. In the first documented flipped classroom study, Lage et al. (2000) flipped an economics course by posting videos and powerpoints for students to view before class. The in-class time was spent answering questions followed by a 10 minute mini-lecture and then followed by an experiment or lab. The students reported that they felt positively about the course and in particular, the students enjoyed the in-class group work and felt more comfortable asking questions than in a traditional classroom setting (Lage et al. 2000).

Baker (2000) coined the term “flipped classroom” and flipped two courses, both of which were upper level undergraduate graphic design courses (Graphic Design for Interactive Multimedia and Communication in the Information Age). He used online discussion boards and lecture notes as the out-of-class activities. He carried out an action research project where he noted the flipped model increased interactions between students and also between students and him. The students felt they had more control over what they learned and when they learned it and reported they experienced more critical thinking in the course.

More recently, Love et al. (2014) carried out a comparison study of a flipped version of an applied linear algebra course versus a traditional version of the same course. Their out-of-class materials included screencasts, readings from the textbook, or readings from the instructor’s notes. They used exam grades as a comparison of student understanding and used an end of the course survey to measure students’ perceptions of the course. The flipped classroom students performed better on the three exams during the semester, but similarly to the traditional classroom students on the final exam. The flipped classroom students had a more positive perception of the course and scored better than the traditional classroom students on being willing to talk to classmates about mathematics. Additionally, advocates of the flipped classorom model argue that it allows for more individualized instruction and for students to set their own educational pace (Baker 2000).

Other flipped classroom research has deomonstrated that the flipped classroom caused a disconnection between the out-of-class materials and in-class materials. Bowers and Zazkis (2012) studied a flipped version of calculus. The students were asked to watch 10 minute videos before class and then come to class to work on problems. In the study, the instructor did not interact much with the students during class, but teaching assistants worked with the students. The students, although reporting enthusiasm about the course, did not score well on the final exam, and the researchers interpreted this result as a disconnect between the out-of-class videos and the in-class problems. Strayer (2012) conducted a comparison study between a traditional statistics course and a flipped version of the same course. He used a mixed method study including quantitative survey methods and qualitative interviews as well as field notes to compare the students’ perceptions of the two learning environments. The out-of-class activities for the flipped version were given through the intelligent tutoring system Assessment and Learning in Knowledge Spaces (ALEKS). Strayer’s (2012) results indicated that the flipped classroom students were more likely to engage in cooroperative problem solving during class, but they were also more likely to misunderstand the connections between the materials in the course.

In a review of research on blended learning environments, which included flipped classrooms, Bluic et al. (2007) found that there is a need for research to provide a structure for designing blended environments and to develop research methods to determine the blended learning constructs that are most useful to student learning. “A strong theme in our reflection on the literature is that research on blended learning needs to pay much more careful attention to issues of integration, and to choose conceptual tools and methods that will help us all arrive at a better working knowledge of how to help students integrate the various learning experiences that come their way” (Bluic et al. 2007, p. 232). Consequently, the systematic investigation of the impact of flipped classrooms on student learning is still necessary and important.

The mixed results reported above motivated the present study in two ways. First, students tend to react well to the flipped classroom in terms of enthusiasm and likability. Second, the flipped classroom seems to be a source of incoherence in students’ understanding of content. There is such a long history of students’ misconnections and missed connections in mathematics that we should take seriously any potential pitfalls a particular instructional paradigm might introduce. We therefore reasoned that there was a need to create a theory that would inform the design of a flipped classroom model so as to reduce incoherence.

Coherence in the Mathematics Classroom

The National Mathematics Advisory Panel (2008) described a coherent curriculum as one that is “marked by effective, logical progressions from earlier, less sophisticated topics into later, more sophisticated ones” (p. xvii). While the overarching undergraduate mathematics curriculum proceeds in such a fashion, from precalculus, to calculus, to differential equations or analysis, the logic follows an axiomatic structure. For example, The Fundamental Theorem of Calculus cannot be articulated or proven without first stating and proving the formal definitions of limits, derivatives, and integrals.

This logical structure at times undervalues, or even ignores conceptual structure, and how students’ thinking progresses. When we speak of coherence in the mathematics classroom, it has more components than just the correct axiomatic progression. We agree with Thompson (2008) who stated that “effectiveness might be a consequence of coherence, but it cannot define it” (p. 46) because progression in terms of mathematical sophistication need not be the same as progression in epistemology or in individuals’ cognition. A particular challenge in using flipped classroom models is that they potentially create instructional incoherence. For example, Bowers and Zazkis (2012) noted that in a flipped version of calculus, “asking students to view short, 10-minute videos and then only do problems during class time negates the ability for the instructor to help the students make critical connections” (p. 850).

Thompson (2008) offered conceptual analysis as a way to consider mathematical concepts in terms of their connections to and dependence upon other mathematical concepts, with reference to how those prerequisite concepts must be known in order to be useful in understanding the target concepts. We considered developing an effective model for the flipped classroom as a theoretical problem and used conceptual analysis and the notion of coherence to approach it. We created two constructs, curricular coherence and instructional coherence, in order to approach the problem of incoherence introduced by the structure of the flipped classroom. We define curricular coherence as the logical sequencing of mathematical content. We define instructional coherence as alignment among in-class materials, out-of-class materials, and target content. We then utilized two theoretical tools, conceptual analysis (Thompson 2008) and mathematics-in-use (Czocher et al. 2013) to design out-of-class materials to support in-class instruction in order to reduce the disconnect between the two. Both constructs are described in further detail below and an example of materials design and implementation is also provided below.

Curricular Coherence

Curricular coherence is the extent to which mathematics content is logically, cognitively, and epistemologically sequenced. A typical curricular trajectory (such as pre-calculus, calculus, analysis) is arranged to ensure logical coherence but may introduce conflicting cognitive or epistemological views of mathematics concepts (Raman 2004).

For example, student difficulty with rate of change as related to quantities is documented throughout mathematics learning (i.e. Thompson 2008), but it is a critical concept in differential equations because of its relation to the derivative. To complicate matters, the derivative concept was also identified as being used differently in differential equations than it was expected to be known at the end of calculus due to its dependence on conceptions of rate of change of physical quantities (Czocher et al. 2013). Related mismatches have been reported between how engineering faculty and mathematics faculty teach and use the concept of derivative (Bingolbali et al. 2007). Other lines of inquiry into student conceptualization of rate of change have revealed that students have difficulty isolating rate of change as a quantity of interest over time (Monk 1992), have a poor understanding of covariation and derivative (Thompson 1994), and refer to derivative as the tangent line (rather than slope of the tangent line) despite being able to define the derivative symbolically (Zandieh 1997).

Herscovics (1989) introduced the construct cognitive obstacle to describe a manner of thinking about a mathematical object or structure that is appropriate in one case, but inappropriate in another. Cognitive obstacles may arise from or contribute to incomplete concept images. For example, many times in differential equations, the curve describing the quantity in terms of time is unknown. So thinking about derivative as the slope of a tangent line is unproductive when developing a (differential) equation to describe how a quantity changes over time. In such a case, it is perhaps more productive to think about a derivative as a limit of difference quotients. Here, we consider cognitive obstacles as a symptom of lack of coherence in the curriculum.

Instructional Coherence

We define instructional coherence as the mathematical connections among materials within a course (i.e. the matching of content or important conceptual ideas from out-of-class to in-class). It is important to consider when initiating a flipped classroom paradigm because the purpose and role of various instructional modes, such as lectures, videos, and group work, change. Without explicit attention to the new role each of these instructional materials plays, instructional incoherence can rise markedly.

For most instructors, using a combination of video recordings, quizzes, and educational videos (e.g., TED talks) to move lecture out-of-class has not been challenging. How to spend class time is more problematic. In many flipped classroom settings, instructors lecture during the in-class time too. Other difficulties in implementing a flipped classroom model have been reported, such as: (1) failure to address student misconceptions, (2) overuse of low cognitive-level activities that required only recall of facts, and (3) an emerging disconnect between lecture materials and active-learning in-class components (Andrews et al. 2011). For the first, if student misconceptions are not addressed, students may fail to identify the necessary relationships among essential concepts (NRC 2000), which leads to incoherence from one course to the next. Secondly, continual use of low cognitive-level activities, particularly in mathematics, allows students to achieve at a high level in their courses without ever connecting mathematical content (Sternberg 1996). Lastly, the disconnect between lecture materials and active-learning components has also been identified as a point of instructional incoherence in past implementations of the flipped classroom model (Bowers and Zazkis 2012). In our use of the flipped classroom paradigm, we explicitly focused on exposing and addressing cognitive obstacles in order to surmount the difficulties in connecting in-class and out-of-class learning components reported by Andrews et al. (2011) and Bowers and Zazkis (2012).

The Conceptual Framework

Conceptual Analysis

Thompson (2008) identified three issues he saw in mathematics education: (1) all mathematics education research is carried out to help students learn meaningful mathematics, (2) most students in the U.S. have little contact with core concepts that carry throughout their education (e.g., base ten numeration), and (3) mathematics teaching focuses too much on moving students toward “translucent symbolism” (p. 46) and away from the core ideas of mathematics concepts. Thompson combined these three issues and used them to address coherence in curriculum. He did this through the construction of the conceptual analysis framework, which was a way of, “analyzing the coherence, or fit, of various ways of understanding a body of ideas” (p. 59).

Thompson (2008) suggested that conceptual analysis could be applied for four purposes: (1) examining what students know and understand given a particular context and mathematical concept, (2) deciding when students’ knowledge was helpful for learning a mathematical concept, (3) deciding when students’ knowledge might be damaging for learning a mathematical concept, and (4) analyzing “the coherence…of various ways of understanding a body of ideas” and then arranging the ways of understanding for “mutual compatibility and mutual support” (p. 59). In this present study, we mapped which concepts satisfied (2) and (3) in order to accomplish (4). We then used this map to design out-of-class instructional materials.

We summarize an example used by Thompson (2008) of how to introduce exponential functions because we used a similar analysis to motivate the connections between rate of change of a quantity and exponential growth.

Thompson (2008) provided an example of how introducing exponential functions to students might be done through conceptual analysis. He noted that the most important property of exponential functions was that the rate of change at a particular independent value was proportional to the dependent value of the function at the independent value. That is, given an exponential function f(x) = Ae^(Bx) and a value x_0, then the rate of change at x_0 is ABe^(Bx_0), which is proportional to f(x_0) = Ae^(Bx_0). Although this proportional property is essential to future uses of exponential functions, it is rarely discussed in lessons on exponential functions, in part because it has been noted to be difficult to cultivate in students’ understandings (Thompson 2008; Confrey 1994).

Thompson (2008) hypothesized that introducing exponential functions through the context of simple interest because according to simple interest, if the rate is 8 % per year, then the total value of the account after t years is V(t) = P + 0.8Pt, which explicitly shows the proportionality to the original value. Then instead of the usual route of examining what the interest would be after an entire time period (i.e. 1 year), he suggested the creation of a piecewise function shown in Fig. 1. Using v(x) where x is the number of years including portions of years (i.e. 2.3 years), the students can see the interest function as being able to measure the total amount at any moment in time.
Fig. 1

Piece-wise function that demonstrates interest gained over a fixed number of compounding periods

More to the point, though, the graphical representation of the piecewise function showed that the rate of change within each period was proportional to the initial value of that same period. For example, between 1 and 2, P(1.08) (0.08) is proportional to P(1.08). “The intent is that students come to see that, for very large n (a very large number of annual compounding periods and thus a very small amount of time), the function’s value at the beginning of each period is ‘nearly equal to’ the function’s value at every point within the period” (Thompson 2008, p. 55). Thompson (2008) concluded that this realization was the crux of connecting exponential growth to rate of change.

Mathematics in Use

Mathematics-in-use was developed to explore from an epistemological and cognitive standpoint, and with the support of extant mathematics education literature, how mathematical concepts and procedures might come together to address a mathematical problem. It has roots in APOS theory (Dubinsky and McDonald 2001), conceptual analysis (Thompson 2002, 2008), and didactical phenomenology (Freudenthal 1983). The technique provides “a descriptive mapping of how the variety of concepts, techniques and understandings converge within a particular mathematical setting (example, task, segment of exposition)” (Czocher et al. 2013, p. 4). It can provide a view of how the prerequisite mathematics concepts work together and are applied to solve a given problem. It is carried out by examining one problem or task in detail from the perspective of how a student might go about solving a task. The analysis focuses on what mathematical concepts and ways of thinking about those concepts she might undertake in order to obtain a solution. In this study, we used those identified ways of thinking to create out-of-class materials that would support productive ways of thinking about major concepts in ways that would in turn support learning the target differential equations topics. The mathematics-in-use analytic technique operationalizes Thompson’s (2008) four stated goals of conceptual analysis within the context of canonical differential equations problems thereby suggesting a roadmap for how to build new mathematical understandings on the basis of conceptual knowledge of prerequisite mathematics.

Figure 2 shows how the theoretical constructs and analytic techniques are linked. We drew from the literature and our experiences on what cognitive obstacles were likely to be present given a particular mathematical concept. At the same time, we used the analytic technique mathematics-in-use to determine, given a task to be introduced during the in class time, what mathematics students might need to reach a solution. Combining these two sources of information, we then examined the coherence of the mathematical content through instructional coherence (coherence between the in-class and out-of-class content) and curricular coherence (coherence across the mathematical content within the course and connecting to past and future mathematical content). In the next section, we provide an example of how the conceptual framework informed design of out-of-class instructional materials to support in-class activities.
Fig. 2

Conceptual framework informing design of out-of-class instructional materials

Applying the Conceptual Framework to Design Instructional Materials

We found that in focusing on coherence, it was necessary to examine two aspects: instructional coherence and curricular coherence. To apply the conceptual framework to design out-of-class instructional materials that would reduce curricular and instructional coherence for a particular differential equations topic, we first used mathematics-in-use analytic technique to examine what content could be drawn upon to identify potential cognitive obstacles. We then examined the literature on related cognitive obstacles and coupled that with our experiences of students’ difficulties in learning the material. This was aimed at addressing curricular coherence. We then compiled an exhaustive list of background content that might be needed. We then used conceptual analysis of that content and its focus on coherence to structure the out-of-class materials so that they might support in-class content. This stage was aimed at reducing instructional incoherence. The net result was informing design of out-of-class materials that would take into account how calculus, precalculus, and pervasive concepts would need to be thought about and used in order to promote understanding of the target differential equations concept.

Here, we offer an example of how the in-class and out-of-class instructional materials were designed to work together to promote curricular and instructional coherence and how we drew from our theoretical frameworks in designing the content for the course. The first class was designed to address one basic idea supporting the interpretation of rate-of-change as a quantity: what are some ways change in a quantity can be measured? The goal was to introduce the idea that there are many ways to quantify rate of change, and that many of these ways were already known by the students. The Data Analysis Task given on the first day during class asked the students to analyze the data shown in Fig. 3. Since the data grows (or decays) exponentially (Figs. 3 and 4), we needed to push the students to examine the rate of change of the data proportionally (Thompson 2008).
Fig. 3

Original data from Baker (n.d.) (p. 2)

Fig. 4

Normalized data from Baker (n.d.) (p. 2)

In designing the activities for the first day of class (in class and out-of-class), we first sought what cognitive obstacles might be present in students’ solving of the Data Analysis Task. From the literature, there are numerous cognitive obstacles associated with students’ conceptions of rate of change, however, we summarized these into three main cognitive obstacles.

First, many students after calculus tend to define rate of change as a procedural rule (i.e. rate of change is taking the derivative via the power rule) (Zandieh 2000). This is a cognitive obstacle because sometimes taking the derivative via the power rule is an appropriate method to find rate of change, however this method is inappropriate in many cases, especially when reasoning from discrete tabular data. This cognitive obstacle was also confirmed by our experiences in facilitating discussions about analyzing change from tabular representations and students would initially explain that the way to determine the rate of change was to “take the derivative”.

Second, Hackworth (1995) noted that students’ conceptions of rate of change were weakened from the beginning to the end of a calculus course. She reported that students’ conceptions of rate of change did not include two quantities changing simultaneously, but rather a focus on change in one quantity. This is a cognitive obstacle for reasoning about rate of change as a quantity because change in a quantity is measured with respect to change in another quantity. For example, we can talk about change in volume but the rate at which that volume changes must be measured with respect to some other variable, such as time. Thus, while it might sometimes be appropriate to consider the change in a single variable, in most cases, it is not appropriate to examine the rate of change according to one variable. Confrey and Smith (1994) noted that without an understanding of covariation, individuals could not develop the ability to reason meaningfully about rate of change.

Third, studies have shown that individuals (including teachers) tend to view the various representations of rate of change (symbolic, graphical, tabular, etc.) as separate isolated entities (Coe 2007; Lobato 2006). This is a cognitive obstacle because again, while each representation could be appropriate, it may not always be appropriate or useful. For example, if an individual were given tabular data and asked to find an equation that best represents that data, it might not be appropriate to only consider the tabular representation. A graph may be the most productive means for making sense of patterns in the table. Considering the table as isolated from its graph would impede finding a solution to the task.

Having identified these cognitive obstacles, we then used mathematics-in-use to determine what mathematics students might draw on when approaching the Data Analysis Task. We reasoned students would begin by taking sequential differences. Sequential differences were noted as the main mode of analysis that students of all ages initially engage in when confronted with tabular data (Tague 2015). The difficulty with the data from Fig. 3 is that since each of the series begins at different initial values and each sequence grows exponentially, the sequential differences provide little insight on what pattern might guide the growth of each series. A different analytic approach must be introduced. The student must first decide to normalize the data by dividing the elements of each column by the initial value. This transformation results in Fig. 4.

At this point, a student might want to view the graphical representation of the normalized data to examine any trends (Fig. 5). She could notice that the shapes of the graphs resemble exponential functions. Then the student must make the large conceptual step of examining the proportional difference (what we call the relative change) by dividing the sequential differences by the initial value in that increment (Thompson 2008). For example, if a student were to determine the relative change, she must examine that (1.14–1)/1 = 0.14, (1.30–1.14)/1.14 = 0.14, and so on. This examination produces a consistent relative change of 0.14 in the figure column indicating that the data can be modeled using an exponential function because of the proportional nature of the rate of change.
Fig. 5

Graph of three sets of normalized data from Fig. 4

Based on the documented and experienced cognitive obstacles combined with the mathematics-in-use exploration of how students might solve the in class Data Analysis Task, we considered both curricular and instructional coherence when designing out-of-class and in-class materials. The isolation of the various representations means that when designing instruction connections should be made explicitly so that students can connect the representations of the same mathematical object.

The out-of-class activity invited students to examine data derived from a loan repayment schedule based on an exponential model using both absolute change and relative change. A summary of all text from the slides in the first out-of-class activity is shown in Table 1. The students were first shown a table of values without context, which is shown in Fig. 6. The goal in designing the first slide was to allow them to draw on past experience to make sense of the data without moving directly into sequential differences. This was specifically designed to address the cognitive obstacle of students’ wanting to take the derivative of the data. It also allowed us to address the mathematics-in-use aspect of the activity for the next day where the students would have to examine and discuss the rate of change of three sources of data (Fig. 3).
Table 1

Text from the first out-of-class activity

Slide number

Text from the slide

1

Take a moment and look at the table to the right. What are some ways we talk about changes in quantities in English?

2

What do you think is happening based on the data given in the table? What trends do you notice and how did you notice them?

3

Suppose the amount owed is based on a loan taken out by a law student over the course of 3 years of law school. The student is now out of school and paying off $880.87 per month, but interest is still accruing on the remaining principal. How might the situation be described mathematically (in words and symbols)?

4

The table and symbolic notation are supposed to represent the same data set. Do they? Explain why or why not?

5

Two students from last year’s class were arguing about what the data in the table might mean. The first student, Anna, said she measured the change in amount owed by taking the calculations shown here. Another student, Brian, told Anna that before calculating change in amounts owed, he needed to take each amount owed and divide by the original amount to get the following table. Do you agree with Anna or Brian? Who is more convincing or do you disagree with both? Explain why?

6

Anna’s version of change shown below again is called absolute change and is calculated by taking the difference between two subsequent amounts owed. Brian’s version of change requires first calculating the amounts owed relative to the initial loan principal. Then he calculated the difference between relative amounts and divided by the previous amount to figure out the relative change. Note that the chosen variable is different below because it is now relative amount owed. These are the kind of variable changes that can be helpful to you too!

7

Anna’s version of change shown below (left) is called absolute change. Brian’s version of change is called relative change and is shown below (right). Brian’s measurement can be seen as a modification on Anna’s measurement. How?

8

Anna’s version of change shown below (left) is called absolute change. Brian’s version of change is called relative change and is shown below (right). When might it be more beneficial to examine data with absolute change and when might it be helpful to use relative change? Think of some real life situations for when each kind of measurement might be helpful.

9

How does representing the data graphically help to analyze the data from the amount owed column?

10

What are the strengths and weaknesses of using graphical, tabular, and mathematical representations in explaining change in data?

11

In class, we will talk more about how to describe change in data and why we might want to talk about change. Please think about how you have described and written about change in previous math classes as well as in science and engineering classes. Be ready to suggest some ideas of how to use mathematics to describe change and which display is most helpful in describing the change. See you soon!

Fig. 6

First mathematics content slide of the first out-of-class activity

During the out-of-class activity, the students were then asked to draw on their past experiences with measuring change to make sense of trends in the table and in particular over time trends. By asking the students to examine the change over various amounts of time, it has been shown to begin to address the cognitive obstacle of only examining the rate of change according to one variable and move the students toward covariation (Monk 1992). Next they were given a context to go along with the table and asked to describe change in the data set in words or symbols. In the following slide, we asked them to consider if the symbolic notation and the table represented the same data mathematically. In this way, the students could examine and hopefully connect their English description with the mathematical symbols, which was addressing the cognitive obstacle of the isolation between representations of rate.

Next, the students were given a situation where two students were having a disagreement over which way to represent the change in the table: sequential differences or proportional change. We defined sequential differences as absolute change and proportional change as relative change because that was the vocabulary used in the class materials. We asked them to examine the connections between the students’ method of calculating change and to think about some contexts where relative or absolute change might be appropriate in examining the change over time. In designing the argument between the two students, we contrasted the students’ typical analysis of rate in a tabular setting (sequential differences) with the proportional rate of change that we hoped students would generate the next day in class.

Last, we introduced multiple representations of the data set including sequence notation and graphical representations. Open-response questions were shaped around helping the students articulate the idea that different representations highlight different attributes of the data to again address the isolation amongst the representations.

In class, the students were given the data shown in Fig. 3. The table recorded growth of enzymes in the same medium at different temperatures. The professor opened the first class meeting of the course by asking, “What is happening?” Students volunteered that the first two columns were increasing and the last was decreasing. A good initial look at the data completed, the instructor moved on to asking, “How do the data change?” Because the out-of-class activity provided students with priming of vocabulary and ways of reasoning, the students suggested that relative densities might be a way to compare the cultures across the temperatures. They also volunteered that the data might need to be normalized by dividing each column by the initial value for that column. Both of these actions were analogous to tasks students completed on the out-of-class activity. The corresponding transformation produced the table shown in Fig. 4. After examining the proportional rate of change, a pattern emerged leading to an exponential function. Thus, through the support of extant research, the out-of-class modules were designed to explicitly address underlying cognitive obstacles and misconceptions relevant to the in-class materials.

Methods

Classroom Context

The differential equations course was a class of 80 second-year engineering and science majors. The classroom was large with a central podium and clusters of four to five desks with computers on each desk. The students enrolled were in general highly motivated since the course was initially designed as part of the Honors program. Enrollment was not limited to only Honors students. Many students reported having signed up for the course because a friend, teacher, or advisor had recommended it as being more helpful than the non-flipped, non-mathematical modeling-based differential equations course.

The course used a mathematical modeling approach to differential equations. By this we mean that new mathematics was introduced through solving canonical engineering problems, such as predicting the concentration of pollution in a dam. In general, the engineering problems took multiple class sessions to solve. The course progressed not through algebraic or computational difficulty, but instead through increasing complexity of the engineering situations. For example, when approaching the problem of predicting the concentration of pollution in a dam, we assumed that the water was well mixed both in space and in time. Then as the course progressed, simplifying assumptions were removed in order to represent more general situations. In the case of the dam, by the end of the course, a partial differential equations (PDE) was used to model the level of contamination of water in the dam.

There were 47 class meetings and a total of 32 out-of-class modules created and assigned. In general the out-of-class modules were assigned before each class meeting. If a particularly challenging problem spanned several class sessions, but no new cognitive obstacles were addressed then an out-of-class module was not assigned. For example, near the end of the course, the engineering situations were complex and so took several in class periods to reach a solution, but no new mathematics was introduced and so no new out-of-class activities were assigned. The modules were administered through the course webpage and the students worked through them at their own pace and as many times as they wanted. The modules were only available for 12 hours the day before the next class meeting and closed at midnight. The modules had next buttons that were only active after the student responded to the task on the current slide. If the questions were multiple choice or multiple select, they were graded automatically and sent to the online course grade book, so students had automatic feedback on those questions. The open-response questions were graded as complete/incomplete and the instructor used those to decide if there were concerns to be addressed the following day during class or if there was a particular way he might introduce the in class activity that would build off of students’ responses.

Class time was driven by students choosing and defining the variables and parameters important to the engineering situations. If students could not come to a conclusion or determine the next step toward a solution at the end of class, then class was ended without a resolution. In these situations, the following class began by asking the students to resolve whatever issue was necessary to move forward in solving. The out-of-class activity was only referenced when the connection arose. For example, in the first day of class, the out-of-class material was brought up near the beginning of class because it was used to set up how students approached the first in-class activity.

Creation of Out-of-Class Materials

Out-of-class instructional materials were online interactive modules created using Articulate Storyline software and were embedded in the course website. Each module addressed an aspect of a single cognitive obstacle identified as problematic for the upcoming differential equations content. In the extended example provided above, the interactive module provoked student thinking about ways to measure change and rates of change. The modules included both long-and short-answer questions as well as some multiple-choice questions which were designed to encourage critical thinking about the cognitive obstacle. If a procedure was the focus (e.g., implicit differentiation) the module asked questions that would highlight how the procedure was used in the precalculus or calculus course and how it would be used in the differential equations course. Several modules were more complex, having branches where the next question depended on the response to previous question. Excerpts from one of the modules are displayed in Table 1 and Fig. 6.

Data Collection and Analysis

The study was carried out using survey methods adapted from an existing instrument (Powers et al. 2010). Powers et al. (2010) introduced pencasts in a chemistry course and used survey methods to examine student feedback on lecture capture through pencasts (flash videos where handwriting and audio is synched) in the context of a pharmaceutical computations course. The researchers designed 10 survey questions to examine: (1) if the students used the lecture capture pencasts; (2) if they found the pencasts helpful; (3) if they thought the pencasts “enhanced learning the material presented in class (p. 147); (4) how they used the pencasts; and (5) what could be improved. In past semesters, we examined students’ perception of pencasts of difficult homework problems (Tague et al. 2014) and lecture capture as part of the development of the differential equations course (Tague et al. 2013). We, therefore, sought to use perception surveys that were similar to those used in studies of classroom technology but would also provide us with feedback on the course components.

In the present study, there were two types of surveys: a large pre-/post-course survey administered online and four smaller surveys administered in class. The two large surveys will be referred to as Initial Perception and Final Perception. The four smaller in-class surveys will be referred to as Check Ins 1, 2, 3, and 4. We adapted similar questions to those developed by Powers et al. (2010), but found that requesting specific examples of what students’ find helpful or not helpful allowed us to make more informed adaptations of technology in our class. For the current study, we adapted our past surveys to determine, instead of helpfulness, if students found the course materials related (and thus coherent).

The online Initial and Final Perception surveys addressed a variety of aspects of the adaptation of instructional technology in several mathematics courses, three of which were related to instructional cohesion. The Initial/Final Perception surveys were part of a larger project examining blended or flipped courses across the mathematics department. The purpose of these surveys was to build a profile of students’ past technology experiences as well as how the technology as a part of mathematics courses may impact students’ beliefs about mathematics and the mathematics department. For this study we selected three questions that were relevant to instructional coherence in a flipped classroom. The questions from the Initial/Final Perception surveys are displayed in Table 2.
Table 2

Survey questions

Initial/final perception survey (Likert scale)

Check in survey (Open response)

I expect out-of-class materials to prepare me to participate in class activities (group discussion, problem solving, etc.)

1. Did you complete the out-of-class activities this week? (Circle one) All Some None

As a result of the out-of-class material, I expect to be confident in my understanding of the concepts that each module covered.

2. Were the in class activities related to the out-of-class activities this week?

I expect the in class activities to be clearly coordinated with the out-of-class material.

3. If they were related, give an example of something from an out-of-class activity that you felt was useful in class.

The Check In surveys were designed to provide ongoing feedback about students’ perception of coherence of the course that we could take into account when designing future out-of-class materials. Four were given to allow for feedback at intervals that would allow us to make any adjustments necessary. The survey questions are displayed in Table 2.

Students were asked to respond to the Initial/Final Perception surveys on a 5-point Likert scale and the verb tense was changed from the Initial to the Final Perception survey. Students were also given the opportunity to provide open-ended feedback on the Initial/Final Perception surveys. After the three Initial/Final Perceptions questions shown in Table 2, there was an optional follow-up question that stated: Please provide any additional feedback from your experience with the out-of-class materials. This question was analyzed by sorting the students’ responses into the categories helpful/not helpful/neutral. Lastly, there was one question on the Final Perceptions survey meant to gauge the students’ overall thoughts on the course. This question was rated on a five point Likert scale and stated, “Overall this course was designed and taught: much better than I expected/better than I expected/similar to what I expected/worse than I expected/much worse than I expected.”

Students’ responses were entered into Excel to facilitate analysis. The quantitative data generated by the Initial/Final Perception surveys were analyzed with Qualtrics descriptive statistics and with SPSS to generate the t-statistic for the difference in means between the Initial/Final Perception surveys in order to address the first research question. Assumptions for the t-test were met.

Responses to the first two Check In Survey questions were coded numerically. For the first question, students could choose from three discrete choices all/some/none, which were coded as 2, 1, and 0, respectively. For the second question, if the students perceived in-class material to be related to the out-of-class material the response was coded as 1. Otherwise, it was coded as 0. Results from all surveys were used to address the second research question. Descriptive statistics for the Check In Survey questions were generated using spreadsheets.

Results

Initial/Final Perceptions Results

The Initial and Final Perceptions surveys were completed by 58 (72.5%) and 20 (25%) students, respectively. The surveys were voluntary and anonymous. The low response rate on the Final Perceptions survey is likely due to its administration being close to final examinations. Table 3 shows student responses to the Initial/Final Perceptions surveys. Table 4 shows the statistical significance between the mean response for each question on the Initial/Final Perceptions surveys. There were no significant differences for any of the three questions. This could be partly due to the fact that how students expect to participate in class tends to be how they participate in class. Out of the 20 individuals who took the Final Perceptions survey, 65 % thought the class was better or much better than they expected and 30 % thought it was the same as what they expected, while 5 % thought the class was worse than expected.
Table 3

Descriptive statistics from Initial/Final Perceptions surveys

Question (5-point Likert scale)

Initial- Mean (SD)

Final- Mean (SD)

Participate: I expect out-of-class materials to prepare me to participate in class activities (group discussion, problem solving, etc.)

3.33

(1.033)

3.67

(.856)

Confident: As a result of the out-of-class material, I expect to be confident in my understanding of the concepts that each module covered.

3.10

(1.054)

3.14

(.964)

Coordinated: I expect the in class activities to be clearly coordinated with the out-of-class materials.

3.91

(.884)

3.90

(.944)

Table 4

Statistical significance from the Initial/Final Perceptions surveys

 

t

df

p

Participate

−1.345

77

0.183

Confident

−0.150

77

0.881

Coordinated

0.039

77

0.969

Qualitative coding on Initial/Final Perceptions surveys on the optional follow-up question was coded according to the categories helpful/not helpful/neutral. The results of the coding are shown in Table 5. There were 2 students in the Initial Perceptions survey that wrote “none” and so were not included in the other codes. Some examples from the Helpful category include: the [out-of-class materials] helped to remind me of things I forgot how to do, I expect they will help me, however I doubt that they will completely prepare me every single time, and They are a good introduction to the material being presented. Some examples from the Not Helpful category include: “I expect to be lost while doing pre-class material”, “I don’t expect the out-of-class materials to be especially useful. I will read the book as I’m assigned it, but I find videos rather boring usually and not a lot of help when I’m trying to grasp concepts”, and “Most of the stuff on [out-of-class materials] seemed like a waste of time. I know they were meant to ‘get our minds ready for class’ but I didn’t really need to review most of the concepts covered in the [materials].” The neutral category consisted of responses that were skeptical of the technology aspect of the course or just stated neither positive nor negative aspects of the out-of-class materials. For example: “I expect that the out-of-class materials will both help and hinder my success and understanding in the class. With online pre-class material, I expect that, while the questions won’t be difficult, the syntax used to input the correct answer will be difficult and irritating.” or “I hope that the [out-of-class materials] will not be too in depth to the topic we will be looking at in the next lecture”. The helpfulness category increased from the initial to the final perceptions, but so did the not helpful category. However, out of the 4 individuals who wrote that the out-of-class materials were not helpful, all of them did so while also mentioning a syntax or administration issue that they were angry about. For example, they were angry that the out-of-class materials were only available for the 12 hours before class began instead of for several days in advance. There was also only one student in the Final Perceptions survey who provided a neutral response indicating that the responses were more bimodal than in the Initial Perceptions survey.
Table 5

Qualitative codes from the follow-up question on the Initial/Final Perceptions surveys: Please provide any additional feedback from your experience with the out-of-class materials

 

Initial perceptions (n = 34)

Final perceptions (n = 11)

Helpful

38 %

55 %

Not Helpful

18 %

36 %

Neutral

38 %

9 %

Check In Results

Table 6 shows the quantitative results from the first two questions from Check In surveys 1, 2, 3, and 4. The number of students taking the Check In surveys was consistent throughout the semester because they were given in class and on paper suggesting that completion rates were similar to attendance rates. The descriptive statistics for the Check In surveys were calculated in a spreadsheet.
Table 6

Quantitative results from the four Check In Surveys

 

Check In 1

(n = 59)

Check In 2

(n = 62)

Check In 3

(n = 53)

Check In 4

(n = 55)

Completed all the out-of-class activities

85 %

82 %

72 %

81 %

Out-of-class activity was related to the in class activity

85 %

74 %

85 %

96 %

Of those that completed the check in surveys, more than 72 % reported having completed all of the out-of-class activities.

For three of the surveys (Check In 1, 2, and 4) at least 85 % of the students rated that the out-of-class materials were related to the in-class materials. The results from Check In 2, were anomalous because a student asked a question that prompted the instructor to decide to back up and address the student’s conceptual concern. It led to a rich discussion in class, but caused that class to be out of sync with the out-of-class activity from the previous night. This alteration of in class activities caused the slight drop (74 %) in students’ rating of if the out-of-class and in class activities were related.

On Question 3 of the Check In surveys, students were asked to give an example of something that was useful from the out-of-class materials. We coded the responses as either directly stating a mathematical concept or connection, stating that there was no connection or they did not remember, or not applicable if the students made a vague reference to remembering, but could not state something specifically. The percent of each type of response is shown in Table 7. In Check In 1, 2, and 3, at least 82 % of the students who responded could a direct connection between the out-of-class of class and in class activities that was offered as a specific example. We took this as evidence that the students were appreciating cohesiveness between the out-of-class and in-class activities. For Check In 4, 78 % of the respondents who could make a connection between the out-of-class and in-class activities, but a larger number of students could not remember during this Check In survey than the other three. This might have been due to the fact that there were only two out of class modules during that week and there was not one assigned the day before Check In 4 was administered. Because of the delay, there were fewer students who could recall a direct connection.
Table 7

Qualitative coding on Question 3 from the four Check In surveys

 

Check In 1 (n = 50)

Check In 2 (n = 54)

Check In 3 (n = 57)

Check In 4 (n = 41)

Connection

82 %

91 %

90 %

78 %

Not Related/Don’t Remember

4 %

0 %

5 %

10 %

Not Applicable

14 %

9 %

5 %

12 %

Some responses which illustrate the “Connection” category included: I found the second-order introduction to be very useful before lecture covered it, Refreshers on balancing equations and derivatives were useful, and The [out-of-class activity] about implicit differentiation really helped with variable coefficients. From the “Not Applicable” category, some responses were: differential equations practice, Review of concepts from past classes, and Learning. These were coded as Not Applicable because of their lack of specificity. From the “Not Related” category, some representative responses were: We didn’t really use cosine and sine transformations, but they were in the [out-of-class activity], I can’t remember, and None of them really felt useful to me.

Across all of the Check In surveys 10 % of respondents or fewer found no relation between the in class and out-of-class materials or else indicated that they did not remember any specific mathematical content from the out-of-class materials. We also received no consistent feedback of complete disconnect between in class and out-of-class materials as had been reported in previous studies on the flipped classroom (Andrews et al. 2011; Bowers and Zazkis 2012). Our results, compared to previous studies of flipped classrooms, are encouraging in that students seemed to see the connections in the course activities.

Discussion and Conclusions

The purpose of this study was to investigate differential equations students’ perceptions of a flipped classroom model. The flipped classroom model was meaningfully rooted in theory of mathematics teaching and learning in order to reduce instructional incoherence reported by previous studies. Analyzing the in class activities for possible cognitive obstacles (Herscovics 1989) coupled with using mathematics-in-use (Czocher et al. 2013) to determine the mathematical resources a student would need while engaged in the in class activity allowed us to design course materials for curricular and instructional coherence.

Students’ perceptions of instructional coherence prior to the start of and at the conclusion of a differential equations course that uses a flipped classroom model were not significantly different as shown in the Initial/Final Perceptions survey Results. Students’ perceptions of instructional coherence before and after were between 3 and 4 on a 5 point scale indicating a slightly positive perception of instructional coherence. Results also roughly characterized students’ perceptions of instructional coherence. Additionally, although the response rate was quite low on the Final Perceptions survey, responses indicated that the students who did respond were likely the most involved in the course. Their responses suggested that either they perceived the materials as helpful or they disliked structural aspects of the materials (i.e. they disliked when they were due). On the Check In surveys, at least 85 % of students indicated that the out-of-class materials were related to the in-class materials, except in the case when the instructor purposefully altered his in-class plans to address a student concern. In this case, 74 % of the students found that the out-of-class and in-class materials were related. Even more telling, nearly 80 % of the students were able to restate a direct connection between the out-of-class and in-class materials on each of the four Check In surveys. Not only did students report that the materials were related, they, could also provide examples of the relation, which we took as evidence of their perception of instructional coherence.

While many of the current studies of flipped classrooms show that students enjoy them (Lage et al. 2000; Love et al. 2014), other studies point to the flipped classroom as a source of incoherence (Bowers and Zazkis 2012; Strayer 2012). Indeed, even when an instructor is aware of current literature and attempts to design in-class and out-of-class components purposefully, students in a flipped statistics classroom still expressed that they felt incoherence among the course components when compared to peers in an unflipped classroom (Strayer 2012). Our study took the design of out-of-class materials further by basing them in a cogent conceptual framework derived from theories of teaching and learning. Results show that it is promising to use cognitive obstacles and mathematics-in-use as a theoretical grounding, to align in-class and out-of-class materials. More generally, it is promising to ground blended instructional methods in discipline-specific knowledge of how that content is learned.

Although our design philosophy centered on addressing cognitive obstacles was intended to reduce instructional coherence, it may also help improve curricular coherence because of our execution of the analytic technique, mathematics-in-use. Since we adapted mathematics-in-use from APOS (Dubinsky and McDonald 2001), Freudenthal’s (1983) structural analysis, and Thompson’s (2008) conceptual analysis, it allowed us to project a task onto the mathematics necessary to solve it successfully while simultaneously considering what the current literature suggested about students’ cognition of that same mathematics. As such, it allows for incorporation of students’ mathematical thinking into instructional design. Although we are aware this might be useful for not only flipped classroom settings, it is especially useful in that case because it provides a way to coordinate out-of-class and in-class materials, which was previously reported as a source of instructional incoherence for flipped classrooms (i.e. Strayer 2012).

The current study is limited in a few important ways. The small sample included only advanced mathematics students in a specific content area. However, these limitations are not unique to our study of flipped classrooms and are indeed common among flipped classroom studies (Bluic et al. 2007). Since our focus was on coordination between course elements and the mathematics content, we sought to measure students’ perceptions on this aspect and not their perceptions of how the in-class time was spent. We hypothesize that the instructor and his in-class interactions with the students also contributed to the students’ perceptions of the course materials. Future studies might also examine the effectiveness and efficacy of the theory-based approach to reducing perceived incoherence.

Even with these limitations in mind, our design philosophy fills a void in the literature by providing a first step toward designing flipped classrooms for instructional coherence and curricular coherence. Furthermore, our results show that our design of a flipped classroom did circumvent the past documented disconnects between the out of class and in class content. While our design was created specifically to address incoherence fostered through flipped classrooms, it could be used to examine coherence in any mathematics classroom. However, more research is needed to develop techniques that might be used to address instructional coherence and curricular coherence. Our study used mathematics-in-use and conceptual analysis to address curricular coherence in the structure of the content, but we primarily focused on measuring students’ perceptions of instructional coherence for two reasons: (1) it was a persistent issue in flipped classroom models, and (2) there were not methods available for measuring students’ perceptions of curricular coherence. Thus, while more work is necessary to develop reliable and valid research techniques to ensure and measure instructional and curricular coherence, it is essential that future flipped classrooms are designed while keeping cohesion in the fore.

References

  1. Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses. Cell Biology Education -- Life Sciences Edition, 10, 394–405.CrossRefGoogle Scholar
  2. Baker, J. W. (2000, Month). The “classroom flip”: Using web course management tools to become the guide by the side. Paper presented at the 11th International Conference on College Teaching and Learning, Jacksonville, FL.Google Scholar
  3. Baker, G. (n.d.). An introduction to differential equations for scientists and engineers. Columbus, OH.Google Scholar
  4. Bingolbali, E., Monaghan, J., & Roper, T. (2007). Engineering students’ conceptions of the derivative and some implications for their mathematical education. International Journal of Mathematical Education in Science and Technology, 38(6), 763–777.CrossRefGoogle Scholar
  5. Bluic, M., Goodyear, P., & Ellis, R. A. (2007). Research focus and methodological choices in studies into students’ experiences of blended learning in high education. Internet and Higher Education, 10, 231–244.CrossRefGoogle Scholar
  6. Bowers, J., & Zazkis, D. (2012). Do students flip over the flipped classroom model for learning college calculus? Proceedings of the 34th annual meeting of the North American chapter of the international group for the psychology of mathematics education. Kalamazoo, MI: Western Michigan University.Google Scholar
  7. Coe, E. (2007). Modeling teachers’ ways of thinking about rate of change. Unpublished doctoral dissertation, Arizona State University, Tempe, AZ.Google Scholar
  8. Confrey, J. (1994). Splitting, similarity, and rate of change: A new approach to multiplication and exponential functions. In G. Harel & J. Confrey (Eds.), The development of multiplicative reasoning in the learning of mathematics (pp. 293–330). Albany, NY: SUNY Press.Google Scholar
  9. Confrey, J., & Smith, E. (1994). Exponential functions, rates of change, and the multiplicative unit. Educational Studies in Mathematics, 26, 135–164.CrossRefGoogle Scholar
  10. Czocher, J. A., Tague, J., & Baker, G. (2013). Where does the calculus go? An investigation of how calculus ideas are used in later coursework. International Journal of Mathematical Education in Science and Technology, 44(5), 673–684.CrossRefGoogle Scholar
  11. Dubinsky, E., & McDonald, M. (2001). APOS: A constructivist theory of learning in undergraduate mathematics education research. Dordrecht, The Netherlands: Springer.Google Scholar
  12. Freudenthal, H. (1983). Didactical phenomenology of mathematical structures. Dordrecht, The Netherlands: D. Reidel Publishing Company.Google Scholar
  13. Hackworth, J. A. (1995). Calculus students’ understanding of rate. Unpublished Masters Thesis, San Diego State University, Department of Mathematical Sciences. Available at http://pat-thompson.net/PDFversions/1994Hackworth.pdf.
  14. Herscovics, N. (1989). Cognitive obstacles encountered in the learning of algebra. In S. Wagner & C. Kieran (Eds.), Research issues in the learning and teaching of algebra (pp. 60–86). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  15. Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: a gateway to creating an inclusive learning environment. Journal of Economic Education, 31, 30–43.CrossRefGoogle Scholar
  16. Lobato, J. (2006). Alternative perspectives on the transfer of learning: history, issues, and challenges for future research. Journal of the Learning Sciences, 15(4), 431–449.CrossRefGoogle Scholar
  17. Love, B., Hodge, A., Grandgenett, N., & Swift, A. W. (2014). Student learning and perceptions in a flipped linear algebra course. International Journal of Mathematical Education in Science and Technology, 45(3), 317–324.CrossRefGoogle Scholar
  18. Monk, G. (1992). Students’ understanding of a function given by a physical model. In G. Harel & E. Dubinsky (Eds.), The concept of function: Aspects of epistemology and pedagogy. MAA Notes, volume 25. Washington, DC: Mathematical Association of America.Google Scholar
  19. National Mathematics Advisory Panel. (2008). Foundations for success: Final report of the National Mathematics Advisory Panel. Washington, D. C.: U. S. Department of Education. Available at http://www.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf.
  20. National Research Council. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academies Press.Google Scholar
  21. Powers, M. F., Bright, D. R., & Bugaj, P. S. (2010). A brief report on the use of paper-based computing to supplement a pharmaceutical calculations course. Currents in Pharmacy Teaching and Learning, 2, 144–148.CrossRefGoogle Scholar
  22. Raman, M. (2004). Epistemological messages conveyed by three high-school and college mathematics textbooks. The Journal of Mathematical Behavior, 23, 389–404.CrossRefGoogle Scholar
  23. Sternberg, R. J. (1996). What is mathematical thinking? In R. J. Sternberg & T. Ben-Zeev (Eds.), The nature of mathematical thinking (pp. 303–318). Mahway, New Jersey: Lawrence Erlbaum Associates, Inc.Google Scholar
  24. Strayer, J. F. (2007). The effects of the classroom flip on the learning environment: A comparison of learning activity in a traditional classroom and a flip classroom that used an intelligent tutoring system. (Doctoral dissertation).Google Scholar
  25. Strayer, J. F. (2012). How learning in an inverted classroom influences cooperation, innovation and task orientation. Learning Environments Research, 15, 171–193.CrossRefGoogle Scholar
  26. Tague, J. (2015). Conceptions of rate of change: A cross analysis of modes of knowing and usage among middle, high school and undergraduate students. (Doctoral dissertation).Google Scholar
  27. Tague, J., Czocher, J., Baker, G., & Roble, A. (2013). Choosing and adapting technology in a Mathematics course for engineers. Atlanta, Georgia: Proceedings of the American Society for Engineering EducationGoogle Scholar
  28. Tague, J., Czocher, J., Roble, A., & Baker, G. (2014). Pencasts as exemplars in differential equations. Denver, Colorado: Proceedings of the Conference on Research in Undergraduate Mathematics EducationGoogle Scholar
  29. Thompson, P. W. (1994). Concept image and concept definition in mathematics with particular reference to limits and continuity. Educational Studies in Mathematics, 12, 151–169.Google Scholar
  30. Thompson, P. W. (2002). Didactic objects and didactic models in radical constructivism. In K. Gravemeijer, R. Lehrer, B. van Oers, & L. Verschaffel (Eds.), Symbolizing, modeling, and tool use in mathematics education (pp. 191–212). Dordrecht, The Netherlands: Kluwer.Google Scholar
  31. Thompson, P. W. (2008). Conceptual analysis of mathematical ideas: Some spadework at the foundations of mathematics education. In O. Figueras, J. L. Cortina, S. Alatorre, T. Rojano, & A. Sépulveda (Eds.), Plenary paper presented at the annual meeting of the international group for the psychology of mathematics education (Vol. 1, pp. 45–64). Morélia, Mexico: PME.Google Scholar
  32. Tucker, B. (2012). The flipped classroom. Education Next, 12, 82–83.Google Scholar
  33. Zandieh, M. (1997). The evolution of student understanding of the concept of derivative. (Doctoral dissertation).Google Scholar
  34. Zandieh, M. (2000). A theoretical framework for analyzing student understanding of the concept of derivative. In E. Dubinsky, A. Schoenfeld, & J. Kaput (Eds.), Research in collegiate mathematics education, IV (Vol. 8, pp. 103–127). Providence, RI: American Mathematical Society.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.California State UniversityFresnoUSA
  2. 2.Texas State UniversitySan MarcosUSA

Personalised recommendations