Research in Science Education

, Volume 42, Issue 5, pp 943–965

An Explanation for the Difficulty of Leading Conceptual Change Using a Counterintuitive Demonstration: The Relationship Between Cognitive Conflict and Responses

Authors

    • Department of Physics Education, College of EducationSeoul National University
  • Taejin Byun
    • Department of Physics Education, College of EducationSeoul National University
Article

DOI: 10.1007/s11165-011-9234-5

Cite this article as:
Lee, G. & Byun, T. Res Sci Educ (2012) 42: 943. doi:10.1007/s11165-011-9234-5

Abstract

Bringing successful teaching approaches for stimulating conceptual change to normal classrooms has been a major challenge not only for teachers but also for researchers. In this study, we focused on the relationship between cognitive conflict and responses to anomalous data when students are confronted with a counterintuitive demonstration in the form of a discrepant event. The participants in this study were 96 secondary school students (9th grade) from S. Korea. We investigated students’ preconceptions of motion by administering a written test. After the exam, we presented a demonstration that may have conflicted with the ideas held by students. We then investigated the relationship between students’ cognitive conflict and responses to anomalous data by using a Cognitive Conflict Level Test (CCLT). Results showed that cognitive conflict initiated the first step in the process of conceptual change. Anxiety was an especially crucial component of cognitive conflict, affecting the relationship between cognitive conflict and students’ responses. In addition, superficial conceptual change was found to be the most common response.

Keywords

Cognitive conflictConceptual changeResponses to anomalous data

Introduction

Over the past several decades, science educators have agreed that students begin lessons on most science topics with pre-instructional conceptions that differ from scientific conceptions. These conceptions, so-called alternative conceptions or misconceptions, are robust and resistant to change. The resistant nature of preconceptions may discourage teaching efforts, and researchers have identified the characteristics of desirable conceptual change (Hennessey 1999). For example, Hewson (1981) uses the terms conceptual capture and conceptual exchange to characterize changes in the overall content of a conception. Posner et al. (1982) characterize these changes as accommodation and assimilation. Carey (1985) characterizes conceptual changes as weak knowledge restructuring or strong knowledge restructuring to indicate the degree to which a student holds a preconception. White and Gunstone (1989) describe conceptual change as a principle or belief change—a change in a metaphysical belief, for example. In addition, the initiating factor for conceptual change is known as disequilibrium, dissatisfaction, or cognitive conflict (Hennessey 1999). Although these terms are not identical, they are used analogously by researchers (Damon and Killen 1982; Murray 1983; Murray et al. 1977; Posner et al. 1982).

Festinger (1957) suggested that the perception of inconsistency among an individual’s cognitions generates psychological discomfort (namely, cognitive dissonance) and that this aversive state motivates individuals to attempt to resolve the dissonance. Festinger’s cognitive dissonance resembles Piaget’s disequilibrium (Misiti and Shrigley 1994; Smedslund 1961). This means that cognitive dissonance would be similar to cognitive conflict (Lee et al. 2003).

A considerable number of researchers argue that cognitive conflict is a necessary condition of the type of learning that is described as conceptual change (Hashweh 1986; Kwon 1989, 1997; Niaz 2006; Posner et al. 1982; Treagust and Duit 2008). In light of this idea, many teaching strategies have been designed to facilitate conceptual change. Many of these use cognitive conflict as a means of facilitating a change in students’ conceptions (Chan et al. 1997; Druyan 1997; Hewson and Hewson 1984; Mason 2000; Mortimer and Machado 2000; Niaz 1995; Thorley and Treagust 1987; Tsai 2000).

This area is not, however, without debate. Other researchers argue that cognitive conflict strategies do not consistently lead to conceptual change. Even when students’ ideas are confronted with contradictory information through instruction, frequently such contradictory information does not result in meaningful conflict for a learner (Alvermann and Hague 1989; Champagne et al. 1985; Dekkers and Thijs 1998; Dreyfus et al. 1990; Hewson and Thorley 1989; West and Pines 1985). Elizabeth and Galloway (1996) argue, for instance, that a learner who feels helpless would consider a cognitive conflict strategy as a cognitive attack.

To understand this controversy in more detail, we explored how cognitive conflict and conceptual change are related to one another. In 1998, Chinn and Brewer conducted research to understand students’ conceptual changes after they were confronted with contradictory information in the form of anomalous data. They found that students often respond to anomalous data by discounting the data rather than changing their conceptions. These researchers proposed a taxonomy of students’ responses to anomalous data to explain how students respond to anomalous data in ways other than changing their conceptions. However, previous research offers little evidence on the reasons students respond as they do. In particular, there has been little research on how students’ cognitive conflicts, caused by anomalous data, affect the students’ conceptual changes. To help fill this gap, this study attempted to bring together research on cognitive conflict with research on the responses of students confronted with anomalous data.

Theoretical Bases

The Cognitive Conflict Process Model

Cognitive conflict has been loosely defined in a number of ways as: an “awareness of a momentary disequilibrium” in the system of schemas (Mischel 1971, p. 331); cognitive disequilibrium or conflict induced by awareness of contradictory discrepant information (Bodrakova 1988); a condition created when one’s expectations and predictions, based on one’s current reasoning, are not confirmed, resulting in disequilibrium (Wadsworth 1996); a conflict between cognitive structure (i.e., an organized knowledge structure in the brain) and an environment (e.g., an experiment, a demonstration, a peer’s opinion, or a book); or a conflict between conceptions in the cognitive structure (Kwon 1989), in which cognitive structure means, as Langfield-Smith (1994) describes it, any mental representation used to organize knowledge, beliefs, values, or other data whether hypothetical or neurological, and in which mental representation means mental model, which is a dynamic representation integrating recognized external information and individual knowledge (Lee et al. 2005; Vosniadou and Ioannides 1998). Taking what we find to be the most salient and common points in these definitions and adding specificity, we define cognitive conflict as a perceptual state of the discrepancy between one’s mental model and the external information recognized (internal-external conflict), or between different mental models of one’s cognitive structure (internal conflict).

Many researchers have described how cognitive conflict arises. For example, Strauss (1972) describes two kinds of cognitive conflict (in his words, disequilibrium). One is an external, adaptational disequilibrium by means of a prediction-outcome conflict. The other is an internal, organizational disequilibrium through structural-mixture conflict. Sigel (1979) describes three different kinds of cognitive conflict (his word, discrepancy): internal cognitive conflict (between two competing ideas), external social conflict (between two external events or sources of information), and internal-external conflict (between an internal and external event). Kwon (1989) presents three kinds of cognitive conflict, using Hashweh’s (1986) analysis on metacognitive conflict.

Based on literature review and case studies, Lee et al. (2003) have proposed the cognitive conflict process model (see Fig. 1). According to this model, cognitive conflict requires that a student both has a preconception and believes that he or she is being confronted with an anomalous situation. If either the preconception or the anomalous situation is lacking, there is no cognitive conflict. In this model, cognitive conflict is considered to be a psychological state generated when a learner is confronted with an anomalous situation. In this state, the learner (1) recognizes an anomalous situation, (2) expresses interest and/or anxiety in resolving the cognitive conflict, and (3) engages in a cognitive reappraisal of the situation to resolve this conflict. Thus, this model assumes four psychological constructs in cognitive conflict: recognition of an anomalous situation, interest, anxiety, and cognitive reappraisal.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig1_HTML.gif
Fig. 1

Cognitive conflict process model

When a learner recognizes that a situation is incongruous with his or her conception, he or she should be interested in and/or anxious about resolving the incongruity in this situation. Simultaneously, a learner will try to resolve the cognitive conflict in any way possible. To do so requires that he or she reappraise his or her psychological state.

This model assumes that the stronger this psychological state, the higher the levels of cognitive conflict experienced by the learner. Different students might experience different cognitive conflict types. For example, some students might experience cognitive conflict with strong feelings of both interest and anxiety, while other students might experience cognitive conflict with strong feelings of anxiety but little interest.

Results of the attempts to resolve cognitive conflict are expressed here as response behaviors. There are a large number of studies on students’ responses to anomalous data. We review these studies in the next section. The taxonomy of students’ responses to anomalous data proposed by Chinn and Brewer (1998) is a representative example of response behaviors. We initially designed our research according to this model to investigate the relationships between cognitive conflict and students’ responses to anomalous data.

Students’ Responses to Anomalous Data

Anomalous data play an important role in science learning and have been used widely in science teaching for promoting conceptual change (Lin 2007). Many researchers have reported examples of students’ responses to anomalous data. In some cases, they refer to the diverse student responses to anomalous data as response types. For example, Posner et al. (1982) found that an individual who is faced with an anomaly has several alternatives: (a) rejection, (b) lack of concern with experimental findings, (c) compartmentalization of knowledge.

Chinn and Brewer (1993) developed a theoretical framework for understanding how people respond to anomalous data, proposing seven psychological responses among which a scientist or student would choose when confronted with anomalous data: ignoring, rejection, exclusion, abeyance, reinterpretation, peripheral theory change, and theory change. In 1998, they tested a taxonomy of the seven possible responses to anomalous data that they had proposed in 1993. Trumper (1997) studied the implications of instructional strategies that attempted to create cognitive conflict in order to produce conceptual change. In his study, he observed different responses to anomalous situations by students.

Chan et al. (1997) examined how individuals and peers process scientific information that contradicts what they believe, linking this activity to conceptual change. In interviewing students, they found that knowledge-building activities included sub-assimilation, direct assimilation, surface-constructive activity, implicit knowledge building, and explicit knowledge building. They found that the level of knowledge-processing activity exerted a direct effect on conceptual change and that these activities mediated the effects of conflict.

Niaz (2001) investigated cognitive conflict resolution strategies (namely, responses to contradiction) used by students in solving problems of chemical equilibrium. Participants were freshman students enrolled in Chemistry II for science majors at a university in Venezuela. In this study, Niaz found that students using conflict resolution strategies do accept and explain the anomalous data but may still preserve the central hypotheses of their alternative conceptions. This finding coincides with those of assimilation strategy (Lee and Kwon 1999) and Peripheral Theory Change (Chinn and Brewer 1993).

Kang et al. (2004) operationalized cognitive conflict as the degree of dissatisfaction a student exhibits with his/her existing conception after being presented with a discrepant event. However, they did not test the degree of cognitive conflict directly. Instead, they supposed that responses to a discrepant event might be one of the indicators for quantifying cognitive conflict. Accordingly, they tried to test cognitive conflict indirectly by testing responses to anomalous data. For instance, a cognitive conflict score of “0” was assigned to responses such as rejection and reinterpretation, and a score of “2” was assigned to responses such as peripheral belief change.

Lin (2007) investigated the possible responses to anomalous data obtained from experiments that are repeatable by carrying out additional or alternative experiments in the laboratory. Two hundred undergraduate students participated in the study. Experiments were carried out by groups of students, each of which consisted of two to six members. Lin describes nine categories of responses, including one she calls uncertainty about interpretation of the data. Lin also assesses two categories of processing: anomalous data-final response (A-R process) and anomalous data-[mediators with/without intermediate response]-final response (A-M-R process).

In reviewing the literature, it is clear that there is a substantial body of research on students’ responses to anomaly, and there is a good deal of agreement among the research findings; cognitive conflict is clearly one of main factors influencing how people respond to anomalous data. However, to date no study has directly investigated the relationship between cognitive conflict and responses to anomalous data. Thus, the purpose of the present study was to identify the relationship between cognitive conflict and responses to anomalous data. Specifically, we investigated the relationship between the cognitive/affective aspects of cognitive conflict and the diverse types of responses that are introduced in the cognitive conflict process model.

In this study, the topic of ‘force and motion in mechanics’ was chosen for several reasons. First, the field of mechanics is a paradigm for the ‘alternative framework’ viewpoint (Mildenhall and Williams 2001). In particular, force and motion in mechanics provides the clearest case of an alternative and coherent characteristic of student conceptions, which is mostly pre-Newtonian. Many researchers have found a considerable number of conflicts between student conceptions and scientific understanding of ‘force and motion’ (Lee and Kwon 1999; McDermott et al. 1994; Rosenquist and McDermott 1987; Stinner 1994; Trowbridge and McDermott 1981). Thus, ‘force and motion’ provides a good topic for exploring how students’ cognitive conflicts are related to their responses to anomalous data on the motion of an object.

Methods and Procedure

As we presented in the previous section, the purpose of this research was to examine the relationship between cognitive conflict and students’ responses to anomalous data in the conceptual change process. To fulfill this purpose, our study consisted of three phases (see Fig. 2). In the first phase, we investigated students’ preconceptions by administrating a written test involving a track problem (viz., the brachistochrone problem). After this phase, we presented a demonstration that might contradict the ideas of students who answered the problem incorrectly. In the third phase, we investigated students’ cognitive conflict levels and responses with a written Cognitive Conflict Level Test (CCLT). Students were asked to indicate whether they agreed or disagreed with each statement on the Cognitive Conflict Level Test (Lee et al. 2003) on a 5-point scale. After these three phases, we conducted individual interviews with four students (two males and two females) who participated in the interviews voluntarily.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig2_HTML.gif
Fig. 2

Research procedure

Participants

Participants were 96 ninth-grade students in a large, comprehensive secondary school in a major metropolitan city in South Korea. There were 53 males and 43 females in the study. The student population is heterogeneous, consisting of students from low, middle, and middle-high socioeconomic backgrounds. All had studied the basic scientific concepts of force and motion at school. However, for analysis, we used only the data from students who had the possibility of experiencing cognitive conflict, the 46 students who answered the question on the preconception test incorrectly and then took part in a demonstration of anomalous data.

Demonstration

Demonstration, which is effective for creating cognitive conflict, should be sufficiently simple so that students can easily understand the situation and the question related to the demonstration. On the other hand, an effective demonstration must be challengeable by students after they are engaged in inquiring about it. In order for a demonstration to meet these conditions, it should have a new context but activate students’ intuitive ideas rather than ideas memorized or recalled from previous rote learning.

As we present in the theoretical base section, the history of studies on students’ conceptions in ‘force and motion’ goes back more than 30 years. There are many reports and learning materials on ‘force and motion’. Thus, it is not easy to find a new physics problem involving ‘force and motion’. After searching for relevant problems in the field of mechanics, we found that the brachistochrone problem (namely, the “two track problem”) would be relevant for our demonstration. The brachistochrone problem is one of a number of counterintuitive results that indicate the gulf between commonsense intuitions or judgements and those of science (Matthew 1999).

This particular problem, to find the curve joining two points along which a frictionless bead will descend in minimal time, is typically introduced at the college level. However, the statement of this problem is easily understood, even for secondary school students, when phrased in a more familiar context (Haws and Kiser 1995). Even at the college level, an engaging way to present this topic is to begin with students’ intuition because many students have intuitive ideas on this subject regardless of their academic performance. Haws and Kiser (1995) showed that the brachistochrone problem can sometimes result in unexpected and amusing responses from students. Even though secondary students could not fully solve it, they were able to share in the stimulating experiences generated by the study of this particular problem, which marked the starting point of the ‘least action principle’ by great physicists and mathematicians.

We presented students with tracks with balls at the top (see Fig. 3). A ball was held at the top of each track. We asked the students to predict which track would be faster if the balls were released at the same time (the balls would be released, not pushed). Immediately after the students answered the question, we completed the demonstration.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig3_HTML.gif
Fig. 3

Track problem

Data Collection and Analysis

Cognitive Conflict Level Test

The Cognitive Conflict Level Test (CCLT) is an instrument that was developed based on the Cognitive Conflict Process Model and that is used to measure the degree of cognitive conflict of a student confronted with anomalous data. The validity and reliability of CCLT were assessed in a previous study (Lee et al. 2003). The validity of the CCLT was supported by the results of a factor analysis in which the four main factors completely coincided with the four measurement components proposed as the constructs of cognitive conflict and explained 72.31% to 77.73% of the total variance.

Content-validity coefficients were assessed by six experts in science education. They used a 5-stage Likert scale to assess the validity of each item. The content validity coefficients among the experts ranged from .85 to .97, and the mean value was .93. The reliability of this instrument was assessed by calculating the internal consistency values using Cronbach alpha in the subset (Cronbach α: .69~.87) and in the total test (Cronbach α: .86~.91).

In the early version, CCLT had only a cognitive conflict test (12 items) with no statement about students’ response types. Table 1 shows the 12 items, which consist of four components of cognitive conflict multiplied by three items. Each item is rated on a 5-point Likert scale (0 = “not at all true,” 4 = “very true”). Thus, using the 12 items, we could count not only the total scores of cognitive conflict (the maximum score is 48: 4 score × 12 items = 48 score) but also the scores of each component of conflict (the maximum score is 12: 4 score × 3 items = 12 score), which would be used for classifying cognitive conflict types.
Table 1

Components and items of the Cognitive Conflict Level Test (CCLT)

Measurement Components

Test Items

Recognition of contradiction

1. When I saw the result, I had doubts about the reasons.

2. When I saw the result, I was surprised by it.

3. The difference between the result and my expectation made me feel strange.

Interest

4. The result of the demonstration is interesting.

5. Since I saw the result, I have been curious about it.

6. The result of the demonstration attracts my attention.

Anxiety

7. The result of the demonstration confuses me.

8. Because I cannot solve the problem, I am uncomfortable.

9. Because I cannot understand the reason for the result, I feel depressed.

Cognitive reappraisal of the situation

10. I would like to ascertain further whether my idea is incorrect.

11. I need to think about the reason for the result a little longer.

12. I need to find a proper basis for explaining the result.

We assumed there could be diverse cognitive conflict types because cognitive conflict has four different components (recognition, interest, anxiety, reappraisal) and the type of cognitive conflict could be different depending on the way that each component arose. According to the cognitive conflict process model, we devised a criterion (the strength of each component in the cognitive conflict) and classified the types of cognitive conflict. For example, if the score of a component, which is the total score of three items in a component of CCLT, was over 4.9 (the average score of a component in the cognitive conflict test), we defined the strength of the component as strong; otherwise, it was considered weak. According to this classification, arithmetically, there were 16 possible cognitive conflict types. In other words, we used the average score (4.9) as the cutting score to define the strength of a component of CCLT as strong or weak. The average score was calculated by first dividing the overall CCLT score by the number of participants and then dividing that figure by the number of components (4). Because the median is often used as the recommended standard for the cutting score (Cizek and Burg 2006; p. 250), we also tried the median score (4.5) as the cutting score to define the strength of a component of CCLT as strong or weak. However, regardless of our choice of cutting score (4.9 or 4.5), the number of possible cognitive conflict types and their distribution (see Fig. 5) are the same because the score of a component can only be an integer from 0 to 12.

In this study, we also added three items to the CCLT instrument in order to test the response types of students who were confronted with an anomalous situation. These three items are as follows:
  • I accept the result of the demonstration as valid.

  • I can offer an explanation for the result of the demonstration.

  • I have changed my previous idea (conceptions).

These three items were developed based on the questions in the Table 2, taxonomy of responses to anomalous data adapted from Chinn and Brewer (1998).
Table 2

Taxonomy of responses to anomalous data

Response types

Does the individual accept the data as valid?

Does the individual offer an explanation for the data?

Does the individual alter their current theory?

Ignoring

No

No

No

Rejection

No

Yes

No

Uncertainty

Undecided

No

No

Exclusion

Yes or no

No

No

Abeyance

Yes

Undecided

No

Reinterpretation

Yes

Yes

No

Peripheral theory change

Yes

Yes

Yes, partially

Theory change

Yes

Yes

Yes, completely

Adapted from Chinn and Brewer (1998: 646)

Participants rated their belief in each item on a 0 to 4 scale. Their ratings were divided into three categories: no (0 to 1 point), undecided (2 points), and yes (3 to 4 points). According to these scores (believability of an anomalous data, inconsistency between an anomalous data and students’ conceptions, and theory change), students’ response types are decided. In this study, first, we followed the definitions of the response code (in the Table 2), which are developed by Chinn and Brewer (1998) as follows:
  1. 1.

    Ignoring means that the learner does not believe the data or does not try to explain away the data. This response results in no theory change for the learner.

     
  2. 2.

    Rejection means denying that the data is valid. Here, the learner may offer an explanation for why the data is not valid, appealing to methodological flaws or other procedural errors.

     
  3. 3.

    Uncertainty means raising a question about whether one should believe the data or not.

     
  4. 4.

    Exclusion means that a learner excludes this data; he or she feels no need to explain the data because the learner considers the data to be irrelevant to his or her theory.

     
  5. 5.

    Abeyance means that a learner believes that the data and his or her theory ought to be able to explain the data.

     
  6. 6.

    Reinterpretation means that a learner explains that data within the current theoretical framework without changing his or her current theory.

     
  7. 7.

    Peripheral theory change means that a learner accepts the data as valid and explains the data by making minor changes to his or her theory without giving up any core components of the theory.

     
  8. 8.

    Theory change means that a learner abandons his or her current belief completely.

     

Individual interview

After we conducted the CCLT with 96 students, we interviewed four students who were volunteers (two males and two females) to more deeply understand students’ cognitive conflicts and their responses to the demonstration. The interview questions were based on the CCLT questions. We asked additional questions to clarify students’ answers. For instance, “Why did you answer this way?”, “Could you explain your reasons for your answer in more detail, please?”, and “What were the criteria for your choice?”

To maintain the reliability of our data analysis, the authors (the two raters) independently classified a subset of randomly selected CCLT data (20% of all data) by type of cognitive conflict and type of response and then identified the relationships between the data. Discrepancies between the raters were discussed and resolved. As a result, the remaining data were analyzed by the second author. During the analysis of the rest of data, the raters would meet to establish consensus as needed.

Findings and Discussion

Cognitive Conflict Scores

We measured the scores of cognitive conflict by using CCLT and found that all students who gave incorrect answers experienced cognitive conflicts. However, their scores of cognitive conflict were diverse, ranging from 4 to 46. This indicates that anyone who is confronted with a discrepant event may experience cognitive conflict even though the strength and state of cognitive conflict may be different. The distribution of students’ cognitive conflict scores is shown in Fig. 4. The average cognitive conflict score is 19.0 (SD 10.9). This score is under the median score (24). Thus, students experienced cognitive conflict very differently despite observing the same demonstration result. What accounts for the difference among the individual cognitive conflict scores? That depends on various factors; as Limón (2001) indicated, these factors may include learners’ prior knowledge, motivation, beliefs, and factors related to social context. A student explained the reason for her low score on the CCLT by saying, “I am not good at science. Biology would be OK but I cannot understand physics…Because I am not interested in science, I am not confused by the demonstration result (which is inconsistent with my previous idea).”
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig4_HTML.gif
Fig. 4

The distribution of students’ cognitive conflict scores

Cognitive Conflict Types

In this study, we found seven types of cognitive conflict (see Table 3). Among these seven types, Type A indicates that three of the four components of cognitive conflict are strong, and only anxiety is weak. Type B indicates that all four components of cognitive conflict are strong. Type C indicates that all four components of cognitive conflict are weak. In Type E, only the component reappraisal is strong. Type G indicates the presence of few cases of cognitive conflict (fewer than three cases).
Table 3

Criteria for classifying the cognitive conflict types

Cognitive Conflict Types

Components of Cognitive Conflict

Recognition

Interest

Anxiety

Reappraisal

A

Strong

Strong

Weak

Strong

B

Strong

Strong

Strong

Strong

C

Weak

Weak

Weak

Weak

D

Strong

Strong

Weak

Weak

E

Weak

Weak

Weak

Strong

F

Strong

Weak

Weak

Weak

G

.

.

.

.

Strong means scores above 5 point; weak means scores from 0 to 4.9.

Type G means very few cases that are fewer than 3 cases.

Figure 5 shows the distribution of cognitive conflict types. Each cognitive conflict type accounts for less than 20% of the total. The prominent cognitive conflict type is Type C (19%). Students in Type A experienced cognitive conflict without anxiety. Type B is the cognitive conflict type in which all components of cognitive conflict are consistently strong. Thirty percent of students were in both Type A and Type B. The students in Type A or B experienced cognitive conflict more strongly than others. The percentages of types D, E, F are 11%, 7%, and 13%, respectively. In the results for cognitive conflict types, recognition is the most prominent component among the four, while anxiety is the least.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig5_HTML.gif
Fig. 5

The distribution of cognitive conflict types. Note: * O = Strong, X = Weak, Ex) Type A: Recognition (O), interest (O), anxiety (X), reappraisal (O). ** G: Very few cases (fewer than 3 cases)

A Taxonomy of Students’ Responses

We illustrate the relationship between cognitive conflict and responses in the cognitive conflict process model. Students' response types are different from the cognitive conflict types since responses would be followed after experiencing cognitive conflict. Chinn and Brewer (1998) proposed eight possible responses to anomalous data. Based on her research on undergraduate experiments, Lin (2007) added a new response, uncertainty about interpretation of the data.

In our study, we used a demonstration, which is different from the anomalous data used in the previous study. Figure 6 shows the distribution of students’ response types. There are ten (possible) types of student responses. Among these, nine are responses proposed in previous studies (Chinn and Brewer 1998; Lin 2007). The remaining type, identified in our research, is superficial theory change, a new response to anomalous data.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig6_HTML.gif
Fig. 6

The percentages of students’ response types

As shown in Fig. 6, superficial theory change is remarkable among the response types. Many students could not explain the result of the demonstration, but they were willing to accept it and in response altered their earlier ideas. They thought that results presented by the teacher were always right. This phenomenon can be seen in the following dialogue.

Interviewer: Please explain what the criterion was when you decided if the data were valid.

Student 1: If I listened to the explanations from my teacher or textbooks, I could decide.

Interviewer: What about experiment? Do you usually accept the data of an experiment in the lab?

Student 1: Yes [I do]. I have not often thought about this [issue], [but] I think I accept what my teacher shows as valid.

He added, “Though I do not know and cannot explain the reason for the [demonstration] result, I believe that the result shown by the teacher will always be correct.” We found similar ideas in other students’ answers: “Because I observed the result with my own eyes, I changed (my initial idea)” and "I should believe because that was the result of the experiment.” Here, it is worth clarifying that ‘theory change’ refers to the ‘superficial theory change’ response. As outlined in the “Methods and Procedure” section, we classified students’ theory changes based on how they rated their ideas about the question, “Is the initial theory changed?” according to the following scale: no (0 to 1 point), undecided (2 points), and yes (3 to 4 points). In other words, theory change was assessed by students’ self-reports (their ratings and reasons) as in previous studies (Chinn and Brewer 1998; Lin 2007). This student-side aspect of ‘theory change’ may be different from the research-side aspect of theory change, which is well known as a condition of conceptual theory, which includes the following: dissatisfied, intelligible, plausible, and fruitful.

How does a student who has accepted the demonstration result as valid but who cannot offer an explanation for it alter his/her current theory? Mason (2001) found that acceptance of anomalous data is related to students’ beliefs about certain knowledge handed down by authority, and this finding is consistent with our results. As other previous studies (e.g., Elby and Hammer 2001; Hammer 2000) have found, students have different beliefs (namely, epistemological beliefs) about learning. For instance, some students may believe learning consists of memorizing facts, data, and formulas provided by the teachers. These students accept knowledge from (external) authority rather than constructing their own knowledge. In our study, students in the superficial theory change category are assumed to be those with naive beliefs about learning, easily accepting anomalous data and changing their previous theory without critical consideration. Conversely, epistemologically sophisticated students may try to apply new knowledge and modify their own understandings.

Thus, there is a difference between superficial theory change and other responses, such as peripheral theory change, uncertainty, and abeyance, in students’ responses. In the case of peripheral theory change, we assume that students accept the result as valid, offer an individual explanation for the result, and alter their current theory (partially). In the case of uncertainty and abeyance, students answer ‘no’ or ‘undecided’ about theory change. Students in these two cases may need more time to resolve uncertainty or to explain the data completely before they decide to change their initial theories.

Students in superficial theory change, however, always answer ‘yes’ about theory change. In spite of their acceptance of the result as valid and changing of their current theories, they could not offer individual explanations for the result; they simply had strong confidence in the result of the demonstration because the result was conducted by a teacher whom they viewed as possessing high authority. In some cases of superficial theory change, even though students changed their previous ideas, they did not indicate whether they accepted the data as valid or if they could explain the data.

Thus, the biggest difference between superficial theory change and other responses, such as peripheral theory change, uncertainty, and abeyance, is that superficial theory change occurs without sufficient conditions for the change: the ability to validate and explain anomalous data. This would seem to be an irrational response. However, under certain circumstances, theory change may not follow a logical pattern of reasoning (Niaz 2006). This may be one reason for the difficulty in successfully guiding students’ conceptual changes.

In addition to the superficial change, we found specific examples that had not been proposed by other researchers. For instance, as seen in Table 4, many students’ responses included “uncertainty (undecided).” Some students did not accept the demonstration result as valid and did not offer an explanation for the data; however, they did not determine whether their current ideas should change. These students are classified in the “ignoring” category. We have not identified similar responses in previous studies (Chinn and Brewer 1998; Lin 2007). Researchers and instructors should understand and address the fact that many students have some degree of uncertainty not only in their ideas about the problem but also in their thought processes and responses that follow the demonstration.
Table 4

A taxonomy of responses to anomalous data presented by demonstrations

Category of response

Are the data accepted as valid?

Are the data explained?

Are the initial concepts changed?

Ignoring

No

No

No Undecided

Rejection

No

Yes

No

Uncertainty of validity

Undecided

No

No

Uncertainty of interpretation

Undecided

Yes

Undecided

No

Undecided

Undecided

No

Exclusion

Yes

No

No

Abeyance

Yes

Undecided

No

No

Undecided

Reinterpretation

Yes

Yes

No

Peripheral theory change

Yes

Yes

Yes, partially

Theory change

Yes

Yes

Yes, completely

Superficial theory change

Yes

No

Yes

Undecided

Undecided

Yes

Yes

Undecided

Yes

No

No

Yes

Bold letters: new type or new examples that are different from previous studies.

As shown in Fig. 6, superficial theory change was the most common response. The next most frequent response was ignoring. Abeyance and uncertainty of validity/interpretation follow. The students with these types of responses hesitated to make a decision about their responses to the demonstration result. The exclusion response accounted for 8% of the total, and theory change, the response in which students’ ideas were changed, accounted for 6%. Rejection, reinterpretation, and peripheral theory change were not found in this study.

In summary, our results concerning students’ responses to anomalous data expand on previous research (Chinn and Brewer 1998; Lin 2007). We found a new response to anomalous situations, superficial theory change, and some examples that can be classified as “undecided” in some response types. There may be several reasons for this inconsistency. For example, differences in culture, teachers’ self-efficacy or esteem, age of subjects, methods of presenting anomalous data (articles, experiments, demonstrations), and testing methods may explain why students’ responses in the present study were not consistent with previous research.

The method of presenting anomalous data, in particular, appears to account for the greatest difference between the present study and previous research. For instance, one previous study (Chinn and Brewer 1998) presented students with two opposing science articles that contained enough explanation to be reasonable to the students, while in our study, students watched demonstrations without any text. In this latter situation, many students commented on the demonstration result as having authority. In our study, one student said, “I cannot doubt this result because my idea is my own but the result (that my teacher presented) is true.” In the previous study, the two opposing science articles may have had equal authority to many students, while in our study, the demonstration results had higher authority than students’ own ideas. Thus, students' different epistemological beliefs affect the conceptual change process initiated by anomalous data in different ways.

Based on these results, we explored the specific relationship between cognitive conflict and responses to anomalous data. Our findings are presented in the next section.

Cognitive Conflict Scores and Response Type

Figure 7 shows the cognitive conflict scores by response type. Ignoring, abeyance, and exclusion have lower cognitive conflict scores. However, the uncertainty group (uncertainty of validity/interpretation) has higher cognitive conflict scores. A possible explanation is that at the beginning of the learning process, students may need guidance that helps them to understand data, interpret the result, and more clearly develop their previous ideas. The change group (theory change and superficial theory change) has the next highest cognitive conflict score. However, there is no statistically significant difference among the scores of the three groups (the uncertainty group, the theory change group, and the other group) in the result of the one-way ANOVA and post-hoc comparisons between groups. Here, we can only suspect that cognitive conflict may have both constructive (e.g., theory change or uncertainty) and destructive (e.g., superficial) effects on student learning. In order to understand the relationship between cognitive conflict and responses, researchers and instructors must understand cognitive conflict in more detail, beyond assessing its scores. We consider the types of cognitive conflict and their relationships to the responses in the next section.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig7_HTML.gif
Fig. 7

Cognitive conflict scores according to students’ response types

Cognitive Conflict Types and Responses

To more clearly understand the results given in Fig. 7, we considered cognitive conflict types that depend on the strength of each component in the cognitive conflict rather than the cognitive conflict scores. Figure 8 shows the percentage of students’ responses by cognitive conflict type. There are diverse response patterns according to cognitive conflict type. There are the similar response patterns between Types B and C, which have the highest score and the lowest score of cognitive conflict, respectively. Conversely, the response pattern in Type A is very different from that in Type B, even though the only difference between Type A and Type B is that Type A indicates weaker anxiety. Type A includes theory change (Type A is the only type with theory change), less superficial theory change, and an absence of ignoring. This means that higher cognitive conflict with less anxiety had a more constructive effect on responses to anomalous data.
https://static-content.springer.com/image/art%3A10.1007%2Fs11165-011-9234-5/MediaObjects/11165_2011_9234_Fig8_HTML.gif
Fig. 8

The percentage of students’ responses by cognitive conflict type

As shown if Fig. 8, the percentage of ignoring increases from Type A through Type F. Among the three types with the lowest cognitive conflict (D, E, F), Type E has the highest percentage of abeyance and no superficial theory change. Type E has weak recognition, interest, and anxiety, but strong reappraisal, while Typed D and F have lower reappraisal. This suggests that reappraisal may help students think more critically rather than just intuitively/superficially.

In summary, Type B represents a high level of cognitive conflict because all components of this type are strong. The response pattern of students in this group is similar to the response pattern of students in Type C. Type C represents a low level of cognitive conflict. Thus, the effects of cognitive conflict in both high and low groups are similar (as shown in Fig. 6). However, the effect of cognitive conflict is more constructive when it arises at an appropriate level. These results appear consistent with Yerkes-Dodson’s Law (Yerkes and Dodson 1908). Gagne et al. (1993) summarized this law as follows:

There is a curvilinear relationship between arousal and performance such that, starting from a very low level of arousal and going to a moderate level, performance increases. Then any further increases in arousal cause performance decrements. Thus there is an optimal level of arousal. (p. 430)

Why is there an optimal level in arousal (as found in cognitive conflict) in Yerkes-Dodson’s Law (1908)? Our findings may provide an explanation.

If cognitive conflict was not aroused at an appropriate level, students were not motivated or felt strong anxiety. In particular, anxiety was a cause of the inordinate strength of cognitive conflict, leading to a destructive effect of cognitive conflict, as shown in the relationship between Type B and responses shown Fig. 8. This psychological state, anxiety, would negatively affect students’ learning. Thus, we can assume that anxiety as a component of cognitive conflict is a key to determining the effects of cognitive conflict in responses to anomalous data, constructively or destructively.

From these results, we found that relevant responses (theory change) were affected only when cognitive conflicts were adequately aroused. Thus, we must consider the importance of affective aspects of cognitive conflict (e.g., anxiety) beyond the cognitive aspects that the cognitive conflict process model emphasizes.

In this study, we considered both the inner structure of cognitive conflict and the outer range of its effect. As we proposed in the cognitive conflict process model (Lee et al. 2003), cognitive conflict has an inner structure consisting of recognition, interest, anxiety, and reappraisal. Thus, we believed that the role of cognitive conflict should be interpreted based primarily on understanding its inner structure. Second, because cognitive conflict is an event that is aroused at the beginning of learning, we need to consider its effect at close range (the range at which cognitive conflict has a direct effect), in the initial stage of learning triggered by cognitive conflict. In other words, it is reasonable to focus on the first step of the process of conceptual change rather than the final results when we investigate and discuss the effects of cognitive conflict.

Conclusions

Many researchers have reported that merely presenting a discrepant event to students does not necessarily induce cognitive conflict (Chinn and Brewer 1998; Gorsky and Finegold 1994; Kang et al. 2005; Lin 2007). The present study may help develop the discussion on this issue in more detail.

Overall, our research findings can be summarized as follows:
  • First, all of the students who gave incorrect answers experienced cognitive conflict even though the ranges of their cognitive conflict scores are very diverse.

  • Second, we found seven response types to anomalous data. The superficial theory change was new and was the most common response.

  • Third, the cognitive conflict scores were related to the students’ responses to the anomalous data. The primary responses in the lower scoring groups included ignoring, exclusion, and abeyance. Conversely, the higher scoring groups showed responses such as uncertainty and (superficial) theory change.

  • Fourth, seven types of cognitive conflict were determined according to its four components (recognition, interest, anxiety and reappraisal). Among the types of cognitive conflict, Type A, in which all components but anxiety are strong, exclusively showed theory change and was a less superficial response than other types.

From these results, we can tentatively conclude that cognitive conflict has affective and cognitive features, and the features of cognitive conflict affect students’ responses to anomalous situations, responses that occur as a result of decision making or efforts to resolve conflict. There is an appropriate level of cognitive conflict that has constructive potential; if students experience too low or too high a level of cognitive conflict, the conflict will negatively affect students’ learning. In addition, anxiety is an important component of cognitive conflict for enhancing its effect on conceptual change.

These results appear consistent with Yerkes-Dodson’s Law (Yerkes and Dodson 1908). Gagne et al. (1993) described this law as follows:

The attempts would not work if they generated too much arousal. This might happen, for example, with a student who was very anxious (aroused) about performance in front of a group. A question that produced cognitive conflict (her word, conceptual conflict) would increase this student’s arousal level to a non-productive point. (p. 430)

Finally, our results support the assumption that, although cognitive conflict is no longer viewed as an ultimate teaching method, it is still valuable for initiating the learning process (Zohar and Aharon-Kravetsky 2005).

Further Research

Since Piaget’s work (1963, 1985), many researchers have believed that students are motivated to restructure their conceptions when they experience cognitive conflict. However, as we noted at the beginning of this paper, the role of cognitive conflict in conceptual change is an issue of ongoing debate. Our findings may be helpful in expanding researchers’ views in ways that will help promote relevant and comprehensive discussions on the role of cognitive conflict in conceptual change. Since there are many factors that affect student learning, we cannot expect that a single factor controlled the whole learning progression and its result. Of particular note, under certain circumstances resolution of a conflict may not follow a logical pattern of reasoning but rather a slow process (based on motivational, intuitive, and affective factors) in which the hardcore belief slowly crumbles (Niaz 2006). Further research efforts should expand their scope to encompass these findings.

In addition, we need to pay attention to Limón (2001)’s critics on the current cognitive conflict paradigm as an instructional strategy since it focuses only on cognitive aspects of student learning, neglecting many other variables that influence students’ learning in the school setting. She writes, “to induce a meaningful [i.e., essentially constructive] cognitive conflict, students should be motivated and interested in the topic, activate their prior knowledge, have certain epistemological beliefs and adequate reasoning abilities to apply” (p. 374). She also argues that many of the difficulties found in the application of the cognitive conflict strategy in the classroom are closely related to the complexity of such factors as motivation, learning strategies, epistemological beliefs, attitude, reasoning abilities, the teacher and his or her qualities. However, we do not know yet how these factors concretely affect the features of cognitive conflict and what other kinds of factors affect constructive cognitive conflict besides these factors. In addition, there are remaining questions, such as those Limón (2001) raised: “How long would it take to induce a constructive cognitive conflict? Under the real conditions of the school setting, to what extent can the cognitive conflict strategy or other conceptual change instructional strategies be applied successfully?” In order to address these questions, we might need a new theoretical framework, which identifies the strengths and weaknesses of existing partial models and makes the integration of diverse and difficult issues in conceptual change.

Quantifying cognitive conflict is also an important issue, which researchers need to address, based on the findings of this study. Since Zimmerman and Blom (1983) assessed cognitive conflict, many researchers have tried to quantify cognitive conflict, focusing on response latency (time) and degree of uncertainty (behaviour such as hesitation, surprise). Recently, Kang et al. (2004, 2010) criticized the previous methods, for the relationship between students’ overt behaviours and cognitive conflict is not so direct as to regard the scores as the degree of cognitive conflict. Instead, they considered students’ responses to anomalous data as an indicator for quantifying cognitive conflict; of the seven types of responses, rejection, reinterpretation, and exclusion were rated as “0.” Uncertainty was rated as “1.” Peripheral belief change and belief decrease were rated as “2.” Belief change was rated as “3.” However, as can be seen in Fig. 8, the relation between students’ responses and cognitive conflict is not simple enough for us to quantify the degree of cognitive conflict based on their relation. In addition, as we proposed in the cognitive conflict process model, cognitive conflict and responses are in conceptually different states. Therefore, even though cognitive conflict and responses are related to each other, it is difficult to prove that their relationship is directly reflected in the cognitive conflict scores. Thus, we must further develop the techniques for assessing cognitive conflict as our understanding of cognitive conflict expands.

Implications for Teachers/Teacher Education

Recently, bringing successful teaching approaches for stimulating conceptual change to normal classrooms has been a major challenge not only for teachers but also for researchers in the field of conceptual change. Treagust and Duit (2008) describe this challenge as follows, “It appears that the gap between what is necessary from the researchers’ perspective and what may be set into practice by normal teachers has increased. Maybe we have to address the paradox that in order to adequately model the teaching and learning process, research alienates the teachers and hence widens the theory-practice gap (p. 324).”

Where does this paradox come from? One of the major reasons would be that conceptual change is a very complex event, as many researchers (Tyson et al. 1997; Venville and Treagust 1998) have argued, which is related with diverse aspects such as epistemological/ontological/affective/social in a science classes. In this sense, teaching conceptual change would be an overly complex affair for a teacher/researcher to manage. Thus, it might be very difficult for a theory to explain the complex phenomena of conceptual change successfully.

Then, how can we address this paradox? There is a hint for solving this problem: The frameworks of student conceptual change-being predominantly researched so far-may also provide powerful frameworks for teacher change towards employing conceptual change ideas. For instance, one of our study’s findings, ‘superficial theory change’, can challenge teachers/researchers’ general expectations (or their common beliefs) about conceptual change. Superficial theory change exposes two important questions: (1) It is possible for students to seem to change their initial ideas by accepting demonstration results without understanding the underlying science?; (2) As a result of this, it is possible for a teacher (or a researcher) to be misled into thinking that students have been taught effectively?

Thus, experiencing students’ cognitive conflict and responses challenges teachers/researchers to open their eyes and to become more sensitive to students’ learning. The better a teacher/researcher is sensitive toward students’ responses (or other psychological states such as anxiety), his or her knowledge can be closer to the reality of teaching and learning in the classrooms. And finally, this kind of more realistic knowledge of a teacher/researcher can bridge the gap, for instance, between what is necessary from the researchers’ perspective and what may be set into practice by normal teachers.

When we reflect that learning situations are filled with conflicts between students’ previous knowledge and new information being learned (Johnson and Johnson 1979), teachers might experience special chances to enhance students’ learning every day. However, this opportunity can be realized only when the teacher can help his or her students experience constructive cognitive conflict and conceptual change by leading them to recognize contradiction, feel interest, maintain relevant anxiety, reflect on their situation, and more. Otherwise, the opportunity could pass by without any positive result. Thus, it is clearly important for a teacher to understand and effectively manage students’ cognitive conflict in class with the help of a well-developed theoretical framework.

If a teacher could better understand the features of students’ cognitive conflict and its effects on learning, the teacher would be able to help his or her students experience constructive cognitive conflict and conceptual change by leading them to recognize contradiction, feel interest, maintain relevant anxiety, reflect on their situation, and more.

On the other hand, students might gain benefits from discussion with researchers about ‘cognitive conflict and response’, since this discussion would encourage students to be reflective/responsible learners capable of evaluating and managing their own learning.

Copyright information

© Springer Science+Business Media B.V. 2011