Journal of Science Education and Technology

, Volume 20, Issue 3, pp 258–269

Model Based Inquiry in the High School Physics Classroom: An Exploratory Study of Implementation and Outcomes

Authors

    • Utah State University
  • Danhui Zhang
    • Beijing Normal University
  • Drew Neilson
    • Logan High School
Article

DOI: 10.1007/s10956-010-9251-6

Cite this article as:
Campbell, T., Zhang, D. & Neilson, D. J Sci Educ Technol (2011) 20: 258. doi:10.1007/s10956-010-9251-6

Abstract

This study considers whether Model Based Inquiry (MBI) is a suitable mechanism for facilitating science as inquiry to allow students to develop deep understandings of difficult concepts, while also gaining better understandings of science process and the nature of science. This manuscript also considers time devoted to MBI in comparison to more traditional demonstration and lecture (TDL) teaching methods, while also revealing the MBI strategy implemented in the physics classroom. Pre-, post-, and delayed- revised versions of the Physics, Attitudes, Skills, and Knowledge Survey (PASKS) were administered to two groups of students, those taught a unit on buoyancy with a TDL instructional strategy (n = 26) and those taught the same buoyancy unit, but with the MBI instructional strategy (n = 28). The PASKS focuses on student achievement in terms of science content, science process/reasoning, nature of science, and student attitude toward science. Through quantitative methods the findings revealed statistical differences when considering the pre-, post-, and delayed- measures with significant differences found overall and on each scale. This indicated improved achievement overall and on each scale with the exception of attitude scale for both groups. Additionally, the findings revealed no statistical differences between groups (i.e., TDL & MBI).

Keywords

Model-based inquiryPhysicsTraditional lecture and demonstrationsStudent science outcome measures

Introduction

It is an experience most teachers know; having to teach a topic they are not fully comfortable teaching. Why the discomfort? Usually, it is due to not fully grasping the relationships, the cause and effect, the logical steps that are made in a proof, or put more succinctly, it is the discomfort associated with having learned something only superficially at best. As we struggle to make sense of a concept, as we try to learn so we can teach, we employ strategies to help make sense of an idea. We engage deeply by considering the concept in multiple contexts, we develop simulations or tests, whether mentally perhaps by mathematical calculations, or through physical experimentation, to see if data can be generated that supports our understandings. Metaphorically, we walk around a concept, we lift it up and look under it, we look at it from the backside, we test it to see what it can and cannot do. And after this, we feel more secure in our understanding, ready to make arguments founded on our personal experiences we have fashioned with the concept.

These are the same struggles and strategies that our students need to experience if they are to develop deep conceptual science understandings. In addition, these are the same struggles and strategies that also lend themselves to helping students better understand science processes and the nature of science. To develop deep understandings of concepts, science process, and the nature of science, students need experiences that allow them to engage in the messiness of science. They need to be afforded space, time, and direction along the way for testing their own assumptions. Unfortunately, teachers often expect students to grasp relationships between variables because teachers tell them these relationships. They leave students understanding of science as process tied to stories about great discoveries or a linear unproblematic scientific method. Similarly students are often left to pick up on implicit nuances of the nature of science, but from experiences in science that are far removed from the actual practice of ‘doing’ science.

What are most commonly found in science classrooms are teachers logically showing students how to arrive at an equation, drawing graphs or perhaps even drawing a box around an explanation for emphasis. Yet this is the same strategy that was inadequate for us, the teacher, to develop deep understandings. To build our understandings, we looked in different texts, made our own graphs, our own analogies, our own logic steps, our own simple experiments and finally our own conclusions. Only after this did we feel prepared to teach. Our students are similar, yet teachers often fail to recognize this process of nurtured understanding.

For most students, simply being told, whether by reading or lecture, seeing a demo, or by following a procedure outlined in a laboratory, is insufficient to develop much more than a superficial understanding, yet this is the experience of most students (Campbell and Bohn 2008; O’Sullivan and Weiss 1999; Windschitl 2003). Thorough understanding is tough. It is a recursive process that is somewhat uniquely crafted by each student, but it is this recursive, often iterative process that helps students develop deep understanding. These ideas are consistent with a constructivist framework, which in the broadest sense focuses on the nature of learning as “knowledge forms are said to be fashioned or constructed by learners” (Hruby 2002, p. 585).

When science education is considered within a constructivist framework, the focus of science instruction shifts “to involve students in doing rather than being told or only reading about science” (National Research Council [NRC] 2000, pp. 16–17). Inquiry is one central strategy for engaging students in doing science that is highlighted in the national standards documents and by leading science teaching organizations (American Association for the Advancement of Science [AAAS] 1989, 1993; NRC 1996; National Science Teachers Association [NSTA] 2007). Inquiry is defined as

[A] multifaceted activity that involves making observations; posing questions; examining books and other sources of information to see what is already known; planning investigations; reviewing what is already known in light of experimental evidence; using tools to gather, analyze, and interpret data; proposing answers, explanations, and predictions; and communicating the results (NRC 1996, p. 23).

But, while standards documents are advocating inquiry as an instructional strategy, currently open inquiry is seen as problematic by many science teachers and has not been widely accepted or enacted (Campbell and Bohn 2008; O’Sullivan and Weiss 1999; Settlage 2007; Windschitl 2003). Some reasons cited for these problems are (1) teachers inability or discomfort directing or controlling student inquiry, (2) a perception that open inquiry is too time intensive and (3) lack of evidence for improved student outcomes (Settlage 2007).
This study emerged as the authors considered the problems cited. It emerged as the third author, a high school science teacher, and the first author, a university science educator, discussed the concerns associated with inquiry and considered Windschitl and Thompson’s (2006) ideas regarding Model Based Inquiry (MBI) as a possible mechanism for addressing these particular science teachers’ concerns. The study was undertaken to consider whether MBI is a suitable mechanism for facilitating science as inquiry to allow students to develop deep understandings of difficult concepts, while also gaining better understandings of science process and the nature of science. In addition, this manuscript considers time devoted to MBI in comparison to more traditional lecture and demonstration teaching methods, while also revealing the MBI strategy that one practicing teacher implemented in the physics classroom. The following questions were used to guide this research:
  1. 1.

    Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, and attitudes toward science) for physics classrooms facilitated with MBI when considering pre-, post-, and delayed-student outcome measures over time?

     
  2. 2.

    Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, and attitudes toward science) for physics classrooms facilitated with Traditional Lecture and Demonstration (TDL) when considering pre-, post-, and delayed-student outcome measures over time?

     
  3. 3.

    Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, & attitudes toward science) when comparing physics classrooms facilitated with differing instructional strategies (MBI and TDL) at the pre-, post-, and delayed-student outcome measures intervals?

     

Literature Base

Inquiry as an Instructional Strategy

While investigations into the effectiveness of inquiry as an instructional strategy have shown promise for increasing students’ understanding of science (Chang and Mao 1999; Ertepinar and Geban 1996; Hakkarainen 2003), the nature of science (Schwartz et al. 2004), and increasing students’ interest and attitudes toward science (Cavallo and Laubach 2001; Chang and Mao 1999; Paris et al. 1998), there remains debates, as well as problems of enactment. One example of the debates that can be found emerge as Settlage (2007) offered the following assertion regarding open inquiry:

Holding open inquiry as the purest form of classroom inquiry and suggesting it is an ideal for which science teachers should strive is a myth… It is impractical to expect teachers to implement open inquiry with any regularity and there is negligible evidence supporting a continued allegiance to a faith in open inquiry (p. 464).

Johnston (2008) challenges these ideas in a response to Settlage (2007) as he argues that

Settlage seemed to neglect that inquiry is not simply a teaching tool, but a teaching goal… It is a scientific endeavor in itself, allowing students to be themselves within a culture of scientific inquiry… The processes embraced by science that allow us to extract explanation from evidence are paramount to a citizen’s understanding of science… Alas, in an era of high-stakes testing in which much of science is stripped of its inquiry processes in favor of content factoids, it must be our obligation to make open inquiry a learning objective in our classrooms.

So, while there seems to be a unified call for inquiry as an instructional strategy coming from national standards documents (AAAS 1989, 1993; NRC 1996) and leading science education organizations (NSTA 2007), there still remains those that feel more research, comparable to that which is being reported here, into the benefits of students engagement in inquiry is needed.

In addition to the debates about inquiry as an instructional strategy, the following are some documented problems that teachers indentify when seeking to employ inquiry as an instructional strategy: (1) lack of clarity with respect to what constitutes inquiry (Bybee et al. 2008), (2) lack of examples of how inquiry is facilitated as an instructional strategy in real classrooms (Settlage 2007), and (3) the lack of the explicit association of inquiry with science content (Windschitl et al. 2008). Each of these problems is further considered next.

Colburn (2000) argues that the science education community has embraced “no idea more widely than ‘inquiry’ or ‘inquiry-based instruction’… [but] perhaps the most confusing thing about inquiry is its definition (p. 42)”. The National Science Education Standards (NRC 1996) offers some clarification by defining two types of inquiry, that which describes teaching and that which describes doing science.

Scientific inquiry refers to the diverse ways in which scientists study the natural world and propose explanations based on the evidence derived from their work. Inquiry also refers to the activities of students in which they develop knowledge and understanding of scientific ideas, as well as an understanding of how scientists study the natural world (p. 23).

Additionally, others have attempted to help resolve this ‘problem of clarity’ by working to define inquiry. As an example, Martin-Hansen (2002) offered insight by “describ[ing] the types of inquiry—open or full inquiry, guided inquiry, coupled inquiry, and structured inquiry—in order to develop an understanding of the different aspects of inquiry among teachers” (p. 34). While those working to clarify what constitutes inquiry should be commended, another step in this clarification, which is also connected to the second problem of enactment we identified, is an authentic example of what inquiry looks like as it is facilitated in real classrooms.

Settlage (2007) declares that while calls for inquiry are dominant, examples of what it looks like in the classroom are rare. He further emphasizes this by pointing out “the examples provided within the National Science Education Standards of inquiry are fictionalized” (Settlage 2007, p. 465). Other examples of fictional depictions of inquiry can be found (Weld 2002), that while meant more for offering a vision for what science instruction can look like, can be folded into the category of ‘fictionalized inquiry’. It is recognized that visions of inquiry are valuable in as much as they offer ideas for enactment, but at the same time, it is believed that there is a need for more non-fictional depictions emerging from real classrooms. Such is the case of the depiction of MBI that was carried out in this research.

The final problem of enactment that we identified was the lack of explicit association of inquiry with science content. Windschitl et al. (2008), best describe this problem

This disconnect of inquiry from content is not only antithetical to real science; it can place inquiry on the margins of school curriculum, where accountability pressures for learning subject matter makes inquiry seem like a diversion from getting one’s feel of ‘real science’ (p. 946).

Inquiry experiences disconnected from science content learning might involve a student investigating relationships without considering the possible causes of relationships. An example of this could involve students investigating which diapers hold more liquid. If not connected to science content, students could walk away with a somewhat better understanding of the process of science, but this understanding would be incomplete or truncated at best. A deeper experience, which we see MBI fostering is connecting inquiry to science content. If the students investigating diapers are also asked to articulate ideas or possible theoretical aspects of their investigations, they might then consider absorbency of chemical compounds. Through purposeful investigation, they are then better positioned to couple their science process with developing a deeper understanding of important factors surrounding chemical compound absorption.

The MBI facilitated as part of this research is valuable when considering each of these problems of enactment as (1) it is believed to offer some clarity with respect to defining inquiry,(2) it offers an example of how inquiry can be facilitated as an instructional strategy in an authentic context, and finally(3) it reveals how inquiry and science content learning can be juxtaposed to show a more realistic model of the processes of science. MBI as well as its theoretical importance is discussed next.

Model Based Inquiry

Gobert and Buckley (2000) best define Model Based Inquiry (MBI), in a broad sense, for the purposes of our work and research

Model-based teaching is any implementation that brings together information resources, learning activities, and instructional strategies intended to facilitate mental model-building both in individuals and among groups of learners (p. 892).

As mentioned previously, our ideas about MBI is derived most directly from the work of Windschitl and Thompson (2006). Windschitl et al. (2008) emphasize modeling or model building as an iterative process whereby students engage in scientific inquiry as advocated in national science standards documents (AAAS 1989; NRC 1996), but in a manner that is coherently derived from and informative of conceptual models students construct. This is best explained in the following:

In these inquiries, models are treated as subsets of larger, more comprehensive systems of explanation (i.e., theories) that provide crucial frames of reference to help generate hypotheses for testing, act as referents in interpreting observations, and are themselves targets of revision (Windschitl et al. 2008, p. 945).

While model based inquiry is being explored in this study because of these features, it is also seen as a mechanism of cohesively facilitating student learning about science content, science process, and the nature of science in a more holistic and relevant manner, so that neither of these is overlooked as students learn in science classrooms.

Windschitl et al. (2008) report benefits arising from MBI as an instructional strategy that science educators have had success implementing with pre-service science teachers, but to date there remains little published research investigating the nuances and outcomes of this approach when implemented in secondary science classrooms. This research offers some insight into the outcomes of MBI implementation.

Methods

Context

This study investigated the effect of Model-Based Inquiry (MBI) as an instructional strategy in comparison to a traditional lecture and demo (TLD) instructional strategy on a variety of student outcomes (i.e., content, science reasoning/process, nature of science, and attitude toward science). Two high school physics classes taught by the third author were selected for comparison. The physics teacher/third author is a veteran physics teacher with 14 years of science teaching experience. He teaches Physics, Chemistry, and Advanced Placement Physics courses. These two classes with the same teacher were used to minimize the variables between groups. While the students in the classes were not randomly selected, as student schedules were determined by school administrators and students based on their class scheduling needs, the class that would serve as the MBI and TLD classes were assigned randomly. The MBI class consisted of 28 students (14 Male/14 Female) while that TLD class consisted of 26 students (19 Male/7 Female), with 54 total students involved. The TLD class was used as the control population and was taught through traditional lecture and demos. The MBI class was used as the intervention or experimental population and was taught through model-based inquiry instructional strategies whereby students worked in groups of 2–3 students to develop a model of their understanding of buoyancy and used class time to design and test mechanisms for further developing, extending, or refining their original models. The differences in instruction experienced by both classes are depicted in Table 1; this is drawn from the teachers’ daily instruction log.
Table 1

TLD and MBI instructional log with teacher reflections/comments regarding MBI implementation

Day

TLD daily instruction

MBI daily instruction

1

Students created a list about factors important in determining liquid pressure (i.e., volume, surface area, depth, and density). Did Pascal’ Vase demonstration to show effects of each factor

Briefly exposed students to modeling. Students went into the lab to create their own models of their understandings of buoyancy

2

Reviewed results of yesterday’s demonstration and derived P = pgh. Using an electronic scale, students were asked to predict results: two equal sized cubes (copper and aluminum) were placed in water and buoyant force was calculated from weight in air vs. water. Next, two objects of equal mass, also copper and aluminum were placed in water. Buoyant force was also calculated

Clarified and revised student models. These models were first constructed on paper day 1, before shared with the whole class on day 2. Based on discussions in whole class, students then made additional revisions to their models

3

Used a variety of demos to reinforce Fbuoyant ~ Vunderwater. Also showed that the reason that fluids exert a net upward force comes from the differences in pressures on the top and bottom of the object

Assigned homework questions from the text

Students developed inquiry investigations to address questions that emerged through their model development on days 1 and 2. Students went into the lab to perform any test or experiment they wanted based on their inquiry designs. A variety of equipment was available for them, such as scales, clay, overflow cans, buckets, salt, sugar, thermometers, etc.

4

Answered questions from students on assigned homework

Using an overflow can and electronic balance, showed that the buoyant force was equal to the weight of the displaced fluid

Assigned homework questions from the text

Students spent an additional day in the lab where they continued to test areas of their models in the through experimentation

5

Explicitly taught about floatation. FB = Weight. Also reviewed Archimedes Principle and answered student questions from homework

Used data and experimental design of one of the group members to discuss experimental design

One group tried to test the effect of adding salt to water. They had two cups, one with water and the other with salt water. They then used two somewhat similar blocks and placed them in each cup

We discussed as a class why the group had difficulty in finding a conclusion. The class eventually agreed that they had more than one variable and suggested that they use one block rather than two. Also, it was decided that they ought to measure how high the block floated in each cup

For conclusions, it was decided (with much effort) that if the block floated differently in each cup, then fluid density affected buoyancy and more experiments would need to be done to show how

6

Answered questions from homework and began lab on buoyant force

Students continued to work in lab. I created an “experiment log” which is general enough, but perhaps will encourage more thought about experimental design

7

Finished buoyant force lab

One last day in the lab to complete experimentation

8

Went over Buoyant Force Lab with students

Boat in aquarium demo. Water level was marked and then boat was filled with rocks. Water level was marked again. Students predicted new water level when rocks were thrown overboard. Process was repeated, but boat was filled with wooden blocks for second trial

Students summarized findings

Recreated a few student experiments

Students modified their model based on what they learned through experimentation in the lab

9

Showed Mythbuster clip on pirate myths (walking underwater in overturned boat)

Explicitly showed Archimedes principle (FB = weight of water displaced) using the Vernier™ probes

As can be seen in Table 1, equal amounts of time were devoted to learning about buoyancy in both classes, the MBI and TLD.

Instrumentation

Both classes completed the same pre-, post-, and delayed- assessments. The unit delayed-assessment was given 1 month after the end of the unit when the post-assessment was given. Two versions of the assessments were created, Version 1 (pre- & delayed-) and Version 2 (post-). All items used to measure students’ understanding of science process/reasoning, nature of science, and attitudes toward science were drawn from the Physics, Attitudes, Skills, and Knowledge Survey (PASKS) created by Lawson (n.d.) and used extensively in the Arizona Collaborative for Excellent in the Preparation of Teachers (ACEPT) Project (Piburn et al. 2000). Initial reliability and validity work with the PASKS was established in the ACEPT project. Subsequent studies supported by measures taken with PASKS were then reported in research emerging from the project (Adamson et al. 2003; Sawada et al. 2002). The content section of the PASKS was the only section that was revised. Because the original version of the PASKS was designed to measure content learning stemming from a yearlong course in Physics and our intention was only to measure content learning in a specific unit (buoyancy), content questions specific to this topic were developed by the teacher (third author) and the researcher, first author, using original PASKS items as a template for item construction. The revised PASKS measures are referred to as rPASKS. The following is a brief description of each section:
  • Science process/reasoning-this section measures students’ abilities to reason and employ science processes in science.

  • Nature of science-this section measures students’ understanding of the epistemology of science.

  • Attitudes toward science-this section measures students’ affective feelings toward, valuation, and behaviors engaging in science inside and outside of school.

  • Content-this section measures students’ conceptual understanding.

An example rPASKS content item is included in Fig. 1 followed by an example original PASKS item that was replaced. Table 2 shows the number of items in the final revised PASKS used to measure student outcomes for students understanding of (1) science content, (2) science process/reasoning, (3) nature of science, and (4) attitudes toward science.
https://static-content.springer.com/image/art%3A10.1007%2Fs10956-010-9251-6/MediaObjects/10956_2010_9251_Fig1_HTML.gif
Fig. 1

Sample content items (constructed rPASKS & original PASKS items)

Table 2

Number of items used to measure each student outcome category and examples of items for each category

Category

Pre-rPASKS

Post-rPASKS

Delayed-rPASKSa

Science content

11

11

11

The pressure in a liquid would be greatest in which of the following situations?

Science process/reasoning

6

6

6

Farmer Brown was observing the mice that live in his field. He discovered that all of them were either fat or thin. Also, all of them had either black tails or white tails. This made him wonder if there might be a link between the size of the mice and the color of their tails. So he captured all of the mice in one part of his field and observed them. Below are the mice that he captured (picture not included here)… Do you think there is a link between the size of the mice and the color of their tails?

Nature of science

5

6

5

Scientists think atoms exist primarily because they have seen them through powerful microscopes

Attitudes toward science

2

2

2

If given a choice, I would not study physics (negative item)

aPre-rPASKS and Delayed-rPASKS were the same instrument. Only the stems of multiple-choice items are included in table

Because changes were made to the science content items in the PASKS, reliability of the rPASKS instruments used was determined, either the Kuder–Richardson Formula 20 (KR20) reliability coefficient or the Cronbach’s Alpha reliability coefficient for each domain in the instrument. These measures were used to determine how consistently subjects answered questions within each domain, with KR20 coefficients used for domains with dichotomously scored items and Cronbach’s Alpha coefficients used for domains with Likert-type scored items. These coefficients are reported in Table 3.
Table 3

Domain KR20 reliability coefficients for rPASKS domains

Domain

Type of coefficient

Reliability statistic

Level of consistency

Science content

KR20

.529

Moderate

Science process/reasoning

KR20

.416

Moderate

Nature of science

Cronbach’s alpha

.107

Low

Attitudes toward science

Cronbach’s alpha

.903

High

Based on the reliability coefficients reported in Table 3, all domains except for the Nature of Science (NOS) were characterized as either moderately or highly consistent (Bland and Altman 1997), meaning that consistency was found in how subjects performed on the items in all domains except NOS. Given this, care was taken in interpreting findings considering the low consistency of the NOS Domain.

Analysis

Descriptive statistics were determined for the MBL and TLD classrooms on each of the three assessments (i.e., pre-, post-, & delayed-) before Repeated Measurement of Variances were completed for each domain of the rPASKS. The Repeated Measurement of Variances allowed for the determination of whether statistically significant differences were found with respect to time for each domain. In addition, they allowed for the determination of whether statistically significant differences were found when comparing the two groups for each domain (i.e., MBI & TLD).

Findings

Descriptive Statistics

Table 4 reveals the descriptive statistics for the MBI and TLD Groups on pre-, post- and delayed-measures.
Table 4

Descriptive statistics for MBI and TLD

 

MBI (N = 28)

TLD (N = 26)

Mean (SD)

Mean (SD)

Pre-

 Total

7.18 (2.67)

8.13 (2.72)

 Content

4.25 (1.29)

4.54 (1.48)

 Process/reasoning

2.93 (1.96)

3.62 (1.86)

 NOS

17.68 (2.61)

16.69 (2.54)

 Attitude

7.32 (1.68)

7.34 (1.47)

Post-

 Total

8.93 (2.94)

10.68 (2.75)

 Content

6.24 (1.77)

7.89 (1.81)

 Process/reasoning

2.69 (1.61)

2.80 (1.35)

 NOS

19.93 (2.75)

19.80 (3.00)

 Attitude

7.93 (1.98)

8.20 (1.87)

Delayed-

 Total

11.93 (3.00)

12.50 (3.09)

 Content

7.75 (2.14)

8.38 (2.06)

 Process/reasoning

4.18 (1.47)

4.12 (1.53)

 NOS

16.64 (2.30)

16.31 (2.94)

 Attitude

7.64 (1.45)

7.69 (1.35)

As can be seen in Table 4, the Traditional Lecture and Demo (TLD) class scored higher on most measures when compared to the Model Based Inquiry (MBI) class for the pre-, post-, and delayed-rPASKS assessments. Because the TLD class scored higher on the pre-rPASKS when compared to the MBI class, this class was judged to be higher performing at the outset of the instructional unit. This trend held through both the post- and delayed-rPASKS assessments for most domains. Given this difference prior to the instructional unit and throughout, looking at gains overtime (i.e., the difference between pre- & post-, post- & delayed-, and pre- & delayed) was thought to be a much more informative comparative measure.

Results of Analyses

When considering the science content domain, based on the Repeated Measurement of Variances, “time” was an influential factor in affecting student science content learning, F(2, 57) = 78.81, p < .000. As can be seen in Fig. 2, both groups performed better at each time-point (from pre-rPASKS to post-rPASKS to delayed-rPASKS). However, no significant difference was found between groups. As can be seen in Fig. 2, the control group achieved higher than the treatment group at all three time-points (pre-, post-, & delayed-measures), with this difference being greatest at the end of the second time-point (post-rPASKS), but this difference was not significant and was found fading by the third time-point (delayed-rPASKS).
https://static-content.springer.com/image/art%3A10.1007%2Fs10956-010-9251-6/MediaObjects/10956_2010_9251_Fig2_HTML.gif
Fig. 2

Science content domain results

The Repeated Measurement of Variance completed when considering the science process/reasoning domain revealed that “time” was an influential factor in affecting student learning of science process/reasoning, F(2, 57) = 25.43, p < .000. However, as with the science content domain, there was no significant difference found between groups. Figure 3 indicates that the control group performed better than the treatment group at the first time-point, but while both groups performed lower on the second measure they are similar at both the second (post-rPASKS) and third time-point (delayed-rPASKS). Both groups performed better on the third time point in comparison to time-point one (pre-rPASKS) and two (post-rPASKS).
https://static-content.springer.com/image/art%3A10.1007%2Fs10956-010-9251-6/MediaObjects/10956_2010_9251_Fig3_HTML.gif
Fig. 3

Science process/reasoning domain results

When considering the nature of science (NOS) domain, based on the Repeated Measurement of Variances, “time” is an influential factor in affecting student NOS learning, F(2, 57) = 20.92, p < .000. As can be seen in Fig. 4, both groups performed better at the second time-point (post-rPASKS), but both groups performed lower at the third-time-point (delayed-rPASKS), with student performance on the first (pre-rPASKS) and third (delayed-rPASKS) being very similar. However, no significant difference was found between groups. As can be seen in Fig. 4, the treatment group achieved higher than the control group at the first time-point (pre-rPASKS), but they are similar at the second (post-rPASKS) and third time-point (delayed-rPASKS). While this trend was observed, as noted before, no statistically significant differences were found between the groups.
https://static-content.springer.com/image/art%3A10.1007%2Fs10956-010-9251-6/MediaObjects/10956_2010_9251_Fig4_HTML.gif
Fig. 4

Nature of science domain results

Finally, the Repeated Measurement of Variance completed when considering the attitude domain revealed that “time” was an influential factor in affecting students’ attitude toward science, F(2, 57) = 9.90, p < .000. However, as with each of the other domains, there was no significant difference found between groups. Figure 5 indicates that the treatment group held more positive attitudes toward science than the control group at the first time-point (pre-rPASKS). This was reversed at the second time-point (post-rPASKS) with the control groups holding more positve attitudes toward science, but both groups were about the same at the third time-point (delayed-rPASKS), with attitudes in both groups declining, but not to the point recorded at time-point one (pre-rPASKS).
https://static-content.springer.com/image/art%3A10.1007%2Fs10956-010-9251-6/MediaObjects/10956_2010_9251_Fig5_HTML.gif
Fig. 5

Attitude toward science domain results

Discussion and Implications

The discussion and implications section is framed by the research questions with the discussion of research questions 1 & 2 discussed together first before discussion of research question 3.
  1. 1.

    Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, and attitudes toward science) for physics classrooms facilitated with MBI when considering pre-, post-, and delayed-student outcome measures over time?

     
  2. 2.

    Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, and attitudes toward science) for physics classrooms facilitated with Traditional Lecture and Demonstration (TDL) when considering pre-, post-, and delayed-student outcome measures over time?

     

Based on the findings shared, the answer to this research question is yes. Time was found to be a significant factor, which influenced student performance in both groups. This was found across each domain, but these findings did not necessarily indicate improvement at each subsequent time-point (i.e., pre-rPASKS, post-rPASKS, delayed-rPASKS). Both groups did make significant improvements with respect to the content domain. This indicated that both groups did learn science content, whether this instruction was facilitated through a Traditional Lecture and Demonstration (TLD) or a Model Based Inquiry (MBI) instructional strategy.

When considering the other domains, time was always a significant factor, meaning that significant changes occurred when comparing time-points, but this did not always mean that learning or attitudinal gains were occurring. As an example, in the science process/reasoning domain there was a decrease in performance for both groups, before a final increase beyond the pre-instructional unit assessment level. This seemed to suggest that the instructional units actually detracted from students’ understanding of science process/reasoning as measured by the post-instructional unit assessment. This finding was reversed at the delayed-measure. In the end, both groups did show gains from pre- to delayed-measures.

Science process/reasoning is considered an important outcome in science learning that is often neglected (Campbell and Bohn 2008). Additionally, the science process/reasoning learning takes time, effort on the part of the teacher to help cultivate these habits of mind, along with space, time, and reflection on the part of the student (NRC 2007). It is recognized here that this was the first experience that most of these students had encountered with MBI instruction. Given that this was a 2 week unit, it is likely that this was not enough time to develop such habits of mind, especially as measured at the second time point. Gains were found with respect to the final time-point, 1 month after the instruction. These were also found for both groups, MBI and TLD alike. One possible explanation for this could lie in the subsequent instruction that occurred during the interim before the delayed-measure. The third author/classroom teacher offered the following outline of the common experiences these students from both classes had during the interim: (These were the same for both groups)

After finishing the unit on buoyancy, both classes built hot air balloons out of tissue paper. The students worked in groups of two and were given the goal of building a balloon that would stay aloft the longest. The teacher did not explicitly teach how to build the best balloon. In fact, the teacher purposefully avoided giving unsolicited information to see if they applied the lessons and information they had learned in the unit on buoyancy. The only information given to the students was a sheet stating the rules of the competition and the following statement: “Buoyancy in air is the same as in water.”

The building and subsequent launching lasted 1 week. Many students applied knowledge of buoyancy to their balloon. Most groups realized that the buoyant force is related to the volume of fluid displaced, so they tried to build the biggest balloon possible with the sheets of paper available to them. Some also applied previous knowledge of Newton’s laws and sought to maximize volume while minimizing weight. After building and launching the balloons, we shifted gears in the class and moved from mechanics into atomic structure briefly and then into waves (sound followed by light).

Our research design was shaped to isolate instructional strategies as the independent variable being considered. But the teacher was the same for both classes; we suspect that his recognition of the value of MBI or certain facets of MBI (e.g., connecting learning to evidence) could have played a role in the final results. It is possible that our findings are conflated due to ideas and practices that the teacher gained and explored with both classes during this interim.

When considering the attitude toward science and nature of science (NOS) domains, the findings were similar. Attitudinal and NOS gains were found initially at the post-instructional assessment measure, but both decreased significantly at the delayed-measure 1 month later. The difference between these is that while an attitudinal decline did occur; the final measure for both groups was still higher than the pre-assessment measure. This was not the case for the NOS delayed-measure as it declined to approximately pre-assessment levels at the final delayed-measure.

As is the case with science process/nature of science learning, NOS learning is also a process. Research suggests that NOS learning occurs as teachers use explicit reflexive instructional approaches that reciprocate between engaging students in doing science and explicitly prompting reflection and discussion about this process (Akindehin 1988; Lederman 1998; Ogunniyi 1983). As can be seen in Table 1 and was confirmed with the teacher, there was no explicit discussion planned that targeted NOS learning for students. In addition to this, as was noted earlier, the NOS domain was the domain that, based on the KR20 coefficient calculated in Table 3, was found to measure student responses with the lowest level of consistency. So, while these findings are interesting, they are considered with caution.
  • Research Question 3: Are there measurable differences in student outcomes (science content, scientific process/reasoning, nature of science, & attitudes toward science) when comparing physics classrooms facilitated with differing instructional strategies (MBI and TDL) at the pre-, post-, and delayed-student outcome measures intervals?

The Repeated Measurement of Variances analyses revealed that no significant differences were found in student outcomes for any of the domains when comparing physics classrooms facilitated with differing instructional strategies. Given these findings, we cautiously approach the discussion of the trends that were identified because; while differences in outcomes were found, they were not significant. As can be seen in Fig. 2, the treatment (TLD) group scored on average higher on each measure (pre-, post-, delayed-). This score increased between the pre- and post-measures, but faded again at 1 month after instruction. This suggests that the TLD instructional strategy may slightly be better for increasing students’ science content learning initially if measured at the end of a unit or module, as is typically what happens in most classrooms. But, the content learning of the gain for the TLD group in comparison to the MBI group was minimal in the longer term (i.e., as measured 1 month later). This finding may go far in explaining the reluctance of teachers to teach using inquiry as an instructional strategy. If they perceive that direct instruction is more effective for learning content and are exclusively focused on this as a learning goal in science, it would make sense that they would maintain this stance especially if delayed-measures are not common, which is the case in most science classes.

When comparing groups with respect to the science process/reasoning domain, the TLD group preformed higher initially, but this difference in performance among groups was not found after the unit as the post- or delayed-assessment. This could suggest that MBI was more effective at reducing science process/reasoning differences that were initially found present.

Comparing group learning outcomes for the NOS domain revealed the opposite of what was found with the science process/reasoning domain. With respect to this domain, the MBI group preformed higher initially, but this difference in performance among groups was not found after the unit as the post- or delayed-assessment. This might suggest that the TLD instructional strategy was more effective than the MBI with respect for NOS learning, but we are cautious with these findings given the lack of explicit documented NOS focus with either group and the low consistency of this domain in our initial KR20 analysis.

Finally, there was a difference in the attitudes of the different groups, although not significant. The TLD group presented a lower affinity for science prior to the unit, but a higher affinity after the instructional unit. The differences were negligible at the final delayed measure. This result is contrary to what we expected as we expected student autonomy and direction that was more prevalent in MBI instruction to lead to increased attitudes toward science, but did not find it here. As was stated earlier, we remain cautious here with each of the domains when comparing these groups as none were found to be significantly different and as such each of these discussions are based on trends instead of statistical differences.

Conclusions

The following concerns were identified earlier with respect to open inquiry: (1) teachers’ inability or discomfort directing or controlling student inquiry, (2) a perception that open inquiry is too time intensive and (3) lack of evidence for improved student outcomes (Settlage 2007). In concluding this initial research, we would like to consider these in the context of what was reported here.

Instances were found where the teacher experienced struggles and setback directing student inquiries and example of this can be found in the following reflection shared by the teacher:

I saw some potential issues. Some students were doing experiments—they thought. But really they were just playing. One was floating different sized blocks of different masses. When I asked what they were investigating, they reported trying to learn about shapes [through this example, the teacher realized that students were not being purposeful and systematic enough by controlling variables or thinking through what it might take to make conclusions based on data they might collect].

In addition, the teacher expressed feeling that MBI was a bit “nebulous” in that he, as well as the collaborating university science educator, needed to make sure the instructional approach was easily articulated and understood before others could adopt and implement it. This need for clear articulation is seen as an important step in widespread adoption of MBI. Our concern focused on clearly understanding the role of the teacher and students, while at the same time not predetermining the students’ experiences to the extent that the messiness of science, manifested as opportunities to learn more about science process and the nature of science, would be absent. We believe that our documented approach here lends some guidance to address how teachers can direct inquiry, but we also see this as important work that we will continue to pursue.

Teachers have also expressed concerns about inquiry being too time intensive. As can be seen in Table 1, the MBI instructional unit described here was taught alongside a TLD instructional unit. The instructional strategies started and ended at the same time as the post-assessment measure was administered on the same day for both classes studied. We are confident that examples can be found where an inquiry instructional unit is quite longer than a traditional lecture and demonstration instructional unit, but one of our goals in this research was to manage the time of the MBI so that it was not a concern. In the end, we believe that learning is time intensive and that deciding how much time to allot is important, but the decisions should be based on the depth of learning desired for students, instead of focusing on breadth or coverage that, if too excessive, may not lead to enduring understandings.

The final concern that we identified was what Settlage (2007) referred to as lack of evidence for improved student outcomes associated with inquiry instruction. Our findings do not provide evidence for improved student outcomes, at least not at a greater level than could be achieved with TDL instructional strategies. Johnston (2008) declared that

[I]nquiry is not simply a teaching tool, but a teaching goal. This is not a new-fangled idea… That is, the process of inquiry is not simply something that we use to get learners to understand buoyancy. It is a scientific endeavor in itself, allowing students to be themselves within a culture of scientific inquiry… The processes embraced by science that allow us to extract explanation from evidence (pp. 11–12).

Similar to this declaration, we see science as inquiry as a foundational instructional goal for students in as much as conceptual understandings of Newton’s Laws are foundational instructional goals in physics. Therefore, we see value in this research demonstrating no differences in student achievement between TDL and MBI classes as a beginning stage of legitimizing inquiry instructional strategies in classrooms where conceptual understanding has dominated as the sole focus for too long. This is accomplished 1) because no conceptual understanding differences were detected and 2) because we have begun to take measures that will allow us to detect growth in student understanding and abilities with inquiry in the future.

We expect to continue our implementation and research into the effectiveness of MBI as an instructional strategy because we believe, like Johnston (2008), that inquiry should be an instructional goal. We expect that through very focused future attention we can better understand the nuances of these findings, make improvements on our implementation strategies, and through these efforts find gains in the holistic range of student science learning outcomes we value in the future.

Copyright information

© Springer Science+Business Media, LLC 2010