Introduction

Success in mathematics courses is essential for all science, technology, engineering, and mathematics (STEM) fields (Seymour and Hewitt 1997), and indeed mathematics serves as a gateway to completing any college degree (Stigler et al. 2010). Numerous studies have linked students’ persistence in STEM majors to differences in instructional methods (Freeman et al. 2014; Seymour and Hewitt 1997). For example, in calculus courses, there is a positive association between students’ persistence in STEM majors and their report of the frequency with which their calculus instructor uses student-centered techniques (Ellis et al. 2014).

Inquiry-based learning (IBL) is a form of active, student-centered instruction in mathematics that helps students develop critical thinking through exploring loosely-structured problems and by constructing and evaluating mathematical arguments (Prince and Felder 2007; Savin-Baden and Major 2004). In the particular tradition of IBL that we have described elsewhere (Laursen et al. 2014), IBL is based on the teaching practices of mathematician R.L. Moore (1882–1974) (Mahavier 1999). Mahavier (1999) argues that the important features of “Moore Method” are “regular interaction with the students so that the instructor knows how well the students understand the material” (p. 340) and the use of challenging problems. In this way IBL practice is based not on strict procedures for students, but on the commonalities of the learning environment and student outcomes instructors aim to create (Coppin et al. 2009), setting up a classroom environment where students become creators rather than recipients of knowledge through discovering, presenting, and debating mathematics. Similarly, Yoshinobu and Jones (2013) explain that the term inquiry-based learning encompasses the “Moore Method” and other approaches that share the spirit of student inquiry through two core features of (1) deep engagement with mathematics and (2) collaboration with peers. Consistently across these definitions, there are no rigid prescriptions for content or pedagogy, but rather guiding principles that shape instructors’ curricular and pedagogical choices. Together, these features support students’ deep learning of mathematical concepts (McCann et al. 2004; Moon 2004) and development of positive attitudes, beliefs and capacities that support learning and problem-solving in mathematics (Hassi and Laursen 2015; Kogan and Laursen 2014; Laursen et al. 2014).

Even though content or pedagogy are not strictly prescribed, IBL classrooms share similarities with each other and are markedly different than non-IBL classrooms. Whereas non-IBL classrooms are characterized by students frequently listening to instructors talk, in IBL classrooms, much of class time is spent on student-centered activities such as working in small groups, student presentations, and discussions (Kogan and Laursen 2014; Laursen 2013; Laursen et al. 2014). Interaction among students and with the mathematical ideas is enhanced either through students individually presenting problems or proofs which they have worked on before class followed by full group discussion of those presentations, or through small, collaborative group work on deep mathematical activities along with full or small group discussions. Yoshinobu and Jones (2012, 2013) provide rich examples, and explain that the

“core idea is that students are engaged in an apprenticeship into the practice of mathematics. Students actively participate in contributing their mathematical ideas to solve problems, rather than applying teacher-demonstrated techniques to similar exercises. …Students do mathematics like research mathematicians do mathematics” (2012, p. 307).

Despite variation in the extent and types of instructional activities employed, the use of IBL strategies in college mathematics courses is associated with affective gains and greater persistence with mathematics majors for women students, and with improved grades for low-performing students (Kogan and Laursen 2014; Laursen et al. 2014).

While IBL in mathematics is associated with positive student outcomes, a major challenge for educational reform lies in getting large numbers of faculty to use research-supported teaching methods that yield such outcomes (Fairweather 2008; Henderson and Dancy 2007, 2008, 2011). As individual instructors begin to teach with IBL or similar approaches, many must make a transition from traditional lecture methods to more student-centered teaching approaches. [For an example, see (Retsek 2013).] Making such instructional changes can be difficult, but professional development workshops are one way to support instructors who aim to do so. Workshops are the preferred method of National Science Foundation (NSF) program directors for encouraging faculty uptake of student-centered teaching methods, especially when workshops are “in-depth, multi-day, immersive experiences with follow-up interaction with the PI as participants implement the new strategy in their own institutional circumstances” (Khatri et al. 2013, p. 1). There is evidence to support this view, as Lattuca et al. (2014) found that among six different types of professional development activities for engineering faculty, the use of student-centered pedagogies was most strongly correlated with attending a workshop.

In this paper, we report on instructor uptake of inquiry-based learning instructional methods following intensive, week-long IBL-focused workshops. In our role as evaluators for these workshops, we also conducted research to explore factors that affected uptake of IBL methods by the workshop participants. In this paper, we focus on broadly applicable findings related to central concerns for instructors who are making a transition to IBL teaching, and on workshop features that help to make this process smoother. We frame these findings with a three-stage model of instructor change developed by Paulsen and Feldman (1995), based on Lewin’s (1947) theory of change in human systems.

Theoretical Framework

In their instructor change model, Paulsen and Feldman (1995) describe three stages of (1) unfreezing, (2) changing, and (3) refreezing. During unfreezing instructors gain motivation to change through experiencing incongruence between their goals and the outcomes of their teaching practices. Key to this initial stage is “psychological safety” through “envisioning ways to change that will produce results that reestablish his or her positive self-image without feeling any loss of integrity or identity” (Paulsen and Feldman 1995, p. 12). In the next stage, changing, instructors learn, apply, and reflect on new teaching strategies to help align their behaviors with desired outcomes. While teaching strategies may be fluid and changing during this stage, in the final stage, refreezing, either these new strategies are confirmed through positive feedback and solidified, or the instructor returns to his or her original strategies.

While all three stages are important, stage two, changing, is perhaps the most studied (Connolly and Millar 2006). For example, the literature on K-12 professional development offers extensive evidence on features of workshops that help instructors to gain knowledge and skills, and to make changes in their classrooms (e.g., Garet et al. 2001; Wei et al. 2010). There is not much literature on features of effective workshops within higher education (Connolly and Millar 2006; Council of Scientific Society Presidents 2012). Most of what does exist is descriptions of faculty development workshops that are associated with high rates of uptake, and most of these reports do not measure outcomes beyond participant satisfaction with the workshops (Council of Scientific Society Presidents 2012). In this paper, we discuss the quality of the workshops and the elements of successful professional development that facilitate changing. However, our main focus is on how these workshops supported instructors through the less-studied stages of unfreezing and refreezing, and the implications for broadening the adoption of IBL and similar, student-centered strategies in college mathematics and other disciplines. We present survey data that show participants have gone through the change process, and use interview and open-ended survey responses to help explain how the workshops have supported instructors through each of the three stages of the process.

Context for the Study

Data were collected from three workshops held in the summers of 2010, 2011, and 2012, each 4 or 5 days long. The workshops were part of a jointly funded project at several ‘IBL Centers’ (Laursen et al. 2014) around the country. Each workshop was organized and run independently by experienced IBL instructors at one of the centers. As a result, the workshops shared some common elements, but each engaged participants in its own blend of activities, such as watching and analyzing videos of IBL classes, reading and discussing research articles, listening to plenary talks, participating in panel discussions with experienced IBL instructors, and developing IBL materials in small groups. In general, the first and third workshops featured more hands-on, active sessions, while the second, slightly larger workshop was organized in more of a conference style with formal talks. All three workshops exemplified characteristics of effective research-based professional development that have been identified in previous literature on K-12 teacher development (Cormas and Barufaldi 2011; Garet et al. 2001). These included such features as active participation, engaging participants in discussions of their students’ learning, and promoting participant self-reflection. While these findings from the K-12 setting also recommend that teacher professional development interweave a strong focus on important disciplinary ideas with effective teaching strategies to help students engage with those ideas, the workshops we studied focused more strongly on pedagogy than on specific content. In Austin’s (2011) synthesis of the literature, she argues that research suggests this approach may be appropriate for college-level instructors, who have studied deeply in their discipline and are often eager to share this deep content knowledge, but may be unsure how to do so effectively. These workshops focused on helping instructors to plan and teach courses that successfully engage students with mathematical content in an inquiry-based style. More detailed descriptions of the workshops are available in a report on the workshop evaluation (Hayward and Laursen 2014).

Methods

Participants in each workshop were invited to complete online pre-workshop surveys and in-person post-workshop surveys on paper. We conducted online follow-up surveys approximately 15 months after the workshop, so that participants could report on their teaching in the intervening academic year. Survey instruments are available as an appendix to the workshop evaluation report (Hayward and Laursen 2014). From the 139 attendees at the three workshops, we received 124 pre-workshop surveys (89 %) and 125 post-workshop surveys (90 %). For 1-year follow-up surveys, 96 individuals (69 %) responded. We use the words “participants” or “attendees” to refer to all 139 individuals, whereas “respondents” refers only to those who completed the surveys. All surveys included unique, anonymous identifiers that allowed us to match pre-, post-, and follow-up survey responses for each individual. Because we could match surveys but not identify individuals, one person who attended two workshops may have been included twice. Due to incomplete responses on the identifier questions, not all surveys were successfully matched, but overall rates of response matching were high, as presented in Table 1. Overall this high response rate means that responses can be fairly generalized to the workshop population, and thus the findings are not strongly biased by sub-groups such as adopters versus non-adopters.

Table 1 Survey response rates and matching rates, by workshop cohort

The surveys included quantitative items and open-ended questions aimed at both evaluating workshop delivery and understanding the impact the workshops had on the participants’ teaching. Items were developed to monitor participants’ self-reported knowledge, skills, and beliefs about inquiry-based learning, as well as their motivation to use inquiry methods and their perceptions of the overall quality of the workshop. For example, on all three surveys, participants assessed their current knowledge of IBL on a scale of 1 to 4 (1=None, 2=A little, 3=Some, and 4=A lot).

To measure impact of the workshops on their subsequent teaching, we asked participants to report both directly and indirectly whether or not they had implemented IBL. We measured implementation directly through a multiple-choice question on the follow-up survey asking participants if they had implemented no IBL methods, some IBL methods, one full-IBL class, or more than one full-IBL class. We measured IBL implementation indirectly through comparing changes in instructors’ reported frequencies of use of specific teaching practices that were probed on both pre-workshop and follow-up surveys. Available research indicates that self-report is most accurate when it is retrospective over a clearly defined time frame, when it is confidential, and when it is behavioral rather than evaluative (Desimone 2009). Therefore, we designed these indirect measures of teaching practice to ask participants to anonymously report their frequency of use of eleven behaviors in a class that they had taught recently. Participants were asked to select a response for each of the eleven practices to indicate if they use the practice ‘never,’ ‘about once a month,’ ‘about twice a month,’ ‘weekly,’ or ‘every class.’ The eleven behaviors included some that are consistent with inquiry-based learning as presented at the workshops, other behaviors that are characteristic of other forms of active learning but not necessarily IBL, and some that are characteristic of lecture-based instruction. The behaviors and the expected changes in these behaviors upon implementing IBL are detailed in Table 2.

Table 2 Eleven behaviors used to indirectly measure implementation of IBL and their expected changes

Some practices were considered ‘core IBL’ practices because they should characterize all variations of IBL communicated in these workshops. ‘Core IBL’ includes decreased use of instructor activities like lecture and solving problems on the board, and increased use of student activities like presentations and student discussion.

Other items were classified as ‘preference IBL’ practices, which were consistent with the set of IBL approaches presented in the workshops, but different IBL instructors might emphasize them to varying degrees. For example, instructors may vary in how active a role they take in leading discussions; some instructors use in-class group work with group presentations, while others have students individually present problems or proofs worked on outside of class.

We also asked participants about their use of other forms of active learning that are not necessarily characteristic of IBL as presented at the workshops. In this study, these items functioned as “distractors,” reflecting teaching practices that may be incorporated in student-centered classrooms but were not specifically addressed in any of the workshops.

Open-ended questions addressed the perceived costs and benefits of using inquiry strategies and participants’ impressions and learning from the workshop, which helped to provide more detail and deeper understanding of the factors that affected their use of IBL practices. Additionally, participants reported personal and professional demographic information such as career stage, institution type, gender, race, and ethnicity, so that we could test for possible differences in results among groups.

The 139 participants were all college mathematics instructors who voluntarily attended a workshop and who came from at least 117 separate institutions around the United States and Canada. Full demographics are presented in Table 3. Overall, the participants were diverse in career status and teaching experience, and moderately diverse by gender and by ethnicity relative to the mathematics faculty overall (National Science Foundation 2008a, b). Only 13 % of participants reported teaching at a minority-serving institution. Interestingly, 24 % of participants had taken an IBL-style class as a student, and 45 % said they had previously incorporated IBL techniques in their teaching strategies. However, 46 % of participants reported no prior experience with IBL as either an instructor or a student.

Table 3 Workshop participant demographics

In addition to the survey data, 16 interviews were conducted with a small subset of participants after they had completed the follow-up survey. On follow-up surveys, 16 individuals from the first workshop and 11 individuals from the second workshop expressed interest in taking part in a telephone interview. They were all invited to participate, and ultimately seven participants from the first workshop were interviewed by the second author and nine from the second workshop were interviewed by the first author. All interviews lasted approximately one-hour and were conducted by telephone. Available demographic information for interviewees is included in Table 3. In general, the interviewees were representative of the larger group of workshop participants. The interview participants self-identified as implementing more than one full-IBL course (25 %), one full-IBL course (19 %), some IBL methods (50 %), or no IBL (6 %).

During these interviews, we asked questions to gain a deeper understanding of participants’ development as instructors, their views on teaching and learning, and more detail about their classroom activities and the factors that affected whether or not they implemented IBL. Interviews were semi-structured so that participants could reveal their own perspectives instead of fitting their responses into categories introduced by researchers. Thus we did not ask questions in the same order or with the same wording in every interview. Some topics arose spontaneously and thus were not represented in every interview. Prior to data collection, all survey instruments and interview protocols were reviewed and approved by the Institutional Review Board at the University of Colorado Boulder.

Data Analysis

Survey data were entered into and analyzed with SPSS v. 21 (IBM Corp 2012). We calculated descriptive statistics for all variables, and inferential statistics as appropriate. Open-ended responses were entered into Microsoft Excel (Microsoft 2011) and coded for common themes.

We audio-recorded interviews, transcribed them verbatim, and entered them into NVivo v. 9 (QSR International Pty Ltd 2010). We carefully read through and identified segments of the transcripts that related to specific topics and assigned them a code to identify that topic. If an individual passage covered multiple topics, we assigned multiple codes. We coded topics each time participants discussed them, so we sometimes used a code multiple times over the course of an interview. We organized groups of codes that shared similar themes into domains (Spradley 1980). For example, the domain of “implementation barriers” included nine codes representing specific barriers. Analysis was an iterative process, as each transcript brought new detail that might warrant the development of more specific codes which we then reapplied to earlier transcripts. The frequencies of use of specific codes and domains give an approximation of the relative importance of the topics to the respondents. We counted codes and domains both in terms of the number of interview participants who mentioned a specific topic and the number of separate comments they made.

While an interview participant might revisit an idea multiple times throughout an interview, responses to the open-ended survey items were short and less detailed. Therefore, in open-ended responses, we coded each theme only once as either present or absent. This is different from interviews, in which we counted the total number of separate comments made about a particular theme. The first author coded and the third author reviewed both interview and open-ended responses.

Results

We first report results that indicate the nature and quality of participant learning and other immediate outcomes from the workshop. Then, we report results on the impact of the workshop on participants’ subsequent teaching activities. These two sections provide evidence that participants felt the workshops were well designed and delivered, and that participants’ teaching practices changed after the workshop. We then report qualitative results from open-ended survey items and interviews that describe issues for implementation. In the Discussion section, we use the qualitative results to explain the quantitative results in terms of Paulsen and Feldman’s (1995) framework.

Indicators of Workshop Quality and Participant Learning

As a way to monitor the quality of the workshops, we asked participants to rate their motivation to use IBL, their knowledge of IBL, their level of IBL skill, and their belief in the effectiveness of IBL on all three surveys using 4-point scales, as reported in Table 4. We tested change over time by comparing responses on each pair of surveys (pre to post, post to follow-up, and pre to follow-up) with Wilcoxon Signed Rank tests. Respondents reported high motivation to use IBL even before the workshops: 29 % felt ‘somewhat motivated’ and 68 % felt ‘highly motivated.’ On average, their motivation to use IBL rose significantly post-workshop, but then returned to pre-workshop levels by the 1-year follow-up.

Table 4 Changes over time in knowledge, skills, belief in effectiveness of IBL, and motivation to use IBL

On average, respondents’ knowledge rose significantly following the workshop, and then remained the same at the 1-year follow-up. Their average rating of their own skill in inquiry-based teaching rose significantly following the workshop, and rose again by the 1-year follow-up.

As with motivation, respondents started with strong beliefs in the effectiveness of IBL. These rose significantly following the workshop, and then decreased by the 1-year follow-up, though they were still higher than pre-workshop levels. Overall, we interpret these results to mean that the workshops offered participants a high-quality professional development experience that yielded cognitive and attitudinal changes of the types the workshop facilitators sought.

Impact on Teaching Practice

To assess the impact of the workshops on teaching practice, we asked participants to report their implementation of IBL methods in two different ways. First, we measured IBL implementation through one direct item on the follow-up survey. In total, 58 % of the 139 workshop participants reported implementing at least some IBL methods in the year following the workshop they had attended, with 29 % reporting implementing “some IBL methods,” 14 % reporting “one full-IBL course,” and 15 % reporting “more than one full-IBL course.” Only 8 % of participants reported implementing no IBL methods, while the remaining 34 % did not respond to this question.

Differences in implementation rates between workshops were significant (χ 2 (6, 91) = 13.87, p < 0.05). Pairwise comparisons revealed that implementation rates for the third workshop were significantly higher than the first and second workshops, which did not themselves differ significantly. Figure 1 shows implementation levels for only those who responded to this question, not for all participants (response rates were similar for each of the three workshops, as noted in the figure). We found no differences in implementation by any other demographic characteristic.

Fig. 1
figure 1

Implementation of IBL methods by respondents* to 1-year follow-up survey, by workshop cohort

As a check on respondents’ direct, self-reported “IBL” teaching, we also measured implementation indirectly by comparing pre-workshop teaching practices with teaching practices 1 year after the workshop. As noted in the Methods section, the design of these items was consistent with best practice for self-reporting behaviors. Participants reported the frequency with which they used certain teaching practices on both pre-workshop and 1-year follow-up surveys. By comparing matched surveys, we were able to assess changes in teaching practices that were consistent with the inquiry-based practices presented in the workshops, and compare these to reports of other forms of active learning that could be considered controls. Of the 139 participants, 96 (69 %) responded to the follow-up survey and we matched 69 of those responses (72 %) to pre-workshop survey responses. This resulted in a net response rate of 50 %.

In Fig. 2, we compare reports of pre-workshop and 1-year follow-up teaching practices for the 69 respondents with matched surveys. Asterisks indicate significant changes in these frequencies. Upward arrows indicate increased frequency of the practice and downward arrows indicate decreased frequency. We tested for differences in the change in individuals’ teaching practices using Wilcoxon Signed Ranks tests, and significant changes are detailed in Table 5. The use of other practices did not differ significantly from pre-workshop to 1-year follow-up; they are shown in Fig. 2 but not in Table 5.

Fig. 2
figure 2

Reported frequencies of pre-workshop and 1-year follow-up teaching practices, matched survey responses only

Table 5 Significant changes in frequenciesa of reported teaching practices from pre-workshop to 1-year follow-up

Issues for Implementation

We asked participants on pre- and post-workshop surveys to comment on their concerns about implementing IBL. These comments help to reveal whether workshops met participants’ needs and also what post-workshop concerns may have influenced instructors’ decisions to implement IBL or not. In comparing individuals’ matched pre- and post-workshop concerns, we categorized each participant’s concern as ‘dispelled’ if it was mentioned pre-workshop but not mentioned post-workshop, ‘lingering’ if mentioned on both, or ‘raised’ if mentioned after but not prior to the workshop. Results are presented in Table 6.

Table 6 Comparisons of pre- and post-workshop concerns from open-ended comments

Prior to the workshops, coverage of material when using IBL methods was the most common concern (46 participants). Coverage is an issue with IBL since instructors spend more class time involving students in deep inquiry, and typically cover less breadth of content than they are accustomed to doing in lecture courses (Yoshinobu and Jones 2012). Participants mentioned feeling pressure to cover certain topics based on collegial expectations, standardized tests, and subsequent course requirements. They also felt pressure to expose science and engineering students to a large set of computational techniques, rather than seeing the learning goal for these students as conceptual understanding. A large number of coverage concerns were dispelled (31 participants), while smaller numbers were raised (14 participants) or lingering (15 participants). Organizers and participants often discussed coverage at the workshops, yet it remained a concern for many participants. Concerns about student resistance were also common on both surveys (29 raised, 21 dispelled, 10 lingering). The third most common concern was participants’ own lack of skill to implement IBL (23 raised, 8 dispelled, 7 lingering).

The interview data echoed these themes but also provided more detail about the factors that affected participants’ actual implementation of IBL, rather than just difficulties they worried they might face. We organized these interview comments into three groups: supports, barriers, and areas of alertness. We differentiated ‘barriers’ and ‘areas of alertness’ by whether the factors actively discouraged instructors as they tried to implement IBL (barriers) or were merely issues about which they wanted to be conscientious while implementing (areas of alertness); some factors were included in both groups based on how participants discussed the issue.

Interview participants made 96 comments about barriers to implementing IBL in their classroom. The most common were student resistance (12 interviewees, 22 comments), instructors’ fears (9 interviewees, 18 comments), and tenure/evaluation concerns (8 interviewees, 11 comments). Some instructors commented that student resistance was lower when they made an effort at the beginning of the term to ‘market’ the course. This meant informing students what an inquiry-based course would be like and why they chose to use teaching methods that, for many students, were different from the more familiar lecture-based courses. Instructors shared their own fears such as “IBL is hard” or being scared of “relinquishing control” of their classroom.

Implementation supports (16 interviewees, 126 comments) were mentioned slightly more often than barriers. The most commonly discussed was departmental support (16 interviewees, 45 comments), but additional professional development (15 interviewees, 38 comments) and IBL mentors or colleagues (12 interviewees, 34 comments) were also viewed as helpful by instructors for implementing IBL. Participants perceived departmental support in different ways. Some felt a general freedom or openness to innovation within their departments, while others described specific supports, such as other instructors using IBL at their institution. This helped because students were already used to, and expected, IBL techniques in their mathematics classes, so they were less likely to resist. On the follow-up surveys, 85 to 90 % of participants reported at least moderate support from each of three distinct but important groups: departmental colleagues, department chairs, and provosts or deans.

The most common factors cited by interviewees as shaping their implementation were not necessarily barriers or supports, but simply things they had learned to be alert to when implementing IBL. In total, all 16 participants commented a total of 417 times on these factors. These included topics such as finding or creating IBL-appropriate materials (16 interviewees, 62 comments) and the different role of the instructor in an IBL class as compared to that in a lecture class (9 interviewees, 43 comments). Situational considerations included deciding how to implement IBL depending on the level (15 interviewees, 39 comments) or size of the class (15 interviewees, 28 comments). These factors highlight aspects of IBL teaching that differ from lecture-based teaching and thus require extra thought and consideration.

Although workshop organizers had proposed to follow up with participants, only the organizers of the third workshop actively did so, formally engaging participants through an email listserv for 1 year following this workshop. Participants and organizers both contributed to discussion on the group list, sending a total of 191 messages. Of these messages, 19 were sent by the workshop organizers specifically to prompt participants to contribute. These prompts were often followed by flurries of listserv activity, as participants used the list to check in and cheer each other on, share ideas, and pose and respond to difficulties individual instructors were facing with implementing IBL in their own classrooms. By the end of 1 year of formal email follow-up, 62 % of workshop attendees had sent at least one message to the list. Workshop organizers also provided individual consultation and feedback to a number of participants, but we did not track this activity.

Overall, almost half (48 %) of respondents from this third cohort reported that the email list was either a ‘great help’ (32 %) or ‘much help’ (16 %). Only 12 % of respondents felt the email list was not helpful. Many respondents (74 %) from this workshop also said they kept in touch with other participants, a proportion that was slightly higher (but not significantly higher) than the same proportions for the first two cohorts, 62 and 61 % of respondents, respectively. We also compared directly-reported implementation levels across all three cohorts (none, some methods, one course, more than one course) with survey items related to follow-up activities and support from colleagues, but found no significant differences among workshop cohorts.

Discussion

To help frame and interpret the results, we use Paulsen and Feldman’s (1995) three-stage theory of instructor change. It helps to explain the process by which instructors made a transition from more traditional, lecture-based approaches to a more inquiry-based approach through the stages of (1) unfreezing, (2) changing, and, (3) refreezing.

Unfreezing

Broad Definitions and Impact on Teaching Practice

Overall, these workshops were effective in helping instructors through this transition as they adopted IBL teaching practices. Among workshop participants, there was a high rate of uptake of IBL approaches, reported directly (at least 58 % of attendees), and indirectly through changes in teaching practices from pre-workshop surveys to 1-year follow-up surveys. While IBL may involve both pedagogical and curricular changes, we focused on pedagogical changes for two reasons. First, pedagogical changes were the focus of the workshops, and second, curricular changes may develop more slowly as instructors select and reframe topics to highlight in students’ inquiry work, so they cannot be measured as effectively in a short timeframe.

‘Core IBL’ practices are found in all variations of IBL that were communicated in these workshops, and indeed these showed significant changes in instructor use. These included decreased use of instructor-led activities of lecturing and solving problems on the board, and increased use of student-led activities including whole-class discussions, small group discussions, and student presentations of problems or proofs. The frequencies of use of ‘preference IBL’ practices, including instructor-led discussions and students working in groups, showed non-significant increases, suggesting that a minority of instructors made use of these in implementing IBL.

Instructor-reported frequencies of other forms of active learning that are not necessarily characteristic of IBL remained quite consistent from pre-workshop to 1-year follow-up. These included instructors asking conceptual questions leading to generalizations, students solving problems alone, students writing in class, and students using computers. The lack of change in instructors’ use of these methods suggests that respondents are not making general or broad claims about their use of student-centered learning approaches that they may perceive as socially desirable, but are instead selectively reporting specific practices they actually used.

The distinctions between types of teaching practices are important in light of Paulsen and Feldman’s theory of instructor change. Their theory suggests that during the unfreezing stage, instructors gain motivation to change when certain criteria are met, notably, psychological “safety.” This occurs when instructors can envision ways to change that achieve their desired outcomes in a manner consistent with their self-image (Paulsen and Feldman 1995). While changes in ‘core IBL’ practices were common for most participants, the freedom to choose whether and how to incorporate ‘preference IBL’ practices may be important to meeting this safety criterion. Comments from the interviews supported the importance of choice. For example, one participant was struck by “how enthusiastic everyone [at the workshop] was about teaching and helping other people learn what IBL is about and how to integrate it into your classroom.” However, he “tuned out” one presenter who he found “aggressive” in communicating that “this is the only way to go, and that if you don’t do this, then it somehow diminishes your classroom.” Another participant explained that seeing IBL as a spectrum of related practices “was kind of a big moment for me because it made it seem less scary. …Feeling like I can pick and choose aspects of it, and find something on the spectrum that I feel comfortable with, was empowering.”

These findings suggest that portraying IBL as a broad, inclusive set of practices, rather than a rigid and prescriptive method, may be essential for helping new instructors during the unfreezing stage, as it helped them to envision a way to change their teaching that was consistent with their own self-image and thus felt safe. This also gave participants the freedom to use a “hybrid” style whereby they incorporated some IBL strategies into a more traditional class, offering a more feasible and less daunting entry into IBL that could then lead to “full IBL.” Biology education researchers have called this process “phased inquiry” and suggest that it is “an important step toward expanding adoption of inquiry practices in college science courses” (Yarnall and Fusco 2014, p. 56). However, further longitudinal research is needed to explore how teaching practices change after instructors take these initial steps to incorporate “hybrid” IBL.

Diverse Viewpoints and Context of Implementation

In addition to portraying IBL as a broad, inclusive set of practices, it was also important for workshop attendees to see IBL being used in a variety of settings. Interview participants described a number of situational factors that led them to vary the IBL strategies they used, depending on the level (first-year, sophomore, etc.), size, or the audience (mathematics majors, pre-service teachers, etc.) of their class. As one interview participant explained, seeing a diversity of IBL practices, practitioners, and situations, was important because it was “frustrating” when one presenter “had so many resources at their disposal that the rest of us didn’t have, …how many graders and TAs they have and how they keep the class size small. These were things that just don’t apply to most universities.” Other participants made positive comments about the variety of opinions and viewpoints shared in the workshop, such as one who identified the best aspect of the workshop as offering,

“a good diversity of ideas and approaches which I feel that I can adapt to my own teaching. As an inexperienced IBL user, I was very interested in learning from experts, but I was also interested in meeting people in my situation who I can identify with and hearing how they have worked through the same problems that I have.”

Another participant commented that the workshop “gave me more ways and more tools to introduce IBL into [lower-level and pre-service teacher courses].” As a result, he was able to incorporate IBL methods into classes he had previously thought could not be taught with IBL.

From their studies of physics education reform, Henderson and Dancy (2008) recommend providing instructors with easily modifiable curricular materials, so that individual instructors may use their expertise to adapt the materials to their own local environments. While their recommendation applies to reforms focused on curricular materials, our findings suggest that this feature of easy portability may also be important for sharing primarily pedagogical strategies such as IBL. Showing diverse examples of IBL may have helped participants to see how to customize IBL for their individual context and thus made implementation more likely.

The workshop leaders’ choice to present IBL as a variety of related approaches was inviting for participants, but does raise the question of how well their implementations maintained fidelity of IBL. Studies in physics (Dancy and Henderson 2010) and biology (Yarnall and Fusco 2014) have reported that instructors often adapt and modify research-based teaching strategies, usually in ways that align more with traditional methods and reduce the amount of student inquiry. However, IBL in mathematics may be somewhat robust to variation, as student outcomes are improved over traditional courses despite notable variations in how IBL is implemented (Laursen et al. 2014). Portraying IBL as a spectrum of related practices may have helped participants by outlining ways in which they could modify IBL methods to fit their context while still maintaining its core features, including high levels of student inquiry.

Measures of fidelity of implementation may examine either ‘fidelity of structure,’ meaning adherence and duration of use, or ‘fidelity of process,’ including quality of delivery, and program differentiation – “whether critical features that distinguish the program from the comparison condition are present” (O’Donnell 2008, p. 34). For pedagogical innovations like IBL, fidelity of process may be more important than fidelity of structure. We suggest that professional development that communicates broad, inclusive definitions of IBL in mathematics not only increases instructor uptake, but may also help to maintain fidelity through outlining allowable modifications that preserve the core principles of the approach and protect fidelity of process. By explicitly recognizing acceptable variation in practice, the workshops may have reduced the likelihood that instructors would make modifications that would reduce the amount of student inquiry and veer back toward traditional methods. Rogers (2003) argues that this process of “reinvention” is an asset in spreading innovations, as it helps to reduce mistakes, fits the innovation to local contexts, and makes the innovation more responsive to changing conditions. Communicating broader and more inclusive definitions of student-centered strategies in other STEM disciplines may help instructors to adapt these methods to their classes while maintaining high levels of student inquiry.

Changing: Features of the Workshops and Participant Learning

The workshops supported instructors through the changing stage by providing participants with information about IBL, opportunities to view IBL in action, discussions with experienced practitioners, and, in some cases, collaborative work time for participants to develop their own IBL courses. The week-long duration of these workshops, as well as thoughtful scheduling of breaks and free time, were important features that supported participant learning by allowing ample time for participants to process and reflect, as well as to revisit topics throughout the week. Indeed, participants did report increased knowledge and skills following these workshops.

The three most common concerns participants mentioned in open-ended comments (Table 6) are areas where instructors new to IBL struggle and where they may need the most help during the changing stage. These concerns included lack of skill to implement IBL, content coverage, and student resistance. Efforts to encourage mathematics instructors to adopt IBL or similar strategies will need to address these three concerns and provide strategies to help manage them.

The large number of concerns about instructor skill that were raised (23) following the workshop may indicate learning rather than unmet needs. Some comments on the pre-survey support this interpretation, such as “[I’m] not familiar enough with it to have concerns.” But, as participants gained more familiarity with IBL during a workshop, they also learned more about the particular challenges that come along with its use.

In fact, for all but two of the topics mentioned by respondents, the number of new concerns raised was greater than or equal to the numbers of concerns dispelled or lingering. Moreover, of the concerns shared on the pre-workshop surveys, 72 % were dispelled, and only 28 % lingered. While many new concerns were raised on the post-workshop surveys, rather fewer lingered that were already on instructors’ minds before the workshop. Again, this suggests an explanation that instructors were learning rather than expressing needs not addressed by the workshop. The high rate of IBL implementation indicates that participants did not perceive the remaining concerns as great enough to deter them from using IBL.

Refreezing: Ongoing support

Ongoing support through the refreezing stage was especially challenging since these workshops served instructors from geographically diverse institutions, who could not easily reconnect in person. However, Fairweather (2008) states that “external networks of like-minded colleagues… can be important forces in promoting instructional reform” as they help instructors to find supportive colleagues (p. 27). Workshop organizers for the third cohort were able to effectively provide participants with ongoing support through a group email list. Participants from this workshop reported higher implementation rates than those from the other two workshops, and many reported that the email list was helpful. (Ongoing research with other workshops that incorporate structured follow-up support is testing whether this pattern continues and how participation may relate to high or low implementation.) Diversity in participants’ institutional origins may have served as a benefit, if participants felt more secure asking questions or sharing difficulties with outside colleagues than with departmental colleagues involved in their tenure or promotion process. This may have been especially relevant given the high proportion of pre-tenure faculty who attended these workshops (35 %) and the number of concerns shared about evaluation and tenure decisions. Support and encouragement from fellow workshop attendees may also have supplied the positive feedback essential to the refreezing stage.

Organizers also helped connect participants with other IBL colleagues through invitations to an annual IBL-focused conference. Participants supported each other directly, as some reported staying in contact after the workshops through email, or in one case, by forming a small, regional IBL group. While such face-to-face connections were not widespread, they do represent possible ways to support more individuals with refreezing after workshops are over and participants have returned to their home institutions. Other options may include hosting occasional online, themed discussions or involving teaching and learning centers at participants’ home institutions. The New Faculty Workshops for chemistry professors include these approaches as follow-up support thought to increase the impact of their workshops (Stains et al. 2015).

Limitations

While these findings are encouraging, they do come from self-reported data. Previous studies have questioned workshop participants’ ability to self-assess their levels of skill with and knowledge of the practices the workshops aimed to teach (D’Eon et al. 2008). One study found that most biology instructors reported using more student-centered techniques after attending a professional development workshop, but observations using an inquiry-focused protocol revealed that their classrooms were still largely lecture-based (Ebert-May et al. 2011). However, these researchers found that the extent of student-centered teaching was inversely related to both teaching experience and class size. In our sample, about half (47 %) of the teachers had 5 years or less of teaching experience, and 94 % of the classes they reported on were 35 students or less, therefore similar to the groups with higher implementation rates in the Ebert-May et al. study.

Our sample may also be comparable to a sample of instructors from workshops designed for physics and astronomy faculty members in their first few years of teaching (Henderson 2008). Following these workshops, only 1 % rated their teaching as “highly traditional” while the majority (roughly 60 %) reporting that their teaching was “mostly traditional with some alternative features.” These participants’ self-reported ratings were corroborated by ratings from their department chairs.

A recent high-level report (AAAS 2012) acknowledged that measuring undergraduate teaching practices remains a difficult endeavor, but also provided examples for how to avoid some common biases in self-reported data. Our survey design is consistent with these examples; for example, we selected a variety of teaching practices, i.e., behaviors, and assessed frequency of use, rather than asking for self-rating of skill or expertise with the practice. In addition, we used three approaches in order to triangulate participants’ reported implementation: direct report, indirect report, and interview; and we found high agreement among them. These three methods all rely on self-report by the same instructors; their reports are self-consistent but may nonetheless be limited in objectivity. Clearly further research is needed to address the validity of self-reported data about teaching strategies by comparison with observation or other external sources of data. Further research is also needed to identify factors that influence the level of implementation for workshop audiences of varied types.

Regardless of the method of data collection, measuring the effect of faculty development workshops on participants’ teaching practices is difficult. In fact, there are few observational studies in higher education, and many self-report-based evaluations do not measure anything beyond participants’ immediate satisfaction with the programs (Felder et al. 2011; Council of Scientific Society Presidents 2012). This study does not seek to capture the full complexity of teaching (Hora and Ferrare 2013) nor does it intend to assess participants’ degree of skill in implementing IBL approaches. Rather, we argue, while the process of becoming skilled with a new teaching style like IBL may take years, the first steps may be shifts in instructors’ choice of instructional strategies. [For examples of this process, see (Gonzalez 2013; Retsek 2013).] Specifically, if the workshops were effective in shifting instructors’ practice, we expected to see decreased use of instructor-centered activities and increased use of student-centered activities—whether or not these were yet implemented with high skill. Therefore, capturing participants’ initial efforts to incorporate IBL practices into their teaching offers a measure of the first type of change to instructors’ practice that may be anticipated as a result of professional development.

Moreover, changes in instructors’ choice of specific teaching strategies can be detected in the short term, after their initial efforts to implement. More nuanced measurement of skill levels and the effects on student learning may not be observable until participants have had more time to practice and hone their craft—well beyond the typical time frame of studies intended to evaluate impact of professional development. Future research should examine instructors’ adoption of inquiry-based learning over longer time frames and the role played by ongoing support from workshop leaders and colleagues.

Certainly one influence on the high implementation rate seen here is that these participants were volunteers who were already motived to use IBL, and indeed, some already had tried it. Moreover, while, on average, participant motivation levels did not change in the long term, motivation did spike immediately following the workshop. This motivational spike may be instrumental in getting participants to start implementing new student-centered strategies in their own classrooms and to continue learning more on their own (Henderson 2008). Generating motivation may be a bigger challenge among instructors who are compelled to participate, for instance in K-12 settings where professional development is often required in order to comply with state and federal standards (Wei et al. 2010).

In addition to internal motivation, external resistance is often cited as a barrier to implementation of student-centered approaches. While some participants in this study worried about resistance to IBL within their departments, on follow-up surveys most reported they had supportive colleagues. Moreover, interview participants made more comments about supports for implementation than about barriers. The fact that all participants could commit a week to attend an IBL workshop suggests that most likely had some explicit or implicit support from their colleagues and departments, or at least worked in an environment open to innovation. Participants may have experienced resistance more indirectly through tenure processes that dissuade innovation in teaching (Brownell and Tanner 2012), or through norms related to course content and those implicit in shared course syllabi (Hora and Anderson 2012).

The participants at these workshops were already motivated to implement IBL and worked in supportive, open environments; that is not true for all mathematics instructors. Yet getting interested instructors to apply IBL teaching methods is an important first step toward wider uptake. Even at the K-12 level, there is little research on teacher professional development involving non-volunteers (Bobrowsky et al. 2001), so learning from the experiences of motivated, supported participants may provide valuable lessons that can then be leveraged to meet the challenge of expanding the use of IBL methods among other instructors who are initially less familiar, supported, or motivated to use IBL. Indeed, motivation is likely an even bigger challenge with non-volunteer participants, and the findings on unfreezing presented here may be especially relevant for these groups.

Implications

Due to the central role of mathematics in many college majors, improving mathematics instruction by fostering broader uptake of IBL and similar evidence-based teaching strategies will have positive ramifications for a very large number of students. Measuring instructor adoption of these teaching strategies is challenging, as self-report measures may be biased and observation is expensive and difficult. However, self-report measures can be carefully developed to reduce bias by focusing on retrospective, anonymous, and non-evaluative reporting of behaviors. Self-report measures are also well suited to allow researchers and evaluators to measure initial changes in behavior within the short timeframes and small budgets of many grant-funded projects when observation may not be feasible.

Our findings imply that future efforts to spread IBL must prepare instructors for dealing with three main concerns: lack of instructor skill to implement IBL, student resistance, and content coverage. Although these recommendations derive from workshops espousing IBL methods in mathematics and taking a particular form, they are applicable to other inquiry-based teaching strategies as well. For example, student resistance is a concern with any teaching strategy that diverges from the traditional lecture-based courses to which students are accustomed (Seymour 2005; Welch 2012), and content coverage has been identified as a major concern with inquiry-based instruction in biology (Yarnall and Fusco 2014).

Across disciplines, workshops are seen as an effective professional development strategy. In fact, NSF program directors interviewed by Khatri and colleagues (Khatri et al. 2013) regard “multi-day, immersive experiences with follow-up interaction with the PI as participants implement the new strategy” as the most effective propagation strategy for educational innovations, and a recent report on improving engineering education likewise lists faculty development as a critical strategy (Jamieson and Lohmann 2012). Our findings support this view with evidence that multi-day, immersive workshops contributed to high rates of implementation, especially when paired with strong and collegial follow-up support. Communicating broad, inclusive definitions from a diverse group of workshop facilitators is also important and clearly related to the workshops’ impact on participants’ adoption of IBL teaching approaches. Defining student-centered teaching in any discipline by its core features and desired outcomes rather than through rigid use of specific techniques may lead to increased adoption as well as help to maintain fidelity of implementation by providing options for how instructors can adapt the strategies to fit their own classes while maintaining high levels of student inquiry.