Introduction

Organizations face two kinds of challenges: technical and adaptive (Heifetz, 1994). Technical challenges are clearly defined, and an expert can readily implement a solution while adaptive challenges are difficult to define and have few readily implementable solutions (Heifetz, 1994). Many of the challenges facing today’s educational organizations are more adaptive than ever before (Duke, 2016), from redressing decades of inequitable learning opportunities (Louis & Khalifa, 2018) to ensuring students have the knowledge and skills needed to successfully meet the challenges of the next grade level (Williams et al., 2018). These two examples typify many of the extant adaptive challenges in education: they likely necessitate shifts in practice as well as in values and beliefs, require coordinated cross-organizational work, and often entail consistent and systematic experimentation over time in order to identify effective solutions (Heifetz & Laurie, 1997). In other words, there are few easy fixes.

In the United States (U.S.), the burgeoning complexity of the country’s educational issues exists within a policy environment that continues to emphasize regulatory compliance, high stakes standardized testing, and rapid improvement (Trujillo & Renée, 2015). Yet, the thinking, approach, and action steps needed to address increasingly adaptive challenges (e.g., Henriksen et al., 2017) do not comport with such a policy environment. Indeed, the policy environment continues to prompt numerous educators—teachers and administrators alike—to view change efforts as a series of technical challenges with quick fixes (e.g., Datnow & Park, 2018; Ho, 2008). As a result of this persistent misalignment, scores of U.S. policymakers and educators continue to reform local educational agencies (LEAs) (e.g., schools, districts), but those reforms have led to few sustained improvements, especially in urban contexts (e.g., Duke, 2016; James et al., 2016; see also Cohen & Mehta, 2017 and Payne, 2008).

A recent and expanding body of work from scholars and practitioners alike (e.g., Mintrop, 2016; Starr, 2019) argues that educational organizations should adopt design-based approaches to organizational change. With roots in organizations outside of education (e.g., product design and manufacturing firms), these kinds of approaches promote the pursuit of change through a process of identifying organizational members’ specific challenges and then understanding potential root causes, collaboratively generating solutions, and iteratively testing and refining solutions to address those challenges (e.g., Henriksen et al., 2017). Such an approach holds promise to develop people’s “adaptive capacity” (Heifetz, 2010, p. 73) to identify and address adaptive challenges.

While there are numerous strategies to implement a design-based approach to change (e.g., research-practice partnerships, plan-do-study-act [PDSA] cycles; see Bryk et al., 2015; Deming, 1986; Hinnant-Crawford, 2020), design thinking is one of the most prevalent. Design thinking offers a structured process for identifying and addressing organizational needs (Brown, 2009; Kelley & Kelley, 2013; Plattner et al., 2012). Despite increasing calls for educators to use design-based strategies like design thinking (e.g., Nash, 2019), not much empirical research has investigated design thinking’s actual use in educational organizations generally and for change efforts specifically (e.g., Hubbard & Datnow, 2020).

The present study responds to this gap by leveraging an existing research-practice partnership between a U.S. state education agency (SEA) and higher education institution (HEI). The SEA launched a professional learning workshop focused on design thinking with the intention of prompting participating educators to consider both technical and adaptive challenges in their organizations and collaborate with a team of colleagues to identify and start to address one (ideally adaptive) challenge. This workshop was part of the SEA’s larger, statewide strategy to enhance educator capacities to improve their LEAs.

Grafting this research onto the SEA’s work, the broad purpose of the present study was to shed light on the understudied phenomenon of educators using design thinking for change efforts. The present study is not an evaluation of the workshop, but rather a rigorous, empirical investigation of participants’ experiences with the workshop and its topics. Specifically, we asked the following research questions:

  1. 1.

    To what extent do educators engaged in change efforts ascribe value to design thinking?

  2. 2.

    How, if at all, does design thinking influence educators’ individual and collective abilities to engage in change efforts?

This U.S.-based study is timely given the growing number of educational organizations around the world allocating resources towards design thinking (e.g., Sarooghi et al., 2019) and persistent calls to address deep-seated, structural challenges in education (e.g., Shirley, 2019). Our results suggest that design thinking presented a novel way to approach school and district change efforts and prompted shifts in participants’ mindsets about change efforts. Yet, many participants continued to identify technical rather than adaptive challenges and remained skeptical about whether design thinking would help sustain change efforts over time. These results, which provide insight into how educators interacted with a design-based strategy, bolster a thin literature on the use of design thinking in LEAs.

The rest of this paper proceeds as follows. To set the stage, we articulate our conceptual framework, which is rooted in Heifetz’s (1994) technical and adaptive challenges model and transformational learning experiences (Mezirow, 1997). We then review relevant literature on change in U.S. schools and districts, change and professional learning, and design thinking. A description of data collection methods and data analysis strategies follows along with a report of results. Finally, we close by discussing results and noting implications of the present study along with recommendations for much-needed future research.

Conceptual framework

To frame this U.S.-based study, we draw upon two concepts that pertain directly to educational change efforts: (a) technical and adaptive challenges, and (b) transformational learning experiences. As introduced above, the first concept stems from Heifetz’s (1994) work, which asserts that organizations contend with technical and/or adaptive challenges when seeking to change. A technical challenge is easy to define and address, even if it demands considerable resources to surmount (e.g., new learning, personnel, or tools). For example, a school’s paper-based master scheduling system may be error-prone, providing students with outdated information about progress towards graduation. To address this challenge, the school deploys a new online master scheduling system that offers students real-time access to course history and graduation requirements. This kind of solution aligns with what scholars term first-order change, which are incremental efforts that seek to improve what already exists (Watzlawick et al., 1974; see also Argyris & Schön, 1978).

An adaptive challenge, on the other hand, is difficult to define and address. For example, a school might have persistently high rates of chronic student absenteeism. To address this kind of challenge, educators need to gather and analyze data about why students are absent and why prior solutions related to parent communication and attendance incentives, for instance, are no longer working. This kind of approach calls for second-order change, which—unlike first-order change—necessitates thinking and acting in ways that challenge “the way we do things around here” (Deal & Kennedy, 1982, p. 231).

Second-order organizational change efforts—those needed to address adaptive challenges—require people to alter their “schemata,” which are the “organizing frameworks” they use to understand, interpret, and respond to the world around them (Bartunek & Moch, 1987, p. 484). Mezirow (1978) suggests that transformational learning can facilitate the alteration of people’s schemata and shift their ways of thinking about and interacting with the world. Clark (1991) describes this kind of learning occurring along three dimensions: (a) the psychological where people develop more nuanced understandings of themselves; (b) the convictional where people adjust their belief systems based on interacting with and reflecting upon new ideas; and (c) the behavioral where people revise how they act because of their new understandings and beliefs. Experiences that foster transformational learning are akin to cognitive rewiring (Hayek, 1945; Strong, 2013), meaning those experiences aim to shift mindsets first so that, later, people use that new mindset to shift their practices.

Applying these concepts to the present study, SEA officials theorized the design thinking workshop as a way to help participants better identify and address adaptive challenges in their LEAs in order to make change (Nash, 2019). Design thinking was intended to challenge—and, ideally, shift—participants’ schemata about organizational change efforts in schools and districts (Kelley & Kelley, 2013). To move participants towards this “perspective transformation” (Mezirow, 1978, p. 107), though, workshop programming needed to target Clark’s (1991) abovementioned dimensions of transformational learning: participants needed to reconsider themselves and their values before being able to take new actions based on those reconsiderations.

Yet, major structural and cultural aspects of educational systems—especially those in the U.S.—can stymie work intended to change educators’ schemata and resulting practices (Murphy, 2015). In the next section, we review relevant literature on several of these aspects, such as common practices and norms along with educator professional learning, and speculate how the design thinking process may enable educators to push back on these powerful aspects.

Review of relevant literature

Traditional educational change in the United States

We start this review by introducing what we term the traditional change paradigm. Underlying many U.S. educational change efforts are a series of practices and norms that are largely perceived to be central to the very nature of schooling. Tyack and Tobin (1994) refer to these practices and norms as the “grammar of schooling” (p. 454) because they shape, structure, and constrain schooling just as grammar shapes, structures, and constrains language. This grammar—and its constituent practices and norms—persists because it provides a framework in which educators can carry out their duties in ways that are expected. Moreover, Cohen and Mehta (2017) suggest that educational change efforts are most likely to be successful when they do not require educators to operate outside of this grammar of schooling—that is, when change efforts comprise incremental modifications to existing practices and norms rather than paradigmatic shifts in understandings, beliefs, and/or actions (i.e., Clark’s, 1991 three dimensions of transformational learning).

For the purposes of the present study, we highlight two practices and two norms that root the traditional change paradigm, especially in U.S. schools. The first practice centers on the use of student scores on standardized tests to measure school performance and drive educational change efforts. A number of school accountability, inspection, or performance policies around the world call for increases in student test scores, and schools that fall short may be labeled “inadequate”, “failing”, or be subject to closure (e.g., Saw et al., 2017). These policies presume that the threat of sanctions helps focus schools’ change efforts and leads to improved performance (Ehren et al., 2015). A second practice is schools creating a plan to develop and guide change efforts, which often takes the form of a school improvement plan (SIP) or school development plan (SDP). Plan development tends to follow the rational planning model, which posits that if a school drafts a plan identifying goals and implementing strategies oriented towards achieving those goals, then school performance (e.g., student test scores) will improve (Beach & Lindahl, 2007). As a result of these practices, educational change efforts tend to be focused on a limited set of narrow school performance measures.

In addition to using student test scores and planning, two norms also help root the traditional change paradigm. The first is the norm dividing teaching and administration, which asserts that the main job of teachers is to teach, and the main job of administrators is to lead (Murphy et al., 1987). Planning for change efforts tends to involve little actual teaching, as teaching is normally conceptualized (e.g., classroom instruction). Thus, the work of planning for change falls to administrators who devise plans that teachers are then expected to implement. A second norm is the managerial imperative, which asserts that any work in schools occurring outside classrooms is the domain of administrators rather than teachers (Cuban, 1988). While some change efforts occur in classrooms (e.g., implementing curriculum), the development of those efforts occurs outside classrooms (Duke et al., 2013). This arrangement reinforces the idea that administrators—not teachers—are responsible for planning change efforts. As a result of these norms, educational change efforts tend to be developed and driven mostly by administrators and “pushed down” (Lyons et al., 2013, p. 17) to teachers and other staff members, which can lead to job dissatisfaction, burnout, and eventual departure from the education profession (Ingersoll & May, 2011; Nguyen et al., 2020; see also Academic Development Institute, 2022, which reports findings from the annual North Carolina Teacher Working Conditions Survey).

Layered into these practices and norms is the increased expectation that education-focused governmental entities like SEAs provide technical assistance to individual schools and districts—a mandate evident in numerous U.S. federal educational policies (Egalite et al., 2017). One goal of technical assistance is to enhance the capacity of LEAs like schools and districts so that, eventually, LEAs exercise more autonomy and ultimately direct their own change efforts (Howley & Sturges, 2018). However, SEAs often lack the capacity necessary to augment the capacity of the LEAs they oversee, and the nature of SEA assistance tends to be isomorphic, even across varied geographic, socioeconomic, and resource contexts (Brown et al., 2011; Calkins et al., 2007). Specifically, SEAs tend to employ similar practices in service of educational change efforts, such as providing information on best practices, coordinating professional learning for school administrators and teachers, and creating guidance on working with external providers (e.g., McMurrer et al., 2011).

Taken together, a pattern emerges regarding the inherent challenges of schooling which require change as well as the typical characteristics of those changes. School contexts are increasingly complex in terms of need and capacity, as well as myriad contextual factors that shape students’ readiness for learning. The modal teacher is also perhaps less experienced and less academically accomplished than at most any point in the history of U.S. schooling (García & Weiss, 2019); yet, these teachers often bear more responsibility for not only student learning gains, but tasks adjacent to traditional schooling, such as students’ social-emotional well-being (e.g., Bartlett, 2004; Ransford et al., 2009).

Returning to Heifetz’s (1994) framework, these realities come together to create a series of adaptive challenges with no clear, “silver bullet” solutions (Eisner, 1992, p. 722). While adaptive challenges of schooling may be similar across contexts—low student achievement, teacher burnout and turnover, community distrust of schools—the solutions needed to address these challenges tend not to be standardized. Indeed, an examination of the traditional change paradigm shows that, for decades, technical solutions have been used to address adaptive challenges, which are unlikely to be ameliorated through the application of those technical solutions (Hargreaves, 2004; Harris, 2013). This misalignment, which the grammar of schooling helps sustain (e.g., Cuban, 2020), is one of the reasons why scores of educational reform efforts in the U.S. have led to little sustained change (Duke, 2016; James et al., 2016). To foster the kind of activity needed to identify and address adaptive challenges, Heifetz (2010) argues that people need to develop their “adaptive capacity” (p. 73). One way to build this capacity—especially for practicing educators—is through professional learning, which the next section discusses.

Professional learning to build adaptive capacity

Adaptive challenges, by definition, have few, if any, readily implementable solutions. Organizations that can identify and address adaptive challenges have leaders that create the conditions for individuals to collaboratively brainstorm, craft, and implement possible solutions (e.g., Heifetz et al., 2009). Professional learning that equips educators to understand and address adaptive challenges is most likely to occur in LEA contexts where administrators (e.g., principals) commit to increasing their staff members’ social, emotional, and interpersonal capacities (Drago-Severson et al., 2014). That is, LEA administrators cultivate a workforce skilled in adaptive problem-solving because they recognize those skills as essential to organizational change efforts. Additionally, because adaptive challenges are unique to particular organizations, addressing them requires situated learning (Brown et al., 1989; Lave & Wenger, 1991). Educators who take on adaptive challenges do best when that process is collaborative, involves gathering data specific to the organization and its particular challenges, and uses resources already inherent to the organization (Tyre & Von Hippel, 1997). This kind of professional learning, then, aligns with the characteristics of quality, efficacious professional learning that is most likely to result in successful and sustained organizational change (Dobbs et al., 2017).

Professional learning (or, in some contexts, professional development) can be characterized as an approach to organizational problem-solving (Eraut, 2012). In LEAs, professional learning typically helps to strengthen professional capacity, systematize processes, or undertake investigations of data for eventual instructional decision-making (e.g., Osmond-Johnson & Campbell, 2018). Much educator professional learning in the U.S., however, has long been toothless in the service of successful and sustained change efforts because it fails to meet teachers’ needs, tends to be decontextualized, and lacks ongoing implementation support (Dobbs et al., 2017; Hubbard & Datnow, 2020). Heifetz’s technical and adaptive challenges model is thus a helpful lens to (a) understand the nature of challenges inherent to today’s LEAs and the solutions needed to tackle those challenges, and (b) help organizations (e.g., SEAs) and administrators (e.g., principals) structure professional learning that creates the conditions for teams of educators to “confront existing values and norms” (Daly & Chrispeels, 2008, p. 33) and revise the “organizing frameworks” they use to develop and implement change efforts (Bartunek & Moch, 1987, p. 484). One tool that many organizations outside of education, in particular, have used to build adaptive capacity is design thinking.

Design thinking

Design thinking, which emerged from scholarship on collaborative problem-solving and engineering education (Dunne & Martin, 2006), is a process of discovery characterized by creative, innovative solutions to challenges (Wrigley & Straker, 2017). Early design-based developers realized that people’s schemata are powerful indicators of organizational problem-solving capacity, so design-based strategies like design thinking prime people to engage simultaneously in both creative and analytic processes (Razzouk & Shute, 2012).

The design thinking process consists of five stages: empathize, define, ideate, prototype, and test (Plattner et al., 2012; see Fig. 1). The first stage (empathize) involves acquisition of a rich understanding of the needs of organizational members. The second stage (define) requires collective sensemaking around the varying needs to identify a challenge and potential root causes of that challenge. The third stage (ideate) champions a “nothing is off the table” or “yes, and” stance where all strategies that might address the challenge, especially novel ideas, are considered. The fourth stage (prototype) is a call to action. With a context-specific challenge defined and an established set of potential strategies, something needs to be done—be it developing a program or shifting norms. This “something” is a prototype. In the final stage (test), a prototype is implemented in its context and organizational members gather feedback on implementation to examine the extent to which the prototype addresses the challenge.

Fig. 1
figure 1

The design thinking process. Note. The five stages of the design thinking process, which typically proceeds linearly from Stage 1 to Stage 5. Dashed arrows, however, indicate the process’s flexibility depending on organizational and member needs. Developed from Plattner et al. (2012)

Definitionally, design thinking is iterative, solution-oriented, and evidence-based (Liu, 1996). The process contains both content and process factors—that is, procedural elements that combine sequential activities with sufficient flexibility for application in diverse organizational contexts (Owen, 2007). From its origins in engineering, design thinking is now commonly used in fields like information technologies, communications, and consumer services, often helping organizations identify and address challenges related to product development, knowledge transfer, and team creativity (Schmiedgen et al., 2016; see also Razzouk & Shute, 2012). Those engaging with the design thinking process have reported several benefits to their organizations, colleagues, and themselves, including greater inter-unit cohesion, more efficient decision-making, and more competitive marketplace positioning (Schmiedgen et al., 2016).

Design thinking’s use in education

Despite its prevalence in other fields, design thinking remains a reasonably new phenomenon in education (Hubbard & Datnow, 2020). Over the past decade, an increasing number of practitioners and researchers have called for educators to engage with design thinking to address a range of issues, such as breaking down the “research/practice barrier” (Fishman et al., 2013, p. 136) and fostering collaboration to implement change efforts (Denver Public Schools, n.d.; Nash, 2019). Yet, outside of a handful of case studies (Hubbard & Datnow, 2020; Phusavat et al., 2019; see also Sterrett et al., 2021), the peer-reviewed, empirical research base addressing the use and efficacy of design thinking in educational organizations is rather scarce.

While current insights from educational organizations are limited, the literature on design thinking suggests that the process holds promise to help educators build their adaptive capacity (Heifetz, 2010) for engagement in educational change efforts. The structure and execution of the design thinking process may disrupt the traditional change paradigm in three notable ways. First, design thinking’s emphasis on creativity when identifying challenges (Nash, 2019) may encourage a wider view of LEAs, students, and educators that involves more than narrow outcomes like student test scores. Second, the process front loads a deep consideration of a challenge’s root causes (Henriksen et al., 2017), which counters existing research that has found root cause analyses in plans for change efforts to be shallow, of low quality, and misaligned to organizational goals (VanGronigen & Meyers, 2020; Meyers & VanGronigen 2021). Richer root cause analyses may better enable educators to develop and implement more effective change efforts. Third and finally, design thinking calls for a high degree of collaboration among varying organizational members (Wrigley & Straker, 2017), which blurs lines between teachers and administrators established by the norm dividing teaching and administration and the norm of the managerial imperative.

Heifetz and colleagues (2009) assert that organizational change is an adaptive—not technical—process. Yet, scores of education reforms over the past decades have been replete with technically-oriented perspectives and solutions, often prompted by the powerful influences of the grammar of schooling and the traditional change paradigm (e.g., Cohen & Mehta, 2017; Cuban, 2020). Design thinking, however, offers a way for educators to develop adaptive solutions that address persistent and seemingly intractable challenges. Extant gaps in the literature, though, necessitate more investigation of the process’s use in education and by educators, especially with respect to educational change efforts. The present study was intended to start filling some of these gaps and help substantiate a thin empirical research base.

Methodology

The purpose of the present study was to analyze educators’ interactions with and use of design thinking for change efforts. To accomplish this purpose and answer our two research questions, we employed a convergent mixed-methods design (Creswell & Plano Clark, 2011). This approach leverages “methodological triangulation” (Morse, 1991, p. 120), which calls for using at least two different types of methods to examine the same research questions and to enrich the interpretation of results.

Study context and sampling

The context for the present study was a year-long professional development workshop sponsored by a mid-Atlantic SEA in the U.S. The workshop’s purpose was two-fold: (a) teach participants about the design thinking process and how it could be used for organizational change, and (b) coach participants to identify a challenge in their context and use design thinking to start addressing the challenge. During the 2018–2019 academic year, the SEA released a call for applications from schools and school districts. Applications required two primary items: (a) a list of up to eight people who would constitute the design thinking workshop team members, including one person in an administrative role (e.g., assistant principal) to serve as the team’s “positional leader”; and (b) a description of a potential challenge in their school or district. SEA personnel reviewed applications and selected nine school-based teams and one district-based team for a total of 66 educators across 10 teams.

The SEA contracted with an external provider—the Leadership Collaborative (LC)—which assigned eight facilitators to lead the workshop. Participating teams met in-person for four full-day sessions and their work between sessions focused on applying learning from those sessions (e.g., learn about design thinking’s empathize stage and then conduct empathy interviews). Facilitators conducted two site visits with each team—one in the fall of 2019 and one in the spring of 2020. Site visits helped facilitators to see teams’ enacted work and to coach teams on refining implementation efforts.

To gather wide-ranging perspectives about the workshop, we used a stratified purposeful sampling strategy (Patton, 1990). This approach allowed the research team to capture variation and recognize that a “common core” (p. 174) may emerge within and across participant groups (e.g., team members, facilitators). Our final sample consisted of 50 educators across all 10 teams with seven school-level administrators (e.g., assistant principal), 35 teachers and school-level staff members (e.g., guidance counselor), four district administrators (e.g., associate superintendent), and four district staff members (e.g., curriculum specialist). We also sampled all eight LC facilitators. In sum, 58 participants provided data for the present study.

Data sources and data collection procedures

We collected data from three sources: surveys, interviews, and documents.

Surveys To collect data, we devised a series of three surveys consisting of short-answer items (Rea & Parker, 2005) about the design thinking process. Prior to the first workshop session, we administered a pre-workshop survey of 14 short-answer items regarding the 50 workshop participants’ expectations of the workshop and beliefs about the relevance of the first two design thinking stages to their context (N = 38; 76% response rate). We then administered two additional short-answer surveys: one after the third workshop session that focused on participant learnings of the last three design thinking stages (N = 35; 70% response rate), and a second after the final workshop session that asked about the relevance of those final stages (N = 35; 70% response rate). Separate from our research team, the LC also administered its own anonymous feedback surveys after each workshop session. These four surveys followed the same format and asked participants about the workshop’s quality and benefits along with suggestions for improving future workshops. Response rates were 72%, 72%, 52%, and 56%, respectively (N of 36, 36, 26, and 28, respectively). We also reviewed these survey results to enrich the analysis of the surveys we administered.

Interviews To triangulate survey data (Morse, 1991), two members of the research team conducted semi-structured interviews (Patton, 1990) with all willing participants in the spring of 2020: six workshop participants and seven LC facilitators (N = 13). Interviews with six additional participants were scheduled, but ultimately cancelled due to the COVID-19 pandemic. All interviews were completed via videoconference and ranged from 31 to 46 min (M = 41). Interview questions for each participant group centered on the following: for workshop participants, their experiences during workshop sessions and in implementing design thinking in their context; and for LC facilitators, their experiences leading sessions and conducting site visits. Interviews were audio recorded and transcribed using a third-party service.

Documents The SEA provided the research team with related documents from the 10 teams, which included applications and team member roles in their context (e.g., teacher) as well as a description of the challenge they planned to address through the design thinking process.

Data analysis

Since we leveraged only short-answer data from the various surveys, no scales required validation. Short-answer items were analyzed in Nvivo 12 using methods akin to grounded theory, such as an open coding scheme and the constant comparative method to combine and collapse participant responses (Charmaz, 2014). Since only workshop participants completed surveys, we also examined short-answer items for similarities and differences within and across the 10 teams as well.

To analyze interview transcripts and documents, we used an integrated coding scheme consisting of both deductive and inductive codes (Bradley et al., 2007). Deductive codes were drawn from the conceptual framework, literature review, and research questions (e.g., adaptive challenges, design thinking process stages). Inductive codes were derived from open coding the transcripts and documents. One research team member began analysis by devising an initial deductive coding scheme and then two research team members randomly selected three transcripts to code separately using that deductive coding scheme. These team members then met to discuss preliminary “noticings” (Braun & Clarke, 2013, p. 204) and deductive coding results among the three transcripts. After this discussion, the two research team members separately inductively coded the initial three randomly-selected transcripts.

We then met a second time to discuss open coding results and potential areas that the deductive coding scheme might have missed (e.g., influence of accountability and inspection policies). As a result of this second discussion, we finalized the deductive coding scheme and agreed to inductively code all remaining transcripts and documents to preserve participant voice. Two research team members then randomly divided up the remaining 10 transcripts and documents and met regularly to review coding results and achieve consensus about coding disagreements (Saldaña, 2009). After completing coding, we engaged in the data reduction process to refine codes, remove redundancies, and merge deductive and open codes into larger axial codes (Miles & Huberman, 1994).

Methodological limitations

Three methodological limitations warrant mention. First, all participants opted into the SEA-sponsored workshop. Thus, results might be more positively skewed because of greater willingness among participants, on average, to seek out new professional learning opportunities. Second, we were unable to collect data from every workshop participant, most likely because of survey fatigue and a lack of prior relationships between some participants and the research team. Consequently, non-response bias might be an issue if participants who completed surveys and interviews, on average, were different from non-participants. Finally, we were nearing the end of conducting interviews at the onset of the COVID-19 pandemic; thus, we had to truncate interview data collection, which may have reduced some of the research team’s ability to triangulate data.

Results

In this section, we report results for our two research questions. Since the present study used a convergent mixed-methods design, the forthcoming sections integrate findings and results from all three data sources (see Johnson & Onwuegbuzie, 2004).

Research question 1: Ascribing value to design thinking for educational change

Our first research question asked about the extent to which workshop participants ascribed value to design thinking, particularly with respect to the process’s use in LEAs. As subsequent sections describe, participants offered competing ideas about the use of design thinking for change efforts in their LEAs. While the process prompted teams to challenge aspects of the traditional change paradigm, design thinking proved to be a difficult fit with several prevailing practices and norms of the education profession.

Challenging the Traditional Change Paradigm Despite some participants feeling that design thinking was not “new,” many others alluded to ways that the process challenged the traditional change paradigm. After leading two workshop sessions, one facilitator offered this reflection:

[You’re] trying to take people through a process that is not intuitive. It is a very different way of [...] getting to a solution. This process is much more iterative. And you really have to delve into folks’ understandings of schools—kind of their inner workings—to really draw out what the challenge is.

Other participants substantiated this reflection by asserting that their experiences with “normal” change approaches were often top-down and prescriptive. Design thinking’s emphasis on testing a prototype, gathering implementation feedback, and refining the prototype lowered the need to identify the “correct” solution in the first attempt. Collecting feedback presented an alternative way to think about change. As another facilitator said, “Part of what the process is trying to do is to teach you that it’s okay to be unsure if something will work. You can always go back.”

Design thinking presented participants with ways to reframe existing challenges and to identify new challenges. The workshop’s opening session introduced Heifetz’s (1994) technical and adaptive challenges model, stressing that technical challenges needed only easy fixes while adaptive challenges were more difficult to define and address. One district-level participant asserted that, “when you think about [a district], there’s very few technical problems at this level. Everything’s pretty complex. Everything becomes kind of adaptive.”

Using Heifetz’s model as a reflective tool, workshop participants lamented that many prior change efforts in their contexts were technical—not adaptive. The desire for “the quick fix” or “the silver bullet” prevented some participants from acquiring the perspective necessary to better understand their organization’s challenges before brainstorming and adopting solutions. Their challenges—such as low student engagement or the perceived perfunctory nature of meetings—were cast in a new light. One facilitator summarized a team’s efforts with a communication challenge:

To say that communication isn’t good, well, okay, what does that mean? [...] Is it the way we deliver communication? Is it how frequently you were communicating? Is it people aren’t understanding what’s being said? It just deserves so much more conversation than just, ‘Communication isn’t good.’ And this process really forced the teams to really think about, “What do we mean when we say, ‘That’s our challenge.’”

Drawing further on Heifetz’s work, one facilitator suggested that many challenges in education are often treated “as though it’s a technical problem when it really is an adaptive problem, and we need more of an adaptive solution for it.” Adaptive challenges were complex, one district-level workshop participant mentioned, and even more so the solutions needed to address them. Developing adaptive solutions required a better understanding of a challenge.

To develop this understanding, design thinking compelled workshop teams to “slow down” to identify potential root causes of those challenges. One teacher said, “We always talk about getting to the root cause, but I’m not sure we ever had [professional learning] on how to get to the root cause.” Some participants argued that educators rush to adopt solutions before developing a rich understanding of challenges. A facilitator explained: “[With design thinking,] you got be patient. You got to let it play out because a lot of people don’t. If you’re really task-oriented, you’re not loving this slower process.”

Other facilitators echoed the need to push some teams to generate “rich understandings” of the challenges they were discussing. To one facilitator, a unique advantage of design thinking saw teams “[sit] with not just the problem in the building, but how the problem in the building is really evidence of a broader thing.” This expanded lens enabled workshop participants to consider a more diverse set of root causes to help create more contextually appropriate change efforts.

To actually reconceptualize challenges as more adaptive than technical, workshop participants needed perspective from their colleagues. Another divergence from tradition centered on how design thinking broadened the number of voices involved in change efforts, typically through empathy interviews. The first stage of the design thinking process—empathy—asks participants to interview end users (often teachers) to understand their needs and challenges. As one facilitator put it, some teams discussed the power of the question, “What do […] people need?”.

After speaking with numerous teachers, workshop participants felt they obtained a better understanding of their LEA’s challenges and, in turn, increased the likelihood of devising solutions that their colleagues would actually implement. A design thinking-based change approach became a responsive process that adapted to an LEA’s specific and evolving needs rather than one consisting of a static SIP that was only reviewed at year’s end (see Duke et al., 2013). To one facilitator, gathering others’ perspectives helped produce “much more nuanced” solutions that were more tailored to participants’ unique contexts.

Difficulty in Shifting Away from Tradition Several participants touted design thinking as a coherent process for implementing change efforts. Despite perceived usefulness, facilitators and workshop participants wondered whether sustained change would really occur. One district-level participant offered the following reflection:

[The workshop was] challenging some of our decision-making, right? [...]You’re working with design thinking, and you’re wanting to ideate in a way that is innovative and creative. And then, as soon as you say something, someone says, ‘Well, but you can’t because of the teacher contract.’ Or, ‘You can’t because of this or that.’ ‘It’s the school calendar. There’s X number of days in the school calendar, and if we want to have more professional learning, we’ve got to pay people.’ So, I think we still get hung up on some of those real constraints or even perceived constraints.

While facilitators did their best to account for constraints—real or perceived—some felt that the workshop nevertheless cued participants to look critically at change processes in their contexts. One lamented that the harried nature of educator work forced workshop participants to focus on near-term “fires” rather than “grapple enough with the socio-cultural stuff that’s going on,” such as racial disproportionalities in student discipline. Another facilitator echoed, “[Depending on] the people you’re serving, you may have to change drastically the way you’re serving those folks,” and “the way we’ve always done things [may not be] the right way.”

Conceptualizing a new “right way” is difficult, this facilitator continued, because “educators, as a whole, are rule followers.” When prompted to discuss rule following, another facilitator shared this general assessment of educational systems more generally:

We have this model of education that’s been going on forever, and we have all these structures in place that we just keep them going. We never think about another possible structure of how things can be. [...] I just feel like we’re so used to, ‘This is what school is. This is how you do it.’ And we don’t ever get out of our own way to do things differently (Interviewer: The hamster wheel?) Yeah, and then when you want to do something differently, people question you. [...] We’re still not getting out of the box and thinking about how kids learn.

While the gravitational pull of the “wheel” of education may force educators to stay in a technically-oriented “box,” nearly every workshop facilitator and participant concluded their interviews by expressing appreciation for the workshop. As one facilitator said, “Once they say, ‘This is a challenge’, they’re going to have to be responsible for addressing it.”

Research question 2: Design thinking and individual and collective abilities for change

Our second research question asked if workshop participants perceived design thinking as a way to influence their individual and collective abilities to engage in organizational change efforts. The SEA, in particular, was keen on an answer to this question given their larger agenda for educational leader professional learning. Participants centered discussions of their abilities in two domains: mindset and practice.

Breaking the “Wheel” For approximately 90% of participants, this workshop represented their first exposure to design thinking as an approach to organizational change. The process’s novelty inspired marked mindset shifts in participants. Additionally, many commented on the value of a change approach that originated outside education. One middle school mathematics teacher told us that, “if you, right now, didn’t tell me this was not from schools, I would not have known that. […] I feel like it just should be the way people are thinking.” While participants responded to the workshop with both skepticism and enthusiasm, they largely agreed that the workshop prompted them to think differently about change. One participant said:

We get a lot of problems over and over and over that are the same problems over and over and over. And you get stuck in this wheel of, ‘When this happens, this is what we do. When this happens, this is what we do.’

Design thinking offered participants a way to start dismantling “this wheel.” For several workshop participants, gathering data from end users to complete a system map permitted teams to “truly understand versus just check the box that we did.” The map helped teams “see patterns” and build “that connection” across multiple challenges residing within a single context. Facilitators subsequently coached workshop participants to home in on one part of the system map for their initial change efforts. Breaking down the “big issue” of change into more manageable parts made one participant feel “a lot less gloom and doom.” Another non-administrator participant said the system map helped pinpoint “ways that we can affect change in my small kind of locus of control [as a teacher].” Many facilitators reported observing rejuvenated agency among participants to break “this wheel” that many felt, for years, had plagued their abilities to develop and implement change efforts.

As the workshop progressed and teams began implementing solutions in their contexts, participants saw their system maps vivified. One participant noted:

Every part of what you do is interconnected. When you change one thing, it’s not only changed in that area. There is a ripple effect that affects everything. If you don’t think through all of those pieces, you’re going to be surprised—and typically not pleasantly.

Despite facilitators’ encouragement that teams adopt a systems perspective and consider the “ripple effect” that follows solution implementation, some participants needed consistent reminders to use an adaptive lens because they were “still not getting out of the box.” One facilitator commented that teams identified an “overwhelming number of technical challenges” and progressed less on “getting into what would be the nitty gritty of an adaptive challenge”:

How is creating a [schoolwide] communication system going to change classroom teaching? People may be able to find things easier, but is it really going to change mindsets and how we do business? [...] [Are professional learning communities] [...] going to get them to think about and look at data differently?

When workshop teams presented their selected challenges to one another, one district-level participant reflected:

[I feel] like, as groups presented, we were still doing the things that schools traditionally have done. My wonder with that is, like, is it simply because all these teams are engaging in this process for the first time, and it’s just really hard to even break that cycle of thinking about issues and solutions in the standard box?

While the “wheel” or “standard box” exerted pervasive influences on participants’ work, facilitators expressed optimism about the workshop’s effects on participant mindsets and, in turn, future actions. Even if participants failed to identify adaptive challenges and develop adaptive solutions to those challenges, one facilitator asserted that “the data finding, the discussions, and the conversations will move the schools forward because discussions move mindsets, and mindsets move actions. I believe things will change regardless of whether or not they have [something] to show for it.”

Mindsets Move Actions The notion that “mindsets move actions” offered a useful way to frame participants’ summaries of design thinking’s influence on their practice. While change efforts were still in early stages (and the COVID-19 pandemic commenced shortly after the workshop’s conclusion), participants articulated several ways that design thinking altered their practice as educators. The process’s foundational focus on empathy proved to be a watershed experience for numerous participants’ practice. The charge to gather end user perspectives enabled administrator workshop participants, in particular, to receive—as one principal put it—“very candid feedback” about their leadership practice. A facilitator recounted one team’s processing of survey results about their principal with the principal present:

It’s a direct hit to a principal if you talk to them about school climate, right? That’s their responsibility. They know it. Hard for them to make excuses. Why aren’t their teachers buying in? [...] [The principal] actually said to me, ‘You know, I never looked at things this way before. It’s hitting pretty hard, but we can fix this.’

Another school’s leadership team distributed an anonymous survey and learned that their “intensity,” “rush,” and “excitement” about change efforts made teachers feel “dreadful” and overwhelmed, which neither the principal nor assistant principal “ever recognized.” As a result, that leadership team now uses “different” language and “introduce[s] things in a way that seems manageable and we build excitement together.”

These experiences with feedback collection via the design thinking process encouraged participants to create new data collection protocols to help keep a “pulse” on their contexts. One district-level participant, for instance, worked with a facilitator to design and host a series of focus groups asking teachers and administrators about a proposed curriculum framework. Rather than use a predetermined list of questions to perhaps seek confirmation, the participant posed one question—“What’s your initial reaction to this?”—and received feedback targeting “everything under the sun.” Another participant highlighted that creating and instituting these kinds of feedback-gathering mechanisms permitted administrators to “have [more] conversations upfront, [which] has saved us a lot of headaches on the back end.”

The design thinking process’s second stage—defining—induced another shift in participants’ practice: digging deeper. Collecting data from end users and constructing a system map allowed participants to better identify organizational challenges and potential root causes of those challenges. In a representative sentiment, one participant summarized:

When you see a problem, you’re seeing the effects of the problem. You have to dig a lot deeper to figure out what the problem actually is, and that took us a long time during this process. And we’ve been able to apply [design thinking] to other things we’ve seen. ‘Wow, we see this is a problem. What’s underneath it?’ That [approach is] sort of our first go-to now. [...] It’s kind of like a tree with roots. You see the tree, but you have to fix the roots.

Engaging in systematic, disciplined root cause analysis allowed teams to develop solutions targeting a challenge’s more covert roots (e.g., double-booking the same teachers for hallway duty and lunch duty) rather than its more visible symptoms (e.g., messy bathrooms).

Digging deeper, some workshop participants argued, incited another shift in practice: diversifying the voices involved in change efforts. While the empathy stage required participants to solicit feedback, it was the define stage in which participants applied that feedback. As the workshop progressed, teams tested their solutions, which provided their non-workshop colleagues (e.g., other staff members in their LEAs) with evidence that the feedback those colleagues provided earlier was actually reviewed and used to construct solutions. Participants from three of the nine schools shared that this overt display of staff member feedback use spurred their colleagues to offer feedback about other challenges. One participant said:

Now, our teachers are saying, ‘Oh, well, once we get this figured out, have you thought about this?’ People are actually bringing things to us. […] [They are] definitely more willing participants [in the change process].

This shift to solicit and incorporate colleagues’ feedback in change efforts held promise for LEAs to be more responsive to more people, which they felt increased the likelihood of buy-in and sustainment of change efforts.

Discussion

Our results prompt a number of interesting points about design thinking’s use for educational change. First, we discuss design thinking in the context of educational change and then consider the process with respect to our conceptual framework on technical and adaptive challenges and transformational learning theory. We conclude with implications for practice and policy along with recommendations for future research.

Design thinking for educational change

The grammar of schooling and the traditional change paradigm represent “the way we do things around here” (Deal & Kennedy, 1982, p. 231) with respect to educational change efforts. Participants in the present study labeled the grammar and the paradigm as the “standard box” or a “wheel”—terms that implied rigidity and repeated activities that yield the same ineffective results. Our results suggest that design thinking, conversely, offered a different way for participants to think about and engage with change. The process contrasted from the traditional change paradigm in several ways, most notably by emphasizing collaboration and empathy.

Workshop teams consisted of both administrators and non-administrators, which countered both the norm dividing teaching and administration and the norm of the managerial imperative (Cuban, 1988; Murphy et al., 1987). Teachers, for instance, worked alongside assistant principals or assistant superintendents as teams went through the five stages of the design thinking process. In fact, our interview data indicated how a few positional leaders appreciated having non-administrators around the table because those people were often closer to the actual implementation of many change efforts. Our results suggest that design thinking helped democratize the process of planning for and implementing change, which aligns with other studies on design thinking in educational contexts (Hubbard & Datnow, 2020) and a considerable body of work that has long called for educators other than administrators to be involved in change efforts, especially teachers and teacher leaders (Bond, 2021; Hallinger, 2011; Muijs & Harris, 2006; Wenner & Campbell, 2017).

The design thinking process prioritized empathy as a foundation for change efforts, and this priority resonated with numerous participants. These perspectives suggest that the traditional change paradigm may not sufficiently attend to the human side when planning and implementing change (see Mintrop, 2016). Using learnings from the literature on action research methods (e.g., Dick & Greenwood, 2015), participants in the present study intimated that the traditional change paradigm promotes an approach of “done to, not done with”—meaning that others, such as policymakers or senior administrators, issued directives that subordinates (i.e., workshop participants) were then expected to implement. The first two stages of the design thinking process, however, prompted workshop participants to gather the perspectives of end users in order to learn about their specific needs and challenges. The traditional change paradigm tends to exclude non-administrators from leading change efforts, so the act of sitting down with some of those very non-administrators (e.g., teachers) led to workshop participants’ broadened and deepened understanding of their LEAs. This feedback proved to be valuable in helping workshop teams devise and test prototypes that end users would perceive as both legitimate and worthwhile. Action research literature (e.g., Dick & Greenwood, 2015) asserts that this collaborative approach—“done with, not done to”—can lead to better change outcomes in organizations.

These two characteristics of a design thinking-based approach to educational change—collaboration and empathy—challenged the traditional change paradigm. Workshop participants arrived at richer understandings and developed more contextualized solutions because they were in teams, which aligns with other research on LEA improvement efforts both within (e.g., Olsen & Chrispeels, 2009) and outside (e.g., Bush & Glover, 2012) the U.S. educational system. Additionally, this kind of approach to educational change is likely better able to help educators around the world contend with the continued effects of the COVID-19 pandemic on LEAs, students, and communities, which has surfaced numerous adaptive challenges (e.g., student well-being, resource inequities, hybrid teaching modalities).

Design thinking as a transformational learning experience

In our conceptual framework, we posited that the design thinking workshop—as a professional learning experience—could create conditions for educators to engage in transformational learning. This kind of learning can induce a “perspective transformation” (Mezirow, 1978, p. 107), so this section considers our results using Clark’s (1991) three dimensions of transformational learning: psychological, convictional, and behavioral.

Psychological dimension. Our results showed that numerous participants reported developing more nuanced understandings of themselves as participants in educational change efforts. The workshop’s various sessions offered participants considerable time to be physically away from their LEA buildings, which provided both metaphorical and literal separation from the day-to-day bustle of schooling. Separation allowed participants to stop, think, and reflect on the workshop’s charge to identify and address a challenge in their contexts. This “thinking slow” (Kahneman, 2011) helped people reflect and deliberately seek out new or missing information. Moreover, as workshop participants conducted interviews with colleagues during the empathy stage, they were able to further refine understandings of themselves and their colleagues.

Convictional dimension. Individuals’ beliefs—when coupled with their understandings of themselves—come to life in their “schemata” (Bartunek & Moch, 1987, p. 484). Throughout surveys and interviews, participants implicitly and explicitly described beliefs about change, the U.S. educational system, and how the workshop altered some of those beliefs. Design thinking prompted some participants to revise their beliefs that “ordinary” teachers—not just administrators—could help their LEAs address a system-wide challenge (e.g., chronic absence). Others noted modified beliefs about the change process in LEAs, specifically that large-scale efforts could be broken down into smaller implementation cycles. While perhaps not novel to continuous improvement scholars (e.g., Tichnor-Wagner et al., 2017), the notion that some participants viewed change efforts as one large, all-or-nothing endeavor speaks to the persistent influence of the grammar of schooling.

Behavioral dimension. Despite many participants’ positive reactions to design thinking and reported shifts in understandings and beliefs, we found few overt instances of participants putting their new mindsets into practice. While workshop participants did share examples of using design thinking’s empathy stage to gather perspectives on other challenges, many of their actions represented technical challenges and technical solutions. Facilitators lamented having to repeatedly push participants’ thinking away from more technically-oriented challenges, such as creating a new communications system, and towards more adaptively-oriented challenges. The kinds of challenges that most workshop teams identified were firmly first-order—not second-order—organizational changes. Consequently, the educators working in many participants’ contexts would not need to think and act in new ways—they could continue to make small improvements to existing systems (see Argyris & Schön, 1978). This result, more than many others, raises concerns about design thinking’s ability to challenge the grammar of schooling and the traditional change paradigm. It was one thing for participants to report changed understandings and beliefs, but it was entirely another thing for them to act in new ways based upon those changes—and we saw few actions aligned with addressing adaptive challenges.

Implications for practice and policy

Based on our results and discussion, we offer implications for practice and policy. A first practical implication is to recommend that LEAs consider a design-based strategy like design thinking as a mechanism for transformational learning and organizational change. While the present study did not find substantive change in participants’ practice as educators, many participants did report shifts in their mindsets and indicated that—apart from the workshop—they would have had neither the time nor space to consider deeply their organizational challenges. Thus, LEAs and other agencies (e.g., education ministries) could leverage something like the design thinking process to help educators reflect upon their current assumptions of and beliefs about change. At base, this approach could stimulate the cognitive rewiring needed to counter the grammar of schooling’s continued influence. Design thinking holds promise to stimulate changes in educator mindsets in both tightly-regulated educational contexts like the U.S. and other contexts with less regulation.

A second implication for practice centers on fostering favorable conditions for change. Our results suggest that design thinking helped democratize the change process and decenter it from positional leaders. While not a novel result by any means, we consider it here in light of Duke and colleagues’ (1980) notion of involvement versus influence with respect to LEA leadership and management. Many educational change efforts may call for involving those without positional authority, such as teachers or staff members—but those educators may end up exerting little actual influence over change efforts (see Brezicha et al., 2020). In the present study, the design thinking workshop created an entirely new environment for educators of all kinds to gather around a single table to identify challenges and develop solutions. Based on our results, the time and space offered by the workshop helped a wide array of educators be both involved in and influential to the process of defining organizational challenges and developing solutions. Educational leaders—officials that school accountability and inspection policies charge with leading change efforts—may see great benefit in using a design-based strategy like design thinking to reshape how their LEAs conceptualize and conduct the change process.

A first policy implication applies to school accountability and inspection policies. Some accountability contexts champion “rapid improvement” in LEAs, especially LEAs labeled “underperforming” (e.g., Johnson, 2013). This policy environment promotes what Bryk (2015) calls “solutionitis,” which is when educators “jump to implement a policy or programmatic change before fully understanding the exact [challenge]” (p. 468). Our results showed that the design thinking process compelled workshop participants to slow down when engaging with change efforts, which, in turn, encouraged deep, creative, and collaborative thinking. School accountability and inspection policymakers should ensure their policies create an environment that recognizes the “messy” (Harris & Jones, 2017, p. 638) work of change and incentivizes educators to spend more time understanding and defining challenges before addressing them.

A second policy implication relates to professional learning experiences facilitated or sponsored by agencies like SEAs and LEAs. Many participants indicated that some of the design thinking workshop’s value derived from being physically away from their LEA buildings. SEAs and LEAs should establish policies allocating resources that enable more educators to participate in similar professional learning experiences, such as providing funding for substitute or relief educators. Murphy (2015) suggests that current structural aspects of schooling, especially in the U.S., do not encourage collaboration. If professional learning policies permitted educators time and space for the focal purpose of collaborating, our results suggest that educators may be more likely to develop and implement solutions that could lead to lasting changes as a result of their efforts.

Recommendations for future research

Our results and discussion points suggest a number of fruitful avenues of future research, and we highlight two here. A first line of inquiry could take a longitudinal approach to examining the use of design-based strategies for educational change. Presently, much of the extant research on the topic, including the present study, has taken place over a 7- to 14-month time period (i.e., one traditional school year; see Hubbard & Datnow, 2020; Phusavat et al., 2019). Future work could study how educators implemented a design-based strategy over a series of three or more school years, for instance. Given that we found limited evidence of changes in practice, this research could offer insight into design-based strategies’ potential for sustaining change efforts over time.

A second line of inquiry stems from the team-based approach used in the present study’s design thinking workshop. Future studies could specifically examine teams’ readiness to engage with a design-based strategy for educational change. We hypothesize that existing team norms and dynamics in educators’ respective LEA contexts would likely exert substantial influence over how those same educators interact with one another in the context of something like a design thinking workshop. Indeed, findings from this research could offer information about how team dynamics influence educators’ abilities to implement change efforts with a design-based strategy.

Conclusion

The present study helps fill several extant gaps in the scarce literature on the use of design thinking as a strategy for change in educational organizations. By extending prior work on the topic that studied single school contexts in the Philippines (Phusavat et al., 2019) and U.S. (Hubbard & Datnow, 2020), we were able to examine the process’s use among educators from 10 different LEA contexts. Our results suggested that while design thinking started to shift participants’ mindsets about change, we found little evidence of those mindsets being put into practice. Similar to Hubbard and Datnow (2020), though, we remain encouraged that design-based strategies like design thinking can lead educators to critically reflect upon how they can push back against the grammar of schooling and the traditional change paradigm. While the present study occurred in the U.S., implications from our results are highly relevant to other contexts, particularly with respect to how agencies like LEAs frame the scope and goals of educational change and how they facilitate professional learning for educators to realize those goals.