Introduction

Teachers can use classroom-wide behavior management interventions to improve student behavior and academic engagement (Collier-Meek et al., 2019). These evidence-based interventions incorporate antecedent—and positive consequence-based approaches to manage classroom behavior. They can be applied universally (Tier 1) or targeted toward students requiring additional support, such as students who exhibit disruptive or inattentive classroom behavior (Tier 2). For example, at Tier 1, teachers can provide the class with clear behavioral expectations, remind students about these expectations, and deliver frequent behavior-specific praise when students meet expectations or show improvements (Cook et al., 2017; De Pry & Sugai, 2002). At Tier 2, teachers can use targeted interventions such as daily report cards (Owens et al., 2020a, 2020b) or check-in/check-out cards (Filter et al., 2022) to provide individualized goal setting and reinforcement systems. Behavioral classroom management interventions align with Positive Behavior Interventions and Supports (PBIS; Simonsen & Sugai, 2019; Staff et al., 2021), a widely adopted, school-wide, multitier service delivery framework that includes school systems (e.g., team meetings, data-based decision making) and positive behavior management practices (e.g., teaching positive behavioral expectations, continuum of procedures for encouraging expected behavior; Sugai & Horner, 2006). School-wide adoption of PBIS can set expectations for teachers to use particular Tier 1 and Tier 2 interventions (e.g., behavior-specific praise, completing check-in/check-out cards) in their classrooms. However, despite the effectiveness of these behavioral classroom interventions, and even in schools that use the PBIS framework and support the use of these approaches (Reinke et al., 2013), teachers often do not use behavioral classroom management interventions with the recommended frequency or fidelity (Hart et al., 2017; Martinussen et al., 2011; Owens et al., 2020a, 2020b).

Implementation Strategies

School mental health researchers increasingly apply approaches from implementation science (Proctor et al., 2013) to enhance the adoption, implementation, and sustainment of evidence-based practices in schools (Lyon & Bruns, 2019a; Owens et al., 2014). Indeed, the fact that behavioral classroom interventions are well-supported by the evidence (e.g., Collier-Meek et al., 2019) but are underused (e.g., Owens et al., 2020a, 2020b), points to a need for implementation strategies, which are “methods or techniques used to enhance the adoption, implementation or sustainability of a clinical program or practice” (Proctor et al., 2013 page 2) to support their use. Teacher coaching and consultation, which often include modeling, goal setting, performance feedback, and role-play, are often recommended for this purpose (Stormont et al., 2015). These approaches as they are commonly implemented tend to be time and resource intensive (Pas et al., 2020; Stormont et al., 2015) and may not be sustainable with limited school and district resources. There is therefore a need for approaches that are feasible, flexible, and can be tailored to support individual teacher’s needs in implementing Tier 1 and Tier 2 behavioral classroom practices.

Implementation strategies are most likely to be effective when they target relevant malleable factors (Lewis et al., 2018), and theories of behavior change can be useful in identifying these factors. The Theory of Planned Behavior (TPB; Ajzen, 1991) is the most commonly used social-cognitive theory in the implementation science literature (Lewis et al., 2020). The TPB offers one model to understand teacher’s implementation behaviors as a function of two sets of factors: factors that promote their intentions to implement an intervention, and those that promote their ability to act on their intentions to implement the intervention. It also delineates specific psychological determinants of intentions (attitudes, perceived norms, self-efficacy) that comprise potential mechanisms for increasing implementation among teachers who report weak intentions. For teachers who do not implement a behavioral intervention despite strong intentions to do so, implementation strategies should target teachers’ ability to act on their intentions (Fishman et al., 2018), for example by providing reminders in the moment (Nilsen et al., 2012).

Our prior mixed-method work elucidated specific barriers and facilitators that teachers report to their use of behavioral classroom interventions (Lawson et al., 2022). We have also found that teachers’ intentions to use these interventions vary by specific practice (Lawson et al., 2023a), suggesting the need to tailor implementation strategies to each practice. Key barriers and facilitators to implementation include factors that interfere with execution in the moment (e.g., stress, forgetting due to competing demands; difficulty changing habits), as well as beliefs about behavioral classroom interventions that lead teachers to be more or less likely to intend to use them (Lawson et al., 2022). These barriers and facilitators varied somewhat between Tier 2 and Tier 1 interventions. For example, believing there is not enough time to use the intervention was one of the primary barriers teachers reported for Tier 2 intervention, but not for Tier 1 interventions (Lawson et al., 2022). In contrast, barriers such as being distracted or forgetting, and feeling stressed, frustrated, or burned out were salient to Tier 1 and Tier 2 interventions. Together, these results provide a set of key barriers and facilitators that may serve as targets for implementation strategies to support teachers’ use of behavioral classroom interventions.

Community-Partnered Development Approaches

Grounded in models such as community-based participatory research (Israel et al., 2001) and community-academic partnerships (Drahota et al., 2016), there is widespread recognition that interventions and implementation strategies should be developed, selected, and refined in partnership with interested parties, including the individuals who would use them (Lyon & Bruns, 2019b; Pellecchia et al., 2018; Wallerstein & Duran, 2010). Teachers and other educators possess valuable practice-based expertise that is essential for developing implementation strategies that are contextually appropriate and feasible (Bearss et al., 2022). Involving educators as partners is particularly important for ensuring sustainability of school-based interventions and implementation strategies. Academic-community collaborations provide opportunities to ensure that implementation strategies align with key organizational priorities and culture, and that they are feasible given policy, funding, and staffing considerations (Aarons et al., 2011). Strategies that are developed without meaningful input from end users are likely to be too complex, burdensome or unappealing (Lyon & Koerner, 2016), and therefore unsustainable (Beidas et al., 2022). It is critical to involve practitioners equitably and meaningfully from the earliest stages to develop implementation strategies that will ultimately be scaled, used, and sustained (Beidas et al., 2022; Kone et al., 2000; Sibbald et al., 2014). Two common strategies for practitioner involvement are the use of community advisory boards (Lyon et al., 2019a, 2019b; Pellecchia et al., 2018), and iterative development or redesign processes where interventions or implementation strategies are refined based on data (Bearss et al., 2022; Kern et al., 2011).

Positive Behavior Management Toolkit

In this paper, we describe the process that we used to develop a toolkit of implementation resources to support teachers in using behavioral classroom interventions (i.e., “Positive Behavior Management Toolkit” (PBMT). Informed by the Theory of Planned Behavior, the PBMT is designed to target both teachers’ intentions to use key behavioral classroom interventions and their ability to act on intentions, and to target salient barriers and facilitators identified in an earlier study (see Lawson et al., 2022). The PBMT is intended to support teachers in addressing the individual-level implementation barriers that can exist even within the context of a supportive school environment. To promote its flexibility and usability, the PBMT uses a modular approach to support four evidence-based teacher-delivered Tier 1 and Tier 2 practices. The PBMT has three core components: 1) a library of resources, including written materials, videos, and tangible resources (e.g., custom sticky notes); 2) optional text message reminders with personalized tips and reminders; and 3) four 15-to-20-min, bi-weekly meetings with a consultant to assist the teacher with goal setting, identifying relevant resources, building motivation, problem solving, and reflecting on progress. The resources in the PBMT library are available on paper and digitally. These resources are organized into six modules. There are four modules to support four evidence-based teacher-delivered practices: 1) behavior-specific praise, 2) precorrections, 3) calm behavior-specific corrections, and 4) daily report cards. An additional two modules: 5) support teachers in strengthening student–teacher relationships and 6) support their own wellness, because these factors had been identified as key implementation barriers and facilitators (Lawson et al., 2022). Resources in each module target mechanisms from the TPB and related theories of behavior change (i.e., attitudes, norms, self-efficacy, habits) as well as specific barriers and facilitators we have identified as salient in our prior research. Teachers engage with the PBMT for 8 weeks. Each teacher is assigned a consultant, who meets with the teacher four times for 15-to-20 min to support their engagement with the PBMT resources. Consultants and teachers work collaboratively to identify PBMT resources that are relevant for teachers’ goals. Teachers are encouraged to engage with these resources during their regular instruction (e.g., viewing tangible resources) as well as during planning time (e.g., completing planning guides). The PBMT is unique compared to existing teacher coaching models in at least three ways: 1) the packaged, flexible, and modular approach with strategies focused on strengthening intentions (e.g., perceived norms) and acting on intentions (e.g., tangible reminders); 2) relatively low time burden; and 3) the participatory, iterative development approach, as described here.

This paper describes the processes we used to develop the final version of the PBMT. Our iterative development process had two key aspects. First, akin to an advisory board (Cook et al., 2019), we established a Program Development Team (PDT) composed of school—and district–employed partners and held monthly meetings with the PDT before and during the development process to guide the development of the implementation resources. Second, we conducted a series of tryouts in which teachers used a version of the PBMT in their classroom. Following each tryout, teachers provided quantitative and qualitative feedback on the PBMT’s acceptability, appropriateness, and feasibility. We used this feedback to revise the PBMT making it more appealing, usable, and contextually appropriate. In this paper, we describe processes for engaging the PDT and conducting iterative tryouts. We present results to address the following research question: How did quantitative and qualitative data from teachers inform modifications to enhance acceptability, feasibility, and contextual appropriateness? In doing so, we share quantitative and qualitative results regarding the acceptability, feasibility, and contextual appropriateness of each version of the PBMT.

Method

The method section is organized to describe two sets of processes used during the iterative, community-partnered development of the PBMT1) the processes by which we formed and engaged the PDT; and 2) procedures, participants, and methods for data collection and analyses related to the teacher tryouts of PBMT iterations.

Setting

The implementation strategy development processes described occurred in partnership with a large urban school district in the Mid-Atlantic USA. The district’s student body is racially and ethnically diverse, with about 86% of students of racial/ethnic minoritized background. Approximately 80% of district students live in households that are income-eligible for free or reduced-price meals.

Program Development Team

We formed one Program Development Team (PDT) composed of school- and district-employed partners to guide the development of the PBMT. We considered our engagement with this workgroup to be an implementation strategy akin to a community advisory board (see Cook et al., 2019; Pellecchia et al., 2018). The PDT served as a formal workgroup to provide input and advice related to the development of the PBMT and its iterative improvements. Specifically, the PDT provided input on PBMT goals and format, helped interpret the data from the tryouts, and provided feedback on the iterations of the PBMT. PDT members were considered members of the research team, not research participants. The PDT meetings occurred monthly; meetings began prior to developing the first iteration of the PBMT and were scheduled through the end of the second tryout. Figure 1 displays a schematic overview of this process. Two members of the research team, the Principal Investigator (PI) and clinical research coordinator, also participated in these meetings as members of the PDT.

Fig. 1
figure 1

Schematic overview of Program Development Team (PDT) meetings, Positive Behavior Mangement Toolkit (PBMT) development and revisions, and teacher tryouts

PDT Recruitment

When forming the PDT, we recruited educators with diverse roles (e.g., teachers, administrative leadership, PBIS coaches, counselors), backgrounds, and experience, and did not have inclusion criteria related to years of experience in education. We recruited educators who worked in more than one school building in the district (i.e., PBIS coaches), and those who worked in only one school (i.e., teachers, administrative leadership, counselors). PDT members were not required to work in a school that was participating in the tryouts.

To recruit PDT members, we developed a one-page information sheet that described the project, the goal of the Program Development Team (i.e., “…to provide expert input regarding the resource package by participating in a series of team meetings”), and a description of what participation in the PDT would entail (i.e., “attend approximately 8–10 meetings between June 2021 and April 2022 to offer your feedback, expertise, and ideas on the development of the resource package”). We distributed this information sheet to school district partners, including the PBIS Director, building principals, and teachers with whom we had previously engaged in a prior study. The information sheet invited interested individuals to contact the study PI. The PI also presented about the project and the PDT opportunity to the team of PBIS coaches employed by the partnering school district. The research staff (i.e., PI and clinical research coordinator) met individually or in groups with individuals who expressed interest in joining the PDT to describe the role, answer questions, and invite them to join prior to the first PDT meeting.

The PDT consisted of five teachers, two school administrators, one counselor, and four PBIS coaches (i.e., professionals who provide schools technical assistance in implementing the school-wide PBIS framework). These members were drawn from three schools, two of which also had at least one teacher participate in the try outs. PDT members were compensated for their time participating in team meetings via electronic gift cards when that role was outside the scope of their regular job duties ($30/hour). The PI and clinical research coordinator also participated in the PDT meetings as members.

Meeting Structure and Content

Team meetings occurred monthly after school (the time preferred by team members) and virtually due to the COVID-19 pandemic and team member preference. We held nine meetings between June 2021 and May 2022. The group decided to not hold monthly meetings during August and September 2021, due to scheduling constraints at the beginning of the school year, and during January 2022, due to heightened levels of pandemic-related disruptions.

Team meetings were co-chaired by the PI and clinical research coordinator. To minimize burden on educator team members while also allowing them to have meaningful ownership over PDT processes, the research team members proposed an agenda for each meeting, distributed in advance and reviewed at the beginning of the meeting to request additions or revisions. Research team members sought input at the end of each meeting about future meeting agenda items. Table 1 summarizes the major topics of each meeting. Following the final PDT meeting, the research team members sent individualized thank you notes to each PDT member acknowledging their specific contributions, and gathered feedback from each member about whether they would like to be credited by name for their contributions in a range of possible products (e.g., named on an acknowledgements page within the PBMT, in presentations and publications, and in any dissemination through the school district). We also asked whether they would like to receive emails with updates about the project.

Table 1 Overview of Program Development Team (PDT) meeting topics

Teacher Tryouts of PBMT Iterations

Participants

Eight teachers participated across the two tryouts. All were female. Six teachers identified their race as White, one as Black, and one as more than one race. No teachers identified their ethnicity as Hispanic/Latino/Spanish. Most (75%) had a Master’s degree, and teachers had, on average, 15.5 years of teaching experience (SD = 8.3), and 7.9 years teaching at their current school (SD = 6.7). Grades taught ranged between kindergarten and 8th grade.

Procedures

All procedures were approved by the school district research board and the institutional review board of the Children’s Hospital of Philadelphia. The research team recruited schools for study participation in collaboration with the school district PBIS office. Specifically, the school district PBIS Director and the study PI asked district PBIS coaches to recommend schools in which they anticipated school leadership might be interested in participating in the project, with an effort to select schools with student demographics representative of the district’s. There were no other school-related inclusion criteria. All participating schools had adopted school-wide PBIS. The PI met with interested principals, and principals of four K-8 schools agreed for the study team to recruit teachers within their school for the tryouts.

We conducted two rapid-cycle tryouts to inform the iterative development of the PBMT. Due to COVID-19 pandemic-related restrictions, the research team recruited teachers for participation in both tryouts solely via email. For the first tryout, which took place between October and December 2021, nine teachers were invited, randomly drawing from a pool of teachers at participating schools who had previously expressed willingness to be contacted for research, and three teachers participated. For the second tryout, which took place between February and May 2022, 35 teachers were invited, drawing from the same pool of teachers and principal recommendations, and five new teachers participated. Across recruitment for both tryouts, three teachers actively declined participation, two teachers indicated that they were no longer eligible because they changed schools, and two teachers expressed interest but ultimately did not attend an informed consent meeting due to a leave of absence. The other teachers who were recruited but did not participate did not reply to the outreach attempts. No teachers declined consent after attending an informed consent meeting.

Participating teachers met with a member of the study team to provide informed consent, complete baseline surveys, and receive a version of the PBMT. At the end of their tryout, all teachers completed endpoint surveys and a semi-structured interview in which they provided their feedback about their engagement with the PBMT. Teachers were compensated with gift cards for completing baseline ($20) and endpoint ($20) measures and for the follow-up interview ($40).

Teachers who participated in the first tryout used PBMT Version 1 during their everyday work setting for four weeks. We selected a four-week period for the first tryout to enable more rapid iterations. During this time, teachers each attended two consultant meetings and participated in quantitative and qualitative data collection at the end of the period. In collaboration with the PDT, the study team revised the PBMT based on data from the first tryout.

Teachers who participated in the second tryout used the revised version (PBMT Version 2) during their everyday work setting for a period of eight weeks, in order to test the acceptability and feasibility of the planned time period for implementation. Each teacher received four consultant meetings during that eight-week period and participated in quantitative and qualitative data collection at the end of the period. The study team then collaborated with the PDT to make further revisions to the PBMT based on data from the second tryout.

Measures

Quantitative

At the beginning of their tryout, teachers provided demographic information. After each tryout, teachers provided quantitative feedback using the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM; Weiner et al., 2017). The AIM consists of four items (e.g., “[intervention] is appealing to me”) rated on a 5-point Likert scale (from “completely disagree” to “completely agree”) tailored to assess the acceptability of the PBMT. The IAM consists of four items (e.g., “[intervention] seems applicable”) rated on the same 5-point Likert scale tailored (i.e., [intervention] populated with “The Positive Behavior Management Toolkit”) to assess the appropriateness or contextual fit of the PBMT. The FIM consists of four items (e.g., “[intervention] seems doable”) rated on the same 5-point Likert scale tailored to assess the feasibility of the Tier 1 and Tier 2 behavioral classroom interventions supported by the PBMT (i.e., [intervention] populated with “The positive behavior management interventions discussed in the toolkit”). The mean item score on each of these measures (scale ranging from 1 to 5) was computed. These measures have shown acceptable inter-item consistency (α > 0.82) and test–retest reliability (r > 0.70).

Qualitative

After their tryout, teachers also participated in semi-structured interviews to provide more in-depth feedback. All interviews were conducted by the first author (GL), who did not work with the teachers to support their implementation. The semi-structured interviews were guided by a protocol with four sections. In the first section, participants were asked to describe how they used (or did not use) the PBMT and its components. Probes included estimating the time spent engaging with it, describing what using it “looked like for you,” and describing which components were used more and which less. The second section focused on the perceived acceptability, appropriateness, and feasibility of the PBMT and its components. In the third section, participants were asked about the extent to which they perceived the PBMT as having an impact on their own behavior or for their students (e.g., can you tell me about what happened when it [was/wasn’t] helpful?). The last section focused on participants’ recommendations to improve the PBMT (e.g., modalities, specific materials). Throughout the protocol, follow-up probes asked about specific components of the PBMT (i.e., support meetings, written resources, tangible resources, text message reminders). On average, the interviews lasted 36.7 min (SD = 12.9 min). Qualitative interviews were audio-recorded, transcribed, and de-identified, and de-identified transcripts were analyzed.

Analyses

Mixed-Method Analyses

We used the taxonomy of mixed-methods designs outlined by Palinkas and colleagues to guide mixed-methods analyses (Palinkas et al., 2011). We employed a sequential structure, beginning with the quantitative data, with an emphasis on the qualitative data (i.e., quan → QUAL; Palinkas et al., 2011). The function of the qualitative data was to expand on the quantitative findings by providing additional depth and richness to the quantitative information about acceptability, appropriateness, and feasibility. We merged quantitative and qualitative data at the data interpretation stage by the process of connecting the qualitative findings to the quantitative findings.

Quantitative

We examined descriptive statistics (means, standard deviations) for the AIM, IAM and FIM after each tryout and compared them with values in the published literature.

Qualitative

We analyzed the qualitative interview data following each tryout. We used an integrated inductive and deductive analysis approach (Bradley et al., 2007) that included identifying a priori constructs of interest (e.g., acceptability and feasibility). We also used modified grounded theory (Glaser & Strauss, 1967) to identify emergent themes. The transcripts were coded in multiple stages (Saldaña, 2013). After an initial review of transcripts, we developed a codebook with definitions, key words, and example codes for “additional resources,” “fillable documents,” “format,” “goal setting,” “general,” “interventions,” “one pagers,” “reminder texts,” and “guide” (i.e., consultant). The codebook was refined through an iterative process in which two coders (GL, ST) applied the codebook to transcripts, identified and reconciled any discrepancies through discussion, and updated the codebook to clarify definitions and decision rules. Two coders (GL, ST) coded all eight transcripts in NVivo, with a percent agreement above 0.80 for all codes. Any discrepancies were reconciled through discussion (Hill et al., 2005). The coders reached consensus through discussion in 100% of cases. When discrepancies occurred, the consensus coding was used.

Members of the research team, led by the principal investigator (GL), then reviewed all transcript excerpts for each code and drafted an analytic memo for each code that at least one additional member vetted (Saldaña, 2013). The memos contained the salient themes within each code, ID numbers of participants whose comments fit within the theme, and illustrative quotes. The research team revised the analytic memos through an iterative process of discussion and revision (Saldaña, 2013) to synthesize the data regarding salient themes that were related to acceptability, contextual appropriateness, feasibility, and recommendations for improvement. Given the goal of using the qualitative data to inform revisions to the PBMT, the final-stage memos were organized into the overarching sections: “positive feedback” and “suggestions for improvement.”

PDT Involvement

The research team shared with the PDT brief summaries of the quantitative data and major themes from the qualitative data, organized by “positive feedback” and “suggestions for improvement” at one meeting following the first tryout and a second meeting following the second tryout. PDT members participated in discussion regarding interpretation, implications, and how to revise the PBMT in response to the feedback.

Results

The results are organized to illustrate how we refined the PBMT based on the tryouts. We first describe Version 1 of the PBMT for the purpose of providing context for describing the revisions that were made, then the quantitative and qualitative results from tryouts of subsequent PBMT versions and illustrate how results informed revisions to each version.

Positive Behavior Management Toolkit (PBMT)

Description of Version 1

Version 1 included three core components: 1) a library of resources, organized into six modules; 2) optional text message reminders; and 3) brief (i.e., 15–20 min), bi-weekly meetings with a consultant. Version 1 was structured such that teachers, in collaboration with the consultant, were encouraged to set goals related to any of the modules in any order that they preferred (see Supplemental Fig. 1).

First Tryout

Quantitative results

After the first tryout, the mean score on the Acceptability of Intervention Measure (AIM) was 4.41 (SD = 0.52; range [4,5]), the mean score on the Intervention Appropriateness Measure (IAM) was 4.25 (SD = 0.66; range [3.75,5]), and the mean score on the Feasibility of Intervention Measures (FIM) was 4.41 (SD = 0.52; range [4,5]). These score indicating average responses in between “agree” and “strongly agree” for all three constructs.

Qualitative results and revisions to the PBMT

Consistent with the relatively high quantitative ratings of acceptability, appropriateness, and feasibility, there were several positive themes regarding the first iteration of the PBMT (Table 2). Teachers perceived the PBMT as easy to use and accessible; they shared that it met an important need, filling a gap left by typical pre-service training. They also perceived it as effective. Teachers tryout identified two components of the PBMT as especially acceptable and helpful: “consultant meetings,” during which a member of the research team met with them for 15–20 min to provide support about their engagement with the PBMT, and text message reminders. Illustrative quotes regarding each of these positive themes are displayed in Table 2. Based on these findings, and in consultation with the PDT, we did not change many core aspects of the PBMT between the first and second iteration. We retained the length and frequency of consultant meetings, the four focus teacher practices, many of the written resources, and the availability of text message reminders.

Table 2 Positive themes from qualitative interviews in first tryout

There were also several themes regarding aspects of the PBMT that teachers viewed as less acceptable or helpful, challenges they encountered, or additional resources they wanted after the first tryout. Table 3 displays these themes related to teachers’ feedback for improvement, and the changes that we made to the PBMT in response to each of them. First, teachers shared experiences that highlighted the importance of establishing a strong student–teacher relationship as a foundation before working on the focus practices. Related, teachers shared challenges using the daily progress report intervention before strong relationships and Tier 1 practices were established. In response to this, we imposed additional structure to the PBMT. Unlike in Version 1, where teachers could start with any module (using supplemental resources about student–teacher relationships as appropriate), Version 2 began with a “road map” that instructed teachers to start with a goal regarding student–teacher relationships, then focus on Tier 1 practices, and finally to focus on daily progress reports. Supplemental Fig. 1 illustrates the changes that were made to this document during the iterative revision process. Teachers shared suggestions to make the PBMT more contextually appropriate, such as further differentiation by age, framing resources in terms of potential trauma exposure, and supporting teachers in consider students’ cultural backgrounds. See Supplemental Fig. 2. Teachers also wanted more concrete tools to make the focus practices easier; in response to this, and with consultation from the PDT, we included additional tangible resources to make focus practices easier (e.g., custom sticky notes that teachers can use to deliver praise or precorrections), step-by-step how to guides, and documents with example sentence stems (“what it can sound like”; See Supplemental Fig. 3). Finally, teachers reported a need for resources to support caregivers, and one teacher wanted the text message reminders to be organized in a single place, such as through an app or website (See Table 3).

Table 3 Feedback for improvement from qualitative interviews and changes to the PBMT after the first tryout

Second Tryout

Quantitative

After the second tryout, the mean score on the AIM was 4.80 (SD = 0.45; range [4,5]). The mean score on the IAM was 4.80 (SD = 0.45, range [4,5]), and the mean score on the FIM was 4.65 (SD = 0.49, range [4,5]). Although there are no established benchmarks for what is considered an acceptable score on these measures, these scores are similar to or higher than other scores on these measures that have been reported in the published literature about school-based interventions (i.e., a mean score of 4.21 on the FIM regarding “school-wide PBIS,” Corbin et al., 2022; scores of 4.7 on the AIM, 4.4 on the IAM, and 4.3 on the FIM after the completion of an iterative redesign effort of a school-based intervention Bearss et al., 2022).

Qualitative results and revisions to the PBMT

Consistent with the quantitative data, teachers’ qualitative feedback after the second tryout was very positive, and we identified many of the same themes as after the first tryout (e.g., PBMT as easy to use and feasible and meeting an important need; consultant meetings and text message reminders as particularly important). We also identified some unique positive themes after the second tryout (see Table 4). Specifically, teachers identified the PBMT as contextually appropriate, reported that it improved their relationship with students, and viewed the tangible resources as one of the most valuable components. In response to this feedback, we kept the changes that were made between the first and second versions and included additional tangible reminders, such as stickers that teachers could use a visible reminder for themselves to deliver praise.

Table 4 Themes from qualitative interviews in second tryout and PBMT changes

Qualitative themes related to suggestions for improvement after the second tryout were minor (See Table 4). Some teachers shared that the appearance of the PBMT was overwhelming; in response, we removed resources that teachers did not use during the second tryout and reduced the number of extra copies of planning guides and other fillable documents. Additionally, one teacher reported that she wanted to use the self-monitoring guide, but it was physically too large and would be obvious to her students; we therefore replaced it with a smaller version.

Discussion

This paper advances the literature by providing a detailed description of the partnership-based, iterative development process used to develop the Positive Behavior Management Toolkit, a packaged teacher-facing implementation strategy to support teachers in using evidence-based Tier I and Tier 2 behavioral classroom interventions. We describe the approaches we used to engage our Program Development Team of school- and district-employed partners and to conduct iterative tryouts of versions of the PBMT with teachers. We illustrate how quantitative and qualitative feedback were used to inform revisions to the PBMT, in partnership with the Program Development Team. Although this paper focuses on the development of the PBMT, we think that processes described here can be applied to develop or adapt a variety of school-based interventions and implementation strategies.

The findings also indicate that teachers perceived the PBMT as highly feasible, acceptable, and contextually appropriate, and that these perceptions improved across PBMT versions. Teacher-reported scores on quantitative measures of acceptability, appropriateness, and feasibility were similar to or higher than other scores on these measures that have been reported in the school mental health literature (e.g., Corbin et al., 2022). We also identified new positive qualitative themes related to contextual appropriateness, acceptability, and perceived effectiveness after the second tryout. We organize the discussion section to highlight three lessons learned from these processes and results that are broadly applicable to school-based mental health research.

Lessons Learned

Lesson 1: The Importance of Aligning Intervention and Implementation Strategies with School and District Priorities and Structures

Consistent with the recognized importance of tailoring implementation strategies to address needs within the local context (Powell et al., 2015), our experiences highlight the importance of aligning the implementation strategy in development with school priorities and existing structures. In this case, we designed the PBMT to fit within the school-wide PBIS framework, which made it relevant to district and school priorities of strengthening classroom-level PBIS implementation. We worked with the PDT and other school partners to ensure that the PBMT uses language consistent with language in district-level PBIS implementation supports. Our goal was for teachers to perceive it as a resource to support practices they are already expected to use, rather than a new practice. This was important for ensuring feasibility and contextual appropriateness. Quantitative and qualitative data suggest we achieved this goal as many teachers across both tryouts commented that this alignment made it feel manageable and like a good fit.

Aligning the project with PBIS—specifically the priority of promoting classroom-level PBIS implementation—also helped us recruit schools and PDT members. This again highlights the importance of designing intervention development research projects to meet identified school—and district-level priorities. Given the many competing demands school and district personnel face, they are unlikely to invest time and energy into supporting research that does not address their own priorities. As such, school mental health intervention development projects should be designed from the beginning to be responsive to these needs and priorities, while also advancing the science (see Pellecchia et al., 2018). This is consistent with arguments that school mental health services should be conceptualized to address student outcomes that are of high priority to both schools and mental health clinicians (see Atkins et al., 2017).

Lesson 2: The Need to Integrate Theory, Data, and Partners’ Priorities and Feedback

We considered three sources of information in conceptualizing and developing the PBMT: 1) theories of behavior change; 2) previous research, including the evidence-base for behavioral classroom interventions and studies identifying barriers and facilitators to teachers’ use of these interventions; and 3) educator partners’ priorities and feedback, including input from the PDT and teachers’ quantitative and qualitative feedback following tryouts. Doing so posed the challenge—also present in other intervention development projects—of how to integrate information from these sources, particularly in cases where they do not agree. One helpful strategy for navigating this challenge was to explicitly engage partners in discussion about the guiding theory and prior research. For example, in the first PDT meeting, we shared a general overview of the guiding theory, and sought team members’ reactions and feedback. Similarly, in the second PDT meeting, we discussed the implications of existing findings regarding barriers and facilitators. Through these dialogs, we intentionally sought opportunities to align these three sources of information (e.g., develop PBMT resources that fit within the guiding theoretical framework, address barriers previously identified as salient, and are responsive to educators’ priorities and feedback). Based on this experience, we recommend that researchers developing school mental health interventions or implementation strategies be explicit about the sources of information they will consider during the development and revision process and their approaches to integrating them.

Lesson 3: The Importance of Collaboratively Developed and Fair Expectations, Compensation, and Credit in Community-Partnered Work

One key challenge in working with the PDT was balancing our goal of ensuring that PDT members were engaged and felt ownership over the project while not unduly burdening them. For example, we wanted all PDT members to provide input regarding meeting agendas and PBMT resources but did not expect them to plan activities or draft resources outside of the meetings themselves. We found it helpful to explicitly discuss this tension (e.g., “I want you to have ownership over this process, but also do not want to ask you to do my job for me”) when inviting members to join the PDT, and to talk with PDT members about the best way to navigate this tension. We also found it helpful to collaboratively discuss norms and roles for the research team members and school district team members at our first PDT meeting, and to check-in about this periodically. These processes are consistent with recommendations for research-practice partnerships (Esposito et al., 2015; Henrick et al., 2023), and our experience highlights the utility of applying these principles when developing school-based interventions.

Similarly, it is important to fairly compensate and credit school and community partners for their project contributions. Challenges include the limited circumstances under which partners are permitted to accept compensation, the need to follow regulations to protect human subjects, co-authorship standards, and the identification of sources of credit that are meaningful to community partners. We found it helpful to discuss this with our partners and the PDT across project phases, by asking the PDT how they would like to be credited, and asking each team member individually if they would like to be credited by name.

Limitations

There are several important limitations to this work. First, consistent with other intervention development studies (e.g., Duong et al., 2020; Lyon et al., 2021), we conducted tryouts with relatively small samples of teachers. The percent of invited teachers who agreed to participate was relatively low, which raises questions about the representativeness of the samples. This low response rate may reflect the recruitment and engagement challenges experienced by many school-based research teams in the context of the COVID-19 pandemic, during which school staff faced enormous stressors and disruptions (Robinson et al., 2023), and there were often not opportunities for face-to-face recruitment. However, we think that our results highlight the fact that meaningful information can be gained from in-depth tryouts even with small sample sizes, as indicated by the rich qualitative feedback, the high degree of consistency across teachers, and the apparent improvement in acceptability, feasibility, and appropriateness across iterations. Relatedly, teachers participated in the first tryout over an abbreviated period (i.e., 4 weeks rather than 8), which may have affected teachers’ feedback, although had the advantage of allowing for a more rapid transition to the next iteration.

It is also important to note that we did not observe teachers using the behavioral practices or measure their effects on student behavior during this iterative development process. Although participating teachers reported qualitatively that they perceived the PBMT as effective, we do not yet have other sources of data to corroborate these impressions. This is a focus of our ongoing work through a randomized pilot study, in which we are collecting data about teacher outcomes and student outcomes (see Lawson et al., 2023b for the pilot study protocol). This staged approach, in which an intervention or implementation strategy is first iteratively designed with a focus on constructs such as feasibility or usability and subsequently tested for effectiveness, is consistent with many intervention development and adaptation models in the literature (e.g., Discover, Design/Build, and Test, see Lyon et al., 2019a, 2019b). However, developing methods to expedite these steps is an important future direction, given the importance of ensuring that research timelines align with school partner needs (see Beidas et al., 2022).

Finally, we describe here one process for community-partnered, iterative development, but did not collect data to compare the outcomes of this process to alternative development processes. For example, we do not know how PBMT would look if we had not engaged the PDT, or how it would differ if teachers had provided feedback in a “lab” setting as opposed to after in vivo tryouts. Rigorous research to elucidate the impact of these different development processes will be important to advance the field of intervention development research.

Conclusions

This paper provides a detailed description of the process that we used to develop a package of implementation resources to support teachers in using behavioral classroom interventions, with the goal of providing a model that can inform the development or adaptation of a variety of school-based interventions and implementation strategies. Our experiences highlighted the importance of alignment with school and district priorities and structures, particularly to design a contextually appropriate, feasible, and sustainable implementation strategy to be applied in the school setting. We also identified approaches that were helpful for the integration of different sources of information (i.e., theory, data, educator feedback), and reflect on the importance of collaboratively developed expectations, compensation and credit in community-partnered work. We argue that applying these key principles to the development of interventions and implementation strategies will ultimately lead to products that are more likely to be used and sustained.