Introduction

Disruptive behavior—that which interferes with academic instruction—is a common challenge faced by teachers (Aloe et al., 2014; DeShazer et al., 2023). These behaviors may manifest as a function of student mental health challenges and/or student stress related to inequities facing their communities, which have been exacerbated by the COVID-19 pandemic (Deng et al., 2023; Institute of Education Sciences, n.d.). Although teachers’ use of positive behavior supports can reduce disruptive classroom behavior (Korpershoek et al., 2016; Nisar et al., 2022), teacher preparation for and implementation of such supports varies (Owens et al., 2017). Barriers that teachers may face when implementing these supports include limited time and access to professional development (Collier-Meek et al., 2019), insufficient knowledge and skills related to strategy use (Owens et al., 2017; Sutherland et al., 2019), and inadequate implementation strategies and accountability (DeFouw et al., 2023; Long et al., 2016).

In addition, there is concern that teachers have received insufficient training on how to implement these supports equitably and with sensitivity to student culture (Gaias et al., 2019). We draw from Exner-Cortens et al. (Exner-Cortens et al., 2022) and define equitable implementation as implementing supports equally when there is equal need across students (i.e., horizontal equity) and with enhancements when there is greater need among some students (i.e., vertical equity). For example, horizontal equity occurs when all students receive a personalized greeting each day with their preferred name. Vertical equity occurs when certain students (e.g., those who need deeper connection to support their self-regulation and student–teacher relationship) receive a greeting plus a brief check-in about their emotions or goal for the day. Implementation of positive behavior supports without attention to equity or sensitivity to students and their culture is a concern, as students with marginalized identities experience higher rates of exclusionary discipline (Annamma et al., 2019; U.S. Department of Education Office of Civil Rights, 2016), less positive student–teacher relationships, and micro-aggressions that devalue their identities, families, and cultures (Butler-Barnes & Inniss-Thompson, 2020; Collins et al., 2023; Okoroji & Oka, 2021), as compared to their non-marginalized peers. These cumulative experiences of exclusion and isolation can ultimately contribute to students’ school disengagement (Voight et al., 2015), and pushout via exclusionary discipline (Novak, 2021). Notably, these experiences can occur even when traditional positive behavior supports (i.e., supports as typically used and that do not center equity) are implemented (e.g., Bradshaw et al., 2010, 2015). Thus, research on implementation strategies that (1) facilitate the knowledge, skills, and attitudes necessary for equity-focused positive behavioral supports and (2) address common barriers to implementation (i.e., limited time and access to professional development; insufficient knowledge and skills to use the practices; inadequate implementation supports and accountability) is needed.

The Maximize Project uses a research-practice partnership to co-create implementation strategies to increase teachers’ use of equity-focused positive behavior supports (EF-PBS). EF-PBS represent an expansion of traditional positive behavior supports. EF-PBS are classroom practices that include equity features and are implemented within a specific equity/social justice framework, with the intention of making all students feel valued and able to thrive each day. We use Gorski and Swalwell’s (Gorski & Swalwell, 2015) Equity Literacy framework, as it is appropriate for designing practices that advance intersectional equity (i.e., equity across multiple intersections of marginalization). The Equity Literacy framework encourages educators to go beyond cultural competence by developing the skill and will to engage in critical actions, including recognizing when inequities are occurring, responding in the moment, and dismantling procedures that support inequity.

We hypothesized that we could (1) develop a core set of EF-PBS features using the Equity Literacy framework and then (2) address common barriers to educators’ implementation of EF-PBS by leveraging interactive technology and the organization’s social network via peer leaders (see Fig. 1). Emerging literature suggests that interactive technology can facilitate use of best practices for targeted interventions (e.g., Owens et al., 2022; Scheibel et al., 2023); yet it is unrealistic to assume that access to technology will lead to sustained implementation for all teachers without additional supports. Thus, by enhancing the capacity of peer leaders identified by mapping the school social network (Atkins et al., 2015; Neal et al., 2011), we may be able to develop a feasible, contextually and culturally relevant (i.e., in-house) mechanism to further the impact of the technology in facilitating adoption and implementation of EF-PBS, with the goal of ultimately enhancing equitable student outcomes.

Fig. 1
figure 1

Theory of change for maximizing teachers’ use of equity-focused positive behavior supports (EF-PBS), professional learning (PD)

The goal of this paper is to describe the collaborative research-practice co-creation of EF-PBS, as well as an interactive technology-based platform to support their use, referred to as the Maximize platform (we describe the peer leader component in a separate paper). We are following a co-creation process in the context of a research-practice partnership to ensure that diverse educator voices and perspectives are meaningfully represented and integrated into the products developed (Bammer, 2019; Pellecchia et al., 2018; Weisz et al., 2005). In this paper, we describe (1) how we engaged our community advisory board, which is comprised of educators representing diverse lived experiences, positions, student populations, and geographic locations, to co-design our EF-PBS practices and features, and the initial content and processes on the Maximize platform (Phase 1), (2) how we used mixed methods data from educators at three partnering elementary schools to co-produce the second iteration (Version 2) of the Maximize platform (Phase 2), and (3) lessons learned to inform future research.

The Maximize Platform: EF-PBS Practices and Implementation Strategies

EF-PBS Practices

To design our EF-PBS practices for the Maximize platform, we used systematic reviews of existing classroom management practices, positive behavioral supports, and social-emotional programs to identify and prioritize 10 positive behavior supports that have consistent evidence of effectiveness in improving student performance among the samples studied (McLeod et al., 2017; Nisar et al., 2022; Simonsen et al., 2008). However, given well-documented inequities in discipline referrals (Annamma et al., 2019; U.S. Department of Education Office of Civil Rights, 2016), and negative interpersonal experiences among marginalized students and educators (Butler-Barnes & Inniss-Thompson, 2020; Kohli & Solórzano, 2012), we acknowledged the need to modify the implementation of these 10 practices to be more equity-focused (Exner-Cortens et al., 2022). Thus, we co-developed equity-focused features for each practice using the Equity Literacy framework (www.equityliteracy.org) as a guide. For example, through the Maximize platform, we aim to help teachers (a) understand the difference between traditional approaches to establishing classroom expectations (e.g., teacher driven with the goal of obtaining student compliance) and equity-focused approaches that consider student voices when co-creating classroom expectations that focus on improving prosocial behavior via relationships and accountability to the peer group; (b) consider how to individualize praise based on student preference (e.g., public versus private compliments) and ensuring that all students receive praise; and (c) recognize their own stress in the presence of challenging behavior, the biases this stress may invoke, and effective ways to respond to challenging behavior without excluding students from the classroom. See Supplemental Material for Version 1 and 2 all 10 practices and their equity-focused features. Although efforts to improve teachers’ use of culturally responsive practices to reduce disproportionality in office discipline referrals are emerging (Bradshaw et al., 2018; McIntosh et al., 2021), there is still much to be learned about how to integrate this professional learning into the typical practice of schools.

Implementation Strategies

Implementation strategies are methods or techniques designed to facilitate the adoption, implementation, and sustainment of an evidence-based practice (Cook et al., 2019; Proctor et al., 2013). For the Maximize platform, we focused on implementer- (i.e., teacher) level strategies and prioritized ways to collect data via the platform to advance our understanding of teachers’ reaction to the implementation strategies in the moment. Below, we describe the components of the platform and note in parentheses how each is connected to an implementation strategy or behavior change technique (Carey et al., 2019; Cook et al., 2019); see Fig. 1).

To address barriers of access and time for professional development (Collier-Meek et al., 2019), the Maximize platform allows teachers to review content at their own time and pace. To enhance knowledge and skills, the platform includes interactive “wizards,” guided user journeys, and autonomy in the learning process. As a first step, teachers complete a self-assessment of their use of 10 priority EF-PBS: Personalized Greetings, Student Check-ins, Community Circles, Establishing Classroom Expectations, Acknowledge Positive BehaviorPraise, Corrective Feedback, Teaching Prosocial Skills, Classroom Routines, Effective Questioning, and Student Choice. Responses from this process create a personalized teacher profile, and teachers are encouraged to learn more about practices in their profile that were designated as an Area for Potential Growth (IS: test-drive and select practices). To address limitations in teacher knowledge and skills (Owens et al., 2017; Reinke et al., 2011), the Learn More pages of the Maximize platform include the definition of each EF-PBS practice, equitable implementation features, and handouts and video models for implementation (IS: Develop an implementation glossary; Instruction on how to perform the behavior). Building on the benefits of the presentation of new perspectives and self-reflection (Okonofua et al., 2016; Sellars, 2012) and to enhance knowledge and attitudes (e.g., empathy, openness, curiosity), teachers are also directed to complete brief self-reflection activities focused on equity-related topics (e.g., implicit biases, equity versus equality). Given varied reactions to equity-focused initiatives (Muhammad, 2009), the platform provides a private space for learning and self-reflection. In content development, we made continuous efforts with our advisory board to balance meeting educators “where they are,” being provocative enough to disrupt the status-quo, and being realistic about what new practices teachers could adopt given the stress they experience as part of their work.

We also developed several features for improving implementer buy-in and motivation (Carey et al., 2019; Cook et al., 2019). Following the self-assessment process, teachers are encouraged to set a goal for strategy improvement (IS: Goal setting). Given the promise of motivational ruler ratings (Owens et al., 2022), teachers are then asked to rate the importance of the goal and their confidence in carrying out their selected goal. Teachers are prompted to consider possible barriers to achieving their goal and identify actions to take if that barrier emerges (IS: Action planning). Following goal setting, teachers are prompted weekly to complete a goal review until they reach perceived mastery of the goal (IS: Self-monitoring; Prompts and cues). Teachers also received email reminders to engage with the platform (IS: Reminders, nudges, prompts and cues). In the Method section, we describe the processes used with our community advisory board to co-create these platform components. In the Results section, we share data on platform acceptability, feasibility, and utility obtained from educators in our partner schools, and how we used these data to co-produce the platform's second iteration (Version 2).

To our knowledge, this is the first evaluation of an interactive platform with embedded implementation strategies designed to facilitate knowledge, skills, and attitudes aligned with EF-PBS. The focus of the current paper is to share the processes used to co-create the platform. We hope the data captured from the Maximize platform (e.g., how teachers rate their use of skills, the goals they select; their reactions to equity-related self-reflection activities) can provide valuable insights about ways to engage teachers in this work and to ultimately impact student outcomes.

Co-creation Approach

We adopted a co-creation approach to the platform design process (Craig et al., 2021; Greenhalgh et al., 2016; Moll et al., 2020; Nguyen et al., 2020), which centered the “collaborative generation of knowledge by academics working alongside stakeholders from other sectors” (Greenhalgh et al., 2016). The goal of this process is facilitating efficient and meaningful research-practice knowledge translation. Co-creation draws on principles from other design processes and knowledge mobilization approaches, including community-based participatory research (Greenhalgh et al., 2016), integrated knowledge translation (Jull et al., 2017), and user-centered design (Lyon & Koerner, 2016). Co-creation approaches also make space for multiple epistemologies, as all actors involved in the process can contribute knowledge that reflects their own lived and practice experiences. Per Greenhalgh et al. (2016), three processes are necessary for effective co-creation work: (1) a systems perspective, such that the process is non-linear, iterative, and adaptive to local context; (2) a creative endeavor, where the end product is not predetermined and depends on individual needs, experiences, and creative ideas; and (3) a focus on process, such that processes used to create knowledge (e.g., relationship building approaches, leadership style, and governance) are equal in importance to the ultimate knowledge product. Co-creation also involves critical reflexivity (Moll et al., 2020), as well as acknowledgement of power differentials among the decision-makers (i.e., researchers, community partners, end users) (Leask et al., 2019). In this project, and per Bammer’s stakeholder involvement spectrum (Bammer, 2019), our community advisory board members took on a co-design ‘collaborator’ role by providing expertise and advice on the development of the Maximize platform in Phase 1. Educators at participating schools (the end users) were in an ‘involved’ role, as their input directly influenced the iterative design of the Maximize platform and procedures through co-production processes in Phase 2. Our plan was to partner with two local elementary schools and an advisory board to co-develop the first version of the platform. However, due to COVID-19 pandemic-related school closings, we were unable to secure local partnering schools for Year 1. Thus, our Year 1/Phase 1 work described below was conducted with our advisory board members only and we included partnering schools in Year 2/Phase 2.

Positionality Statement & Context

Our investigative team is led by White, cisgender women who strive to be allies in equity-focused initiatives. We approach this work with humility and reflexivity and have been intentional to seek the opinions and insights of advisory board members from diverse backgrounds while developing this project and expanding our perspectives on this topic. We acknowledge and celebrate the long history of scholars of color doing this work, and we wish to build upon their prior work and continuing contributions as we engage in this work.

Method

In this section, we describe our Phase 1 work with the advisory board (recruitment, demographic information, and co-design procedures) and Phase 2 work with educators in partnering schools (recruitment, demographic information, implementation of the Maximize platform, and multi-method approach to data collection).

Phase 1: Co-Design with Community Advisory Board Members

Recruitment

In July of 2021 (Study Year 1), the principal investigators discussed which educators within their local and national networks to invite to the board. We considered individuals with expertise in equity, diversity, and inclusion initiatives in schools and/or positive behavioral supports, as well as persons representing diverse lived experiences, positions in schools, community types (rural, urban, suburban), and students served. We sent invitation letters to all potential advisory board members that (a) oriented them to the project (link to a video overview), (b) highlighted the expertise we thought they would bring, (c) described the responsibilities (10 hour commitment from October-June reviewing materials and/or participating in virtual meetings; one-year appointment with opportunity to renew), and (d) noted the potential benefits (compensation of $40/hour; their name listed on our website, conversations with educators, and the opportunity to list this experience on their resume). Following acceptance, we sent a demographic survey and a poll to schedule the first meeting.

Demographic Characteristics

In Study Year 1 (2021–2022), the advisory board consisted of 11 members. In Study Year 2 (2022–2023), 6 members recommitted for a second year and 2 new members joined; the members who did not recommit cited new work responsibilities. Over the two years, the membership included 3 teachers, 2 assistant principals, 2 school social workers, and 6 in other positions (e.g., Director of diversity/equity/inclusion, teacher equity and diversity trainer, and instructional consultants). Advisory board members described primary job activities as focusing teaching, coaching on culturally responsive teaching, co-leading equity-focused staff training, and providing social-emotional support in elementary schools. Across all members for both years (n = 13), 100% identified as a cisgender woman, and 23.1% identified as Hispanic/Latina. With regard to racial identity, 15.4% identified as Black or African American, 15.4% as multiracial, 7.7% as Middle Eastern, and 61.5% as white. The membership represented schools in urban, rural, and suburban communities in Ohio, Maryland, New York, and Alberta, Canada. Ten members were parents, and six reported that they were parents of children with special needs.

Study Year 1 (2021–2022) Procedures

Our goal was to collaborate with this advisory board to co-design the first version of the Maximize platform. All community advisory board meetings were virtual, given that members spanned three time zones. We used pre-work activities to maximize meeting time for exchange of ideas. In Table 1, we highlight the pre-work activities, content, and co-design processes used during advisory board meetings, as well as the products that resulted. To build relationships and trust, we opened each meeting with “a connector” or “ice breaker.” In alignment with Greenhalgh et al.’s (2016) recommendations for developing a shared vision through relationship building and trust, we discussed communication norms and procedures during our first meeting and revisited these norms in the next three meetings, asking for modifications as needed (few changes were requested via this format). Investigators communicated with advisory board members between meetings (e.g., via email or when we crossed paths as a function of other work) to process group dynamics and obtain perceptions of the climate. We used a variety of technologies to facilitate the work, including Qualtrics surveys, Google Docs, Jamboards, PowerPoint slides, chat boards, and Zoom break-out rooms. In break-out rooms, we designated a note taker and identified ways to signal agreement (e.g., green checkmark) with the content being written on the shared document or if an idea warranted more discussion (e.g., question mark). Relevant to this paper, collaborative meetings were spent (a) identifying the positive behavior supports to prioritize on the Maximize platform, (b) developing the key features for equitable implementation, and (c) developing the self-reflection activities for the platform. Periodically, we asked members to complete a survey to provide constructive meeting feedback.

Table 1 Advisory board activities during study year 1 (2021–22)

Although we initially focused on building trust, we also received early feedback from some members via individual communications that made us reflect on the extent to which the “space” felt too academic (e.g., based on language used during the meeting). With this feedback, we reduced our reliance on PowerPoint slides and academic terms and started using more break-out rooms to foster more connected conversations. In addition, as we met with advisory board members individually, we learned more about the expertise some members could provide outside of meetings. Thus, we started sharing requests for co-design via email between meetings and allowed board members to self-select the topics and activities they most preferred to collaborate on.

Study Year 2 (2022–2023) Procedures

As we obtained quantitative and qualitative data during the first year of Maximize implementation in schools (2022–2023), we shared data with advisory board members and co-created modifications for the next iteration (Version 2) of the platform (for 2023–2024). In spring 2023, the investigative team updated the equity-focused features based on data from the first iteration (Version 1). First, advisory board members rated each new key feature on Equity Impact, Feasibility, and Novelty (0 = Strongly Disagree; 4 = Strongly Agree). Then, we discussed the ratings, which informed refinement of (a) the key features and their order (higher impact first), (b) the self-assessment activities, and (c) the resources on the Learn More pages, all of which are now incorporated into the second iteration (Version 2) of the Maximize platform to be evaluated in 2023–2024. Some advisory board members also continued to be active between meetings, co-creating the next iteration of platform features (e.g., Goal Builder) and new self-reflection activities.

Phase 2: Feedback and Co-Production with Partnering Elementary Schools

After developing the Maximize platform with the advisory board, our goals were to partner with elementary schools, share Version 1 of the platform, observe use of the platform, assess feasibility, acceptability, and utility, and collaborate to co-produce the Version 2 of the platform.

Recruitment

Investigators obtained study approval from their universities’ institutional review boards and participating school districts. To recruit schools to use the Maximize platform, investigators distributed project flyers during the winter of 2021 to elementary schools in districts in Central Ohio. We then met with interested principals to describe the project; principals from three elementary schools representing two districts consented to participate. In August 2022, we held a 2-hour project orientation meeting with all staff in each building; teachers were compensated for participation ($30/hour). During this meeting, we described the project goals and procedures, Equity Literacy framework, and rationale for focusing on EF-PBS. Teachers were given the opportunity to ask questions and consent to the project. We oriented teachers to all aspects of the Maximize platform and asked them to complete an initial self-assessment. We encouraged teachers to connect to the platform once each week for 10–15 min. All teachers could earn contact hours toward continuing education credits for completing platform activities that they then could submit to their local education authority, as relevant to their professional needs.

Demographic Characteristics

In Study Year 2 (2022–2023), 123 educators from the three schools consented to participate (84% response rate across schools). Participants included 57 general education teachers, 34 other teachers (i.e., special education, allied arts [music, art, PE], and English language learner teachers), and 32 other staff (e.g., principals, assistant principals, school counselors, school social workers, aides). Ninety-one participants provided demographic data (see Table 2). Our sample is relatively equally divided across all grade levels and is similar in racial and gender distribution to teachers across Ohio (Ohio Education by the Numbers, n.d.). In this project, we view general education teachers as distal end users of the Maximize platform. Although not discussed in this paper, we also used social network analyses to identify peer leaders in each building and provided consultation to these peer leaders to raise their capacity to provide implementation supports and facilitative coaching to general education teachers (See Fig. 1). Thus, peer leaders are also end users, but are more proximal to the study team than classroom teachers. These peer leaders are included in the total educators count listed above.

Table 2 Participant demographics

The students in participating schools are somewhat more racially diverse than student bodies across the state of Ohio (Ohio Education by the Numbers, n.d.): 18–25% White, 42–59% Black, 9–13% Hispanic/Latine, 2–14% Asian, and 10–14% multiracial. In addition, about 13–24% of students identify as English language learners, 14–17% have an identified disability, and 50–58% are eligible for free or reduced-price lunch.

Maximize Technology Platform Procedures

After viewing a brief platform orientation video, all users were routed to complete an initial self-assessment of the 10 practices (they are referred to as strategies on the platform), rating their frequency of use and interest in improving implementation of each (see Owens et al., 2023 for outcomes). Based on their ratings, teachers were presented with a personalized strategy profile, which displayed the practices in one of three columns: Areas of strength, Doing well enough, and Potential areas for growth (see Owens et al., 2023 for visual images). For each practice, teachers could explore the Learn More pages to find the definition of the practice and its equity-focused key features and resources for implementation (see Fig. 2 for a guided user journey on the platform). Teachers and peer leaders had access to the same technology platform.

Fig. 2
figure 2

Initial maximize platform user journey and key features. Note. Once the educator completes the initial self-assessment, they can navigate to all content in any order and re-take a self-assessment at any time

Based on their personalized profile, teachers were encouraged to set a goal using the Goal Builder, an interactive ‘wizard’ that first prompted teachers to select which strategy they wanted to improve, and then to rate their use or interest in improving their equitable implementation of it. The equity-focused features they were interested in improving auto-populated into a list, and they were asked to select one feature for their goal. The Goal Builder then created a specific goal statement for this feature. For example, if a teacher chose the Corrective Feedback strategy and selected the equity-focused feature for improvement to be Considering a wide range of effective responses to disruptive behavior, including offering choices, providing opportunities for skill development, and engaging in problem solving discussions or restorative justice conversations with peers, the Goal Builder would auto-populate the statement: I am improving my use of corrective feedback by using a wide range of effective responses to disruptive behavior. Teachers could accept this goal statement or edit it for individualization. Lastly, consistent with principles of motivating change behavior (Miller & Rollnick, 2012), educators were asked to complete Motivational Ruler ratings. For importance  of their goal they were asked Among all other things you have to do, how important is this goal?). Then they were asked to describe why they did not choose a lower number. They were also asked to rate their confidence in carrying out the goal (How confident are you that you can improve this practice in the next week? and one thing that would raise their confidence.  Responses are on a 10-point scale, from 1 = not at all important/confident to 10 = very important/confident. After setting a goal, it was populated onto their primary dashboard on the platform.

At the end of each week, teachers received a prompt on the platform to review their goal. In the Goal Review survey, teachers were first asked: How have you done with your goal since your last review date? Response options include (1) Oops I forgot, (2) I made a little progress, or (3) I made a lot of progress. If they selected Oops I forgot, the Goal Review wizard inquired about the barriers that got in the way and what the teacher might do to work on their goal in the next week. If they selected, I made a lot of progress, the Goal Review wizard asked them to describe the progress they had made. Regardless of the first response, they were then asked, “What do you want to do with this goal?” Response options include: (1) Stop working on it, (2) Modify it and keep working on it, (3) Keep working on it with no changes, and (4) I’ve mastered this one! Let’s consider it an achievement. If they selected “I’ve mastered this one…” prior to a month’s time elapsing, they were encouraged to keep working on it for four weeks in order make the practice a habit. If they selected this option after a month had passed and they had completed two goal reviews, they were then asked with whom they might share this success to promote collaboration and celebration. Outcomes of the initial Goal Builder and Goal Reviews are reported in Owens et al., (2023).

Teachers were also prompted twice per month to complete self-reflection activities. For each activity, participants were first asked to watch a brief video or read a vignette and answer initial self-reflection questions. The next section, Learning Moment, provided information about the self-reflection topic, which was followed by some final self-reflection questions about how to move from knowledge to action. Self-reflection topics focused on Equity Literacy concepts (Gorski & Swalwell, 2015), as well as equity versus equality, student–teacher relationships, intersectionality, implicit bias when responding to challenging behavior, and navigating emotions during equity work. Descriptions of additions made for Version 2 of the Maximize platform are presented in the Results section.

On a monthly basis, the primary investigator (1st author) and other research team members met with each team of peer leaders (for training and capacity building purposes) and each principal (for project leadership purposes). Other research team members were in each school one-half day/week to meet with peer leaders (details related to capacity building for peer leaders presented in a separate manuscript). To obtain contextual information to inform project development and data interpretation, we engaged in informal classroom walk-throughs and kept field notes about challenging behavior in the classroom and situations that produced concerns about equity. We used this information to inform the development of additional self-reflection activities for Version 2 of the platform. Research team members did not directly coach teachers, so as not to interfere with the peer coaching process.

Finally, in the fall, winter, and spring of the 2022–23 school year, teachers were asked to complete surveys via REDCap (Research Electronic Data Capture) (Harris et al., 2009). In the winter and spring, surveys included an assessment of teacher perceptions of feasibility, acceptability, appropriateness, and usability of the Maximize platform. In the spring, teachers (platform users and non-users) were randomly selected and invited to participate in a 30–60-min virtual key informant interview conducted by two of the authors.

Phase 2 Quantitative Measures

Maximize Platform Use

Via the backend of the platform, investigators tracked teacher completion of self-assessments, self-reflections, and goals by quarter, as well as goals reviewed and mastered for those who set goals.

Acceptability, Feasibility, and Usability

In January and April 2023, teachers were asked to complete items from the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM), all developed by Weiner (Weiner et al., 2017), and the Technology Acceptance Model (TAM) (Chow et al., 2012); all were modified slightly for relevance to the Maximize platform. The Weiner measures have evidence of acceptable internal consistency and construct validity (Weiner et al., 2017). The AIM, IAM, and FIM scales consist of 4 items rated on a 5-point scale (from “completely disagree” to “completely agree”). Cronbach’s alpha values for the current sample are 0.91 (AIM), 0.85 (IAM), and 0.68 (FIM).

The TAM scale consists of 16 items rated on a 7-point scale (from “strongly disagree” to “strongly agree”). It has 4 subscales assessing general self-efficacy for using technology (α = 0.55), perceived usefulness (α = 0.93) and ease of use (α = 0.88) of the Maximize platform, and a subscale assessing intention to use the Maximize platform (α = 0.93). Previous research shows that items on the TAM have acceptable internal consistency (all α’s > 0.80) and evidence for predictive validity of acceptance and actual usage of technology (Venkatesh & Davis, 2000). Cronbach’s alpha for the current sample of the full TAM measure is 0.91.

As another indicator of acceptability, at each time point, teachers were asked an anticipated use question, How likely are you to log into the Maximize platform in the next week? (1 = not at all likely to 10 = very likely). During the winter and spring time points, teachers were also given the opportunity to respond to two open field questions: Tell us what you like about the platform and Tell us what you think could be improved about the platform. Winter responses to the two open field questions were used as qualitative data sources for this paper (see below).

Phase 2 Qualitative Measures

Self-Reflection Data

Within each self-reflection activity, educators were asked to respond to a set of self-reflection questions (e.g., How do you learn about your students’ identities? Can you think of any invisible barriers to learning that students face? Has there ever been a time where you sounded like the adults in the video? What do you think contributed to your behavior at that time?). At the end of each self-reflection, we also asked a micro-feedback question that stated: “We are open to learning and growing. Please provide any feedback you have about this activity (e.g., anything you particularly liked or did not like about it).” Data from the qualitative self-reflection questions and the micro-feedback question were used as qualitative data sources for this paper.

Interviews and Focus Groups

In spring 2023, two to four teachers per school, including platform users and non-users, and all peer leaders (n = 16) were invited to participate in an interview (teachers) or focus group (peer leaders) to discuss their experiences with the Maximize platform over the 2022–2023 school year. The recruitment emails described the interview or focus group topics, location, and length, and noted that participation was voluntary and confidential. Despite multiple follow-ups, only three teachers from two schools participated in the interviews (two users and one non-user), and eight peer leaders participated in four focus groups. All interviews and focus groups were conducted via Teams, audio-recorded, and transcribed verbatim. Participant responses to questions focused on the Maximize platform were used as qualitative data sources for this paper.

Analyses

With the platform use and survey data, we examined descriptive statistics (means, standard deviations, frequencies), Pearson correlations (between platform use and survey data), and one-tailed directional t-tests, expecting those with higher acceptability/appropriateness/feasibility ratings to be higher users of the platform. Teacher self-report measures were collected in winter (January 2023) and spring (April 2023). However, for three reasons, we only present winter data. First, there were more respondents in winter (n = 79 for AIM/IAM/FIM; n = 77 for TAM) than spring (n = 65 for AIM/IAM/FIM; n = 63 for TAM). Second, the responses and related conclusions are similar across time points. Third, most platform engagement occurred between August and December (see Results); thus, winter responses are more closely connected to the time of user engagement than spring responses. Differences in sample sizes between AIM/IAM/FIM and TAM measures reflect some teachers only completing a partial number of measures included in the survey battery.

Qualitative data obtained via the winter acceptability open field responses, self-reflection activities, micro-feedback question, interviews, and focus groups were cleaned and de-identified prior to analysis by the 7th author and three research assistants. We used codebook thematic analysis, as outlined by Braun and Clarke (Braun & Clarke, 2021) to analyze all qualitative data. This approach was chosen because it is a flexible, qualitative method that enables exploration of meaningful patterns within a dataset, which fit the goals of this paper. To analyze qualitative data, the 7th author developed two codebooks (one for the self-reflection, micro-feedback and open field data, and one for the interview and focus group data) from concepts based on a review of project goals. In thematic analysis, codes are the smallest unit capturing interesting features of the data potentially relevant to the research question (Braun & Clarke, 2021).

The interviews and focus groups were coded using Dedoose, a web application for mixed-methods research. A draft codebook was first tested to ensure applicability, and additional codes were added as needed.

Once a final version of the codebook was established, the 7th author coded all interview and focus groups transcripts. Before secondary coding on the interviews and focus groups was initiated by the research assistants, a coding application test was conducted in Dedoose to ensure good coding agreement among the qualitative research team for the interviews and focus groups. Dedoose calculates inter-rater reliability using Cohen’s kappa statistic (Cohen, 1960), for which a value of 0.61–0.80 indicates good agreement (Landis & Koch, 1977). The coding application test conducted on the interviews and focus groups resulted in a kappa value of 0.74, meeting the threshold of good agreement. Once all secondary coding was completed, the 7th author met with the research assistants to review their coding and discuss anywhere there was uncertainty or disagreement to reach consensus.

The self-reflection, micro-feedback, and open field data were coded in Excel. A research assistant did an initial round of coding for the self-reflection activity, micro-feedback, and open field data. Codes were then refined in the codebook in collaboration with the 7th author before research assistants completed and finalized the coding. The team—led by the 7th author—then created themes from all coded material related to Maximize users’ and non-users’ perceptions of acceptability, feasibility, and appropriateness of the platform.

Results

The result of our Phase 1 work with advisory board members was Version 1 of the Maximize platform, as described above. Below, we describe quantitative and qualitative data collected in Phase 2, as well as how the findings informed the second iteration of the platform (Version 2) which is being tested in the 2023–24 school year.

Maximize Platform Use: Self-Assessments and Self-Reflections

Platform use data showed that 109 educators (89%) completed a self-assessment in Quarter 1 (August–October 2022) and 52 (42%) completed a self-assessment in Quarter 2 (October–December 2022). Very few educators completed a self-assessment in Quarter 3 (January–March 2023) or Quarter 4 (March–May 2023; see Table 3). Regarding goal setting, 69 (56%) educators set a goal in Quarter 1, 31 (25%) set a goal in Quarter 2, and the numbers declined thereafter (see Table 3). Throughout the year, educators set a variety of goals: 20 participants focused on Effective Questioning, 20 focused on Student Choice, 17 on Teaching Prosocial Skills, 16 on Corrective Feedback, 16 on Student Check-ins, 11 on Acknowledging Positive Behavior, 8 on Personalized Greetings, 7 on Community Circles, 7 on Classroom Routines, and 4 on Establishing Classroom Expectations.

Table 3 Educators’ maximize platform use by academic quarter: self-assessments and goals set

Regarding self-reflection activities, 67 (54%) participants completed at least one self-reflection during the 2022–23 school year (range: 1–12; M = 3.21, SD = 3.03; see Table 4 for completion by quarter). Qualitative data from the self-reflection activities revealed three key themes related to overall perceptions of these activities. The first theme reflected comments about the importance of self-reflection as a specific activity on the Maximize platform. Open-ended responses to self-reflection activities demonstrated how educators used the reflection process as a tool to build their understanding of a given topic. Indeed, many participants wrote about how the activities created new levels of awareness related specifically to their social identities. For example:

It makes me realize that I maybe do not align with my students very well as far as identity goes. And that means it is even more important for me to learn about them and the different assets that they have. [Teacher A]

I have never thought about all of those [social identities]. I feel like as a teacher we focus on gender and culture (more visible identities). It makes me think that I have a lot to discover about my students. [Teacher B]

Table 4 Educators’ maximize platform usage by academic quarter: goal reviews and self-reflections

A second theme found in the self-reflection data was that, overall, end users were generally satisfied with the content and activities. For example: This was a great video and activity. It reminded me of how important it is to keep positive relationships with all of those around me…even on my tough days. That’s when it’s the most important. [Peer Leader A] To this end, several participants specifically referenced the impact of learning about their own power and privilege. As described by one end user, “I enjoyed the activity. The [power and privilege] wheel makes it easy to identify your privilege” [Teacher C].

The last theme emphasized a desire for more detailed content and tools to support end users in taking action following the learning. For example, one participant wrote that they “would like to learn more about these things through articles[Teacher D] as well as access to other sources of information to build their knowledge base. Another participant stated: I have always felt like I do these things well. I am EXTREMELY passionate about this type of work. Having more resources and factual information of what actually [should] take place in the classroom would be beneficial. [Teacher E] Other participants indicated that more direct support would be needed to help them move from knowledge to action in the classroom: I would love to see some examples of how teachers implement student-guided class expectations in their own classrooms. [Peer Leader B].

Overall, these themes suggest that participants valued the opportunities for self-reflection embedded within the Maximize platform and were generally satisfied with the content and activities. However, more concrete examples and tools are needed so that end users can move the ideas and concepts presented into action.

Additional feedback on Maximize self-reflection activities comes from the micro-feedback question on content presented in the self-reflection activities (i.e., what users liked/did not like about the activity). Substantive feedback was offered following six of the ten self-reflections, with 25 total users providing responses. The self-reflection activity on teacher and student identities resulted in the highest number of responses. Many users had positive feedback (e.g., stated the activity was “thought provoking” and “almost like therapy”). There were also a handful who suggested that some of the questions were rather personal and elicited guilt (e.g., “makes me feel bad for being a heterosexual, cisgender [person]”).

Overall Acceptability, Appropriateness, Feasibility, and Ease of Use

As an initial indicator of acceptability, in August, 94 educators (47 general education teachers; 47 other educators) rated the likelihood (between 0 and 10) that they would use the platform in the next week: 56.4% provided a response between 7 and 10, 36.2% between 4 and 6, and 7.5% responded between 0 and 3. The average across all educators was 6.74 (SD = 2.19). The anticipated use rating was correlated (r = 0.31, p = 0.003) with total engagement with the platform (i.e., the sum of completed self-assessments, self-reflections, goals set, and goals reviewed). In winter, educators again rated the likelihood (on a scale of 0 to 10) that they would use the Maximize Platform in the next week, and 77 educators responded at this time point (45 general education teachers; 32 other educators); 38.9% provided a response between 7 and 10, 29.9% between 4 and 6, and 24.7% responded between 0 and 3, and the average score across educators was 5.22 (SD = 3.04). This anticipated use rating was also correlated (r = 0.44, p < 0.001) with total engagement with the platform. See Table 5.

Table 5 Percent of educators reporting low, medium, and high likelihood of Maximize platform use in fall and winter

In winter, 79 educators (46 general education teachers; 33 other educators) completed measures assessing overall acceptability, appropriateness, and feasibility of the Maximize platform; average scores on the AIM, IAM, FIM (on a 5-point scale) were 3.63 (SD = 0.64), 3.72 (SD = 0.60), and 3.66 (SD = 0.55), respectively (see Fig. 3 for distribution of responses for example items of acceptability, appropriateness, and feasibility). For the TAM, 77 educators (45 general education teachers; 32 other educators) responded in the winter, and the average reported computer self-efficacy was 6.00 (SD = 0.92) on a 7-point scale. Computer self-efficacy was not related to AIM or FIM measures (rs < 0.15), suggesting that limitations in acceptability and feasibility were not related to general computer self-efficacy; this gives us greater confidence that ratings are related to the Maximize platform itself and not technology in general. The average perceived ease of use was 5.76 (SD = 1.02), aligning with a rating of somewhat agree to moderately agree. The average perceived usefulness was 4.79 (SD = 1.14), which aligns with a rating of neutral to somewhat agree. Lastly, the averages for behavioral intention to use and perceived influence were 4.51 (SD = 1.35) and 4.51 (SD = 1.38), respectively. Correlations between subscales of the TAM (other than general computer self-efficacy) and winter measures of acceptability, appropriateness, and feasibility ranged from 0.23 to 0.79 (ps from <.05 to < 0.001).

Fig. 3
figure 3

Educator Perceptions of Acceptability, Appropriateness and Feasibility of the Maximize Platform

We also examined acceptability, appropriateness, and feasibility among high and low users (high use was defined by a total platform use score at the 75th percentile of the distribution). High and low users did not differ on ratings of acceptability or appropriateness; however, high users (M = 3.80; SD = 0.48) reported marginally higher feasibility scores than low users (M = 3.58; SD = 0.58); t(76) = − 1.70, p = 0.05).

Open-field data from the winter acceptability measure indicated that, for the most part, users appreciated the convenience of the overall Maximize platform as well as the resources and classroom management strategies provided. For example, one user indicated, “I like that the platform is able to be utilized at our convenience.” Another user wrote, “I can log on and use it when I have time.” In particular, a few users appreciated the convenience of setting goals at their own pace, as explained by one user. “I like how you are [able] to reflect on your goals and it is at our pace, but you can get information for helping with your goal.” Many users (about one-third) commented that they appreciated the resources, tools, and strategies found on the platform. As described by one user, “I enjoy watching the videos and seeing it done in ‘real life’. The teachers are positive and it is easy to understand.” Other users appreciated not having to hunt around the platform looking for tools and strategies. For example, one user explained, “I like that all the tips and strategies are in one place to easily find.” The ease of use in finding resources was noted by another respondent, who stated, “I like that the platform is easy to navigate and provides a lot of valuable resources for teachers to use with students.”

Qualitative data from interviews and focus groups also revealed constructive feedback about the Maximize platform. Some end users we spoke with found the platform to be somewhat confusing and not as user-friendly as it could be. As described by one educator, “it seems like so many steps being in, it seems like it’s kind of hard to navigate at first…you look at one day and the next day you are like ‘how did I get here?’” A peer leader described missing important elements of the Maximize platform due to their lack of visibility. The Toolkit, in particular, was a feature noted for its lack of visibility.

I don’t know. There’s something about I feel…for a long time I neglected the toolkit portion of it across the top. And I am not sure how to make that more pronounced. [Peer Leader]

Peer Leader: There is something about the toolkit portion that is chock full of good information…

Interviewer: But it’s not popping out…?

Peer Leader: Yeah…and I am not really sure why. I don’t know if it is because it’s the last tab…I don’t know what makes it so that people don’t notice it.

A second theme identified in the interview and focus group data were suggestions from end users on how to potentially simplify the platform and subsequently increase its use. Several participants suggested that having fewer “clicks” in the system to get to what they needed would be a useful change.

I think it [increased use] comes with the simplicity of the website, like it needs to be a little bit quicker…I think any easier route to get there is [beneficial] and eventually getting to the point of ‘hey, it’s only two clicks’. [Peer Leader]

But if you are looking for a specific strategy, you would have to click through all of those domains. You know what I mean? Like reflecting on biases doesn’t really help me when I am like ‘what do I do with this kid’? [Teacher]

In sum, many users appreciated the convenience offered by the platform as well as the helpfulness of the resources and strategies. However, other users suggested simplifying the platform and enhancing functionality to support future teachers in utilizing and appreciating the benefits of what the Maximize platform has to offer.

Revisions to the Platform Based on end user Feedback: Second Iteration (Version 2)

Self-Assessment

In the Version 1 of the platform, the self-assessment included traditional definitions of the 10 practices, and teachers were encouraged to visit the “Learn More” pages to learn about the equity-focused features for each. We chose this process for three reasons. First, we wanted to highlight the equity-focused features as an important expansion of traditional positive behavioral supports. Second, we wanted the self-assessment to be in the early stage of the user journey to guide areas for learning and goal setting, and we wanted all users to be engaged with the technology so we could observe user choice as it related to goal setting. We were concerned that sharing the equity-focused features from the start might dissuade some users, and we wanted to maximize the number who we ‘called in’ to this work. Third, we wanted to be able to compare our self-assessment data to previous work on teachers’ use of traditional behavior supports and expand upon it (See Owens et al., 2023  for review).

In hindsight, we regret only including traditional definitions of practices in the self-assessment process, as it did not allow us to assess all teachers’ reported use of all equity-focused features; instead, this information was gathered during the Goal Builder process. Because teachers chose different goals, we only obtained self-report on equity-focused features for any given practice on a subset of teachers. In addition, by putting the equity-focused features on the “Learn More” pages and in the Goal Builder, we may have ‘buried’, rather than highlighted, the equity-focused features, as suggested by the qualitative feedback (i.e., difficulty finding things). Thus, for Version 2, we included each equity-focused feature directly in the self-assessment process.

Equity-Focused Key Features

In Version 1 of the platform, each strategy had 5 to 13 key features, some of which were demonstrable (e.g., Work to reduce the use of consequences that exclude students from the classroom environment) and some of which were not (e.g., Teachers engage in ongoing self-reflection to become aware of situations that trigger their own biases.). See Supplemental Material. Although some teachers reported liking the resources provided on the platform, themes from the qualitative data revealed that teachers wanted resources that would help them move from knowledge to action that were more tightly connected to the challenges teachers were facing or more tightly connected to the Maximize goals teachers were setting. Thus, for Version 2, we made two modifications. First, we reduced the number of equity-focused key features for each practice to range from 3 to 6, prioritizing action-oriented features. We integrated the less demonstrable features into the self-reflection activities. Advisory board members rated the equity impact, novelty, and feasibility of the revised features, which led to discussion around prioritizing features with the most potential for equity impact. Thus, key features on Version 2 of the platform are now organized in a way that emphasizes perceived equity impact. Second, we restructured our “Learn More” pages by connecting resources to each equity-focused feature. For example, for a teacher working on “Asking caregivers about student interests to facilitate meaningful check-ins,” we now provide resources specific to actions for this feature, including (1) an editable letter that teachers can send home to caregivers, (2) a website with 21 questions teachers can ask students during check-ins, and (3) videos demonstrating check-ins.

Goal Builder

In Version 1, the Goal Builder contained five steps. To enhance feasibility, we streamlined the Goal Builder in Version 2 to three steps, and gave users more concrete ways to move from idea to action. Namely, in addition to auto-populating a goal statement, we also auto-populate SMARTIE (Specific, Measurable, Achievable, Realistic, Time-limited, Inclusive, and Equitable) steps to provide teachers with specific demonstrable steps to achieve their goal; both can be edited by the teacher for individualization.

Self-Reflections

Version 1 of the platform contained 10 self-reflection activities. Using qualitative feedback, informal observations, and conversations with advisory board members, we designed 5 new self-reflection activities for Version 2. These include topics such as alternatives to exclusionary discipline, addressing micro-aggressions, modifying deficit thinking, enhancing cultural sensitivity in responding to challenging behaviors, and reflecting on how misunderstandings emerge when we have differing experiences.

Discussion

In this paper, we describe how we leveraged the expertise and perspectives of a community advisory board to co-design Version 1 of the Maximize platform (Phase 1), and how we used quantitative and qualitative data from end users to co-produce Version 2 (Phase 2). Data on platform use revealed that educators used the platform most in Quarters 1 and 2 of the school year; however, there was very little platform use in Quarters 3 and 4. Relatedly, ratings of acceptability, appropriateness, and feasibility of the initial iteration (Version 1) of the platform were moderate not strong. However, educators who completed nine or more activities on the platform reported significantly higher ratings of feasibility than those who completed fewer than nine activities. Further, ratings of likelihood of use in fall and winter were moderately correlated with actual platform use. Qualitative data revealed that teachers viewed self-reflection as a useful tool for learning about equity-focused content and appreciated both the self-reflection activities and overall Maximize platform. However, they also wanted more concrete tools and resources to help translate this learning into action and wanted actionable resources to be easier to find on the platform. These data informed Version 2 of the platform, which will be evaluated in 2023–2024. Below, we expand upon lessons learned and highlight implications for co-design and co-production processes in school mental health.

Co-design with Advisory Board Members (Phase 1)

In following the principles of co-creation (Greenhalgh et al., 2016), we structured our advisory board meetings with the goal of developing group norms for communication, a shared vision for the work, and a supportive and stimulating environment to maximize creative idea sharing. Through these processes, we hoped to reduce power differentials between members and facilitate trust within the group. We believe our processes produced successful outcomes, as evidenced by positive relationships that emerged between investigators and advisory board members and among advisory board members, and the novel content and procedures generated that would not have been produced without the contribution of our advisory board members. Thus, our experience and outcomes align with the reported benefits of community-engaged research (Luger et al., 2020).

Along with these successes came rich learning opportunities. We hope the following lessons inform the work of other research-practice partnerships focused on educational equity. First, it is of great value to have diverse perspectives on the advisory board. For example, there were times when an innovative idea emerged from those in an administrative role. With further discussion, however, other advisory board members in roles that held less power (e.g., teachers) were able to share concerns. Allowing time to dive more deeply into ideas, and actively valuing all contributions, was key to surfacing these different perspectives. We recognize that the time it takes to work in this way can be in conflict with the fast-paced timelines of typical  education research; however, the perspectives and opinions that emerged led to rich discussions, learning experiences for all members, and the development of content and processes for the Maximize platform that balanced the needs and views of multiple educator groups.

Second, in order for diverse perspectives to emerge, it requires group leaders to constantly attend to power relations. From our (White, academic) perspective we felt we had created a space where all voices were equal. However, in a rich (and brave) one-on-one conversation, an advisory board member who is a self-identified person of color shared that she considered leaving the board because the space still felt white-centric, academic, and thus “not her space” (e.g., given her practice background, she felt unsure within an environment led by university professors; she did not see many others who looked like her). Thus, it is important for investigators to continually acknowledge the power and privilege we hold. Even if we are not exercising that power personally or in the moment, in the eyes of new collaborators, such power may be ascribed to us because we represent those who hold (and have exercised) such power (Sue & Zane, 1987). Following conversations with this member (of which we are deeply appreciative, as we recognize the emotional labor required), the first author reflected on the intangibles she brought to the room as a function of multiple identities (e.g., being a highly educated White person) and acknowledged that the language and processes used to facilitate the meetings did not align with the member’s anticipations of the experience.

Thus, as dominant-group investigators working in research-practice partnerships, it is critical to take time to develop meaningful relationships with our advisory board members, critically reflect on their feedback, and act on feedback from those whose voices are often silenced in educational spaces (Glass, 2001). Given feedback from our members, we recommend the following strategies to facilitate community members’ comfort and confidence when serving on a research-practice advisory board: (1) orient members to procedures prior to the first meeting, (2) pair advisory board members with a mentor so they can ask questions in a safer space; our members suggested pairing persons of color with mentors of color, if possible, (3) check in with members between meetings, (4) weigh practice-based evidence as much as research-based evidence, (5) listen to and honor each other’s experiences when there is dissatisfaction, confusion, or concern, (6) approach conversations with humility and non-defensiveness, and (7) if possible, have in-depth representation of various identities, as one member stated: “It is hard to be the only one, or one of just a few; it constantly tests you and creates doubt about belonging.” We also witnessed benefits when members of the research team put themselves in the role of the learner. For example, the Ohio research team visited the school of one advisory board member in New York to learn about what she had created in her school that could inform our work. Thus, we encourage other dominant-group investigators to consistently acknowledge explicit and implicit power dynamics between research and practice communities, to make self-reflection about one's own words and actions an ongoing practice, and to take time to connect with advisory board members between meetings. When critical feedback is offered, it is important to view it as a transformational learning experience and an opportunity to grow in a way that will enhance the co-design process, the partnership, and the research.

Third, we note that the way we developed our advisory board was through personal networks. Given strong racial segregation within social groups (Allport, 1954), it is perhaps not surprising that many of our advisory board members were White women. Although we focused on intersectional equity, which our members represented, given the entrenched racial disparities in educational outcomes, we recognize that is especially critical to center the voices of racialized and colonized peoples. Thus, with each additional year, we intentionally expand our network by leveraging the networks of our advisory board members so we can further diversify perspectives on the board.

Lastly, we learned that different members preferred to engage in the work via different formats (e.g., verbal expression, collaborative efforts, with or without technology, thinking before meetings via pre-work, engaging in work in between meetings). Once we established connections among all group members, we learned there were benefits to holding smaller group meetings (e.g., break-out rooms or smaller meetings when we could not find a time for the whole group), offering options for collaboration between meetings, and letting members decide what activities and what time frame best suit them. We perceive that this choice allowed members to engage in work with which they had the most confidence, expertise, and excitement, and therefore provide the richest contributions.

Co-production with Educators (Phase 2)

Quantitative and qualitative data from educators revealed moderate acceptability, appropriateness, and feasibility of the initial iteration (Version 1) of the Maximize platform. Given that such ratings were not correlated with general computer self-efficacy, we have greater confidence that ratings were related to the Maximize platform itself and not technology use in general. In addition, we recognize that equity-focused professional learning, particularly in the current social and political climate, can be challenging, and engagement in equity-focused initiatives is often variable across members in any organization (Goldmann, 2012; Romijn et al., 2021). Thus, the finding that about two thirds of educators agreed or strongly agreed (see Fig. 3) with statements about the platform’s acceptability and appropriateness is encouraging.

However, platform use data revealed a significant decline over time. This is critically important. It could suggest that users got what they needed from the platform and did not see a need to return. However, given the intent of the platform to encourage ongoing self-reflection and goal setting, these dwindling numbers suggest barriers to use, such as lack of time and other competing priorities, challenges commonly shared by our partnering educators. When we reflected on these declining usage numbers, we identified several lessons. First, we saw that the strongest use of the platform occurred when there was structured time to use it (e.g., protected time in the orientation meeting to complete the self-assessment; time during district-sanctioned professional learning days for platform exploration and collaborative reflections). Thus, although interactive technology that uses strategic design flows has the capacity to overcome some barriers and facilitate teachers’ use of evidence-based strategies (e.g., Owens et al., 2022), additional facilitators must be present. Important facilitators include principal support (Locke et al., 2019), teacher consultation (DeFouw et al., 2023; Nadeem et al., 2013), and nudges within the technology (Lawson et al., 2022). In our second year of implementation, we are working closely with principals to further enhance their promotion of platform use and to find more protected time for teachers to use the platform. In addition, as part of the larger project, we are examining the role peer leaders can play in supporting teacher platform engagement and implementation of EF-PBS. Similarly, given that the correlation between intended use and actual use was moderate, we will continue to look for ways that high platform users can champion the benefits of the platform.

Second, when considering the context of participating schools (high stress, staff turnover, student mobility), we acknowledge that engaging in self-paced voluntary learning about equity may be viewed as a luxury rather than a direct solution to the challenges they are experiencing. Thus, although the platform offers guidance on practices that can improve student behavior, foster student–teacher relationships, and reduce teacher stress, perhaps it was not viewed that way by teachers. Thus, we made changes to the platform (e.g., streamlining wizards, better connecting equity features to actionable steps, offering initial SMARTIE steps) to make it more accessible and relevant to educators’ daily experiences, while maintaining our overall focus on the root causes that drive disruptive behavior and educational inequities. We argue that balancing practicality with the science of effective approaches should be a high priority for education researchers. It is encouraging that those who completed more activities on the platform reported higher ratings of feasibility than those who completed fewer activities. However, we cannot know the directionality of this relationship. On one hand, if high users find the platform feasible, we do not need to modify platform features that are working. Instead, we need to think through approaches to get more teachers over the initial “hump” of engagement. On the other hand, we need to acknowledge that we are not reaching a substantial portion of teachers, thus continued modification to enhance perceived relevance and practicality may be needed.

Third, qualitative data revealed important themes that guided our co-production of Version 2 of the platform. Given that teachers generally appreciated the self-reflection activities, we added additional self-reflection activities for the second iteration. To make these new activities relevant to participating teachers, the new self-reflections focus on topics that emerged in the qualitative data, during our informal observations in schools, and in conversations with peer leaders and principals. Additionally, the qualitative feedback indicated that we needed to streamline some aspects of the platform and provide more concrete tools and resources to help translate this learning into action. Thus, as described in the Results section, we made modifications to several components of the platform to address this feedback. Our insights on new activities also came from the time we spent just being in the schools. Although spending time at schools is, again, in tension with standard research timelines, we learned the critical importance of “being present” in schools. Having a presence affords us a richer understanding of the school context, which contributed meaningfully to our interpretations of the data. Consistent with recommendations in the literature (Cook et al., 2019), it also facilitated the development of trust with educators in the participating schools, which we found served to reduce power differentials and enhance the honesty with which they shared feedback.

Lastly, our data have implications for the measurement of acceptability, appropriateness, and feasibility in school mental health research. Although the gap between intentions and actions is well documented, given that our anticipated use rating was moderately correlated with actual platform use, this rating of intent may be useful in other projects. For education technology endeavors, it may also be useful to assess general computer self-efficacy to confirm that this is not influencing ratings of acceptability, appropriateness, and feasibility of the intended content or processes. Additionally, it may be helpful to examine these constructs among high and low users, as feasibility may be best realized once users gain minimal proficiency with the system.

Limitations

We acknowledge that although our completion rate for the quantitative surveys was respectable, the responses of teachers in this sample may not represent all teachers in the sample and do not represent other areas of the country. In addition, our response rate for teacher interviews was low, which may be because of their timing at the end of the year or because the researchers who led qualitative data collection and analysis were not part of the local investigative team. We thought their removed position would offer enhanced confidentiality so educators would feel comfortable sharing candid feedback; however, teachers may have been more invested in the process if the invitations came from the local team who had built relationships with these teachers. Lastly, we recognize that most feedback is coming from teachers who are likely highly interested or disinterested in learning about equity-focused practices. Thus, their feedback may not help us make modifications to better engage those who are only moderately interested in this work. Relatedly, teachers may have engaged in self-reflection outside of the platform (e.g., conversations with colleagues) that we did not capture with our data collection procedures.

Conclusions and Future Directions

The co-creation approach to the development of the Maximize platform that we review in this paper has led to a novel tool grounded in the lived experiences of educators that, in its initial iteration, was viewed as moderately feasible, acceptable, and useful. Although co-creation is a slower process than traditional intervention development, we believe our data highlight how the benefits of this approach outweigh the drawbacks, and how it created a richer end product than if the work came from academia alone. We also acknowledge that co-creation is an learning journey, especially for White researchers who have been socialized to believe their Eurocentric, science-based perspective is the ‘right’ one. We continue to deconstruct our own biases in this work and encourage other dominant-group investigators to do the same. We are grateful to our advisory board for sharing their vast knowledge and hope the recommendations help others effectively elevate community members’ voices and confidence in the context of school mental health research-practice partnerships. We also hope this article adds to the body of work in school mental health calling for approaches to intervention design, development, and testing that center the needs of end users and actively engage with practitioners and others with lived experience to co-create interventions.

Feedback from our advisory board and school staff using the Maximize platform guided the co-production of Version 2, which aims to improve the platform toward supporting equitable educational experiences for all children. Our work shared here represents the first step in a long process. We are currently engaging in another round of assessment with the three partnering schools as well as one new elementary school to observe responses to the second iteration of the platform. Because our qualitative data suggest that brief episodes of guided private self-reflection are a welcomed tool for professional learning related to equity-focused topics, in our current evaluation, we are assessing the relationship between completion of self-reflection activities and change in teacher perceptions of EF-PBS. We are also examining the extent to which goal setting and ongoing goal review relates to actual change in teacher practices and student outcomes, and hope the enhanced prominence of the equity-focused features and clear connection between the features and actionable implementation strategies will facilitate such change. Lastly, although interactive technology can help to overcome barriers to implementation, technology alone is likely insufficient. In the larger project, we are evaluating the impact of the combination of Version 2 of the platform and the use of peer leaders in the school to support teachers’ use of EF-PBS. We hope that in sharing our experiences and lessons learned, other researchers may better understand how to effectively engage in this type of community-oriented co-creation work in school mental health, toward moving the needle on teachers’ use of equity-focused positive behavioral supports.