Introduction

Children in rural settings are less likely to receive mental health services than their urban and suburban counterparts and even less likely to receive evidence-based care (Wagenfeld, 2003). Student mental health concerns can be addressed in rural schools by using multi-tiered systems of support (MTSS; Center on MTSS, 2022) to organize and adopt feasible and cost-effective mental health evidence-based practices (EBPs; Anderson et al., 2013; Herschell et al., 2021; Kelleher & Gardner, 2017; Wagenfeld, 2003). MTSS, such as positive behavioral interventions and supports (PBIS; Sugai & Horner, 2009), can be extended to integrate and implement mental health EBPs according to level of risk or problem severity (Olson et al., 2021). With adequate training and support, school-based mental health staff such as counselors, school psychologists, and social workers can deliver services to students and assist other school staff with implementing EBPs at the targeted and indicated levels. Such school professionals in rural schools are ideally positioned to address mental health problems because of their knowledge of mental health issues and their experience working with children (Berryhill et al., 2021; Foster, 2005). Unfortunately, most school professionals in rural settings are not trained on specific mental health EBPs (Siceloff et al., 2017) because of difficulty accessing training (Harmon et al., 2007).

The purpose of the present study is to (a) describe the development of an online training platform and implementation strategy for school professionals with some mental health background in rural schools based on the user-centered design approach (Lyon & Koerner, 2016), and (b) examine the platform’s and implementation strategy’s perceived feasibility, appropriateness, acceptability, and usability.

Remote training has been used for training school personnel in rural schools. For example, remote coaching has been used for the implementation of PBIS (McDaniel & Bloomfield, 2020; McDaniel et al., 2020) and to enhance effective classroom behavior management by teachers (Bice-Urbach & Kratochwill, 2016; Fischer et al., 2016). School professionals can be trained remotely using synchronous (live/interactive) or asynchronous (previously recorded, non-interactive, accessed on-demand) approaches. Synchronous consultation has been used to support teachers who were experiencing difficulties addressing disruptive behavior in the classroom (Bice-Urbach & Kratochwill, 2016). Results showed that student disruptive behaviors eased after individualized behavior support plans were implemented. Further, teachers found the remote consultation experience feasible and acceptable.

Rural School Context

Children in rural areas present with similar levels of mental health concerns as children in urban areas but experience more barriers in terms of accessing support than their urban counterparts (Bureau of Health Workforce, 2017; Kelleher & Gardner, 2017; Robinson et al., 2017). Mental health services in rural communities are marred by accessibility, availability, affordability, and acceptability challenges (Ezekiel et al., 2021; Wilson et al., 2015). For example, rural communities have fewer mental health professionals who have been trained on EBPs compared to urban and suburban communities (Larson et al., 2016). Children in rural communities are less likely to have adequate health insurance coverage than children in other locations, and many parents do not have the means to pay out of pocket for services (Newkirk & Damico, 2014). Transportation barriers affect families in rural communities significantly more than families in other settings (Arcury et al., 2005). Further, the stigma of mental health services is still a potent barrier among parents in rural locations (Owens et al., 2007; Polaha et al., 2015). Fortunately, providing services in the school setting can address these systemic and cultural barriers because those services would be widely available, offered in a normalized setting that minimizes stigma, and provided free or at subsidized cost (Kern et al., 2017; Owens et al., 2002; Stephan et al., 2007). Although rural schools are increasingly playing an important role in tending to the behavioral health of students (Hoover & Mayworm, 2017; Owens et al., 2013), they also face significant challenges. For example, rural schools have difficulty attracting trained mental health professionals (American Psychological Association, 2016), have large staff turnover (Lee et al., 2009), receive inadequate funding for mental health services (Slade, 2003), and have difficulty gaining access to quality professional development training (Harmon et al., 2007). Providing school professionals with appropriate implementation strategies (i.e., training) that are effective, available on demand, and built for the specific rural context might position rural schools to better serve student mental health needs while simultaneously contributing to narrowing services disparities (Moon et al., 2017; Paulson et al., 2015; Wilger, 2015).

The use of a participatory approach for the development of an implementation strategy for remote training might contribute to greater participation rates among school professionals and help to sustain the use of EBPs with students over time. User-centered design (UCD; Lyon & Koerner, 2016) is a useful framework for the development of a remote training strategy.

User-Centered Design

User-centered design (UCD) has been used extensively in the design of digital products (Abras et al., 2004), mental health interventions (Lyon & Koerner, 2016) and digital mental health interventions (Mohr et al., 2017). UCD is a process that bases the design of an innovation on information provided by constituents, or people who will use the innovation (Goodman et al., 2012; Hanington & Martin, 2012). The general development approach in UCD includes evaluating stakeholder needs in the context in which the product is going to be used, discussing design ideas with stakeholders, developing prototypes of those ideas at varying levels of “fidelity,” conducting initial evaluations with stakeholders, refining the prototypes, evaluating the prototypes to determine if they achieve their purpose, and implementing and evaluating the results (Lyon et al., 2020).

In the present study, the target users for the training platform are school professionals with some mental health training who work in rural schools. The target problems addressed via UCD are the professionals’ training needs based on an examination of prior experiences with mental health training and context-specific implementation barriers (Dopp et al., 2018). The development of the “product” (i.e., platform components and implementation strategy) can be guided by the product’s perceived feasibility, appropriateness, acceptability, and usability of the various prototypes (Lyon & Koerner, 2016). In implementation research, perceived appropriateness refers to the perceived fit, relevance, or compatibility of the innovation for a specific setting; feasibility refers to the extent to which an innovation can be successfully used in a particular setting; and acceptability refers to the perception among stakeholders as to whether the innovation is agreeable, palatable, or satisfactory (Lewis et al., 2015; Proctor et al., 2011). Finally, usability has been defined as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction (International Organization for Standardization, 1998).

Training Platform and Implementation Strategy

Initial training workshops and ongoing supervision are key strategies for implementing EBPs in schools. Multicomponent training strategies for mental health therapists, comprised of an initial workshop followed by ongoing consultation, have been found to be more effective than a single workshop for enhancing therapist clinical skills and knowledge, treatment adherence, and clinical outcomes (Beidas & Kendall, 2010; Herschell et al., 2021).

Existing models of remote training are delivered synchronously via webinar lectures or interactive coaching, and to a lesser extent, asynchronously through previously recorded modules (e.g., Walker & Baird, 2019). Synchronous training has been found to be effective and acceptable to school professionals (Bice-Urbach & Kratochwill, 2016; Fischer et al., 2016). However, relying solely on synchronous training might not be feasible and the training might not be sufficiently potent for school professionals. Finding sufficient time for training has been an important barrier to training busy school professionals (Moore et al., 2022). Given that learning how to implement a new intervention can be difficult and time intensive, offering additional asynchronous training to professionals in rural schools seems both appropriate and likely necessary (King et al., 2021). Asynchronous training would allow busy school professionals to learn EBP implementation at their own pace, review specific session activities right before meeting students for group or individual sessions, and reinforce previously learned material by reviewing modules or consulting manuals online.

The original implementation strategy for this study was developed based on the existing research training literature (Beidas & Kendall, 2010; Herschell et al., 2010; Sholomskas et al., 2005) and our previous work in under-served schools (Eiraldi et al. 2020; Eiraldi et al., 2018; Eiraldi et al., 2019; Eiraldi et al., 2016). The implementation strategy consisted of two distinct levels of online training support: (a) initial synchronous workshop, plus asynchronous modules; and (b) initial synchronous workshop, plus asynchronous modules, plus scheduled synchronous consultation. However, given the specific rural school context, the implementation strategy was adapted in this study based on an examination of stakeholder training needs, barriers to and facilitators of remote training, as well as perceived appropriateness, feasibility and acceptability of the implementation strategy.

Purpose of the Present Study

The purpose of this pre-implementation mixed-methods study in rural schools was to describe the development of a key component of the training platform (asynchronous modules) and the adaptation of the implementation strategy both guided by UCD. We employed a mixed-methods design as the collection of quantitative rankings alone would be insufficient for the purpose of the study (Creswell & Clark, 2011). Rich qualitative data were necessary to understand training needs and barriers and facilitators to inform prototype revisions alongside quantitative rankings. Qualitative data led the adaptation of the implementation strategy, while quantitative data provided a general assessment of feasibility, appropriateness, acceptability, and usability through prototype iterations. We expected that an analysis of reported training needs and barriers to and facilitators of remote training would inform the adaptation of an implementation strategy that would be acceptable and feasible to implement in the rural school context, and that participants would express willingness to participate in training. The study was conducted prior to a randomized controlled trial to examine school professional and student outcomes. The protocol for the larger study has been described in detail elsewhere (Eiraldi et al., 2022).

The contributions of the study to school mental health include the use of UCD for the development of a remote training platform and implementation strategy for school-based mental health professionals in rural schools, and the platform’s perceived feasibility, appropriateness, acceptability, and usability.

Method

The study employed a mixed-methods design with a QUAL + quan structure in which qualitative data served a primary role and quantitative Likert-scale rankings provided a supportive, secondary role (Creswell & Clark, 2011). The function of the mixed-methods design was that of complementarity. We used semi-structured interviews and open-ended responses (QUAL) to elaborate upon quantitative findings (quan) and to better understand the process of implementation of remote consultation as experienced by stakeholders (). The qualitative data provided insight into training needs, barriers, and facilitators that informed prototype revisions and was collected alongside quantitative rankings, which provided a high-level assessment of feasibility, appropriateness, acceptability, and usability across participants.

Participants

The study was conducted with school professionals from 25 K-8 rural schools in Pennsylvania that were implementing PBIS at Tier 1. Most participants were female, white, and non-Hispanic. More than two thirds (68%) of participants were school counselors. Social workers made up 8% of the sample. The other 16% included the following roles: reading specialist and PBIS coach, teacher, emotional and autistic support teacher, and behavior interventionist. Almost an equal number of school professionals had less than 10 years of experience on the job as had more than 10 years of experience (see Table 1).

Inclusion Criteria

Any school, designated by the US Census Bureau as “rural,” with a PBIS program that was implementing Tier 1 with fidelity, with or without a functioning Tier 2, was considered for inclusion in this study. Fidelity of implementation at Tier 1 was important because Tier 1 is foundational for the development of mental health interventions at the advanced tiers of support (Hawken et al., 2009). Due to the COVID-19 pandemic, most schools in Pennsylvania shifted the criterion from scoring above a certain percentage to simply issuing a score. As such, participant schools entered the study with only a prior indication of implementing PBIS with fidelity, based on the results of the 2019–2020 school year measurement according to criteria set by the Pennsylvania PBS Network (i.e., a score ≥ 70% for Tier 1 on the Tiered Fidelity Inventory; Algozzine et al., 2014). Any school-based professional (e.g., school counselor), with or without experience implementing Tier 2 interventions, based at a school implementing PBIS, was eligible for inclusion in the study.

Measures and Study Procedures

Study staff emailed or called principals of rural schools who were implementing PBIS to explain the study and to ask if they would allow a school mental health professional or other professional from their school to participate. If the principal agreed, study staff described the study to the school professional and asked if they were interested in participating. If they were interested, they were read a consent form and asked to verbally consent. The verbal consent form indicated that agreeing to participate would entail participating in qualitative interviews, completing surveys, watching and rating modules on perceived feasibility, appropriateness, acceptability, and usability, and potentially participating in a training randomized controlled trial.

We conducted evaluative and iterative strategies (Kirchner et al., 2018) to ensure that the remote training strategy would be a good fit with the rural school context. Due to unanticipated logistical barriers, we made modifications to the UCD approach. Instead of collecting data concurrently from all schools, we collected data as they entered the study because recruitment of schools took more time than anticipated. This prevented us from concurrently evaluating the fidelity of several prototypes, as we had originally planned (Eiraldi et al., 2022). Instead, we evaluated one paper prototype of the interventions and implementation strategy (Prototype 1), followed by one asynchronous video prototype of the interventions and a revised paper prototype of the implementation strategy (Prototype 2). At the conclusion of the evaluation of Prototype 2, we used the member checking approach (Harvey, 2015) to provide participants with a chance to provide feedback on the perspective provided by other respondents.

We used a demographic questionnaire to collect information about the demographic characteristics of study participants.

We conducted two semi-structured, qualitative interviews with each of the 25 participants. Theme saturation was achieved by conducting more than 12 interviews in a largely homogenous population (Guest et al., 2006). The interviews were conducted by one of the co-authors (RC), a qualitative data specialist. Interviews were conducted over the phone and scheduled at a time that was convenient to participants.

The first qualitative interview guide elicited views about past experience with professional training and perceived barriers and facilitators to participation in consultation sessions and conducting groups with students (e.g., What would make it difficult for you to participate in consultation sessions and conduct groups with students?) This interview was conducted immediately after the participant consented to participate in the study.

After analyzing the results of the first interview, and informed by prior training work with school professionals in underserved schools (Eiraldi et al., 2020; Eiraldi et al., 2015), we developed and administered a second interview. For the second interview, we provided participants with a written description of Prototype 1. We provided a description of EBPs that would be offered, training and consultation components, a rationale for the need for each component, a description of training modules and the approximate time required for them, as well as a description of the implementation strategy (i.e., remote training and consultation; Kern et al., 2011). Participants were asked to rate the EBPs and the different components of the training and consultation for feasibility and acceptability using a 5-point Likert-type scale. They were also asked why components did or did not appear feasible or acceptable (Kern et al., 2011) and their perspectives about willingness to participate in remote training (See Fig. 1).

Fig. 1
figure 1

Prototypes and measures

Asynchronous Modules

We developed video modules showing a step-by-step walkthrough of group session preparation for Coping Power Program (CPP; Lochman et al., 2008) for children at risk for externalizing behavior problems, CBT Anxiety Treatment in Schools (CATS; Khanna et al., 2016) for children at risk of anxiety problems, and Check-in/Check-out (CICO; Hawken et al., 2014) for children at risk for externalizing behavior problems. Additional video modules included strategies for managing child therapy groups, screening procedures for children in Tier 2 and “red flags” for identifying children at risk for externalizing and internalizing disorders, and a review of barriers and ways to overcome barriers to implementing mental health EBPs.

Implementation Strategy

We developed an implementation strategy with three main components: (a) initial training workshop; (b) e-learning training modules on demand; and (c) consultation. The consultation components consisted of didactics and coaching. Didactics included: (a) discussing student referrals; (b) conducting a step-by-step walkthrough of the session objectives; (c) reviewing the theoretical principles behind the intervention components for that session; (d) encouraging adherence to the intervention manual; (e) problem-solving barriers to implementation and helping school professionals reflect on past challenges in order to successfully implement the upcoming sessions; and (f) enhancing school professionals’ use of empathy and positive reinforcement. Coaching included (a) setting goals for content delivered from the manual; (b) self-reflection; and (c) receiving performance feedback. We emphasized the importance of implementing the program as intended, and the expectation that school professionals would be expected to reach a high level of fidelity when implementing the interventions. Then, school professionals would be asked to reflect on the previous intervention session, and the consultant would make some observations about the previous group session or CICO case. The consultant would be expected to discuss how the school professional handled student behavior in session, including overall level of participation and enthusiasm, and disruptive behavior. Participants would also be told the approximate amount of time that it would take the participant and consultants to complete each component. Finally, participants would be told that the final version of all asynchronous videos and copies of implementation and intervention manuals would be available for download from a project website during the clinical trial phase of the study.

Prototype 2 of Video Modules: Ratings and Qualitative Questions

Participants were emailed instructions and two uniform resource locators (URLs) that they used to watch and rate modules for perceived feasibility, appropriateness, acceptability, and usability of the second prototype. Given that some of the e-learning modules were quite lengthy, we divided the sample of participants and randomly assigned them to three smaller groups of about eight participants so that each group would watch and review different modules. Participants were instructed to watch the modules, rate them, and provide their opinions about them using free text format. They were asked to complete the Intervention Appropriateness Measure [IAM], Acceptability of Intervention Measure [AIM], and Feasibility of Intervention Measure (FIM; Weiner et al., 2017) via Research Electronic Data Capture (REDCap). The three measures are comprised of 4 items, each rated on a 5-point Likert-type scale (1 = completely disagree, to 5 = completely agree). The Cronbach alphas for the measures range from 0.85 to 0.91. A three-factor CFA exhibited acceptable fit (CFI = 0.96, RMSEA = 0.08) and high factor loadings (0.75 to 0.89), indicating structural validity. Seven-week test–retest reliability coefficients ranged from 0.73 to 0.88. Regression analysis indicated each measure was sensitive to change in both directions (Weiner et al., 2017). Participants were also asked to provide comments to expand on their ratings (e.g., “Please comment on the module about CICO”). Participants were also instructed to complete the Usability Subscale (US) of the Telehealth Usability Questionnaire (TUQ; Parmanto et al., 2016) to measure usability. The US is 7-point Likert-type instrument (1 = Strongly disagree, to 7 = Strongly agree). The Cronbach alpha of the usability measure is 0.93 (Parmanto et al., 2016). We made slight adaptations to the US (e.g., changing the word “systems” to “training modules”) for the evaluation of all instructional modules. After completing the questionnaire, respondents were also asked to comment on their answers (e.g., “What was simple to use about the training modules?”).

Prototype 2 of Implementation Strategy: Ratings and Qualitative Questions

Participants were also emailed descriptions of the revised implementation strategy and asked to complete a survey regarding perceived acceptability, feasibility, and willingness to participate in remote training and a qualitative interview about their views of the proposed implementation strategy. The survey had 18 questions (e.g., How willing would you be to participate in the coaching portion of the consultation?) rated on a 1 (not willing at all) to 5 (extremely willing) Likert-type scale.

In the second one-on-one interview, the same 25 participants responded to the paper prototype of the remote training strategy. They were asked why components did or did not appear feasible or acceptable (e.g., What makes the coaching part of the consultation feasible or not feasible?) and their perspectives about willingness to participate in remote training.

Member Checking

After reviewing results of the quantitative surveys, we summarized the information from the free text responses. We utilized a modified synthesized member checking process (Birt et al., 2016; Harvey, 2015) to provide participants with a chance to provide feedback on the perspective other respondents provided (Harvey, 2015). We synthesized and summarized emerging themes from the previous survey and asked participants to rank on a Likert scale the extent to which these themes matched their experience or perspective. We also provided an opportunity for participants to answer open-ended questions to explain their perspectives.

Data Analyses

For the examination of prior training experiences and barriers and facilitators of remote training, we imported transcripts of semi-structured Interview # 1 into NVivo (QSR International, 2020), a qualitative data management and analysis software. Analyses were guided by an integrated approach (Bradley et al., 2007) that included identification of a priori attributes of interest (i.e., constructs important to consider in the development of the remote training strategy) and modified grounded theory, which provides a rigorous, systematic approach to identifying emergent codes and themes.

For the examination of perceived feasibility, appropriateness, acceptability, and usability of the prototypes, we imported transcripts of semi-structured Interview # 2 into NVivo for data management and analyses. We also coded open-ended responses from surveys. Over two iterations, we gathered data from 5-point rating scales (AIM, IAM, FIM, usability) and qualitative data (i.e., semi-structured interviews, written answers) simultaneously (2011b; Palinkas et al., 2011a). We utilized descriptive statistics (mean, standard deviation) of acceptability, feasibility, and willingness to participate for each of the components of the training and use of remote technology. Interview and open-ended response data were analyzed to elaborate upon quantitative findings and to better understand the process of implementation of remote consultation as experienced by stakeholders (Palinkas, Horwitz et al., 2011b).

Results

The pool of potential participants was comprised of school professionals from 153 schools in Pennsylvania, classified as rural (fringe, distant, remote) according to the US Census Bureau, implementing PBIS at Tier 1. We emailed or called the principal from these schools to explain the study and let school staff participate. We were able to explain the study to 77 principals (50% response rate). The principals from 25 schools agreed to let the school professionals participate (16% participation rate); all participants from the 25 schools consented to participate in the study. A participant had incomplete data but was kept in the study. The final sample was composed of 25 school professionals (see Table 1).

Table 1 Demographic characteristics of behavioral health staff

User-Centered Design Prototype Evaluation and Modification

This section includes a description of stakeholder feedback and modifications of the training platform and implementation strategy. Data from the 25 participants are organized into three sub-sections: Barriers and Facilitators to Participating in Remote Training, based on Interview # 1 data; Asynchronous Modules, based on Survey # 2 and member check data; and Implementation Strategy, based on Interview # 2, Survey # 2, and member check data. Data are presented pertaining to qualitative theme analysis for barriers and facilitators to participating in remote training, quantitative descriptive statistics and qualitative theme analysis for asynchronous video modules, and descriptive statistics and theme analysis for implementation strategy. These data were summarized and discussed by the research team in order to make modifications to the original versions of the training platform and implementation strategy. When quotes are provided, we identify them using P (participant) followed by the participant identification number. We end this section describing the changes that were made to the modules and implementation strategy.

Barriers and Facilitators to Participating in Remote Training

In the first qualitative interview, participants reported their previous experience with training as well as potential barriers and facilitators to the remote training and consultation process described to them. These barriers and facilitators fell into the following themes: prior experience, perceptions about engagement in remote format, the availability of necessary resources, time as a barrier, and school buy-in and support as a facilitative condition.

Prior Experience

All participants reported previously participating in remote trainings, and all but one participant reported prior at least some training in mental health. Although many participants mentioned graduate courses as a source of this training, participants also mentioned trainings to meet continuing education requirements, PBIS forums, trainings as a part of professional association membership, trainings from their regional technical assistance centers, and other trainings and courses. Most participants stated that they have received training on or related to PBIS, CICO, safe crisis management and intervention. A few participants reported receiving training on trauma-informed approaches, and one participant reported receiving training on cognitive behavioral therapy.

Some participants raised that participating in remote training and consultation would be feasible and/or acceptable due to their prior experiences. As one participant (P14) stated, “Not that COVID has given us a whole lot of positives, but this is one of them, because I think we've all gotten very used to using computers to communicate with each other and just kind of thinking outside of the box. So that doesn't make me nervous.”

Engagement in Remote Format

Several participants plainly stated that participating in remote training would not be a problem, but a few participants noted that some components of in-person training are missed in remote training. As one participant (P17) explained, “I don't like [remote] quite as well as in person because I feel like I get distracted more easily. So, I feel like I have to work a little harder to really focus and pay attention when it's done on teleconference… I just would prefer in-person, if I had my choice.” A different participant (P04) was concerned about the interaction piece of remote trainings, but explained that it would be possible to promote collaboration virtually:

…in a small group where you're able to, you know, just like collaborate together, that would be much different…We hold our staff meetings virtually now and it's a small group and we talk about some different things with kids. And those I do enjoy because—or I get a lot out of—because we can all kind of bounce ideas off of each other.

Another participant (P15) shared that in-person trainings were preferable, but that they have had no problems with remote trainings: “I think it's a little bit more difficult to do remote than it is in person and live, just for the personal interaction piece of it. But I haven't really had any major issues with doing remote trainings.”

Resources

A few participants stated that they had access to computers and internet, which would support feasibility of the remote training and consultation. One respondent (P15) noted that they often have technology issues at their district, which might cause a barrier, explaining,Our technology at our school district is not the best. So, there are moments where our internet goes down, we lose power…So I would say my only barrier is our lack of like solid technology.”

Time

Time emerged as the most commonly reported barrier to participating in remote training and consultation as well as implementing interventions. As one participant (P09) generally stated, “I mean, time is always, like, of the essence…” Most respondents specifically pointed to their own complicated job descriptions which require them to prioritize emergencies in the school building. A couple respondents also specifically noted that it can be more difficult to indicate that you are busy when participating in remote training as compared to in-person training; therefore, they may be more likely to be interrupted. As one respondent (P20) explained, “I would still probably be in the school building and I would get interrupted…They would expect me to still be working even though I was in a training.”

School Buy-in and Support

Most participants named school and administrator buy-in and support as a primary facilitator for participating in remote training and consultation. One participant (P07) described how they believed support from administration might help mitigate barriers around time:

I don’t really think that there would be difficulty because it lends itself to, like, my actual job and the school district is supporting the participation…It wouldn’t be difficult for me, but I guess the only logistic that would have to be worked out would be in terms of scheduling just to make sure that I have access to the students at the time that we would need them or whatever. But yeah, I can’t really right now think of any barriers, so to speak, mainly because the school district is on board. So, they’re aware; it’s something we’ve committed to working through.

Many echoed this sentiment, stating that their administration would provide support for time and scheduling barriers.

Asynchronous Modules

Three groups each watched and rated the modules. We used random group assignment. The first group (n = 8) watched 12 CPP modules, the second group (n = 9) watched 8 CATS modules, and the third group (n = 7) watched 8 CICO modules as well as a few additional modules not associated with a specific intervention. The additional modules included: (1) a module about effective strategies for running groups with children, (2) a module about how to conduct a brief in-service training with school faculty about recognizing signs of externalizing and internalizing problems in students and an overview of the screening process for identifying students for Tier 2 interventions, and (3) a module about barriers to implementing mental health EBPs, including those identified by participants, and ways to overcoming them. Participants rated each group of modules for perceived acceptability, appropriateness, feasibility, and usability. The scores were uniformly high. Acceptability scores ranged from 3.89 (CATS; SD = 0.66) to 4.36 (CICO; SD = 0.67), appropriateness scores ranged from 4.18 (Identifying students in need of services; SD = 0.77) to 4.57 (CICO; SD = 0.53), and feasibility scores ranged from 4.17 (CATS; SD = 0.59) to 4.57 (Group management; SD = 0.53). The lowest score for usability was 6.11 (CATS; SD = 0.91), and the highest score was 6.67 (CPP; SD = 0.44), see Table 2.

Table 2 Asynchronous training video modules

In general, the qualitative data mirrored the quantitative ratings. Most participants felt positively about the training modules; they reported that the modules were useful and easy to use. In open-ended responses, participants reported that they specifically liked that the modules were broken down by modules or parts of the intervention manuals and that they were convenient and easy to use. However, participants also provided critical areas of feedback, particularly regarding engagement, sound quality and background noise, formatting, clarity, and the explicit connection of information. Below we summarize the feedback participants provided in these areas.

Engagement

Participants assigned to most groups reported that the modules were a bit boring and/or repetitive. Several participants across most groups reported resources or strategies that could strengthen learning, including providing a “training packet” (P03) or physical document to use to take notes; breaking information down into more, shorter slides; organizing the bullet points so that they appear in the order that the narrator is speaking about them; integrating more visuals and examples; and asking questions during the module to keep the participant engaged.

In the member check, almost everyone agreed that training modules should have supporting documents, such as a “training packet,” PowerPoint slides, or a workbook. Although no one strongly disagreed, participants had mixed perspectives about whether the training modules should have interactive elements, such as questions to answer while watching. Only one person (P10) added an additional remark about this in open-ended responses, explaining, “I complain when training modules have interactive elements, but it does help me focus and motivate me to more completely learn the information.”

Sound Quality/Background Noise

Participants in most of the assigned video groups noted background noise as a quality issue. However, when asked directly in the member check survey about the issue, we saw mixed levels of satisfaction with background noise across all groups. For example, one participant (P18) wrote, “It was not a major problem.” Across all groups, participants reported moderate to high satisfaction with text size, indicating that text size may not have been a general issue. The member check confirmed mixed satisfaction with background noise across all groups.

Formatting

Participants also noted some quality issues such as the size of the text on the slides, inconsistent formatting, and, in one specific video group, verbal information not always aligning with text information. One participant (P12) stated that “overall presentation” made the modules difficult to use and questioned whether slides were “ADA compliant” because they were difficult to read due to color or size. Participants suggested changing the size of the text and cleaning up the audio. In the member check, we found that there was mixed satisfaction with consistent formatting across the CATS and Coping Power groups but moderate to high satisfaction with text size overall.

Content Clarity and Explicit Connection of Information

Although there were some mixed perspectives, most participants thought that the modules were clear and provided information that would help them with implementation, or, as one participant (P01) said, “to the point.” Participants specifically valued the examples that were provided in the training modules. However, they provided some feedback to improve this component of the modules. A few participants stated that specific modules (the modules about managing therapy groups) needed more examples and others (in the CATS group) thought that examples were unrealistic, not clear and concise enough, and that the trainee should be better oriented to the part of the group that they are about to watch in the video example.

Implementation Strategy

The survey scores from the 25 participants about perceived acceptability, feasibility, and willingness to participate in remote training were uniformly high. Acceptability scores ranged from 4.77 (SD = 0.43) for use of remote technology to 4.92 (SD = 0.27) for the CATS group intervention. Feasibility scores ranged from 4.40 (SD = 0.87) for the CPP group intervention to 4.81 (SD = 0.49) for coaching. Willingness to participate scores ranged from 4.58 (SD = 0.90) for the CICO individualized intervention to 4.81 (SD = 0.57) for CATS (see Table 3).

Table 3 Perceived acceptability, feasibility and willingness to participate in training and use remote technology

In the second one-on-one interview in which the same 25 participants responded to the paper prototype of the remote training strategy, participants reported themes that built upon those they reported in the first qualitative interview. These themes were related to prior experience, engagement, resources, time, and school buy-in and support.

Prior Experience

As in the first interview, participants responding to the proposed implementation strategy reported that their prior experiences with training would be a facilitator for participating in remote training and consultation. Specifically, several participants noted that they had experience with telehealth technology and a few shared that they had experiences similar to the didactics protocol described in the paper prototype. In contrast, one participant (P08) noted that the didactics protocol would be new to them and would require “getting comfortable with it and everything.”

Engagement in Remote Format

A few participants responding to the proposed implementation strategy in the second interview maintained a preference for in-person training opportunities, which was raised in the first interview. However, they similarly did not report that doing trainings remotely would be wholly negative: “I think that it can have its benefits.” (P19).

Resources

Participants also echoed sentiments reported in the first interview when reporting that they have resources, specifically computers and internet, that would support feasibility of the training and consultation. While one spoke about occasional internet glitches and another spoke about issues with telephone service, others voiced that they had no concerns about the resources needed to participate.

Time

As in the first interview, the biggest barrier that emerged in the interview responding to the proposed implementation strategy was the challenge of time to participate in online trainings and consultation. Participants spoke about the same issues (having to prioritize emergency situations and juggle multiple duties).

Despite this, participants noted several strategies that would mitigate barriers related to time: having a consistent schedule or planning ahead of time, allowing for flexibility when necessary, and making use of hours outside of the regular school day.

A few participants noted that it is easier to attend online trainings or remote consultation due to convenience. One of these participants (P14) similarly noted that they preferred in-person trainings but considered the convenience of online trainings: “It does create that flexibility and, again, like reduces transportation and stuff like that.” Another participant (P13) also explained that remote trainings are easier to attend as they require fewer logistics to figure out, such as finding coverage: “I have found that meetings in general that have been taking place via Zoom or some type of [virtual] meeting…we’ve had more success in general. Whether that’s people being able to attend, not having to get as much coverage.”

School Buy-in and Support

Echoing responses from the first interview, many participants named school-wide support as a facilitator to participation in the proposed implementation strategy. Several voiced that they had support from their administration, but a few noted that feasibility of the proposed implementation strategy would depend on administration and staff support.

Summary of Changes to Training Platform and Implementation Strategy

Quantitative ratings and qualitative feedback from participants guided revision of the asynchronous video modules and implementation strategy. Quantitative and qualitative data specific to the asynchronous video modules informed the addition of several engagement strategies and refinement of modules to increase their quality and clarity. Data related to the training and consultation implementation strategy informed strategies to mitigate barriers related to time, engagement, and school buy-in and support. We outline the changes made to the modules and implementation strategy in Table 4.

Table 4 Summary of changes to training platform and implementation strategy

Discussion

We used UCD to develop an online training platform and accompanying implementation strategy for school professionals serving children at-risk for mental health problems in rural schools and examined stakeholders’ responses about the platform and implementation strategy. The training platform addresses an acute need for specialized training in EBPs by professionals in rural schools (Harmon et al., 2007). An important first step was the assessment of participants’ previous experiences with training and perceived barriers and facilitators of remote training. This information was used to develop the paper modules and implementation strategy to fit the context of rural schools. Most participants (i.e., school counselors or social workers; 76% of the sample), were mental health professionals. A few participants (e.g., a reading specialist and PBIS coach, a regular education teacher) received little to no prior mental health training in pursuit of their professional degree. Most participants reported having prior experience with receiving in-person and remote training to meet continuing education requirements, such as training on mental health interventions. However, very few reported receiving training on interventions for internalizing problems, and only one participant reported receiving training on an EBP. This is consistent with findings from previous studies, indicating that most professionals in rural schools have not been trained on interventions for internalizing disorders, or on EBPs for any disorder (e.g., Siceloff et al., 2017).

The most important barriers reported by participants included having difficulty finding the time to participate in training, receive consultation from members of the research team, or deliver interventions to students. Participants also reported that it could be difficult to obtain buy-in from teachers and caregivers for the completion of measures, and from administrators for the implementation of EBPs. Obtaining parent/guardian consent to allow children to receive mental health interventions was also identified as a potential barrier. The presence of these barriers is consistent with findings from previous studies in rural schools (Moore et al., 2022). Participants identified solutions for dealing with the time barrier, including having flexible times for participation in remote training, receiving advance notice so they can fit training into their schedule, and obtaining buy-in from administrators so they can have some flexibility in their schedule.

Participants identified several concerns about the video modules, including inadequate engagement in the delivery of material, sound quality, formatting of slides, and clarity of content.

We addressed the reported barriers in the revised videos and implementation strategy. For example, we included an EBP for anxiety problems, offered specific training on how to identify and screen children at risk for mental health problems, and provided training on how to run groups effectively. The revised implementation strategy includes a flexible schedule for conducting the initial training and subsequent consultation to better fit the schedule of busy school, professionals, and more interactive communication during the initial training. To increase buy-in, we added sections to the asynchronous training to highlight the evidence supporting intervention effectiveness, and intervention impact on students. The implementation strategy now includes sharing information with administrators and reminding them about the need for their support of school professionals delivering the interventions.

We edited the modules to address audio and visual formatting issues, clarified information on several slides that were reported to be confusing, and improved the flow of how information was presented on the modules. We noted that obtaining parent/guardian consent could be difficult to obtain in some cases and offered different options for describing the study to caregivers and obtaining consent/assent.

Participants reported that participation in remote training would be feasible and acceptable given prior experiences, and that they have access to computers and internet. However, some participants noted that they had encountered some problems with internet connectivity, and they raised concern that they would likely be interrupted during supervision and implementation of the different interventions.

Many participants in the qualitative interviews reported being excited and motivated to participate in remote training. This is not surprising given that school professional in rural schools has few opportunities to gain access to quality professional development training (Harmon et al., 2007).

UCD was a very helpful framework for guiding the development and refining of training components and implementation strategy. The framework, which uses a participatory approach, helps with the development of products that are responsive to context (Goodman et al., 2012), and that are acceptable, appropriate, and feasible and easy to use (Lyon et al., 2020).

Limitations

The study has some limitations. First, no school psychologists participated in the study. School principals nominated participants who were members of the PBIS leadership team. Although no data were collected about individual members of the larger leadership team, it is possible that school psychologists were not nominated because of their busy schedule with testing or because they were not taking part in this team. Second, it is not clear to what extent participant schools are representative of other rural schools. Although the response rate was adequate, the participation rate was low. Other studies conducted in schools have also reported low participation rates (e.g., Heinrichs et al., 2005). Third, we were able to only do two iterations of the prototypes. This might have limited the refinement process of the training platform and implementation strategy, which might result in unforeseen implementation barriers. Forth, the platform and implementation strategy were largely developed based on data provided by school counselors and social workers and, as such, data on feasibility, acceptability, and appropriateness might not generalize to other school professionals (e.g., school psychologists), school faculty, or paraprofessionals. Fifth, the quantitative data analysis was only descriptive. A more robust statistical analysis delineating differences across groups based on group participant demographics, EBP-specific components, and module length, would have strengthened the results. Sixth, only participant-level data were collected. Future researchers could strengthen the study by including school-level demographic information.

Implications and Future Directions

The participation of school professionals from rural schools in a training development project based on implementation science approaches has implications for training. Implementation science has been described as “essential to the process of translating evidence-based interventions into the unique context of schools” (Forman et al., 2013, p. 77). Training programs in school psychology, counseling and social work have steadily increased instruction on mental health EBPs and implementation of EBPs (Regehr et al., 2007; Shernoff, 2017; Zyromsky et al., 2018). However, a large number of school practitioners still do not use EBPs either because they have never received appropriate instruction or because they face significant barriers to implementing them (Hicks et al., 2014). A solution to this training gap is training school professionals in the places where they work. Training professionals in rural schools (a specific school context) requires strategies focused on implementing with fidelity, given the close connection between fidelity and student outcomes (Durlak & Dupre, 2008) and a delivery approach that accounts for specific barriers and facilitators for the rural school context (Paulson et al., 2015).

Given existing barriers such as limited time for training activities, finding the right combination of remote training components (e.g., use of asynchronous modules, synchronous coaching) vis-à-vis fidelity and student outcomes, seem like an important next step. This will be addressed in the upcoming clinical trial.

Conclusions

The study makes contributions to the research literature by providing a step-by-step description of the development of a remote training platform and implementation strategy based on UCD. The use of a participatory approach for the development of the training strategy should increase training buy-in and minimize common implementation barriers for the use of EBPs in a group of under-served schools. Providing school professionals with appropriate implementation strategies (i.e., training) that are effective, available on demand, and built for the specific rural context, might enable rural schools to better serve student mental health needs and contribute to narrowing services disparities (Moon et al., 2017; Paulson et al., 2015; Wilger, 2015).