Background

Depression and anxiety disorders are the most common mental health disorders among youth, affecting 20–30% of the population [1]. Evidence-based practices (EBPs), such as cognitive behavioral therapy (CBT), can improve outcomes among youth with these disorders [2,3,4,5,6]. However, less than 20% of youth with depression or anxiety have access to any EBPs, primarily because of limited availability of mental health providers, stigma, and lack of tools to implement effective treatments in the community [7,8,9,10,11,12]. Even when EBPs such as CBT are offered, fidelity to CBT treatment can be weak [6, 13,14,15,16] and most recipients do not receive an adequate therapeutic dose [17]. Without effective treatment, mental health disorders in youth can lead to poor developmental and academic outcomes, substance abuse, self-injury, adult psychopathology, and suicide [2, 18,19,20,21], ultimately resulting in immense social and economic costs [2, 18, 22].

EBPs need to be implemented successfully in settings where individuals are most likely to seek care if they are going to have a widespread and meaningful impact on public health. For many individuals with mental health disorders and for youth ages 14 to 21 in particular, non-clinical settings such as schools are attractive options for accessing EBPs [23,24,25]. Youth primarily spend their time in schools, which typically have school professionals (SPs) with training in social work, counseling, or psychology and who interface with students on a daily basis [22, 26, 27]. Students have reported more willingness to access mental health services at school than in other community settings [10, 11], and among youth who do receive any mental health care, 50–75% receive it exclusively in schools [12, 28]. However, the school professionals with whom they interact rarely have the training or support needed to provide EBPs [29].

Successful implementation of EBPs outside of traditional treatment settings requires scientific determination of optimal implementation strategies that maximize uptake and quality of care by addressing the organizational and community barriers to sustainability. Implementation strategies are highly specified, theory-based methods that target known barriers to improve uptake at provider and system levels [30]. However, implementation strategies designed to improve uptake of CBT among school professionals have not yet been empirically tested on a large scale. SPs do not routinely receive CBT training, and often report low confidence in their ability to deliver such treatments [31,32,33].

Promising theory-based implementation strategies for improving CBT uptake in schools are Replicating Effective Programs (REP), coaching, and facilitation. These strategies are potentially complementary to each other, but optimal combinations and sequences have not been tested empirically. REP, which is relatively low-burden to end-users [34, 35], focuses on customizing an intervention package to local needs and providing further support through large-group training and ongoing technical assistance [35]. REP has been shown to improve uptake of psychosocial EBPs in community organizations [35,36,37,38,39]; however, may not be sufficient for all providers requiring more supervision in delivering EBPs or for those who are experiencing organizational barriers to EBP adoption [38, 40]. Coaching provides ongoing live supervision of EBP delivery and has shown promise in facilitating CBT adoption in schools [33, 41, 42]. Facilitation includes consultation by an organizational expert in strategic thinking skills for providers to help them enhance organizational and leadership support for CBT implementation at their sites and has been shown to enhance uptake of psychosocial mental health interventions [34, 39].

Currently, there is no research to guide how best to combine REP, coaching, and facilitation for the purpose of CBT implementation in school settings. What is known is that schools, and the SPs that deliver mental health services at the schools, are heterogeneous in terms of barriers to CBT implementation [43]. Optimally efficient CBT uptake in schools may require a “stepped up” type of adaptive implementation intervention, whereby more intensive implementation strategies are only provided to schools that do not respond to a less intensive approach. In addressing barriers to uptake, augmentation of REP with coaching may be essential to overcome SP barriers, while facilitation may help with institutional barriers. Comparative research is needed to best combine these strategies to create and optimize an adaptive implementation intervention that maximizes uptake, cost-effectiveness, and sustainability of an established EBP (CBT), to ultimately improve student mental health.

This study seeks to build the best possible adaptive implementation intervention involving three theory-based implementation strategies—REP, coaching, and facilitation—using a clustered, sequential multiple assignment randomized trial (SMART) design. The study will foster development of an adaptive implementation intervention to improve frequency of CBT delivery to students by SPs, thereby reducing student mental health symptoms [44,45,46,47,48,49]. The study will take place in high schools across the State of Michigan.

Methods/Design

Aims and objectives

Primary study aim

The primary aim of this study is to compare the effectiveness of an adaptive implementation intervention on CBT delivery among schools versus REP alone (the control). The adaptive intervention provides schools with REP + coaching from the start and subsequently augments with facilitation for schools needing additional assistance. The primary outcome is the total number of CBT sessions delivered to students by SPs over an 18-month period. Number of CBT sessions is defined further below, and includes group and individual sessions delivered to students.

Specific CBT component delivery and whether delivery of individual or group sessions were brief (< 15 min) or full-length (≥ 15 min) will also be tracked and examined as secondary outcomes. As an exploratory outcome for this primary aim, we will also examine change in student mental health symptoms among students over the study period.

Exploratory aims:

  1. 1.

    To estimate the costs of different implementation interventions and determine the incremental cost-effectiveness of added coaching and/or facilitation.

  2. 2.

    To assess whether the effect of augmenting REP with coaching or facilitation is moderated by SP or school factors such as SP knowledge and perceptions of CBT as well as school administrator support of CBT implementation.

  3. 3.

    To determine whether coaching and facilitation improve CBT knowledge, perceptions, skills, or championing skills among SPs, and which of these account for increases in frequency of CBT delivery and improvement in student clinical symptoms.

Methods

This study employs a clustered, sequential multiple assignment randomized trial (SMART) design to inform development of an adaptive implementation intervention (Fig. 1). The study was reviewed and approved by the University of Michigan Institutional Review Board (IRB; UM Protocol # HUM00132239). The study takes advantage of an ongoing initiative to disseminate CBT training in schools in the State of Michigan, the Transforming Research into Action to Improve the Lives of Students (TRAILS) program. All program delivery, training, and implementation support is provided through TRAILS and is considered non-research per local IRBs, and considered exempt from regulation under our approved IRB.

Fig. 1
figure 1

Adaptive implementation of school-based CBT study flow and timeline. Potential to benefit from facilitation is defined as ≥ 1 participating SPs delivering < 3 cognitive behavioral therapy (CBT) components to < 10 students or school professionals (SPs) reporting, on average, > 2 barriers to CBT uptake

Setting

The study will take place in high schools across Michigan’s 83 counties, with CBT delivered by existing SPs for students with depression and anxiety. The REP (which includes CBT manual package, training, and technical support for SPs), coaching, and facilitation implementation strategies will be provided through the TRAILS program [31].

Study design

Figure 1 details the four phases of this study over an 18-month period; the four phases are 3, 2, 10, and 3 months in duration, respectively.

The run-in phase involves deployment of the REP implementation strategy (CBT manual package, training, and technical assistance) and identification of schools, SPs, and potential students in need of CBT by SPs.

For phase 1, eligible schools are randomized with equal probability to continued REP only versus REP combined with coaching (REP+coaching). At the end of phase 1, schools are assessed to determine whether they would potentially benefit from facilitation.

During phase 2, schools that could benefit from facilitation [39] (see Additional file 1: Appendix 1) will be re-randomized in phase 2 with equal probability to continue their implementation strategy from Phase 1 (i.e., REP or REP+coaching) or to have their current strategy augmented with facilitation (i.e., REP+facilitation or REP+coaching+facilitation, respectively). The active elements of the coaching and facilitation strategies will be paused during the summer months (June–August 2019) when schools are not in session.

For phase 3, all implementation strategies will be discontinued. Outcomes will be collected longitudinally throughout all phases, from SPs up to 18 months after the baseline assessment and from students up to 15 months after their baseline assessment.

Sites/schools

Over 200 SPs from up to 100 schools across the State of Michigan’s 900+ high schools will be recruited by study staff to participate in the study. Every attempt will be made to recruit at least one eligible school from each county in Michigan and to include rural as well as urban and suburban schools.

Site inclusion criteria

Schools will be eligible if they

  1. 1)

    Are a high school (grades 9–12) from a school district in one of the 83 counties in Michigan that has not previously participated in a TRAILS CBT training initiative.

  2. 2)

    Are within a 2-h driving distance of a TRAILS coach (who are mental health professionals primarily working in community mental health clinics across Michigan).

  3. 3)

    Agree to participate in data collection throughout the study duration.

  4. 4)

    Identify at least one SP who is eligible and agrees to participate in study assessments throughout the study duration.

  5. 5)

    Allow for SP(s) to deliver individual and/or group mental health support services on school grounds, yet outside of the general education classroom environment.

A school administrator who is a principal or other senior administrator at each participating school will be asked to provide data on building-wide sociodemographics and leadership support for evidence-based practices.

School professionals identified by schools are eligible if they are

  1. 1.

    Employed at a Michigan high school

  2. 2.

    Have a background in clinical school social work, counseling, psychology, or similar field

  3. 3.

    Able to read and understand English and comprehend study assessments

School professionals will be excluded if they have a significant illness or condition that precludes their participation in the implementation strategies, including the REP training and student identification process, coaching, or facilitation, or are unable to provide informed consent for participation in the study activities.

Student eligibility and recruitment

As part of REP, SPs will be trained during the run-in phase to identify 10 eligible students in need of CBT. Because accurate case finding is critical to successful CBT implementation, training SPs on student identification is a core component of the REP implementation strategy [39] used in previous studies of implementation strategies [40, 50]. SPs will be taught through REP to recognize signs of depression and anxiety in students, using public domain screens (Patient Health Questionnaire 9 modified for teens [PHQ-9T] and generalized anxiety disorder [GAD]-7) [51].

Students are considered eligible if the SP determines they have at least one symptom of depression or anxiety that impacts their daily functioning and well-being. Students are considered ineligible as determined by the SP if they are (1) high school seniors (or would be graduating prior to any CBT sessions); (2) are unable to regularly attend school-delivered CBT skills groups; or (3) are unlikely to benefit from CBT skills groups due to cognitive or developmental disability, lack of English proficiency, or significant behavioral difficulties.

Stratified randomizations

All randomization occurs at the school level. All study-eligible schools are randomized in phase 1 with equal probability to receive either REP or REP+coaching. In phase 2, schools with documented evidence of a need for additional implementation support based on predetermined criteria (see Additional file 1: Appendix 1) will be further randomized with equal probability to continue their phase 1 strategy (REP or REP+coaching) or to have their current strategy augmented with facilitation (REP+facilitation or REP+coaching+facilitation). To ensure balance across study arms, the first randomization will be stratified based on school size (> 500 or ≤ 500 students), location of school (rural or urban), percentage of students on free/reduced lunch program (≥ %50 or < 50%), and pre-randomization delivery of CBT (any sessions vs. none). The second randomization, among schools that might benefit from facilitation, will be stratified by size, location, and total number of CBT sessions provided in the 8 weeks post first randomization (top 50% vs. bottom 50% within REP or REP+coaching arm).

Evidence-based practice (EBP) to be implemented

The EBP to be implemented is cognitive behavioral therapy (CBT) for youth with depression or anxiety [52,53,54,55,56]. Modular CBT—defined as individual components of CBT, delivered flexibly and responsively to presenting symptoms [47]—will be utilized in particular, due to its strong evidence base and advantages over other manualized protocols for school-based delivery [45, 49]. Modular CBT has been previously found in several studies to be associated with reduced depressive and anxiety symptoms when compared to usual care [44, 57], and among students in particular [48, 58]. CBT has also been delivered successfully for different racial and ethnic groups [8, 57], thus making it ideal for a statewide trial within schools [59, 60]. Core CBT components used in this study are based on previously established interventions [55, 61] and include psychoeducation, relaxation, instruction in identification and replacement of anxious or depressive thoughts, behavioral activation, creation of fear hierarchies, and exposure. Additional emphasis will be placed on active intervention techniques associated with improved engagement and clinical outcomes, such as agenda setting, modeling of skills, practice with feedback, and assignment of take-home practice activities [62].

Implementation strategies and components

REP

Replicating Effective Programs (REP) [36] will be provided to all schools and is based on Rogers Diffusion Model [63] and social learning theory [64]. REP enhances EBP uptake by customizing interventions to fit the needs of specific settings through EBP packaging (tailoring of the modular CBT manual in user-friendly language), didactic training, and ongoing technical assistance provided by the TRAILS program. The package includes an overview of CBT core components, agendas describing how each component is delivered within a session, sample student screening forms, talking points for students, and suggestions for school-based delivery. REP training to be provided by TRAILS covers modular CBT core components including screening and identification of students. REP technical assistance consists of regular scheduled conference calls during which SPs may receive support from an expert CBT clinician and open access to an interactive website that provides additional resources (e.g., video demonstrations, case simulations) (Table 1).

Table 1 Summary of implementation strategies across REP, REP+coaching (REP+C), REP+facilitation (REP+F), and REP+coaching+facilitation (REP+C/F)

Coaching

The coaching implementation strategy (Table 2) is provided by TRAILS clinicians and is derived from the school-based Positive Behavior Interventions and Supports (PBIS) model of coaching for individual development [65]. Coaching uses a CBT training expert to attend in person to observe group sessions led by the SPs, provide live feedback [66, 67], and model the use of core CBT elements to improve SP competence [65, 68,69,70,71,72]. All SPs from schools randomized to coaching in phase 1 will receive weekly visits from a CBT coach for a minimum of 12 weeks, which will occur in the context of the SPs weekly CBT group. After 12 weeks of on-site coaching, SPs are evaluated on their CBT skill delivery through a short objective competency quiz. SPs deemed to need a full second round of coaching based will receive another full 12 weeks of coaching.

Table 2 Fidelity checklist summary for REP, coaching, and facilitation components

Facilitation

Facilitation (Table 3) is based on the Integrated-Promoting Action on Research Implementation in Health Services Framework [73] and promotes provider self-efficacy [74] in mitigating organizational barriers to EBP adoption. Facilitation is delivered via regular phone contact for at least 10 weeks with the SPs by an expert in school and mental health care organization, implementation methods, and use of CBT and EBPs in schools. The facilitator will support SPs in strategic thinking and leadership skills to address organizational barriers covering the following:

  1. 1.

    Initiation and benchmarking (week 1): facilitator contacts each SP to give background on CBT, review potential barriers and facilitators to CBT use (e.g., space to provide CBT, school administration support for the program), and set measurable goals for CBT uptake.

  2. 2.

    Mentoring (weeks 2–9): facilitator and SP hold regular weekly calls to develop rapport; facilitator provides guidance to SP on overcoming specific barriers to CBT uptake by aligning SP strengths with SP available influence at the school and needs of local staff. If needed, facilitator refers SP to REP technical assistant (TA).

  3. 3.

    Leveraging (weeks 2–10): facilitator continues calls with SP and reaches out to school administrators, identifies school/community priorities per administration input, and helps SP align CBT use/goals with these existing priorities. The facilitator helps SP summarize and describe added value of CBT to administrators and other school employees (e.g., consistency with other initiatives).

  4. 4.

    Ongoing marketing (continuous): facilitator, leadership, and SP summarize progress and develop sustainability plans.

Table 3 Data sources and measures*

Fidelity monitoring to implementation strategies

Fidelity monitoring will be used to assess whether each site is receiving the core components of each implementation strategy (REP, coaching, and/or facilitation) and to ensure that there is no contamination. Different staff members will serve as REP specialists, coaches, and facilitators. Study staff will train REP specialists, coaches, and facilitators, and meet with them on a regular basis to monitor fidelity. Separate study staff will oversee monitoring of implementation strategy fidelity. Fidelity metrics are described in detail in Table 2. Adequate fidelity to REP is defined by all sites receiving the CBT package, > 90% of SPs receiving training, and at least one monthly contact by the TA specialist to SPs. For coaching, a fidelity checklist [75] will document content covered, post-session feedback provided, session planning and role-play practice that occurred, and provision of resources and materials. The facilitation quantitative fidelity measure [34, 76, 77] will ascertain mode of contact, general content of discussion, and interaction time [39].

Measures

Data sources and measures (Table 3) will ascertain frequency of CBT session delivery by SPs through month 18 (primary outcome), school-level factors (administrator survey), SP characteristics, and a student outcomes survey. Independent study research associates (RAs) will collect all assessments from SPs and school administrators electronically. To protect student anonymity over the course of the study, SPs themselves will facilitate administration of student surveys, also collected electronically via a secure server that immediately de-identifies all student information.

Aim 1 primary outcome (CBT delivery)

The primary outcome is the total number of CBT sessions delivered by each SP to students over the course of 18 months. To assess this outcome, SPs will complete a weekly survey where they report their weekly CBT delivery in group or individual sessions, as well as the compnents delivered.  Secondary outcomes will include different types of CBT delivery (individual vs. group; full sessions vs. brief) and delivery of specific CBT components. SPs will be compensated for weekly survey completion in the registry, and study staff will follow up with SPs who do not report CBT delivery for 4 weeks in order to remind them to complete data entry.

School-level measures

A longitudinal survey will be given to consenting school administrators to record percentage of students eligible for free/reduced lunch, average classroom size, attendance rate, number of students referred to psychiatric emergency services, and administrator tenure. Administrators will also complete the Implementation Leadership Scale (ILS) [78] to assess institutional support for EBP. No identifying information will be collected as part of these assessments, and no compensation will be provided. Administrators will also be asked to provide approval for participating SPs to collect academic indicator data on GPA, absences, suspensions, and expulsions for participating students.

SP characteristics

SPs will also complete longitudinal web-based surveys that include demographic background, level of education, job tenure, prior experience administering CBT, and knowledge and perceptions of CBT delivery using the CBT Knowledge Questionnaire [79]; Provider Attitude Survey [80]; Treatment Manuals Survey [81]; and the Psychotherapy Practice Scale [82]. SPs will be compensated for all completed assessments. SPs will also complete the ILS to ascertain leadership support, and two other validated measures related to support for EBPs—the Implementation Climate Scale (ICS) [83] and the Evidence-Based Practice Attitude Scale (EBPAS) [84].

CBT fidelity

Consistent with real-world fidelity monitoring for quality improvement purposes [85], the abovementioned web-based SP weekly assessment will be used to track number of CBT sessions delivered and CBT content delivered each week.

Student outcomes

SPs wil be encouraged to identify 10 students that they believe could benefit from CBT prior to and during CBT training. SPs will be trained to create a mini-registry of students using a web-based instrument designed by TRAILS to communicate with other Qualtrics surveys (see Additional file 2: Appendix 2). Students identified by the SP will complete secure electronic surveys on mental health symptoms and health care utilization using the previously described web-based tool (Table 3). SPs will provide to the student in person an information sheet outlining the study eligibility requirements, assessments, compensation, and risks and benefits. A waiver of documentation of consent and waver of parental consent was obtained for ascertaining student outcomes from local IRBs. SPs will be required to provide students with a private location for completing all assessments and will reassure students prior to each assessment that all answers will be de-identified and that they will not have access to the responses. Measures will include student sociodemographic characteristics, health behaviors (e.g., substance use), CBT receipt, knowledge, and use of CBT skills, mental health symptoms (PHQ-9T, GAD-7), and access to mental health services and other healthcare use (e.g., ED referrals or admission). To ensure that students are not coerced into participation, they will be asked to confirm on the web-based survey that they would like to submit their answers. SPs will not be informed if students opt to not submit their answers after completing the survey. Students will be compensated for each survey completed over the 15-month period. In order to protect student privacy from the study staff, SPs will facilitate all student compensation.

Cost estimates

For each implementation strategy, we will calculate the average costs and average outcomes per SP using methods described elsewhere [39]. The primary implementation costs are the personnel time spent in REP activities (e.g., SP training, TA), coaching (e.g., time to hire/train coaches, network maintenance, SP coaching time), and facilitation by study participants (including SP and school administrator time). Costs will be quantified as hours multiplied by wages and fringe benefits for each person. Wage rates will be obtained from school records, and in cases where this information is not available, average wages for each occupational level will be used from the Bureau of Labor Statistics. Hours will be tracked through attendance logs for each implementation activity.

To assess costs of delivering CBT, 40 randomly selected SPs will also be asked to complete time-motion surveys for 2 weeks (starting 4 weeks after the phase 2 randomization) that ask about time allotted to providing CBT versus other forms of student counseling, care, or crisis management. School services will be translated to costs based on the wage rates of school providers.

Student-level service costs of CBT delivery and other use will also be estimated from study records of participation in CBT sessions, academic indicators, and self-reported utilization survey data on inpatient, emergency department, and outpatient use outside the school setting. Health care costs will be assigned using Current Procedural Terminology (CPT) codes, and a relative value unit (RVU) weight in the Medicaid Fee Schedule calculates standardized costs in US dollars for each service adjusted for annual levels of inflation using the consumer price index.

Study sample retention

We will aim to prevent study attrition by following a planned protocol for obtaining the primary research outcome (total CBT sessions delivered by each SP), even if a SP moves to another institution (occurring among < 2% of SPs in our previous studies). A study research assistant (RA) will monitor SP weekly reports of CBT delivery. SPs who fail to submit reports for four consecutive data collection waves will receive two personalized emails from the study RA asking for their report. SPs who do not respond will be contacted by phone by study staff. Study staff will maintain brief communication with all SPs through periods of vacation and will provide easy methods for reporting job transitions that could impact data collection.

Analyses

All eligible schools, once consented and randomized at phase 1, will be included in an intent-to-treat data analysis sample for all aims. Analyses of student mental health outcomes, however, will be restricted to schools in which at least one SP provided a list of student names for study participation prior to the first randomization. A detailed analysis plan is available in Additional file 3: Appendix 3.

Primary aim

The primary aim analysis will determine the effect of the most intensive adaptive implementation intervention, by comparing the total number of CBT sessions delivered by SPs over the course of 18 months between schools receiving REP alone (the control) versus schools receiving the adaptive intervention (REP + coaching + facilitation for schools that are eligible).

Exploratory aims

For exploratory aim 1 analyses, incremental cost effectiveness ratios (ICERs) will be calculated for each relevant comparison of implementation interventions by dividing the incremental average costs by the number of CBT sessions delivered as well as the number of depression or anxiety-free days based on PHQ-9T or GAD-7 student score changes between each time point.

Exploratory aim 2 analyses will assess whether the implementation intervention effectiveness is moderated by SP or school-level factors including SP prior training and baseline perceptions of CBT, as well as perceived school administrator support for adoption of CBT. Results of these analyses will be used to construct a more deeply tailored adaptive implementation intervention that further improves uptake, and particularly SP delivery of CBT.

Exploratory aim 3 analyses will test mechanisms through which the coaching and facilitation implementation strategies increase frequency of CBT delivery and/or improve student mental health outcomes.

Missing data

Missing outcome data may occur due to school or SP dropout or loss of contact with SPs or students. Our sample retention protocol will ensure that all efforts are made to obtain primary outcome measures for all SPs in all 100 schools. For our primary SP-level outcome, based on preliminary data from TRAILS, we anticipate an attrition rate of < 10%. Prior to conducting all primary and secondary data analyses, missing data will be dealt with explicitly using multiple imputation methods for SMART studies [86, 87].

Sample size

The estimated sample size for this study is based on our primary aim: a comparison between the expected number of CBT sessions delivered by SPs between months 1 and 18 in schools receiving the adaptive implementation intervention (REP+ coaching + facilitation for schools that are eligible) versus the control (REP only). The sample size calculation for this comparison is a straightforward adjustment to the sample size calculation for a two-sample t test [88]. The first adjustment accounts for the clustering of SPs within schools (estimated interclass correlation = 0.03) to account for between-site variation induced by within-site correlation in SP CBT delivery outcomes. The second adjustment accounts for the rate of response following each phase 1 treatment by weighting schools differently to account for some schools being re-randomized and contributing to multiple experimental conditions. Using a two-sided test based on k = 100 schools (50 randomized to REP and 50 to REP+coaching in phase 1), on average ≥ 2 SPs per school (anticipated N ≥ 200 SPs), a type-1 error rate of 5%, ICC = 0.03, and assuming phase I response rates of 10% (REP only) or 50% (REP+coaching), we will have > 80% power to detect a moderate effect size of D = 0.53 between the two implementation interventions on number of CBT sessions delivered.

Trial status

The study has not started as of August 2018. In the run-in phase (October 2018–January 2019), all eligible SPs will receive Replicating Effective Programs (REP) components, including a 1-day didactic training in CBT in mid-January of 2019. The first randomization will occur in late January 2019.

Discussion

To our knowledge, this is the first study to comparatively test adaptive implementation interventions at a population (state) level to promote utilization of a modular CBT intervention outside of traditional clinical settings, as delivered by existing school staff, for school-age youth with depressive and/or anxiety symptoms. This is also the first type III hybrid implementation-effectiveness trial to use a SMART design that seeks to understand how best to sequence three implementation strategies (REP, coaching, and facilitation) to improve SP-delivered CBT and student mental health outcomes. The study also informs the more efficient use of implementation resources as not all schools may require the most intensive implementation strategy. In certain contexts, REP alone may significantly improve uptake of evidence-based preventive health interventions, particularly when financial incentives also support their use. However, in other contexts, optimal uptake will require an approach that augments REP with a more intensive implementation strategy such as facilitation. This SMART design will determine the best way to tailor delivery of more intensive implementation strategies (e.g., coaching, facilitation) to schools that need more than initial REP, and can also yield a more cost-effective approach.

This study also incorporates implementation strategies from differing theoretical foundations to better understand links between the various strategies and different mechanisms which can be targeted to overcome barriers to EBP uptake, hence, ultimately leading to more precise implementation. Notably, combining REP, facilitation, and coaching to optimize CBT implementation in school settings provides an innovative way to address provider and organizational barriers, potentially maximizing EBP uptake and impact on student outcomes. In addition to determining optimal implementation strategies to embed CBT in schools for youth with depression or anxiety, this work will elucidate mechanisms of successful implementation to inform the customization of these strategies based on factors specific to different providers, organizations, and communities. Coaching and facilitation both have proven valuable and target different barriers, but the mechanisms by which they foster EBP uptake remain unknown. This study will also help elucidate if and how these implementation strategies foster frontline provider leadership, including transformational and transactional leadership skills previously studied in health care settings. Cost-effectiveness analyses will further tie differences in the cost of an implementation strategy (or adaptive sequence of strategies) to differences in important student behavioral and academic outcomes, including other forms of health care utilization and high school graduation rates.

Results from this study will also provide insight into whether improved CBT knowledge, perceptions, or skills among SPs are associated with increases in fidelity to CBT delivery and improved student outcomes. Understanding the mechanisms by which specific implementation strategies such as facilitation and coaching impact EBP uptake will inform their more precise use in different settings. The sequential randomizations embedded in the SMART design allow us to consider how different school and SP-level factors change over the course of the study and moderate the effectiveness of implementation strategies. These moderation effects can more specifically inform tailored and targeted implementation strategies improving provision of implementation support to schools as their needs change over time, and informing construction of the most effective adaptive implementation intervention for improving mental health outcomes across states and school districts.

Despite the novel design of this study, as well as the comprehensive assessment of implementation strategies, there are limitations that warrant consideration. Notably, the opportunity to use a state-wide network of SPs and coaches to implement CBT precluded in-person data collection from students, which would have led to unsustainable study costs. We are also limited to enrolling schools based on availability of TRAILS-trained coaches. We considered several alternative designs that could be applied to large-scale implementation of school-based CBT. The SMART design used here, however, allows us to make this comparison as well as understand whether and how coaching and facilitation work with each other to impact implementation outcomes. Further, the sequential randomizations included in the SMART design allow testing of potential time-varying moderators or how effectiveness of phase 2 implementation strategies differs by change in key metrics during phase 1. Understanding these dynamics enriches our understanding as to which schools benefit most from different implementation strategies and also informs potential mechanisms for change under different implementation strategies.

Conclusions

Overall, the proposed study addresses two major public health priorities in mental health services implementation: (1) reducing the provider capacity shortage affecting school-age youth (i.e., increasing the number of trained mental health providers that reach youth in underserved regions by increasing access to quality mental health therapies by utilizing school settings); and (2) enhancing the scientific knowledge base of implementation science by determining the optimal adaptive strategies for promoting the uptake of EBPs in community-based settings. This study will also support the deployment of a sustainable infrastructure capable of disseminating evidence-based mental health practices across an entire state’s public school system and determine an optimal, adaptive strategy for cost-effective utilization of this implementation infrastructure. Ultimately, the sustainability of the study is potentially realized through a state-wide system that can effectively train existing SPs in EBPs, with capacity to continuously and rapidly update SPs as advancing clinical science provides improved treatments. This work has the potential to give hard-to-reach students rapid access to the latest treatments and treatment advances, by creating an adaptive implementation intervention that can potentially be scaled up and spread nationally.