Introduction

Few would deny the value of developing and conducting randomized clinical trials of behavioral interventions and disseminating interventions that are proven more effective than alternative interventions or standard care. In the last 30 years, numerous child and adolescent interventions focusing on a variety of disorders have been developed and rigorously researched, resulting in the identification of many strong evidence-based practices (EBPs; see California Evidence-Based Clearinghouse (CEBC): https://www.cebc4cw.org/registry/). Undoubtedly, this is a major achievement in behavioral health. Yet, after decades of treatment development, testing, and implementation studies to more efficiently bring these models to community-based providers, few children and adolescents who need behavioral health treatment receive an EBP (Ghafari et al., 2022; Okwori, 2022). This gap remains despite many purveyor organizations actively promoting the implementation of evidence-based prevention and treatment interventions for children and adolescents, and a large research base indicating that implementing EBPs is feasible, demonstrates improved clinical outcomes, and is cost effective (e.g., Beidas et al., 2012; Bond & Drake, 2017; Bruns et al., 2016; Godley et al., 2011; Hoagwood et al., 2013; Shelton et al., 2018). As such, we must understand factors associated with the successes and failures of dissemination and treatment utilization in an effort to increase EBP implementation.

The field of implementation science was developed to study the efficacy to effectiveness gap and to develop strategies to spread and sustain EBPs into behavioral health delivery systems and organizations. This research has led to greater understanding of the core processes of implementation, as well as identification of both the barriers and facilitators to that occurring. As knowledge increased, a consensus developed indicating the presence of two interdependent phases of dissemination: implementation and sustainment (Aarons et al., 2011; Greenhalgh et al., 2004; Manthey & Goscha, 2013; Willging et al., 2015). Implementation is generally defined as the initial phase in adopting an EBP, where the focus is on activities designed to integrate interventions into organizations. This phase includes specialized training, often made possible by start-up funds from government or charitable organizations (Mancini & Marek, 2004; Moore et al., 2017; Saldana et al., 2015; Shelton et al., 2018). The current consensus is that it takes a minimum of 2 years to complete the implementation phase for behavioral health interventions (Palinkas et al., 2018).

Sustainment refers to continued use of an EBP after the implementation phase. In some circumstances sustainment occurs when start-up funds come to an end or when the training, fidelity monitoring, and assistance from the purveyor are stopped or reduced substantially (Aarons et al., 2011; Bergmark et al., 2019; Moore et al., 2017; Scheirer & Dearing, 2011; Stirman et al., 2012). What it means to be sustained may differ from one EPB to another, with some considering a program sustained if it is delivered entirely to model specifications and meets all fidelity requirements, while others consider sustainment to be maintenance of some but not necessarily all intervention components (Hunter et al., 2017; Palinkas et al., 2020; Scheirer, 2005).

Research has begun to identify factors that inhibit or promote implementation, with emerging indications that some of the same factors may predict successful sustainment as well (e.g., Beidas et al, 2019; Bond et al., 2021; Nadeem et al., 2014; Saldana et al., 2015). For example, Aarons et al. (2011) distinguished important “outer service context” and “inner service context” factors. Research suggests that both inner and outer context factors indeed play important roles in implementation, and perhaps sustainment, of EBPs (Shelton et al., 2018).

It is generally accepted that although the boundary between implementation and sustainment is not unyielding, sustainment should be studied as a separate phase from implementation, as specific factors which influence implementation may differentially influence sustainability (Stirman et al., 2012). Arguably, four factors hypothesized to be essential to sustainment are (1) ongoing funding or reimbursement typically from government sources or third-party payors; (2) leadership and commitment to a particular EBP or EBPs more generally from funders, service providers, referral sources, and community leaders; (3) acceptance and commitment from staff implementing the practice such that they can deliver it with fidelity and believe the practice enhances outcomes; and (4) adequate structure and quality of the training program, adaptability of implementation, and quality assurance activities such as fidelity and competence monitoring (e.g., Aarons et al., 2011; Bond & Drake, 2017; Chung et al., 2014; Forehand et al., 2010; Massatti et al., 2008; Novins et al., 2013; Peterson et al., 2014; Swain et al., 2010).

It appears that funding may be the primus inter pares or “first among equals” in predicting sustainability success (e.g., Bergmark et al., 2019; Hailemariam et al., 2019; Massatti et al., 2008; Tibbits, 2015). For example, Bond and colleagues (2014) studied 6-year sustainment of four evidence-based interventions across 49 sites and concluded that “adequate funding was the single most critical factor in an agency’s decision to continue offering an EBP” (p. 2235). In a study of the sustainment of Adolescent Community Reinforcement Approach (A-CRA) at 82 sites throughout the United States, researchers concluded that “…such factors as external funding and political factors may trump such factors as inner setting activities which include strategic planning, evaluation, and communication” (Hunter et al., 2015, p. 2). Summarizing similar findings, Nadeem and colleagues (2014) concluded that:

Sustained funding by the state signals to clinics that EBPs are a priority, provides infrastructure to cover training and consultation costs, engages treatment developers in the process, and fosters interorganizational communication between practitioners using the treatment through training and consultation (p. 147).

Features of the intervention itself, and its associated training, fidelity, and coaching activities have been identified as important but understudied predictors of sustainment (Bond et al., 2014; Novins et al., 2013; Rodriguez et al., 2018). Studies have found that more frequent communication with the EBP trainer, ongoing supervision or feedback, fidelity monitoring, booster training sessions, coaching and consultations, and technical assistance from the trainers or EBP purveyor are key factors in sustainment (Cooper et al., 2013; Flynn et al., 2020; Godley et al., 2011; Manthey & Goscha, 2013; McWilliam et al., 2016; Novins et al., 2013; Schoenwald et al., 2004). Many questions remain regarding the essential ingredients for successful sustainability of EBPs beyond the initial training and implementation phase. The obvious implication of elucidating key sustainment processes and factors is to channel resources and tailor training programs accordingly in an effort to increase the likelihood of sustainment.

In order to contribute to the developing knowledge base concerning sustainment of evidence-based behavioral interventions, the present study focused on one particular EBP, Multidimensional Family Therapy (MDFT; Liddle, 2010; Liddle et al., 2005, 2018) as a representative of EBPs. This study had two primary objectives: (1) to examine short-term (2-year) and longer-term (5-year and 8-year) sustainment rates of MDFT in Europe and North America, and (2) to explore potential factors that may be associated with sustainment of MDFT in particular and the potential implications for behavioral health EBPs more generally.

Methods

Overview

This retrospective non-experimental study examined MDFT sustainment among all MDFT programs that initiated training in Europe (n = 72) and North America (n = 65) at least 2 years prior to data collection, and hence had the opportunity to sustain for 2 years or longer. Programs received implementation and sustainability services by one of two purveyors of MDFT training and implementation services: Stichting Jeugdinterventies (SJI) in Europe and MDFT International in North America. The study examined the rates of sustainment for full implementation of MDFT. Full implementation consists of yearly certification of therapists and supervisors, and the program demonstrating adequate model fidelity and clinical outcomes as measured by the two purveryor organizations.

Multidimensional Family Therapy

MDFT (Liddle et al., 1992, 2005) is a comprehensive, integrative evidence-based treatment for youth substance misuse, delinquency, mental health, and other problem behaviors. The model has demonstrated effectiveness in several randomized clinical trials with young people of ages 12 to 25 and their families (e.g., Henderson et al., 2010; Liddle et al., 2009, 2018). MDFT therapists work in four treatment domains: adolescent, parent, family, and community. At various times throughout the course of treatment, therapists have sessions alone with the adolescent, alone with the parent(s), and with the adolescent and parent together.

Treatment advances in three stages: Stage 1: developing therapeutic alliances and motivation; Stage 2: promoting change in emotions, thought, and behaviors; and Stage 3: reinforcing and “sealing the changes.” The goals of the adolescent domain are to reduce substance misuse and other problem behavors, improve emotional regulation and coping skills, help teens communicate more effectively with their parents and other adults, and enhance social competence. The parent domain focuses on increasing parents’ behavioral and emotional involvement and attachment with their adolescent; reducing parental conflict and enhancing teamwork; and helping parents find practical and effective ways to influence their teen. Family sessions aim to decrease conflict, deepen emotional attachments, and improve communication and problem-solving skills. The community domain fosters the youth’s and family’s competency with social systems (e.g., educational/vocational, juvenile justice, recreation) and helps young people and families advocate for themselves in these important and influential systems.

MDFT Implementation and Sustainment Approach

MDFT implementation and sustainment is based on the Stages of Implemention (SIC) model outlined by Saldana and colleagues (Saldana, 2014; Saldana et al., 2015), which specifies discrete and systematic steps in the process: (1) Engagement, (2) Consideration of Feasibility, (3) Readiness Planning, (4) Staff Hired & Introductory Training, (5) Fidelity Monitoring Processes in Place, (6) Service and Consultation Service Begin, (7) Model Fidelity, Staff Competence, and Adherence Tracked, and (8) Competency. Many of the 73 implementation strategies compiled by the Expert Recommendations for Implementation Change (ERIC) project (Powell et al., 2015) are employed within and across these Stages of Implementation. Particular MDFT implementation and sustainment strategies map on to 6 of the 9 ERIC project implementation strategies (Waltz et al., 2015): (1) use evaluative and iterative strategies, (2) support clinicians, (3) adapt and tailor to context, (4) provide interactive assistance, (5) train and educate stakeholders, and (6) develop stakeholder interrelationships. The MDFT implementation and sustainment program does not systematically use the following strategies: (a) engage consumers directly (i.e., youth and families), (b) utilize financial strategies, or (c) change infrastructure.

The current examination starts with the Implementation Phase of the SIC model, Stage 4 (Staff Hired & Introductory Training), after Stage 3 (Readiness Planning) is complete and a contract for MDFT training and implementation services has been signed by all parties. MDFT implementation is multi-component, integrated, and developmental. It consists of 3 primary components: (1) clinical training leading to certification of therapists and supervisors, (2) operational facilitation, and (3) fidelity and outcome monitoring. The implementation phase of MDFT takes approximately 9–12 months for a program to achieve. The sustainment phase then begins and sustainment services are required yearly, consisting of (a) clinical support and booster training leading to recertification of therapists and supervisors, (b) operational facilitation as needed, and (c) fidelity and outcome monitoring.

The philosophy of MDFT implementation and sustainment is to provide effective and comprehensive training, guidance, and collaboration in the first year of implementation such that the program can function relatively independently from the purveyors in subsequent years. Although an MDFT program must be supported by one of the purveyor organizations in order to maintain their license and receive booster, recertification, fidelity and outcome monitoring, and other assistance as needed, the intensity of involvement is greatly reduced after the first year of implementation. This was purposeful in its design in order to increase provider autonomy and maximize sustainment rates.

Clinical training for both therapists and supervisors includes access to specialized written and video training materials, didactic presentations and workshops, case consultations, and review and analysis of work samples. MDFT training is based on and isomorphically reflects the same behavioral change principles of the MDFT clinical approach (Liddle, 2016). Each training method targets one or more aspects of a clinician’s skill and professional development. Operational facilitation consists of ongoing assessment of clinician and program satisfaction with MDFT, identification of barriers and potential barriers to implementation, and regular collaboration with program leaders to solve challenges as they arise and to prevent future challenges. Fidelity and outcomes monitoring are accomplished through an online database. Therapists and supervisors enter case and supervisory data into a proprietary database provided by the purveyor organizations. Therapists enter information on each MDFT case, including the parameters of treatment (e.g., frequency and duration of sessions, domains targeted) and supervisors enter information on MDFT-specific supervision sessions provided to each therapist. Clinical outcomes are measured by completion of the MDFT Intake and Discharge Evaluation forms which assess youth, parent, and family functioning on several dimensions (e.g., substance use, mental health, parenting practices, family functioning, whether or not the youth was discharged to a higher level of care). See the MDFT website (https://www.mdft.org/programs) and CEBC MDFT summary (https://www.cebc4cw.org/program/multidimensional-family-therapy/), for a more detailed description of MDFT training and quality assurance procedures.

Data Collection/Procedures

Adminstrative staff (ZA & JR) from the two purveyor organizations collected the data from their respective organizations. These two staff members were blind to study objectives. All data used in the current study were part of standard administrative data collection procedures of the two organizations. First, the study team interviewed staff from each organization to identify the information they routinely kept on MDFT programs, and then created an Excel spreadsheet with designated variables that were collected by both programs as part of their standard procedures and could be used to address the two study objectives. The two administrative staff then entered data from their records on the Excel sheet in an anoynomous manner so the researchers would not know the identity of the programs.

Measures and Variables

Sustainment

A program was defined as sustained if it had maintained continuous uninterrupted licensed MDFT services for at least 2 years past program initiation (i.e., full implementation of MDFT to fidelity). Full implementation consists of initial certification and annual recertification of therapists and supervisors and demonstrating adequate model fidelity as measured by the two purveyor organizations. A discontinued program either voluntarily stopped providing MDFT or failed to meet fidelity standards and hence lost its license to provide MDFT. Program sustainment was measured at three time periods: 2, 5, and 8 years after program initiation.

Sources of Funding

Program sources of funding were coded in one of three categories: 1 = time-limited governmental or foundation grants with specified start and end dates, 2 = ongoing funding with no defined end date (e.g., ongoing governmental contracts), or 3 = mixed (a combination of funding sources such as government contracts plus billing of third-party payors).

Level of Care

MDFT is implemented within various levels of care and service delivery systems. Level of care was coded in 1 of 3 service delivery categories: 1 = outpatient, 2 = in-home programs or hybrid programs providing both in-home and outpatient services, depending on the needs of the client/family, or 3 = partial hospitalization or residential programs.

Location of the Service Delivery

Coded in 1 of 2 geographic settings: 1 = services provided in a rural community, or 2 = services provided in an urban metropolitan area.

Network Membership

Programs were coded as being part of a network if they operated as part of a multi-program initiative characterized as having the same funding source, procedures, reporting requirements, and participation in joint implementation meetings.

Program Perceived Reason for Discontinuation

Both purveyor organizations conducted phone or videoconference “exit interviews” with program leadership (e.g., Clinical Director, Executive Director) to better understand from their perspective why the program was discontinuing. Purveyor staff then coded the reported reasons into 1 of 3 categories: 1 = funding, 2 = clinician issues such as failed to meet certification standards, or 3 = agency restructuring such as new leadership with new priorities, changes in programming or target populations.

Given the categorical nature of the data, we analyzed the data using a series of chi-square tests. Chi-square examines association between nominal variables with 0 indicating that two variables are completely independent and larger positive values indicating stronger association. The association between sustainment at each time interval (2, 5, and 8 years, with 1 indicating that sustainment had been achieved and 0 that it had not) was examined in relation to funding, level of care, location of the service delivery, and whether or not the program was located in a network. Please see Measures above for operational definitions of these variables. Given the exploratory nature of the research questions, we did not adjust for multiple statistical tests. Prior to conducting the chi-square tests, we evaluated the statistical assumption of chi-square tests—having 5 expected occurrences within each cell—and determined the assumption was satisfied in all cases.

Results

Sustainment Rates

Ninety percent (n = 124) of the 137 programs that had the opportunity to sustain for at least 2 years were sustained. Of the 101 programs that had the opportunity to sustain for 5 years or more (i.e., MDFT training began at least 5 years prior to data collection), 87% (n = 88) were sustained. Of the 44 programs that had the opportunity to sustain for 8 years or more, 70% (n = 31) were sustained. Chi-square tests indicated no significant difference in sustainment rates between European and North American programs, and thus all subsequent analyses were conducted on the full sample.

Factors Associated with Sustainment and Discontinuation

Potential differences were examined among programs that sustained versus those that discontinued at the three time periods.

There were no significant differences between sustained and discontinued programs at any of the three sustainment time periods for location or level of care. Being part of a multi-site network with other MDFT programs was associated with longer-term sustainment at 8 years or more (χ2 (1, N = 44) = 9.90, p = 0.002) but not at 2- or 5-year sustainment periods.

Source of funding was not associated with sustainment at the 2-year period nor at the 8-year period, however, financing was associated with sustainment at the 5-year period (χ2 (1, N = 101) = 10.68, p = 0.005), favoring ongoing funding from governmental, other organizations, or mixed funding sources in comparison to time-limited grant funding. Programs that were funded from time-limited grants were more likely to discontinue: 37% of programs with time-limited grant funding were discontinued during the 5-year period, whereas only six percent of those with ongoing funding and 10% of those relying on mixed financing discontinued during this same period.

Both the North American and European purveyors of MDFT routinely asked providers who discontinued their MDFT program to share their perceived reason for the discontinuation. These data revealed three primary perceived reasons for discontinuation identified by providers: (1) financial (e.g., lost funding), (2) clinical/workforce (e.g., clinicians failed to meet or maintain certification, clinicians did not want to implement the EBP), and (3) restructuring at the provider level (e.g., new leadership). Fifty-three percent of the discontinued programs reported lack of adequate funding as the primary reason for discontinuation, 31% reported that provider internal restructuring issues led to discontinuation, and 16% reported that their MDFT programs closed due to clinician issues.

Discussion

MDFT sustainment, particularly in the long-term, compares favorably to rates from previous research on sustainment of evidence-based interventions. For example, MDFT 2-year sustainment at 90% compares favorably with the average 2-year sustainment of EBPs of approximately 50%, ranging from a low of 25% to a high of 88% (e.g., Bond et al., 2014; Cooper et al., 2013; Hodge & Turner, 2016; Peterson et al., 2014; Scheirer, 2005; Stirman et al., 2012; Tibbits et al., 2010). Although there are relatively few studies in behavioral health examining longer-term sustainment, some notable exceptions provide guidance. For example, The National Implementing Evidence-Based Practices Project studied 49 programs providing a variety of interventions including employment and family psychoeducation. Seventy-nine percent of the programs sustained for at least 4 years and 47% sustained for 8 years (Bond et al., 2014; Peterson et al., 2014). Tibbits et al. (2010) examined 67 behavioral interventions in Pennsylvania, finding that 45% were fully sustained for 5 years. In comparison, of the 137 MDFT programs in North America and Europe that participated in this study, there were relatively high rates of longer-term sustainment, with 88% of the programs sustaining for at least 5 years and 70% for at least 8 years.

Because this is a hypothesis-generating study, a major focus is to consider the results in the context of previous findings and to theorize about key sustainment factors related to MDFT in particular, and EBPs more generally. The findings of the present study suggest, like others, that inadequate or unstable financing is an important reason for discontinuation (e.g., Aarons et al., 2011; Beidas et al., 2019; Bond et al., 2014, 2021). While we hesitate to speculate without access to financial data from the programs, it is interesting that funding source appears to be a meaningful factor for 5-year sustainment but not for 2-year and 8-year periods. First very few (10%) MDFT programs discontinued within 2 years, and perhaps once a program sustains for 8 years or longer it has successfully negotiated stable funding, leaving the 5-year period perhaps the most vulnerable to discontinuation due to loss of funding. Based on typical grant cycles, this is often when start-up funds have ceased. This presents an interesting area for further exploration; if we can learn more about the financial challenges programs face in this vulnerable period, funding may be adjusted to facilitate longer-term sustainment.

These findings also suggest that one possible facilitator of longer-term sustainment might be being part of a multi-site provider network and ostensibly benefiting from the funding and technical support provided by such a network. This study found that network membership was related to 8 or more years of sustainment, but not to 2- or 5-year sustainments. Perhaps the benefits of such networking and support have a stronger influence than other factors over the long haul of implementation efforts. These are questions that merit further exploration.

It is important to recognize that this study was limited to information routinely gathered by the two purveryor organizations. Certainly, there are many additional possible factors influencing sustainment than those measured here. Previous research and theorizing on sustainment suggest that the practical effectiveness of the EBP; fidelity monitoring, coaching, and support from purveyor organizations; intervention features such as cost of training and quality assurance activities; and perceived flexibility and complexity of the intervention may also play important roles in sustainment (Bond & Drake, 2017; Flynn et al., 2020; Hunter et al., 2015; Palinkas et al., 2018; Scheirer, 2011; Shelton et al., 2018). For example, studies have revealed that implementation is very difficult if the EBP training and ongoing consultations are not perceived as worthwhile by therapists and supervisors who are tasked with administering the EBP (Beidas et al., 2019; Motamedi et al., 2021). One promising study found that 85% of clinicians trained in MDFT reported that the training gave them the skills to be a better therapist (Godley et al., 2001). The extent to which clinicians’ motivation to learn and continue with EBPs influences sustainment, and the most important influencers of clinician satisfaction with EBPs, are potentially fruitful areas of further investigation.

It has been suggested that EBPs that are flexible in their application may be more acceptable to providers in comparison to EBPs that are perceived as highly structured and inelastic (Shelton et al., 2018). For example, a study of Family-Focused Therapy (FFT), an EBP for adolescent and young adults with bipolar disorder (Miklowitz, 2010), identified that providers valued EBPs that were both flexible (i.e., not a cookbook) and provided structured training, guidelines, and supervision (Chung et al., 2014). MDFT purveyors did not ask continuing programs to identify the reasons for their ongoing continuation (i.e., they only asked discontinued programs to discuss their perceived reasons for discontinuation). Future research might explore the influence of clinicians’ perceptions of the quality of clinical training, the level of collaborative and compassionate implementation guidance, and the adaptability and flexibility of both the clinical and implementation model on program sustainment rates.

This report contributes to the growing body of knowledge concerning sustainment of evidence-based behavioral interventions. It is noteworthy for its focus on longer-term sustainment across a relatively large number of programs on two continents. The fact that MDFT demonstrated better than average sustainment rates over time provides an opportunity to speculate on the reasons for its sustainment and sets the foundation for hypothsis generation concerning possible keys to sustainment that may be pertinent to other behavioral health interventions. The results suggest that funding source and stability and network memberships may be important factors for sustainments. Additionally, we speculate that perhaps also quality and nature of the clinical training, adaptability, and flexibility of interventions and their associated implementation and sustainment programs might also be key factors.

This study had important limitations that temper any conclusions based on the results. First, we studied only one EBP and thus cannot know how generalizable the findings are to other EBPs. Second, because this study was based on extant record data that the purveyors collect as part of their standard procedures, factors that might be important to sustainment such as leadership, organizational characteristics, staffing and turnover, quality of the clinical training provided, adaptability of the intervention, and, of course, improved clinical outcomes, were not examined (Aarons & Farrahnak, 2014; Massatti et al., 2008; Shelton et al., 2018). Collecting data directly from provider organization and clinicians, including in-depth interviews with both discontinued and sustained programs to better understand their perspectives on sustainment, may also provide essential information about EBP sustainment.

Nevertheless, the results of this study have important implications for EBP sustainment and generate ideas concerning potential factors of long-term sustainment that warrant additional research attention. The findings support the growing consensus that not only start-up funding but also ongoing financing either through governments or third-party payors is important for longer-term sustainability of EBPs. This study highlights the potential value of multi-site program networks for long-term sustainment (e.g., 8 years or more). Finally, the results point to the potential importance of the quality and nature of the EBP itself. Producing consistently excellent clinical outcomes, being adaptable to different organizations and the clients whom they serve and providing high quality structured and empowering training and quality assurance may also be essential features of sustainable evidence-based behavioral health interventions. These programmatic and clinician-level factors are all critical areas for further investigation of the keys to EBP sustainability.