Methods

The research team conducted a systematic review in accordance with PRISMA guidelines [22]. A study protocol (CRD42020114111) was registered with Prospero prior to article searching and data extraction. The research team conducted an initial search in December 2019 and a secondary search for new publications in September 2021 due to COVID-related delays.

Database sources and search strategy

The research team searched peer-reviewed literature registered in seven computerized article databases, including Academic Search Premier, Cumulative Index to Nursing and Allied Health Literature, PsycINFO, Social Work Abstracts, ProQuest Criminal Justice Database, ProQuest Sociological Abstracts, and MEDLINE/PubMed. The team also hand searched for relevant articles across four journals (i.e., the Journal of Offender Rehabilitation, Health and Justice, Implementation Science, and the Administration and Policy in Mental Health and Mental Health Services Research) to ensure that no articles were overlooked by database search criteria. Members of the research team met with a reference librarian to confirm search terms for three key concepts: implementation, corrections, and health intervention. A complete list of search terms is in Appendix 1.

Inclusion criteria

Articles were included in the systematic review if they (1) described an empirical study of implementation outcomes, determinants, and/or implementation strategies related to a health intervention; (2) cited an implementation science theory, model, framework, or taxonomy in their approach; (3) were conducted within a criminal-legal setting (i.e., jails, prisons, community supervision, and courts); (4) focused on interventions for adult populations; (5) were conducted within the U.S.; (6) were published in a peer-reviewed journal, (7) were published in English, and (8) were published between January 1, 1998, and August 31, 2021. Since the implementation science field is relatively young, and most public health intervention research in criminal-legal settings began to increase during the 2000s, we selected 1998 as the starting year to capture implementation research. Studies were excluded if they solely described efficacy or effectiveness results of health interventions or EBPs (i.e., did not assess implementation outcomes), described a study protocol, viewpoint (e.g., conceptual articles, perspectives), systematic reviews, dissertations, conference abstracts, studies that focused on youth or juvenile justice populations (whose needs and correctional environments differ from those of the adult population), or were published outside of the aforementioned date range. Studies that were conducted outside of the U.S. were also excluded given the significant differences in criminal-legal and health service systems across the global community.

Study screening, data abstraction and synthesis

Search results were managed using Zotero and uploaded into Covidence [23] for de-duplication, screening, and full text review. All abstracts were reviewed by two members of the research team. Disagreements over study eligibility were discussed until a consensus decision was reached or adjudicated by a third reviewer, and reasons for exclusion were documented. All articles eligible for full text review were evaluated by two reviewers, with consensus discussions and a third reviewer resolving any disagreements on article eligibility for the systematic review.

The team developed a standardized abstraction template to summarize study information including location, study aims, type of criminal-legal setting, description of the intervention being implemented, study design, sample characteristics, analytic approach, outcomes measured, implementation framework(s) applied, relevant framework domains named in the findings, contextual factors, a description of implementation strategies used and their corresponding targets. All extracted data were reviewed by the research team to ensure accuracy. Specific implementation outcomes (e.g., acceptability, adoption) identified in the studies were extracted and coded using Proctor et al.’s [9] taxonomy of implementation outcomes. Proctor et al.’s taxonomy of implementation outcomes is widely used and offers a summary of outcome measurement language referenced in many implementation theories, models, and frameworks. Several studies in this review used Proctor et al.’s taxonomy to define their implementation outcomes of interest, and those specific outcomes were extracted for this review. When authors described implementation outcomes that were not framed using the Proctor taxonomy, two reviewers consensus coded the implementation outcome based on its description in the study and then categorized the outcome using the Proctor taxonomy.

Implementation determinants were extracted and categorized based on the five domains of the Consolidated Framework for Implementation Research (CFIR; [11]). The research team selected this framework as it is one of the most widely cited [13] and recognizable implementation frameworks and was developed by consolidating domains and definitions from multiple implementation frameworks. Additionally, many studies in this review used CFIR to code their implementation determinants. When authors did not use the CFIR domains in their analysis, two reviewers consensus coded the implementation determinant based on its description in the study and then categorized the determinant using the CFIR domains to promote standardized terminology across our synthesis.

We operationalized implementation strategies reported in all implementation trials using the Pragmatic Implementation Reporting Tool [24]. The research team selected this tool because it provides a standardized approach to specifying implementation strategy components used in clinical and implementation research. The Pragmatic Implementation Reporting Tool integrates implementation strategy reporting guidelines from Proctor et al.’s “Specify It” criteria and Presseau et al.’s [25] Action, Actor, Context, Target, Time framework. Implementation trials were defined as studies that evaluated the effectiveness of an implementation strategy on a specified implementation outcome (e.g., the effectiveness of a “facilitation” strategy on adoption of an EBP). Implementation trials that did not report implementation outcomes were not included in this step of the coding. Using the Pragmatic Implementation Reporting Tool [24] as the guide, one reviewer examined all relevant implementation trial manuscripts and identified relevant protocol papers to extract and document information relating to the strategy specification. A second co-author reviewed the data extraction to promote rigor.

Results

A total of 4382 articles were identified in the searched databases and 30 studies were identified through hand searching (Fig. 1). Of these, 230 articles met criteria for full text review. The majority of excluded articles did not frame the study as implementation science or cite implementation science relevant references (n = 124), were ineligible publication types (e.g., protocols, conference abstracts, n = 49), were not conducted in a criminal-legal setting (n = 18) or outside of the U.S. (n = 15). In total, 24 articles were included in the study sample (see Table 1).

Fig. 1
figure 1

PRISMA flow chart

Table 1 Study citation, purpose, methods

Design, sample and setting characteristics of included studies

Table 2 summarizes the study design, sample, and setting-related characteristics of all studies included in this systematic review. In terms of study design, 42% (n = 10) of the articles identified were cross-sectional studies (i.e., data collected from a single measurement point), 28% (n = 9) were experimental (i.e., random assignment), and 21% (n = 5) were pre-experimental studies with pre- and post-test measures across a single group. Few (24%; n = 6) used a hybrid design, an increasingly common approach to addressing effectiveness and implementation aims within the same trial (see [50]). Approximately 42% (n = 10) exclusively employed quantitative methods, 33% (n = 8) used qualitative methods, and 25% (n = 6) used both qualitative and quantitative methods.

Table 2 Study characteristics and settings (n = 24)

Studies included representation from multiple criminal-legal staff, community-based partners, clients, and researchers. Few studies only included criminal-legal staff (21%, n = 5), while other studies included criminal-legal staff and community-based partners (33%, n = 8), or staff, community-based partners and clients (8% n = 2), or staff and research partners (8%, n = 2). One study included community-based staff and clients (4%), and 6 studies used administrative data (25%). Most studies (71%; n = 17) included at least one prison setting and half (n = 12) included at least one jail setting. Less than a third of studies (29%; n = 7) were conducted in a community corrections setting such as probation or parole. No identified studies focused on specialty courts (e.g., drug or mental health treatment courts).

Approximately 38% (n = 9) of studies investigated interventions to address infectious diseases, 25% (n = 6) addressed substance use, 21% (n =5) focused on mental health, and 8% (n = 2) addressed co-occurring mental health and substance use. Nearly half of all studies (46%, n = 11) were funded by either phase 1 or phase 2 of NIDA’s CJ-DATS initiative.

Implementation focus and findings

Table 3 presents a summary of the implementation science approaches used in the design and/or analyses of the studies. Implementation science theories, models, frameworks, and taxonomies applied in these studies included the following: Proctor’s Implementation Research Model [51]; Consolidated Framework for Implementation Research [11]; Exploration, Preparation, Implementation, & Sustainment Framework [12]; TCU Program Change Model [52], Promoting Action on Research Implementation in Health Services [53]; Rogers’ Organizational Diffusion of Innovations [54, 55]; Proctor’s Taxonomy of Implementation Outcomes [9]; Proctor’s Implementation Strategy Specification [15]; Practical, Robust, Implementation and Sustainability Model [56]; Expert Recommendations for Implementing Change [57].

Table 3 Integration of Implementation Science Methods (n=24)

Of the 24 studies, 63% (n = 16) examined implementation determinants (e.g., barriers and facilitators). Examples of studies focused on implementation determinants include a pre-implementation assessment of factors that could impact intervention implementation (e.g., [26]), a formative evaluation of an intervention (e.g., [30]), a post-implementation assessment of provider perspectives on implementation of an intervention (e.g., [32, 43]), and a retrospective examination of factors impacting implementation and sustainment of an intervention (e.g., [49]). Of the 16 studies examining implementation determinants, factors impacting implementation were most commonly associated with the inner setting (i.e., the organization in which the intervention was implemented; 88%, n = 14), followed by the outer setting (63%, n = 10), implementation process (38%, n = 6), characteristics of individuals engaged in the implementation effort (31%, n = 5), and intervention characteristics (31%, n =5).

In addition, 50% (n = 12) of studies measured implementation outcomes. Studies examining implementation outcomes primarily cited those included in the Proctor et al. [9] taxonomy. Of the 12 studies examining implementation outcomes, 42% (n =5) measured acceptability, which is the perception that a service or practice is “agreeable, palatable, or satisfactory” ([9], p. 67). Further, 33% (n = 4) of studies measured feasibility or the extent to which an intervention can be used within an organization. In addition, 17% (n = 2) of studies measured penetration or reach which is the degree to which a practice was integrated within the service setting ([9], p. 70). Additional implementation outcomes measured were appropriateness (8%, n = 1), fidelity (8%, n = 1), cost (8%, n = 1), sustainability (8%, n = 1), and adoption (8%, n = 1).

Few studies explicitly tested implementation strategies (25%, n =6). Tested implementation strategies included the following: conducting a series of rapid cycle processes [46], conducting a local needs assessment and strategic planning process [37, 45, 46], creating a local change team comprised of cross-disciplinary and cross-agency staff [37, 45, 46], utilizing a training coach to support the improvement process [37, 45, 46], promoting network weaving through local councils [29, 47], and providing education and outreach (Friedman et al., 2015; [47, 48]). Three articles [37, 45, 46] examined the implementation strategies from the HIV Services and Treatment Implementation in Corrections (HIV-STIC) studies, which was part of the NIDA-funded CJ-DATS. Two studies [29, 47] examined the same organizational linkage strategy from the MAT Implementation in Community Correctional Environments (MATICCE) studies, also part of CJ-DATS. It is also important to note that the level of detail provided across studies was sometimes incomplete, and determinations related to the implementation strategy were based on the best available information within the included articles and their related protocols.

Application of implementation science frameworks and taxonomies

Table 4 provides an overview of the implementation science methods used in each study. Although all studies cited a framework, application of the frameworks varied by study. Of the 24 studies included in this review, 50% (n = 10) used an implementation science framework or taxonomy to select or develop their data collection methods (e.g., [27, 33, 35]; Tables 1 and 4). For example, in some cases, the Proctor et al. [9] taxonomy was used to inform the researchers’ decisions about what implementation outcomes (e.g., acceptability, appropriateness) to select for a study and how to operationalize them (e.g., [45]). Additionally, implementation science frameworks were used for data analysis and coding for 38% (n = 9) of the studies (e.g., [30, 32, 43, 44, 49]). Most studies (67%; n = 16) applied frameworks in the introduction and discussion sections to frame the study and its results. In another 13% (n = 3) of the studies, an implementation science framework was identified or cited, but the authors did not specify how the framework was applied to their study or the authors did not include sufficient detail to code the application of the implementation science framework.

Table 4 Application of implementation science methods

Discussion

This systematic review documented the use of implementation science theory and approaches in studies aiming to implement health interventions in criminal-legal settings. Our review identified 24 articles for inclusion that spanned a 14-year period from 2007 to 2021. Thus, on average, less than two articles each year were published that employed implementation science methods to researching the uptake of health interventions in criminal-legal settings. That correctional health research has not kept pace with advancements in implementation science research and methodology is troubling given the complex health needs of the 1.8 million people incarcerated in the nation’s prisons and jails and the 3.7 million people under community supervision [58,59,60]. Mass incarceration and complex co-occurring health conditions both disproportionately impact communities of color and low-income communities [61]. In addition to reforming the criminal-legal system to reduce these inequities, researchers can also investigate effective implementation strategies to rapidly increase access to high-quality health services that meet the health needs of people currently incarcerated and under community supervision. In this way, greater testing of dissemination and implementation strategies in criminal-legal settings can help address health inequities stemming from societal injustices of mass incarceration and the impact of criminal-legal involvement on social and economic wellbeing [62, 63]. This review provides a summary of published correctional health intervention research that integrated implementation science approaches and provides a foundation to inform future studies seeking to employ implementation frameworks, strategies, and outcomes in their study designs.

Limitations

There are two notable limitations when considering the results of the study. First, it is possible that relevant studies may have been excluded within the parameters of the systematic search. Specifically, if a study was not framed as implementation science or clearly indicated the use of implementation science methods, it was excluded. For example, a study that examined barriers and facilitators of implementing a program but did not integrate implementation science within the study justification or methods would be excluded. This is also true for studies that may have been supported by funding from CJ-DATS but did not explicitly integrate implementation science methods. To include as many relevant studies as possible, we scanned articles for implementation science relevant citations that might indicate an implementation science approach. Second, the study used 1998 as a start date because of the initiation of Veterans Affairs Quality Enhancement Initiative (VA QUERI). Although most of the studies included in our review were from the 2010s, it is possible that earlier relevant studies could have been excluded. Finally, this systematic review focused on identifying implementation approaches, strategies, and outcomes from efforts aimed at providing evidence-based care to the adult criminal-legal involved population. Thus, we are unable to draw any lessons or comparisons from efforts conducted in juvenile justice settings (e.g., JJ-TRIALS). Adult and juvenile correctional settings differ in their populations, structure, and programming. Implementation strategies that are effective in juvenile settings where there is a stronger focus on healthcare and rehabilitation programming and support youths who typically serve shorter sentences, are likely to differ than implementation strategies used in adult correctional settings.

Implications

Despite these limitations, this study also highlights a number of important future directions related to the implementation of EBPs in criminal-legal contexts.

Focus on determinants reflects complexity of the implementation environment

The fact that implementation determinants make up the majority of the studies in this review, most of which focus on the inner setting (i.e., the organizational context), reflects the need to understand the factors impacting implementation within a complex implementation environment. Correctional health interventions, by nature, often involve an inter-organizational and multi-disciplinary context in which practitioners who may be trained in one type of service (e.g., healthcare) are operating within the context of another agency environment (e.g., corrections). For example, a mental health intervention operated by a private service provider but co-located within a county-based detention center is cross-sectoral (i.e., private and public sector, respectively) and is susceptible to a broad range of multi-level factors originating from both the mental health service system and the criminal-legal system. Influences from both of these systems can create an implementation environment that may require significant intervention adaptation to render the intervention fit or appropriate for the implementation environment. Consequently, the fact that much of the implementation science focus in correctional health research has been about contextual inquiry (i.e., understanding implementation determinants) is appropriate and expected. Given the relative nascency of leveraging implementation science within correctional health research, and the variation in types of health interventions and correctional environments (i.e., jail, prison, courts, community supervision), the focus on contextual inquiry and implementation determinants should continue.

Additionally, results show that much of the focus on implementation determinants has been on factors related to the inner setting, or the organization in which the intervention is being implemented. Although critically important, a singular focus on the organizational factors impacting intervention implementation does not account for significant external factors relevant in contextual inquiry. Notably, policy-level factors and those associated with inter-organizational relationships are significant determinants of whether and how an intervention is adopted into practice [64]. For example, recent opportunities to promote Medicaid enrollment prior to release from incarceration will impact costs and access to care in carceral settings and need to be considered in the implementation of new health interventions [65]. In addition, for interventions and implementation strategies focused on enhancing referral networks (e.g., [29, 47]) to increase service uptake, the presence, quality, and characteristics of inter-organizational relationships are critically important. Consequently, as researchers and practitioners continue to examine implementation determinants, a greater focus on domains beyond the inner setting is needed.

Increase the focus on implementation outcomes and strategies

Although contextual inquiry studies focused on implementation determinants are still needed, the field should also build on the knowledge produced by the two NIDA-funded initiatives, JCOIN and CJ-DATS. These initiatives accelerated the application of implementation science methods in correctional health interventions and focused on developing implementation strategies to enhance uptake of EBPs. Moving forward, researchers and practitioners can develop sequential study aims in which implementation determinants are studied in the first phase of research in a small pilot study and then addressed in subsequent stages, first through a pilot test of an implementation strategy and later through a larger study employing rigorous methods to test the efficacy of the implementation strategy. NIDA’s strategy of speeding up translation in correctional settings through federally funded consortium sites could be expanded to other institutes. Resulting studies from such initiatives would provide the field with invaluable information about the variation in the factors that impact implementation by fields of practice and health foci (e.g., substance use, mental health, infectious diseases) and the implementation strategies that enhance uptake of their respective EBPs. In addition to federal funding initiatives to speed up translation, researchers can consider using hybrid designs. Hybrid effective implementation designs challenge the typical sequencing of efficacy, effectiveness, and implementation research by promoting simultaneous examination of effectiveness and implementation aims [50, 66].

Standardize specification of implementation science methods

Assuming greater focus on the application of implementation science methods in correctional health research moving forward, it is imperative that researchers and practitioners standardize specification of methods in their reports and articles. Better specification will make it easier for other researchers and practitioners to understand, adapt, and apply these methods to their work and advance the research. Suggestions for better specification of implementation science methods in correctional health research articles include (1) providing a clear justification of the use of implementation science methods, including citations; (2) clear description of the implementation focus (i.e., implementation determinants, outcomes, and strategies) as well as the correctional setting (i.e., prison, jail, court, community supervision) and health focus (e.g., mental health, substance use, infectious disease); (3) identification, justification, and meaningful operationalization, and integration of selected implementation science frameworks throughout the study (e.g., include detailed descriptions of the chosen framework(s) and rationale; explain how the framework guided the study methods, such as instrument development, data collection, data analysis; describe the degree to which the framework fit the study context; explain how the framework can aid in the interpretation of results and transferability); (4) clear definition of implementation outcomes that map onto an implementation framework; and (5) standardized specification of implementation strategies, preferably using a framework (e.g., [15, 24]). Standardizing these methods helps to align the correctional health research with other health service fields to help better understand the role of the correctional context in implementation of EBPs.

Conclusion

Although application of implementation science methods in correctional health intervention research is limited, integration of these methods appears to be accelerating, likely fueled by federally funded implementation-focused research consortiums. Overall, the implementation research on correctional health interventions has largely focused on understanding the environment in which health interventions are implemented. Although focusing on these implementation determinants is necessary given the complex environment in which health interventions are implemented, the field should increase its focus on developing implementation strategies to address the known factors that impede successful implementation and to standardize the way that implementation science methods are specified in correctional health research.