Practice to Research and Back in a Social Service Agency: Trying to DO BETTER

Abstract

Background

There is a growing emphasis of evidence-based program requirements being integrated into social welfare policies for youth care services in the U.S. This trend highlights the need for increased practitioner understanding and involvement in the research process to develop and implement evidence-based programs for youth with emotional and behavioral disorders who receive residential services.

Objective

The purpose of this review was to provide residential care practitioners and researchers with an understanding of a transdisciplinary translational research approach for social service agencies and the research activities that can be included.

Method

A review of the literature from a collaborative project between a social service agency and university that resulted in the development and testing of an aftercare intervention for youth departing residential programs was used to explain the framework.

Result

The DO BETTER framework outlines a process that (1) focuses on input from practitioners and consumers to help determine problems that impact youth and families, (2) involves research and practitioner partnerships to conduct a variety of research activities to create solutions and (3) provides results that are useful for practitioners. The research activities of the project illustrate the iterative processes of practice to research and back to practice that included youth, caregivers, practitioners, researchers, and experts from other disciplines.

Conclusion

The framework is provided to help researchers plan for collaborative research with social service agencies, and to help non-researchers in agencies become more familiar with research activities to increase their involvement in program design, testing, implementation and sustainability.

Introduction

Conducting research in social service agencies has reached an important point in time with the recent passage of the Family First Prevention Services Act (FFPSA). As part of the U.S. Bipartisan Act of 2018, the FFPSA requires research evidence and ongoing evaluation of child welfare services for these services to be eligible for Title-IV-e funding (Wilson et al. 2019; U.S. Bipartisan Act 2018). The FFPSA reflects a growing trend of evidence-based program requirements being integrated into social welfare and health policies in the U.S. This trend highlights the importance of social service agencies having the capacity to conduct, understand, and use research for program evaluation and improvement. This can be a challenge for agencies that provide residential programs for youth with emotional and behavioral disorders. Providers of residential programs are tasked with documenting evidence of effectiveness, but often lack the expertise, time, and financial resources to do so (James et al. 2015). Additionally, practitioners may be hesitant to adopt new practices or may lack the technical support to implement evidence-based programs with which they are unfamiliar (Lipsey et al. 2010). In this article, we illustrate how transdisciplinary translational research has been used to address a key practice-related need within a national social service agency, guided by a framework that promotes a flow of research activities moving from practice to research and back.

The DO BETTER framework described in this paper draws from existing models of translational research and implementation science (see Fixsen et al. 2009; Hamilton 2015; Glasgow et al. 2012; Grywacz and Allen 2017; Lipsey et al. 2010; Rubio et al. 2013; Saldana et al. 2011; Schindler et al. 2017). In this approach, research activities start with determining the needs and opportunities for research based on ideas shared by practitioners, youth, and families. Different problems will initiate different research activities, which are used to address the needs in a collaborative manner between staff, consumers, and researchers. The results of research are then translated back as possible solutions in a way that is understandable and can be used. This framework adopts a mixed methods approach to research that includes both qualitative (e.g., analyses of narrative data) and quantitative (e.g., analyses of numerical data) methods (Teddie and Tashakkori 2009, p. 4) that are implemented in both non-experimental and experimental designs (Mason et al. 2018), so researchers and practitioners can use all methods possible to address research questions (Creswell and Plano Clark 2011, p. 13).

Traditionally, applied research has implemented a unidirectional research-to-practice approach that emphasizes setting external standards for research evidence, conducting rigorous studies (e.g., randomized controlled trials), using strategies to push research findings out to practitioners, and ensuring the adoption and implementation with fidelity of evidence-based programs (EBPs; Tseng et al. 2017). This traditional approach potentially contributes to the science-to-practice gap, because it can limit practitioner involvement in determining the need, design, and real-world applicability of the studies (Cargo and Mercer 2008). As a result, results are not being used adequately to improve services and research is not being conducted efficiently (Fixsen et al. 2009). For instance, it can take up to 17 years to get an EBP into routine practice on a large scale (Hamilton 2015; Tseng et al. 2017), in part, because those programs have not been developed with the end user in mind (Glasgow et al. 2012). Fixsen et al. (2009) argue that making better use of research involves building science and quality based on the performance of practitioners. This view aligns with participatory research, which seeks to address complex problems by beginning with the issues most important to those who will ultimately use the research - the practitioners and consumers (Cargo and Mercer 2008; Minkler et al. 2003). With the recent passage of the FFPSA and similar policies, social service agencies that provide residential services to youth who have emotional and behavioral disorders need a framework for conducting studies that are rigorous, efficient, responsive to consumer needs, and include practitioners throughout the entire research process.

Transdisciplinary translational science integrates different perspectives to gain new understandings on how to solve complex problems and put this understanding to use (Harvard School of Public Health 2019; Stevenson et al. 2013). It includes two types of research: transdisciplinary and translational. Transdisciplinary research places emphasis on research collaborations that integrate different perspectives together to create solutions that go beyond what could be achieved separately (Stevenson et al. 2013). This oftentimes includes different disciplines of research and practice expertise. Translational research is the science of putting science to use (Hamilton 2015), by transforming research findings into effective and widely available clinical applications (Mitchell et al. 2010). T2 translational research involves developing the methodology of participatory research, conducting outcomes research, and developing best practices (Rubio et al. 2010), while T3 involves implementing, disseminating, and scaling-up best practices (Hamilton 2015). These stages involve researchers and practitioners partnering together to improve the way research is used in practice. These partnerships focus on using the key problems of practice to shape the research agendas based on collaboration, trust, and research interests that are mutually beneficial for researchers and practitioners (Tseng et al. 2013). Research is conducted in a way that is supported by and informs the larger organization or partnership (Davies et al. 2007).

To improve the way research is conducted in social service agencies, concepts from the translational research frameworks reviewed above can be applied to increase practitioner and consumer involvement in the research process from its outset. For example, a research to practice partnership started in 2005 between a university and social service agency described in Thompson et al. (2017) was created to address a program need for the agency’s therapeutic residential care program. The partnership combined the internally-funded research department of a service agency that had extensive experience in residential programs and research expertise that was primarily involved in program development and evaluation, with a university research center with experience in educational and community-based interventions for children and families with emotional and behavioral disorders. One of the problems the partnership specifically worked to address was the need for an aftercare intervention that could provide ongoing support to youth after they discharged the program. At the time the project started, there were no evidence-based models for aftercare available (Leichtman and Leichtman 2001; Lieberman 2004; Walter and Petr 2004) and reentry rates for residential programs across the country were as high as 50–75% (McMillen et al. 2008). The collaborative effort of this partnership resulted in the successful design and evaluation of an aftercare intervention for youth and families departing residential programs (Trout and Epstein 2007; Trout et al. 2012, 2013b, 2019). The research activities of this project are illustrated in the DO BETTER framework.

The DO BETTER Framework

To promote further successful practitioner-researcher collaborative efforts, we next describe the DO BETTER framework for promoting such partnerships to facilitate translational research in social service agency settings that flows from practice to research and back to practice. The acronym DO BETTER (Determine the Problem, Ongoing Data Collection, Build a Team, Explore Solutions, Try it Out, Test the Program, Execute Implementation, Reach More Youth and Families; see Fig. 1) is provided to describe the different research activities in the process and Table 1 shows the research concepts that are included in the framework.

Fig. 1
figure1

The DO BETTER framework draws from existing models of translational research and implementation science. It outlines research activities that emphasize practitioner, researcher, and consumer involvement so that research activities flow from practice to research and back to practice

Table 1 Research concepts of the DO BETTER framework

Determine the Problem

DO BETTER starts with determining the needs and opportunities for research activities based on problems shared by practitioners, youth, and families. Valuable research questions can emerge from continuous quality improvement activities that occur within social service agencies. Quality improvement is used to identify and design solutions to address organizational problems, which may or may not be useful outside of the agency (Bauer et al. 2015; Glasgow et al. 2012; Hartman and Linn 2008; Mowbray et al. 2003). In the practice to research and back approach, researchers can assist in the collection and analysis of data using activities such as surveys and focus groups and then provide the results back to agency staff so they can determine the use of the results. Like participatory research (Cargo and Mercer 2008; Minkler et al. 2003), the key here is that the ideas start with the practitioners or consumers who will use the results in the end. Researchers help with data collection and qualitative analysis, and then the results are provided back to the program to interpret and determine how the results will be used (e.g., training, improving services, etc.).

Ongoing Data Collection

Ongoing data collection on practices and outcomes are important for monitoring program quality. Social service agencies may prescribe best practices but still achieve poor outcomes because quality improvement practices are not in place (Forman-Hoffman et al. 2017). Agencies can use quality improvement tools such as the Plan-Do-Study-Act cycle described in Moen and Norman (2010) to establish questions, analyze and compare data, and determine changes that need to be made. This can include monitoring service delivery, model implementation and tracking outcome data. Practitioners and consumers also have experiences and anecdotal information that can be captured on an ongoing basis to improve and evaluate the quality of services. Paying attention to “pride or pain” points, as one practitioner called it, can help prioritize these efforts, and using practices recommended by experienced practitioners can result in more effective programs (Dubois et al. 2002). Creating a system of quality improvement that uses ongoing data collection and analysis can make the process more routine and efficient, and monitor day-to-day practices related to outcomes.

Researchers can play an important role in ongoing data collection and analysis by using different quantitative and qualitative research techniques. For example, routine follow-up surveys conducted with parents helped identify the research problem for the collaborative project on aftercare mentioned above (Trout and Epstein 2007; Trout et al. 2013b, 2019). The need for aftercare was originally voiced by parents whose children had departed the residential program. The following is one of the author’s firsthand accounts:

Follow-up survey data were routinely collected by the service agency on youth placement status after departure and shared with program directors and staff on an annual basis. In reviewing this data, a trend of comments related to aftercare emerged. Parents acknowledged their children had done well in the program, but the families could not provide the same level of support that their children received in the program when they returned home. Consequently, some of the children went back to old problematic behaviors after they discharged.

The survey data that was collected routinely from consumers converged with anecdotal experience held by the residential program staff and the reentry rates for residential programs across the country. The need to prevent youth from returning to problematic behaviors and subsequent reentry was a concern shared by both parents and practitioners. Attempts were made by the agency to create an aftercare program to improve the quality and sustainability of the outcomes achieved by the residential program, but there was limited information on best practices.

Build a Team

Research-practice partnerships can be built to create teams that join practitioners with researchers to address problems of practice such as the need for new interventions (Tseng et al. 2017). In these partnerships, the research issues addressed are important to the practitioners because they address a programmatic or consumer need, and beneficial to researchers because it fits their interests and expertise. In social services, “practitioners are the intervention” (Fixsen et al. 2009, p. 523) who provide insight into what research studies and findings are useful, while researchers provide universal knowledge and have an understanding of where that knowledge comes from and its limitations, along with the methodological skills to conduct high-quality research (Grywacz and Allen 2017; Hamilton 2015). These types of partnerships also create a collective knowledge that is built over time through iterative work to address future problems (Tseng et al. 2017).

As described in Thompson et al. (2017), the social service agency partnered with a university to address practice related needs through research. The need for aftercare also resonated with a transdisciplinary advisory panel that was assembled by the partnership. This panel included experts from different disciplines (e.g., education, child welfare, juvenile justice, prevention, etc.) from around the country who were brought together to focus the research agenda for the social service agency. Consensus and collective expertise from this panel declared aftercare a significant priority, and development on a program called On the Way Home® (OTWH) began as a result.

Explore Solutions

In research-practice partnerships, both researchers and practitioners contribute important information needed to understand the problem and explore solutions (Middlemiss 2016). Knowledge flows between practitioners and researchers, not solely from researchers to practitioners (Hamilton 2015). For instance, researchers and practitioners can work together in co-designing and testing new interventions (Tseng et al. 2017). This approach can increase the use of research because practitioners (1) bring knowledge of programs and practices to the design and implementation of studies, and (2) are more likely to trust the findings as a result (Hamilton 2015).

For the aftercare project, the lack of research available on best practices created a need to begin to develop the theory and model. It was apparent early on that determining the scope of the aftercare model would be critical to address the variety of aftercare needs for youths in residential programs. For example, youths placed in residential programs can have internalizing disorders, disruptive behaviors, academic problems, delinquent behavior, drug and alcohol use, exposure to trauma, etc. (Briggs et al. 2012; Farmer et al. 2004; Lyons and Schaefer 2000; Trout et al. 2008). To address the diversity of youth needs, the initial aftercare model was aimed at supporting youth home and school placement stability overall. Federal funding (PIs: Trout and Epstein, #R324B070034; #R324A120260) was then needed and obtained to begin the project.

During the project, different research activities were used to refine the model based on the perspectives of those who would use the service. For example, focus groups and surveys were conducted with youth, parents, school teachers, direct care staff, and agency leaders (Trout et al. 2014a, b). The convergent and divergent views gained from these research activities were critical in refining the scope of the types of supports needed. Youth and parents shared that education and relationships were two of the most important domains for aftercare (Trout et al. 2014a, b). Direct care staff indicated mental health and substance abuse supports were of the highest importance for these youth (Trout et al. 2014a). Attorneys who provided legal representation to youth and families shared that family supports were the most important (Tyler et al. 2018). A national survey of agency leaders indicated that even though children may depart with decreased symptoms, they were still at risk for unsafe behaviors as they transitioned from the structure of the residential programs back to the community without continuous family and safety supports (Tyler et al. 2017).

Different themes emerged from the research that would highlight the guiding principles for the aftercare model. The staff who were providing the aftercare services needed to be supportive and available, have the ability to assist in preparing the child to return to the “real-world”, and be able to help identify local supports in the community (Trout and Epstein 2010). Transition planning that promoted continuity and consistency, providing 24-hour phone support, and helping families coordinate services was voiced consistently across the different groups of participants (Trout and Epstein 2010; Tyler et al. 2014, 2018). Unique perspectives were also shared based on the viewpoints of different participants. Youth stated they wanted to feel normal and no longer wanted to feel like they were in treatment services, while parents were concerned about the time commitment and the qualities of the staff providing the services (Trout and Epstein 2010). Attorneys were concerned about parent expectations for their children when they returned home, such as not expecting any further problems or lack of optimism about child’s future success (Tyler et al. 2018). Agency leaders expressed the importance of reinforcing skills training for youth and parents (Tyler et al. 2017) to help youth sustain the gains they made in the residential program. These suggestions were supported by principles that were key for aftercare: (1) family empowerment—the degree to which parents perceived their ability to navigate services on their child’s behalf; and (2) caregiver self-efficacy—the degree of confidence in performing caregiver roles to support and manage their family needs (Bandura et al. 2011; Trout et al. 2013b, 2019). These principles were tied to the theory of change, which details the outcomes that were expected to result from the intervention based on the targeted variables (Schindler et al. 2017). For aftercare, these targeted outcomes were placement stability in the home and school (Trout et al. 2013b, 2019).

Three core components of support were defined for the OTWH Model—Family, School, and Homework (Trout et al. 2012, 2013a, b, 2019) with help from the transdisciplinary team. Residential staff from the agency provided expertise in the area of residential programming to maximize continuity between the residential and aftercare programs. Experts from outside of residential care were also involved in the development of the activities that make up each component. Curriculum and training experts of Common Sense Parenting® (CSP; Burke et al. 2006), which is the parent training program developed and used by the service agency, were involved in the design of the Family Support component of OTWH. CSP is an evidence-based program that improves family functioning through parent training of the critical skills necessary to successfully support adolescents’ academic and behavioral success (Mason et al. 2015, 2016), which were important outcomes for OTWH. CSP, however, is a classroom group-based curriculum, so it was modified for OTWH so consultants in the youth’s home could provide the training to parents on a one-on-one basis.

For the School Support component of OTWH the university provided expertise in the areas of educational supports for students with emotional and behavioral disorders. Expertise from developers of an evidence-based dropout prevention program called Check and Connect (C&C; Christenson et al. 1997) were also sought. These experts provided insight into the design of the school support component of OTWH. C&C involves monitoring high-risk educational behaviors of students to prevent school failure and build communication between the schools, students, and families. After consulting with the C&C developers, C&C was modified for the school support component of OTWH so that it could be implemented by both the OTWH consultant and a program identified school contact person who primarily served as the liaison between the consultants and the teachers.

For the Homework Support component there were four primary goals: (1) improve homework completion, (2) decrease tension and friction related to homework completion, (3) improve home-school communication, and (4) build youth independence (Trout et al. 2019). Essentially the homework study hour that was used by the residential program was adopted so that youth could have the same structure provided at home. This included a homework checklist of 13-items that the youth and caregiver completed with the guidance of the OTWH Consultant. This structure was combined with providing individualized supports for students such as tutoring, behavioral reinforcers, contracts, etc.

Try It Out

In the early stage of program development, researchers and practitioners can work together to determine feasibility and try out the program. Evidence-based programs are manualized/evaluated packages of best practices, while evidence-based practices are the components that can be mixed and matched in programs to address the many variations of communities & youth they serve (Hamilton 2015). Schindler et al. (2017) describe the benefit of using micro trials (Howe et al. 2010) to establish feasibility and preliminary evaluation to refine the program components, strategy, and materials. In a similar fashion, critiquing the design of the program and components early on was an important part of the developmental stage for OTWH. During the process, youth, caregivers and stakeholder who participated in the research activities provided valuable input into the design.

The pilot phase of the program provided important information that was used to adapt the design of the model (see Trout et al. 2013a). The original version of the Family Support component was highly individualized and based on the input from parents to identify which parenting skills would be most helpful for them. Parents and consultants could choose from a wide range of parenting skills that could be applicable. This proved to be a challenge early on, because it was difficult for the parents to determine which parenting skills, they needed to focus on for their child’s return home because their child had been living in the residential program. Most youth did not have problems immediately upon their return home from the residential program, and so parenting skills were not identified. As a result, parents and consultants became overwhelmed with too many choices resulting in parenting plans that were vague. When the youth did begin to struggle, typically two to three months after they reunified, support provided by the OTWH Consultant was often reactive and focused on crisis situations, rather than reinforcing the desirable behavior youth had initially demonstrated when they returned home. The lesson learned from the pilot phase determined the need for parenting plans that could promote a positive transition for the youth returning home by improving continuity and reinforcing the skills the youth learned in the residential program. As a result, the activities included parent training for all the parents in the areas of encouraging good behavior, preventing problems, correcting problems, and teaching self-control with specific examples that linked the gains the child made in the residential program. This helped the youth, parents and consultant create an individualized parenting plan that was proactive for the child and family.

As a result of the pilot phase, greater emphasis was placed on the delivery of the three primary supports of OTWH (i.e., family, school, homework). Evaluation of the service delivery was then conducted on the revised OTWH model after the first year of full implementation. OTWH Consultants had spent more than 50% of their time providing direct service to youth and families in the three areas, which averaged two hours per week (Trout et al. 2013a). There was greater emphasis on the core sessions of CSP to focus on encouraging good behavior and preventing problem behavior, to help parents encourage the progress youth made in the residential programs. This also addressed one of the key concerns shared by youth who did not want to “start over in treatment.”

Feedback from youth, parents, and consultants affirmed the revisions to the design were beneficial. The benefits to the participants are another important element of this research process, since the purpose of the study should be with end user in mind (Minkler et al. 2003). The consultants indicated that providing prescribed parent training made them better equipped to consult to the parents and reduced time spent on crisis management. Parents reported that they appreciated how available the consultants were to them. Additionally, consultants helped parents use the 24-h Boys Town Hotline® (Boys Town.org, 2019) that was operated by the service agency, which provided them with a community resource that they could use on their own. As the OTWH model began to take shape it addressed several of the suggestions identified from the focus groups: continuity for youth, availability of support, staff competency, and helping youth and families connect to resources.

Test the Program

Once a model has been established and tried, rigorous evaluation is needed to test the program to determine efficacy and effectiveness. Efficacy is whether an intervention can work under ideal circumstances, while effectiveness is whether the invention works in practice (Haynes 1999). Translational research involves this transition from efficacy trials to effectiveness trials to determine if the intervention produces the same outcomes in real-world settings (Grywacz and Allen 2017). Practitioners may be aware of the need for trials such as these, but often lack the resources and technical support to do so on their own (James et al. 2015; Lipsey et al. 2010). Studies have traditionally been done by researchers without practitioner involvement, which can limit the application of the results in practice when the study is completed (Cargo and Mercer 2008; Glasgow et al. 2012). With a transdisciplinary translational research approach, practitioners are actively involved in the research process, which can increase their understanding of the results and limitations of the study (Hamilton 2015), and the knowledge needed to implement the intervention in practice and influence policy (Tseng et al. 2017).

The FFPSA standards require programs to be evaluated based on causal evidence resulting from research studies using randomized or quasi-experimental group designs (Wilson et al. 2019). Randomized controlled trials (RCT) randomly assign participants to treatment or control groups and are the gold standard approach for program evaluation (Jaccard and Bo 2018). Two randomized controlled trials (RCT) (Trout et al. 2013b, 2019) were conducted over ten years to test OTWH in youths’ homes and schools. The first trial was part of the development study, which included a sample size of 82 youth and families randomly assigned to OTWH or services as usual. Overall, OTWH had a significant impact on home and school placement stability at 12-months post discharge from the residential program. Returning to care or discontinuing school enrollment in the community school was three to over five time less likely for youth in the OTWH group compared to the youth who received services as usual (Trout et al. 2013b).

The second RCT was an effectiveness and replication study, which included 187 youth and families. Families who received OTWH reported significantly higher levels of empowerment and self-efficacy than families in the control group. The findings of the previous RCT for placement stability at 12-months did not replicate. In the second study, the placement stability for the services as usual group improved compared to the first RCT. This may have been due to contamination from the dissemination efforts to advocate for aftercare based on the results of the first study, which occurred in the same agency and schools in the region. However, there was a large and significant difference at 9-month follow-up indicating that the odds of positive placement for youths in OTWH were over three times greater than for youth who received services as usual (Trout et al. 2019).

Execute Implementation

Once a program has been tested and shown to be effective, the next phase of research is to ensure it is implemented in practice as defined. Translating the knowledge of what works into practice can be a considerable challenge (Lipsey et al. 2010). This requires practitioners and researchers working together on the implementation of evidence-based programs and practices to produce expected youth and family outcomes with fidelity (i.e., adherence to a model). Fidelity involves implementing the whole package essential to achieving identified outcomes and determining which elements of the model are critical (Hamilton 2015; Tseng et al. 2017). It is important to note that effectiveness of a program and fidelity are both needed to achieve desirable outcomes. Poor outcomes could be a result of good fidelity of an ineffective model or an effective model implemented poorly (Fixsen et al. 2009).

Implementation science is the study of methods to promote uptake of research findings into routine practice to improve the quality and effectiveness of services (Bauer et al. 2015; Glasgow et al. 2012). The DO BETTER framework incorporates models of implementation science (see Bauer et al. 2015; Chamberlain et al. 2011; Fixsen et al. 2009; Rogers 2003; Saldana et al. 2011; Saldana 2014) that are based on continuous exchange of information and improvement cycles that transfer research-based interventions to real-world settings (Fixsen et al. 2016). One model, the Stages of Implementation Completion (SIC; Saldana et al. 2011), measures the time spent across three phases (pre-implementation, implementation, and sustainability) to fully adopt and implement a new program. The pre-implementation phase has three stages: (1) engagement, ((2) consideration of feasibility, and 3) readiness planning that occur before a new program is adopted (Saldana et al. 2011). Agency leaders are typically more involved in this early phase as they gain awareness of the new program, determine the need for more information, and then decide if they will try the new program or find another one (see Rogers 2003). The time spent in early stages of pre-implementation is a strong predictor of actual provision of services (Saldana et al. 2011).

Four stages are described in the implementation phase of SIC 1) staff are trained and hired, 2) monitoring processes are in place, 3) services and consultation begins, 4) and then services, consultation and fidelity monitoring continue (Saldana et al. 2011). Staff selection is a critical part of implementation (Fixsen et al. 2009). For example, direct experience working with youth in the residential program became an important staff selection characteristic that proved to be valuable for OTWH. Staff who had experience in the residential program were able to establish rapport with parents based on a mutual understanding of working with the youth in a family-like setting. This added credibility to the parent training and coaching the consultant provided parents on how to address their child’s behavior. In addition to training and consultation, other core components of implementation included evaluation, data systems, and administrative support (Fixsen et al. 2009). For the OTWH project, this included the creation of manuals and materials for pre-service training for staff and consultation training for supervisors that were linked to model fidelity, evaluation tools, and data collection systems to measure service delivery.

Fidelity involves implementing the essential model components to achieve the identified outcomes, and evaluation of the elements of the model that are critical (Hamilton 2015; Tseng et al. 2017). The development phase of the OTWH project (Trout et al. 2013a) provided benchmarks for consultants and supervisors to evaluate the implementation of services. Report logs were used to collect data of the time practitioners spent on the required activities of OTWH. This allowed staff and supervisors a way to objectively measure how much time was spent on the essential components of the model and the associated response youth and parents had to the services provided. This data was combined with direct observation of staff performance to determine fidelity to the model. Ongoing consultation and evaluation were used to monitor the implementation of OTWH service delivery and assess fidelity so that services were directed to achieve the desired outcomes of home and placement stability for youth departing residential care.

The final phase of the SIC addresses the competency of the agency leadership and practitioners to sustain the model, which is measured by the certification of model implementation (Chamberlain et al. 2011; Saldana et al. 2011). To establish certification criteria, a framework like the Standardized Program Evaluation Protocol (SPEP; Lipsey 2009; Lipsey et al. 2010) could be used to compare program implementation to what was found to be effective from the research study of OTWH. SPEP focuses on key factors related to program effectiveness such as delivery of the amount of the intervention (i.e., duration, total contact hours), quality of implementation of the intervention (e.g., training and monitoring of the service), and outcomes based on the type of service (Lipsey et al. 2010). For example, training on OTWH emphasized a minimum of two hours of direct service per week with more than 50% of the time focused on the three components of OTWH (i.e., family, school, and homework support) to achieve the outcomes of home and school placement stability. The amount of direct service, time allocation to the core components, quality of implementation, and outcomes achieved are factors used in the certification process to scale-up OTWH.

Reach More Youth and Families

The final stage of the DO BETTER framework involves scale-up and long-term sustainability. Scaling-up can include expansion of a pilot program to scale within an organization or replication of a program with other organizations (Hartman and Linn 2008). In both cases, the implementation considerations mentioned before apply.

Dissemination efforts are also needed to target the distribution of information to specific audiences who could benefit from the evidenced-based intervention to promote scale-up (Glasgow et al. 2012). Practitioner involvement in the OTWH project provided a deeper understanding of the development of OTWH, the results and limitations, which aided in the transfer of knowledge to other practitioners looking to adopt the intervention. For example, conference presentations and peer-reviewed papers of the OTWH project were prepared to disseminate information to residential program practitioners and policy makers to increase the use of aftercare services. This included presentations to advocate for policy changes and access to the funding needed to improve the availability of aftercare. A survey of different organizations showed that access to external funding was the best predictor of agencies providing aftercare services (Tyler et al. 2016). To aid in advocacy efforts, financial information was calculated to compare the cost of providing a new service such as OTWH to the cost of reentry to residential care and detention facilities.

Discussions with policy makers about the need for aftercare was aided by the recent passage of the FFPSA in 2018. The FFPSA requires discharge planning and family-based aftercare support for at least 6 months post-discharge for youth who receive services in qualified residential treatment programs (QRTP; Bipartisan Budget Act 2018, p. 192). The OTWH intervention provides an example of how aftercare for youth departing residential care is designed to prevent reentry and improve child permanency, which are targeted outcomes of the FFPSA (Wilson et al. 2019). Fortunately, the collaborative effort of practitioners, researchers, youths, families, and other experts who worked on the OTWH project resulted in a potential solution, and the service agency was ready to respond to the new requirements. Future work is needed to scale-up and sustain successful implementation of OTWH in different places over time to reach more youth and families.

Conclusion

Opportunities and Challenges

Traditional research approaches have presented challenges for practitioners, researchers, and policy makers needing to get research to practice. The DO BETTER framework is an attempt to respond to recommendations for more efficient processes to conduct research within real-world services settings (Mason et al. 2018), however, there are limitations. Most agencies do not have the financial resources to conduct research, and the OTWH project, for example, would not have been possible without two large federally funded research grants. Conducting randomized controlled trials in real-world settings also poses challenges such as ethical considerations around randomization and recruitment of sample sizes large enough to detect significant effects. State wards were not allowed to participate in the OTWH research studies due to a state policy, which prevented the involvement of foster care youth in the design and testing of the intervention. Establishing research partnership also require finding the right researchers who are willing to partner on a common cause, resolving disagreements on the direction of projects or conclusions, and the time and commitment to complete large research studies (Cargo and Mercer 2008; Thompson et al. 2017; Tseng et al. 2017). Although a good deal was accomplished during the OTWH project, the activities described above still took over 12 years, and research on large scale dissemination is still needed.

To enhance approaches like the DO BETTER framework, emphasis on evaluating programs through a process of repeated testing and refinement through small-scale experiments across multiple locations is recommended (Cronbach et al. 1981; Hamilton 2015; Howe et al. 2010; Schindler et al. 2017). Micro trials, for instance, are randomized experiments designed to test the effects specific components have on risk and protective factors (Howe et al. 2010). Conducting research in this manner could speed up the process of learning and sharing from research successes and failures (Schindler et al. 2017). For example, the second randomized controlled trial of OTWH could have included other smaller programs in different regions to address recruitment challenges and prevent contamination. Consistent use of processes such as these could address the need for efficiency and evidence that are important considerations for conducting research in social service agencies.

Another limitation of the DO BETTER framework is the lack of detailed activities related to implementation during scale-up of OTWH. As stated above, the DO BETTER framework links to existing models of implementation science, however, further research is needed to define the best practices specific to implementation of the program when scaled-up. For aftercare programs, more research is needed to determine the necessary pre-implementation considerations of readiness and fit (Saldana et al. 2011) of organizations that are considering delivery of an aftercare program. Additionally, research could evaluate any of the core components needed for implementation of an aftercare program such as staff training, agency consultation, evaluation and data support (Fixsen et al. 2009). Moving forward, it will be important that components such as these are clear and concise so that the practices can be measured and adhered to, and the likelihood of achieving successful outcomes is improved (Lipsey et al. 2010).

Summary

This article illustrates the benefits of using a transdisciplinary translational research process to improve access to aftercare for youth departing residential programs. The DO BETTER framework was developed so researchers could more effectively engage community partners and non-researchers, and so non-researchers could become more familiar with the research activities that are involved in getting programs designed, tested, implemented and sustained. Increased practitioner and consumer involvement from the start could improve the identification and use of evidence-based solutions in practice. Ongoing efforts are needed to operationalize these approaches to evaluate and improve the quality of services for youth and families in a timely manner.

References

  1. Bandura, A., Caparara, G. V., Barbaranelli, C., Regalia, C., & Scabini, E. (2011). Impact of family efficacy beliefs on quality of family functioning and satisfaction with family life. Applied Psychology: An International Review, 60(3), 421–448. https://doi.org/10.1111/j.1464-0597.2010.00442.x.

    Article  Google Scholar 

  2. Bauer, M. S., Damschroder, L., Hagedorn, H., Smith, J., & Kilbourne, A. M. (2015). An introduction to implementation science for the non-specialist. BMC Psychology, 3(32), 1–12. https://doi.org/10.1186/s40359-015-0089-9.

    Article  Google Scholar 

  3. Bipartisan Budget Act of 2018. (2018). Retrieved November 12, 2019, from https://www.congress.gov/115/bills/hr1892/BILLS-115hr1892enr.pdf.

  4. Boys Town.org. (2019). Boys Town Hotline. Retrieved November 18, 2019, from https://www.boystown.org/hotline/Pages/default.aspx.

  5. Briggs, E. C., Greeson, J. K., Layne, C. M., Fairbank, J. A., Knoverek, A. M., & Pynoos, R. S. (2012). Trauma exposure, psychosocial functioning, and treatment needs of youth in residential care: Preliminary findings from the NCTSN core data set. Journal of Child and Adolescent Trauma, 5(1), 1–15.

    Article  Google Scholar 

  6. Burke, R., Herron, R., & Barnes, B. (2006). Common sense parenting (3rd ed.). Boys Town, NE: Boys Town Press.

    Google Scholar 

  7. Cargo, M., & Mercer, S. L. (2008). The value and challenges of participatory research: Strengthening its practice. Annual Review of Public Health, 29, 325–350.

    Article  Google Scholar 

  8. Chamberlain, P., Brown, H., & Saldana, L. (2011). Observational measure of implementation progress in community based settings: The stages of implementation completion (SIC). Implementation Science, 6(116), 2–8. https://doi.org/10.1186/1748-5908-6-116.

    Article  Google Scholar 

  9. Christenson, S. L., Evelo, D., Sinclair, M., & Thurlow, M. (1997). Check & Connect: A model for promoting students’ engagement in school. Retrieved November 12, 2019, from http://ici.umn.edu/checkandconnect/.

  10. Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage Publications.

    Google Scholar 

  11. Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R. D., Hornik, R. C., et al. (1981). Toward reform of program evaluation. San Francisco, CA: Jossey-Bass Publishers.

    Google Scholar 

  12. Davies, H., Nutley, S., & Walter, I. (2007). Academic advice to practitioners—The role and use of research-based evidence. Public Money & Management, 27(4), 232–235. https://doi.org/10.1111/j.1467-9302.2007.00585.x.

    Article  Google Scholar 

  13. Dubois, D. L., Holloway, B. E., Valentine, J. C., & Cooper, H. (2002). Effectiveness of mentoring programs for youth: A meta-analytic review. American Journal of Community Psychology, 30(2), 157–197.

    Article  Google Scholar 

  14. Farmer, E. M., Dorsey, S., & Mustillo, S. A. (2004). Intensive home and community interventions. Child Adolescent Psychiatric Clinics of North America, 13(4), 857–884.

    Article  Google Scholar 

  15. Fixsen, D. L., Blasse, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531–540. https://doi.org/10.1177/1049731509335549.

    Article  Google Scholar 

  16. Fixsen, D. L., Schultes, M. T., & Blasé, K. A. (2016). Bildung-Psychology and implementation science. European Journal of Developmental Psychology, 13(6), 666–680. https://doi.org/10.1080/17405629.1204292.

    Article  Google Scholar 

  17. Forman-Hoffman, V. L., Middleton, J. C., McKeeman, J. L., Stambaugh, L. F., Christian, R. B., et al. (2017). Quality improvement, implementation, and dissemination strategies to improve mental health care for children and adolescents: A systemic review. Implementation Science, 12(93), 1–21.

    Google Scholar 

  18. Glasgow, R. E., Vinson, C., Chambers, D., Khoury, M. J., Kaplan, R. M., & Hunter, C. (2012). National Institutes of Health approaches to dissemination and implementation Science: Current and future directions. American Journal of Public Health, 102(7), 1274–1281.

    Article  Google Scholar 

  19. Grywacz, J. G., & Allen, J. W. (2017). Adaptive the ideas of translational science for translational family science. Family Relations, 66(4), 568–583. https://doi.org/10.1111/fare.12284.

    Article  Google Scholar 

  20. Hamilton, S. F. (2015). Translational research and youth development. Applied Developmental Science, 19(2), 60–73. https://doi.org/10.1080/1888691.2014.968279.

    Article  Google Scholar 

  21. Hartman, A., & Linn, J. F. (2008). Scaling-up. A framework and lessons for development effectiveness from literature and practice. Washington, DC: The Brookings Institute.

    Google Scholar 

  22. Harvard School of Public Health. (2019). Harvard transdisciplinary research in energetics and cancer center. Retrieved from https://www.hsph.harvard.edu/trec/aboutus/definitions/.

  23. Haynes, B. (1999). Can it work? Does it work? Is it worth it? BMJ, 319(7211), 652–653. https://doi.org/10.1136/bmj.319.7211.652.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Howe, G. W., Beach, S. R. H., & Brody, G. H. (2010). Microtrial methods for translating gene environment dynamics into preventive interventions. Prevention Science, 11, 343–354. https://doi.org/10.1007/211121-01077-2.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Jaccard, J., & Bo, A. (2018). Prevention science and child/youth development: Randomized explanatory trials for integrating theory, method, and analysis in program evaluation. Journal of the Society for Social Work & Research, 9(4), 651–687.

    Article  Google Scholar 

  26. James, S., Thompson, R., Stenberg, N., Schnur, E., Ross, J., Butler, L., et al. (2015). Attitudes, perceptions, and utilization of evidence-based practices in residential care. Residential Treatment for Children and Youth, 32, 144–166. https://doi.org/10.1080/0886571X.2015.1046275.

    Article  PubMed  Google Scholar 

  27. Leichtman, M., & Leichtman, M. L. (2001). Facilitating the transition from residential treatment into the community: I. The problem. Residential Treatment for Children and Youth, 19(1), 21–27.

    Article  Google Scholar 

  28. Lieberman, R. E. (2004). Future directions in residential treatment. Child and Adolescent Psychiatric Clinics of North America, 13(2), 279–294.

    Article  Google Scholar 

  29. Lipsey, M. W. (2009). The primary factors that characterize effective interventions with juvenile offenders: A meta-analytic overview. Victims and Offenders, 4, 124–147.

    Article  Google Scholar 

  30. Lipsey, M. W., Howell, J. C., Kelly, M. R., Chapman, G., & Carver, G. (2010). Improving the effectiveness of juvenile justice programs: A new perspective on evidence-based practice. Washington DC: Center for Juvenile Justice Reform at Georgetown University.

    Google Scholar 

  31. Lyons, J. S., & Schaefer, K. (2000). Mental health and dangerousness: Characteristics and outcomes of children and adolescents in residential placements. Journal of Child and Family Studies, 9(1), 67–73.

    Article  Google Scholar 

  32. Mason, W. A., Cogua-Lopez, J., Fleming, C. B., & Scheier, L. M. (2018). Challenges facing evidence-based prevention: Incorporating an abductive theory of method. Evaluation and the Health Professions, 41, 155–182.

    Article  Google Scholar 

  33. Mason, W. A., Fleming, C. B., Ringle, J. L., Thompson, R. W., Haggerty, K. P., & Snyder, J. J. (2015). Reducing risks for problem behaviors during the high school transition: Proximal outcomes in the Common Sense Parenting trial. Journal of Child and Family Studies, 24(9), 2568–2578.

    Article  Google Scholar 

  34. Mason, W. A., January, S.-A. A., Fleming, C. B., Thompson, R. W., Parra, G. R., Haggerty, K. P., et al. (2016). Parent training to reduce problem behaviors over the transition to high school: Tests of indirect effects through improved emotion regulation skills. Children and Youth Services Review, 61, 176–183.

    Article  Google Scholar 

  35. McMillen, C. J., Lee, B. R., & Jonson-Reid, M. (2008). Outcomes for youth residential treatment programs using administrative data from the child welfare system: A risk-adjustment application. Administration and Policy In Mental Health, 35(3), 189–197.

    Article  Google Scholar 

  36. Middlemiss, W. (2016). Building a foundation for resiliency from the inside out. Family Relations, 65, 7–8. https://doi.org/10.1111/fare.12186.

    Article  Google Scholar 

  37. Minkler, M., Glover Blackwell, A., Thompson, M., & Tamir, H. (2003). Community-based participatory research: Implications for public health funding. Public Health Advocacy Forum, 93(8), 1210–1213.

    Google Scholar 

  38. Mitchell, S. A., Fisher, C. A., Hastings, C. E., Silverman, L. B., & Wallen, G. R. (2010). A thematic analysis of theoretical models for translational science in nursing: Mapping the field. Nursing Outlook, 58, 287–300. https://doi.org/10.1016/j.outlook.2010.07.001.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Moen, R. D., & Norman, C. L. (2010). Circling back: Clearing up myths about the Deming Cycle and seeing how it keeps evolving. Quality Progress, 43(11), 22–28.

    Google Scholar 

  40. Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria Development, measurement, and validation. American Journal of Evaluation, 24(3), 315–340.

    Article  Google Scholar 

  41. Rogers, E. (2003). Diffusion of innovations (5th ed.). New York: Free Press.

    Google Scholar 

  42. Rubio, D. M., Schoenbaum, E. E., Lee, L. S., Schteingart, D. E., Marantz, P. R., Anderson, P. R., et al. (2010). Defining translational research: Implications for training. Academic Medicine, 85(3), 470–475. https://doi.org/10.1097/ACM.0b013e3181ccd618.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Saldana, L. (2014). The stages of implementation completion for evidence-based practice: protocol for a mixed methods study. Implementation Science, 9(43), 1–11.

    Google Scholar 

  44. Saldana, L., Chamberlain, P., Wang, W., & Brown, C. H. (2011). Predicting program start-up using the stages of implementation measure. Administration and Policy in Mental Health and Mental Health Services Research, 39(6), 419–425.

    Article  Google Scholar 

  45. Schindler, H. S., Fisher, P. A., & Shonkoff, J. P. (2017). From innovation to impact at scale: Lessons learned from a cluster of research-community partnerships. Child Development, 88(5), 1435–1446. https://doi.org/10.1111/cdev.12904.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Stevenson, D. D., Shaw, G. M., Wise, P. H., Norton, M. E., Druzin, M. L., et al. (2013). Transdisciplinary translational science and the case of preterm birth. Journal of Perintology, 33, 251–258. https://doi.org/10.1038/jp.2012.133.

    Article  Google Scholar 

  47. Teddie, C., & Tashakkori, A. (2009). Foundations of mixed methods research. Integrating quantitative and qualitative approaches in the social and behavioral sciences. Los Angeles, CA: Sage Publications.

    Google Scholar 

  48. Thompson, R. W., Duppong Hurley, K., Trout, A. L., Huefner, J. C., & Daly, D. L. (2017). Closing the research to practice gap in therapeutic residential care: Service provider- university partnerships focused on evidence-based practice. Journal of Emotional an Behavioral Disorders, 25(1), 46–56. https://doi.org/10.1177/1063426686757.

    Article  Google Scholar 

  49. Trout, A. L., & Epstein, M. H. (2007). On the way home: A family-centered academic reintegration intervention model (CFDA# R324B070034). U.S: Department of Education Institute for Educational Sciences. https://doi.org/10.1094/PDIS-91-4-0467B.

    Google Scholar 

  50. Trout, A. L., & Epstein, M. H. (2010). Developing aftercare, Phase I: Consumer feedback. Children and Youth Services Review, 32(3), 445–451.

    Article  Google Scholar 

  51. Trout, A. L., Hagaman, J. L., Chmelka, M. B., Gehringer, R., Epstein, M. H., & Reid, R. (2008). The academic, behavioral, and mental health status of children and youth at entry to residential care. Residential Treatment for Children & Youth, 25, 359–374. https://doi.org/10.1080/08865710802533654.

    Article  Google Scholar 

  52. Trout, A. L., Hoffman, S., Epstein, M. H., & Thompson, R. W. (2014a). Family teacher and parent perceptions of youth needs and preparedness for transition upon youth discharge from residential care. Journal of Social Work, 14(6), 594–604.

    Article  Google Scholar 

  53. Trout, A. L., Hoffman, S., Huscroft-D’Angelo, J., Epstein, M. H., Duppong Hurley, K., & Stevens, A. L. (2014b). Youth and parent perceptions of aftercare supports at discharge from residential care. Child and Family Social Work, 19(3), 304–311.

    Article  Google Scholar 

  54. Trout, A. L., Jansz, C., Epstein, M. H., & Tyler, P. (2013a). Evaluating service delivery in aftercare for school-aged youth departing out-of-home care. Journal of Public Child Welfare, 7, 142–153. https://doi.org/10.1080/15548732.2013.770356.

    Article  Google Scholar 

  55. Trout, A. L., Lambert, M. C., Epstein, M. H., Tyler, P., Stewart, M., Thompson, R. W., et al. (2013b). Comparison of On the Way Home aftercare supports to usual care following discharge from a residential setting: An exploratory pilot randomized controlled trial. Child Welfare, 92(3), 27–45.

    PubMed  Google Scholar 

  56. Trout, A. L., Lambert, M. C., Thompson, R., Duppong Hurley, K., & Tyler, P. (2019). On the Way Home: Promoting caregiver empowerment, self-efficacy, and adolescent stability during family reunification following placements in residential care. Residential Treatment for Children and Youth. https://doi.org/10.1080/0886571x.2019.1681047.

    Article  Google Scholar 

  57. Trout, A., Tyler, P., Stewart, M., & Epstein, M. (2012). On the Way Home: Program description and preliminary findings. Children and Youth Services Review, 34(6), 1115–1120. https://doi.org/10.1016/j.childyouth.2012.01.046.

    Article  Google Scholar 

  58. Tseng, V., Easton, J. Q., & Supplee, L. H. (2017). Research-practice partnerships: Building two way streets of engagement. Social Policy Report, 30(4), 1–17.

    Article  Google Scholar 

  59. Tyler, P. M., Thompson, R. W., Trout, A. L., Lambert, M. C., & Synhorst, L. L. (2016). Availability of aftercare for youth departing group homes. Residential Treatment for Children and Youth, 33(3–4), 270–285. https://doi.org/10.1080/0886571X.2016.1232183.

    Article  Google Scholar 

  60. Tyler, P. M., Thompson, R. W., Trout, A. L., Lambert, M. C., & Synhorst, L. L. (2017). Important elements of aftercare services for youth departing group homes. Journal of Child and Family Studies, 26(6), 1603–1613. https://doi.org/10.1007/s10826-017-0673-0.

    Article  Google Scholar 

  61. Tyler, P. M., Trout, A. L., Epstein, M. H., & Thompson, R. (2014). Provider perspectives on aftercare services for youth in residential care. Residential Treatment for Children and Youth, 31(3), 219–229. https://doi.org/10.1080/0886571X.2014.943571.

    Article  Google Scholar 

  62. Tyler, P. M., Trout, A. L., Huscroft-D’Angelo, J., Synhorst, L., & Lambert, M. C. (2018). Promoting stability for youth returning from residential care: Attorney perspectives. Juvenile & Family Court Journal, 69(3), 5–18.

    Article  Google Scholar 

  63. Walter, U. M., & Petr, C. G. (2004). Promoting successful transitions from day school to regular school environments for youths with serious emotional disorders. Children & Schools, 26(3), 175–180.

    Article  Google Scholar 

  64. Wilson, S., Price, C. S., Kerns, S. E. U., Dastrup, S. R., & Brown, S. R. (2019). Title IV-E prevention services clearinghouse. Handbook of standards and procedures. Version 1.0. Office of Planning Research and Evaluation. Administration for Children and Families, U.S. Department of Health and Human Services.

Download references

Funding

This research was supported by Grants Numbers R324B070034 and R324A120260 from the U.S. Department of Education, Institute of Education Sciences. The statements in this manuscript do not necessarily represent the views of the U. S. Department of Education; Institute of Education Sciences; [R324B070034; R324A120260].

Author information

Affiliations

Authors

Corresponding author

Correspondence to Patrick M. Tyler.

Ethics declarations

Conflict of interest

None of the authors have declared any conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Tyler, P.M., Mason, W.A., Vollmer, B. et al. Practice to Research and Back in a Social Service Agency: Trying to DO BETTER. Child Youth Care Forum 50, 149–165 (2021). https://doi.org/10.1007/s10566-020-09548-3

Download citation

Keywords

  • Translational research
  • Transdisciplinary research
  • Research to practice