Approximately two-thirds of youth in the United States experience a traumatic event—such as abuse, community or school violence, natural disaster, loss of a loved one, neglect, or serious accident—before age 16 (Copeland et al., 2007). Within schools, 25% of high school students have been in at least one physical fight, 20% have been bullied, and 17% have experienced cyberbullying (Substance Abuse and Mental Health Administration [SAMHSA], n.d.). Child traumatic stress can result from any of these experiences, and may cause students to have a difficult time focusing, feel anxious or depressed, act with impulsivity or aggression, and/or eat or sleep poorly; these symptoms, if on-going, can result in a number of adverse outcomes including learning difficulties, discipline challenges, and long-term health problems (SAMHSA, n.d.). Children’s reactions to trauma are shaped by their environments and how responsive adults are to their needs. Thus, it is critical to build trauma-informed school systems where educators are equipped with the training and skills to respond effectively to students who display symptoms of traumatic stress in order to promote their resilience or recovery.

Trauma-informed practices in schools (TIPS) has gained momentum in the last two decades as educators have increasingly recognized the potential impact of trauma on students and the need for schools to address symptoms of traumatic stress (Thomas et al., 2019). In 2014, SAMHSA provided a transformational definition of a trauma-informed system as one that:

  1. 1.

    Realizes the widespread impact of trauma and understands potential paths for recovery;

  2. 2.

    Recognizes the signs and symptoms of trauma in clients, families, staff, and others involved with the system;

  3. 3.

    Responds by fully integrating knowledge about trauma into policies, procedures, and practices;

  4. 4.

    Seeks to actively resist re-traumatization of both persons served and staff. (SAMHSA, 2014, p. 9).

By nature of this definition, TIPS require multi-tiered strategies within schools to become a trauma-informed system; stand-alone interventions are not adequate. The first tier requires universal strategies to prevent and address symptoms of traumatic stress for all students. This requires skill development for all teachers and educational support staff to support students in the regular education environment and includes how to manage vicarious trauma they may experience as a result of hearing a student’s story and/or cope with their own stressors so that they have the emotional bandwidth to support students. The second tier provides additional supports for students who have been exposed to trauma and whose symptoms can be addressed over a short period of time and/or in group settings by specialized staff. The third tier involves intensive, individualized supports characterized by specific trauma interventions provided by trauma-trained mental health practitioners. For tiers two and three to be effective, teachers need skills to support their students in the classroom whenever they can and to identify when to refer students for extra support. Since 2014, some state Departments of Education have started to provide guidance on TIPS, and at times, TIPS is included with other social and emotional initiatives, yet there remains wide variation in TIPS information and resources (Thomas et al., 2019).

In their systematic review of the state of research attention to TIPS, Thomas et al. (2019) found that there is no TIPS model that has been systematically implemented and researched. They found that barriers to TIPS research include the complex nature of school systems paired with the need for interdisciplinary contributions from fields of education and mental health. In addition, teachers are often used as a source of referral to an alternative provider rather than a key provider of TIPS consistent with their role. Finally, extant research has yet to fully consider the role of school climate as an integral component of TIPS. A healthy school climate is critical for the success of school initiatives and is indicated when school members feel cared for, respected, and engaged (Austin et al., 2013). This paper details a process for overcoming these barriers when partnering with schools to build TIPS.

Phifer and Hull (2016) discuss several key TIPS implementation guidelines in their review of the TIPS literature with a focus on three case studies of district attempts to implement TIPS. First, transforming school systems to implement TIPS takes time. Each district took five years to invest in the changes that were needed including relationship building, developing a comprehensive plan and timeline, and providing enough reinforcement to promote sustainability. In addition, Phifer and Hull noted the significant need for teacher professional development both at the preservice and in-service stages. They recommended integrating TIPS into teacher education programs and requiring more rigorous and sustained professional development for teachers that transform their teaching practice to better approach challenging behaviors. Finally, they recommend strong university–community partnerships that provide the oversight and expertise needed to build successful three-tiered approaches (tier 1: the whole class, tier 2: small group interventions, tier 3: intensive individualized support; Phifer & Hull, 2016) to TIPS. Thus, additional research is needed to document and evaluate processes through which university–community partnerships can develop interventions and implementation approaches to support school systems in implementing TIPS.

In this paper, we detail results of a pilot implementation study resulting from a university–community partnership that provided the personnel resources, subject matter expertise, research, and funding to build capacity for TIPS in participating school districts. We identify how the partnership formed; describe the way university partners engaged stakeholders in providing continuous input and feedback; and summarize the first two phases of this project, needs assessment and curriculum development, in order to provide sufficient context and background for the implementation study. Next, we provide methods, results, and discussion of Phase 3, the pilot implementation study of administrator, teacher, and preservice teacher trainings. As part of our discussion, we provide lessons learned and future directions as we endeavor to build upon the current TIPS research literature to provide a road map and resources for districts to build TIPS who may not have access to universities or other experts in their communities.

The University–Community Partnership

This university–community partnership is embedded in a community characterized by extreme social-economic disparities, a dual majority White/Latine racial composition, tourism, and abundant natural beauty. Schools in participating districts can be classified as suburban with large within-district variation related to factors such as social-economics and English language proficiency. The university is a minority-serving institution with a diverse student body across a variety of demographics. Our process was designed to identify the distinct training needs of our participants and adapt to each district’s complex system within this unique context.

A community-generated and collaborative partnership was built at the onset of the COVID-19 pandemic that provided and evaluated sustained professional development to support educators with TIPS in order to ultimately benefit students indirectly. The partnership was initiated by Ocean School DistrictFootnote 1 In conversation with a local foundation about how to help teachers as they were transitioning from remote instruction to in-person learning for the 2020–2021 school year, district leaders conveyed that early in the pandemic they recognized the tremendous pressure teachers faced while adapting to remote instruction. As a result, during Spring of 2020, district leaders had offered numerous resources to their teachers (e.g., list of mental health resources, virtual guided wellness activities, opportunities to connect with others). Yet, district leaders found that teachers rarely took advantage of the offered resources despite a clear need for support. Approaching the 2020–2021 school year, Ocean School District wanted to better understand the needs of their teachers in order to support them and their students.

To address their concerns, Ocean district leaders met with a local foundation to discuss the possibility of a needs assessment. The foundation engages in philanthropy to drive pivotal, data-driven changes in education, development, environment, and social awareness primarily in the local community. This includes a portfolio focused on traumatic stress and resiliency that started in health care and has moved into social service agencies, schools, and law enforcement. In listening to district leaders, the foundation suggested they partner with university faculty who have expertise in resilience and traumatic stress in order to design and fund a study to understand educator needs related to TIPS.

Ocean School District and the foundation already had a strong relationship with the university; in the Fall of 2020, they initiated meetings with the dean and key school leaders to identify a team of experts to engage in the needs assessment. Through a series of conversations, the principal investigators (PIs) of this research were identified: a professor in school psychology who studies system transformation to help vulnerable youth thrive, a professor in clinical psychology with expertise in disaster mental health and resilience and recovery following potentially traumatic events, a teaching professor in special education with expertise in positive behavioral supports and specialized academic instruction, and a teaching professor with expertise in teacher professional development and learning communities.

After discussions with the foundation about the goals of this project, the PIs proposed a multi-phase project: Phase 1 entailed a needs assessment with educators (i.e., all school-based certificated staff) in K-12 school settings to understand perspectives including facilitators and barriers to implementing TIPS to support students and themselves (fall and winter 2021–2022). Phase 2 involved a review of existing, available, and free resources that mapped onto the needs assessment and the creation of professional development modules (winter and spring 2021–2022). Phase 3 was focused on a pilot implementation of the training modules (academic year 2022–2023). The team was provided funding for Phase 1 with additional funding to be granted on a phase-by-phase basis depending on results and educator engagement.

Stakeholder Input

District input and guidance is a best practice for any applied research project, particularly in complex systems such as schools. Moreover, one of the most consistent findings in the limited available research is that TIPS initiatives are only successful with administrator and teacher buy-in (Brown et al., 2022). The PIs recruited a second partner district, Mountain School District, to participate in the needs assessment. This allowed for better generalization of results to the community by not relying on a sole school district while also serving this additional school district in improving their implementation of TIPS. The first step in the needs assessment was to develop advisory committees for each district to co-develop and provide constructive feedback to researchers regarding the research questions, methods, and findings of TIPS implementation in their schools.

Districts formed their own advisory committees with the goal of including key administrators with decision-making power, school site leaders, and district mental health staff. The Ocean School District advisory committee included the district chief operations officer, director of community engagement, assistant superintendent of human resources, and assistant superintendent of student services. The Mountain School District advisory committee included the assistant superintendent of instructional services, assistant superintendent of pupil services, one school site principal, one school psychologist, and one teacher on special assignment for curriculum development. Research team members met with each advisory committee separately once per month for the first six months of the project and then at key review and decision-making intervals thereafter.

Phase 1: Mixed Methods Needs Assessment

The initial sequence of meetings was focused on the needs assessment. The research team integrated the input of both districts and confirming mutual approval of the resulting methods to develop survey and focus group protocols that would address their combined needs while also addressing limits of time. Advisory committees each reviewed the proposed protocols and provided feedback that the research team iteratively integrated until a final version was approved by both districts. During this phase, advisory committees also developed study design and recruitment strategies.

Recruitment and incentives differed based on district advisory committee preferences. Staff members at Mountain provided a list of 150 (56% of eligible) randomly selected certificated staff and Ocean provided a list of 300 (37% of eligible) randomly selected certificated staff. Ocean opted to raffle off five \(\$ 100\) and one \(\$ 500\) gift card. All participants were entered into the raffle upon completion of the survey. Mountain opted to have all survey participants receive a \(\$ 5\) gift card, and all participants were entered into a raffle to win an additional \(\$ 250\) gift card. Teachers from each district were also provided the opportunity to participate in district-specific focus groups. Focus group participants from both districts were each given a $40 gift card.

In order to recruit participants, the research team sent an informational email with the survey link and instructions for how to sign up for the focus group. After an initial recruitment email, weekly reminders were sent over a two-month period. A total of 178 survey participants were successfully recruited (response rates of 40% for Ocean and 39% for Mountain). Of the participants who completed the survey portion of the study, the majority were female (81.9%) and identified as non-Latinx White (69.2%). Most reported being general education teachers (57.6%), worked at the elementary school level (53.9%), and had been in education for over ten years (69.2%). Full reports of the demographic breakdown and the focus groups can be found in Aragón et al. (2024).

To assess whether there were any significant differences between the two samples, independent t-tests and chi-square analyses were conducted. Only one variable assessing for implementation barriers indicated statistically significant differences across districts, with a higher mean barrier score in Ocean (M = 1.81, SD = 0.08) than Mountain (M = 1.45, SD = 0.15), t(77) = 2.25, p = 0.027. Given that there were no other significant differences, we aggregated the datasets to increase generalizability and statistical power. The themes that emerged from the teacher focus groups were also similar across districts.

Needs assessment results illuminated the experiences of educators, particularly teachers, during the pandemic (Authors, 2023). Participating educators reported experiencing low levels of secondary traumatic stress but high levels of burnout, with teachers on average endorsing feeling burnt out at least once a week. Teachers were asked about what resources they used during the pandemic, as well as what else they would have liked to have received. Teachers most often used lists of mental health resources for parents, teachers, and students; virtual guided wellness activities; and opportunities to connect with others. They found connecting with others to be the most helpful, as well as receiving training and in-person wellness activities. They reported wanting more training opportunities in advocating for additional resources, consultation for skill implementation, and support groups. Teachers wanted clear communication from administrators at the school and district level, as well as better appreciation and more interpersonal connection. Specifically, they voiced a desire for their administrators to listen to and acknowledge teacher emotions and respond accordingly.

Overall, school climate emerged as the key consideration in teachers’ ability to implement resources. Results suggested that a school climate that is supportive and has both resources and opportunities for training will bolster teachers’ own expertise and existing skills in addition to helping them build new skills. The complete methods and results of this needs assessment can be found in Authors (2023).

Phase 2: Curriculum Development

The university team presented each district advisory committee with the results of the needs assessment and engaged members in a discussion of next steps for supporting TIPS. There was a consensus that professional development was a key next step because the most common desired resources expressed by participants in the needs assessment were related to training and support for implementing TIPS (Authors, 2023). Given specific feedback by teachers for administrators (e.g., “Actually engage with us. Right, be present, walk around, come-come and talk to me once in a while. Any of us,” Authors, 2023, p. 6), advisory committees identified the need to develop two curricula: (1) assisting administrators in how to best support their staff who are experiencing trauma, secondary traumatic stress, and/or burnout, and (2) teaching educators how to support themselves and their students with traumatic stress and related symptoms. Researchers identified freely available existing resources and training to address the continuum of needs, in consultation with advisory committees, resulting in a three-module administrator training and an eight-module teacher training. The three administrator modules were: (1) creating a trauma-informed school campus and climate, (2) building collaborative partnerships using compassionate leadership and psychological safety, and (3) supporting teachers with secondary traumatic stress and teacher self-care. The eight teacher modules were divided into two sections, building a trauma-informed campus (e.g., multi-tiered systems, crisis response, family-to-school connections) and creating trauma-informed classroom practices (e.g., understanding trauma signs and symptoms, the importance of teacher-student relationships, creating classroom expectations and safe spaces). Within this general framework, the PIs also incorporated advisory committee input for content based on pressing needs they were experiencing that year, which included supporting students who are grieving and what to do to support students during and after natural and human-caused disasters.

The general content of the curricula for teachers and administrators was organized based on Trauma-Informed Schools for Children in K-12: A System Framework from the National Child Traumatic Stress Network (NCTSN), Schools Committee (2017). The system framework consisted of 10 essential elements. These 10 essential elements served as big ideas for the scope and sequence of knowledge and skills in each curriculum. To further build the training, the team conducted a thorough scan and integrated relevant existing free and publicly available resources and trainings (listed in Table 1). The content and sequence of modules were tailored for teachers or administrators. Decisions on what content to emphasize for the two different audiences were guided by the results of the needs assessment surveys and focus groups, advisory committee input, and the professional expertise of the university team. Furthermore, as the modules were delivered, formative feedback was gathered after each session. This feedback led to any adjustments made to the content and sequence of future modules.

Table 1 Primary curriculum resources listed alphabetically by citation

The structure of each module was adapted from a lesson planning template used in an accredited university-based teacher education program (TEP). Employing this adapted lesson planning template, the university team took specific big ideas based on the NCTSN essential elements and created learning activities for each module. The adapted lesson planning template consisted of a module objective and module procedures. In consultation with the advisory committees, who both decided upon in-person training, the module procedures outlined how content would be delivered and how participants would engage in assessment and activities at the beginning, middle, and closure of the session. Content was designed to be presented by university team members and for participants to take part in a range of activities throughout the session, such as discussion, question and answer, reading and reflection, or viewing other multimedia content. After the closure of the learning activities, the sessions were designed to end with opportunities for participants to provide feedback via surveys.

Current Study

The current study focuses on the third phase of the project, the pilot implementation of TIPS training modules, with the goal of obtaining process and outcome evaluation data to further refine and improve the TIPS administrator and teacher training modules before broader scale implementation. The research questions for the current study are:

  1. 1.

    What do process data reveal about:

  2. 2.

    the acceptability, appropriateness, and feasibility of TIPS for administrators and teachers?

  3. 3.

    the feedback of participants on the quality of the trainings and trainers?

  4. 4.

    What do outcome data reveal about improvement in participants’ knowledge about trauma and its effects and their perceived ability to implement TIPS?

Methods

District Recruitment

In working with the advisory committees, the university team understood that districts’ needs were continually evolving in the aftermath of COVID-19; learning loss, racial trauma, and exacerbated disparities across many metrics are examples of competing, albeit intertwined, concerns that local districts were grappling with. Ocean School District had a leadership change and a reprioritization of key district goals during the needs assessment and curricula development phases. Thus, their advisory committee, which had multiple members turnover, declined to participate in the implementation of the professional development pilot phase during the 2022–2023 school year. Instead, the research team was able to recruit Valley School District to participate in both the administrator and teacher trainings. Valley School District was recruited due to their superintendent’s expressed interest in the project after hearing about it at a community meeting and due to the available space created by the loss of Ocean School District. In addition, the University TEP was recruited for participation due to (1) a keen interest by the funder in integrating TIPS training into preservice teacher education, and (2) understanding by the university team and TEP leadership that training in TIPS is critical for preservice teachers to navigate their profession, particularly in the aftermath of COVID-19.

Participants

The PIs worked with advisory committees to determine the implementation strategy that would work best within each of the participating school districts. Table 2 details key information about each district participating in at least one phase of the project. It is important to note that not all participants completed all data elements. Exact numbers of respondents used in analyses can be found in Tables 3, 4, and 5.

Table 2 Demographics of participating school districts and the teacher education program
Table 3 Process evaluation data for administrator and teacher candidate trainings
Table 4 Process evaluation data for teacher training at Valley School
Table 5 Independent samples t tests for outcome variables by training group

The Mountain School District advisory committee decided to focus on the administrator training (n = 19) in 2022–2023 and obtain their school leaders’ understanding and buy-in before proceeding with the teacher training. Administrators included the district superintendent, assistant superintendents (e.g., pupil services, instruction), and principals. Most of these administrators had not yet been exposed to TIPS training. The grant provided each administrator with a $25 gift card to complete each sessions’ evaluation.

The superintendent of Valley School District was an individual advisor to the project. She reviewed results of the needs assessment and curriculum plan and found it to be relevant to their needs. This small school district had received training in trauma-informed practices four years prior and found it very helpful. The superintendent wanted updated training and a renewed focus due to staffing changes and the impact of COVID-19. She required that all of her administrators (n = 4) attend all sessions of the training, and also required that all of her teachers (n = 36) attend at least the first session. Subsequently, she was able to use professional development funds to pay any teacher who volunteered (n = 12) $400 to complete the remaining seven modules and participate in the evaluation.

The PIs also worked with their university’s TEP leadership to identify how the teacher modules would fit into the already impacted curriculum. Leadership developed a small advisory committee to guide the project that included the director and associate director of the teacher education program. This TEP program is completed within one year from start to finish; thus, the course and meeting schedule is tight. In addition, similar to the local districts, the TEP had a variety of other priority topics including diversity, equity, and inclusion. Yet, the TEP advisory committee acknowledged the importance of TIPS and their teacher candidates’ interest in and motivation to learn TIPS. After reviewing the modules and fit within existing courses, the advisory committee recommended a focus on the four classroom-based modules and adaptations based on the needs of preservice versus in-service teachers. These four sessions of the teacher training were conducted with teacher candidates (n = 65) over their academic year as part of a mandatory “lunch and learn” series.

Measures

Measures represent process and outcome evaluation data collection efforts. The research team discussed what kind of data would be collected in these surveys with members of the advisory committees, who gave their approval.

Process Evaluation

The purpose of the process evaluation, assessed by a survey given to participants at the end of each individual session, was to assess participants’ perception of the quality of training, their ability to use the information provided in their work, and to solicit feedback to inform future training efforts.

Acceptability, Appropriateness, and Feasibility. Three items from previously validated scales were used to assess acceptability, appropriateness, and feasibility of the information provided. One item each was pulled from the Acceptability of Intervention Measure (“This training session is appealing to me”), the Intervention Appropriateness Measure (“This training session seems fitting”), and the Feasibility of Intervention Measure (“The things I learned in this training session seems doable”). These scales were all developed by Weiner et al. (2017) and were rated on a scale of 1 “completely disagree” to 5 “completely agree.” The original three scales were each comprised of four items, which had Cronbach alpha scores ranging from 0.85 to 0.91. The measure showed acceptable fit (CFI = 0.96, RMSEA = 0.08), high factor loadings (0.75–0.89), and test–retest reliability coefficients ranging from 0.73 to 0.88. We selected one item per scale for brevity due to advisory committee concerns about keeping the assessments succinct to not burden participants.

Quality of Trainings and Trainers. Participants’ perceptions of the quality of the trainings and trainers were assessed using five questions developed by the research team to each evaluate a specific training aspect rather than one latent construct. Items assessed trainer knowledge of the subject matter, ability to explain and illustrate concepts, and ability to answer questions completely; the usefulness of information received; and the usefulness of the training materials/handouts and were rated on a scale from 1 (poor) to 5 (excellent). Items were analyzed individually.

Open-Ended Feedback. At the end of each session, participants were also asked to provide open-ended feedback about what the trainers did well, what they felt the trainers could do to improve, what other TIPS topics they would like to learn about, and what suggestions they had for training improvement. These open-ended questions were developed in consultation with advisory committees.

Outcome Evaluation

The purpose of the outcome evaluation, assessed through pretest to posttest changes, was to understand if the training had the intended impact on participant’s knowledge of trauma and confidence in implementing key concepts and skills.

Knowledge of Trauma. Participants’ current knowledge around trauma was assessed using three questions developed by the research team. The three questions are, “I can support students who have experienced trauma,” “I can recognize the risk factors associated with trauma,” and “I can recognize the common symptoms associated with experiencing trauma.” Response options range from 1 (strongly disagree) to 5 (strongly agree). Each item was scored individually.

Implementation Leadership Scale (ILS). The ILS (Aarons et al., 2014) assessed leadership skills in implementing evidence-based practice. This scale is comprised of four subscales (proactive, knowledgeable, supportive, and perseverant), each comprised of three items. Participants completed the scales relevant for their job role: administrators completed all subscales, whereas teachers and teacher candidates completed the knowledge and perseverant subscales. Example items include “I have developed a plan to facilitate implementation of evidence-based practice” (proactive subscale) and “I am knowledgeable about evidence-based practice” (knowledgeable subscale). The items are rated on a scale of 0 (not at all) to 4 (very great extent). The scale has demonstrated both convergent and divergent validity and internal consistencies in the initial validation study were high, ranging from 0.95 to 0.98 (Aarons et al., 2014). Internal consistencies for the samples in the present study ranged from 0.69 to 0.97. Nearly, all alphas were above 0.80, except for the support subscale (0.69) and perseverant subscale (0.79) within the Mountain administrator posttest sample.

Norwegian Principal Self-Efficacy Scale (NPSES). The NPSES (Federici & Skaalvik, 2011) assessed administrator self-efficacy for instructional leadership. Used in the United States and Europe, this scale was chosen based on its focus on school principals and its relevance to evaluating project outcomes. This scale is comprised of five subscales (three items each): develop goals, guide teachers, creative a positive and safe learning environment, motivate teachers, and develop a collective culture. Participants completed the scales relevant for their job role: administrators completed all subscales, whereas teachers and teacher candidates completed the safe learning environment subscale. Example items from the NPSES include, “Promote a safe school environment for students which is free from bullying” (positive and safe learning environment subscale), and “Develop a collective culture in which everyone works to achieve shared goals” (develop a collective culture subscale). Response options range from 1 (not at all certain) to 7 (absolutely certain). The scale has been validated via confirmatory factor analysis (Federici & Skaalvik, 2011). Cronbach’s alphas run for the scale demonstrate high internal consistency, both for the total scale and for the five subscales (the author did not list the alphas, only that they demonstrated internal consistency; Skaalvik, 2020. Internal consistencies for the samples in the present study ranged from 0.56 to 0.95. Nearly, all alphas were above 0.80, except for the positive and safe learning environment subscale within the teacher education pretest (0.79) and Valley teacher pretest (0.79) and posttest (0.56) samples.

Procedures

Researchers designed a mixed methods pretest–posttest evaluation as a pilot test to refine the core curriculum and consider a menu of possible adaptations. The pretest was administered at the beginning of the first session, before any content had been administered. The posttest was administered directly after the final session. Teachers and administrators in Valley District completed the pretest in November 2022 and the posttest in May 2023. Administrators in Mountain District completed the pretest in October 2022 and the posttest in March 2023. Teacher candidates in the TEP program completed the pretest in October 2022 and the posttest in May 2023.

In addition to the pretest–posttest evaluation, participants were asked a series of closed and open-ended questions at the end of each session to gain feedback after each module, which the researchers use for planning future sessions. Advisory committees were engaged in discussing post-session feedback to help us implement mid-course corrections in the training. The surveys needed to be anonymous to maintain confidentiality; therefore, we were unable to match the pretest to posttest for more stringent data analysis, but we were able to note general trends in scores using independent samples t tests. Results were reviewed with each advisory committee in debriefing meetings to review the overall feasibility, acceptability, and impact of the training along with planning for future phases. The project was approved as exempt by the university’s institutional review board for human subjects.

Results

In-Session Process Data Across Training Groups

Mountain and Valley School District Administrators

In Mountain School District, process evaluation data on the quality of training were highly positive across the three sessions, with the majority of ratings being “good” or “excellent” across the quality of the training itself and of the trainers. Administrators perceived that there was sufficient time allotted that the training was useful to their work and that the training was appealing and of interest. In Valley School District, session one was rated less useful than subsequent sessions, but still in the “good” range overall. Subsequent trainings were rated as more useful and the quality of ratings also improved. Table 3 details the process data results for administrator and teacher candidate trainings.

Valley School District Teachers

Implementation data for Valley School District suggested that teachers rated the quality of sessions from “good” to “excellent” in terms of having a sufficient time allotted, the usefulness of training experience for their work, the relevancy of topics covered and how appealing, fitting, and doable the trainings’ lessons were. Across most sessions, teachers rated the trainers’ ability to answer questions, ability to explain concepts, and knowledge of subject matter as “good” and “excellent.” See Table 4 for the process evaluation data for the teacher trainings.

TEP Candidates

Process evaluation data on the quality of the training and trainers were positive (Table 3). The majority of participants rated the trainers’ knowledge of the subject matter as “good” or “excellent” across the three sessions and also rated the usefulness of the information received as “good” or “excellent.” There was variability in the level of agreement on having a sufficient time allotted, which was likely due to the limit of one hour for “lunch and learn,” whereas the other teacher trainings allotted 75–90 min. The majority of teacher candidates rated the usefulness of the training experience for their work as “good” to “excellent.” Regarding the quality of the trainers, teacher candidates were highly satisfied with the trainers’ ability to answer questions, ability to explain concepts, and knowledge of subject matter, with all rating being between “good” and “excellent” across sessions.

Open-Ended Feedback

We asked open-ended questions during in-sessions surveys as a form of continuous progress monitoring that allowed us to make regular adaptations to our training content and format. As such, we did not formally analyze the feedback for themes but integrated the feedback into subsequent sessions. In terms of the strengths (“What specifically did the trainer do well?”), feedback across all training groups included having a good balance of presenters and perspectives, explaining how trauma is manifested in the schools, and providing relatable examples. In addition, participants noted that trainers provided useful materials and examples, listened, were attentive, and provided opportunities for discussion and reflection.

Suggestions for improvement (“what recommendations do you have for the trainer to improve?”) from teacher comments included more strategies that teachers could implement, and more specific and tailored suggestions to their unique scenarios. Some training topics that teachers requested (“Are there other topics related to trauma-informed schools for which you would like further training?”) included working with children with anxiety, how to talk with a student who is “shutdown,” working with students who have lost a parent, and how to support students who witness “meltdowns” by other students. During the trainings, many teachers described specific situations where they requested help applying general principles to current student trauma concerns. For example, several teachers had students in their classes who had a parent with a terminal illness. Thus, we added detail focused on responding to grief and death. Later in the year, a school in the community had a false school shooter threat and lockdown, prompting more time and attention to how to respond in various lockdown scenarios. We also bolstered content related to what teachers can do individually and in their classroom and when they should seek consultation from a school psychologist or mental health team. Another category of areas for improvement noted in our open-ended survey questions by teachers and TEP candidates included providing more time on the materials and desiring more assistance on implementing training strategies. Thus, we made sure to request more time (i.e., 90 min) for training sessions and include several opportunities for discussion and questions. We realized that providing less, but more in-depth information, was more effective than covering a full breadth of information.

Outcome Data Across Training Groups

Mountain and Valley District Administrators

Outcome evaluation results for Mountain School District revealed statistically significant improvement from pretest to posttest on knowledge about TIPS, including administrators noting they could support trauma-affected students and their ability to recognize common symptoms. There was not a significant increase in administrators’ ability to recognize risk factors associated with trauma. For perceived ability to persevere through the challenges of implementing TIPS, there were statistically significant increases in how knowledgeable administrators felt, how proactive they were in TIPS implementation, and their ability to persevere in implementation. However, there was no changes in administrator perceptions of being supportive of employee implementation. Regarding changes in administrators’ leadership self-efficacy, there was a significant increase in administrators’ ability to create a positive and safe learning environment. There were no changes in perceived ability to develop clear and achievable goals for the school, guide teachers, motivate teachers, or develop a collective school culture. See Table 5 for the complete outcome data across training types. Valley School District only had three administrators, therefore, statistical analysis of their data was unable to be conducted.

Valley District Teachers

Outcome evaluation data (in Table 5) indicated that teachers’ perception of their knowledge of trauma-informed practices significantly increased. At posttest, they rated themselves as significantly more able to support students who experience trauma, recognize risk factors associated with trauma, and recognize common symptoms associated with trauma. Their ability to overcome challenges in implementing trauma-informed practices in their classrooms also improved, as indicated by their increased ratings of knowledge about implementing TIPS and their ability to persevere in implementing TIPS. There was not a significant difference in ratings of whether they could create a safe school environment.

Teacher Candidates

The outcome evaluation data (Table 5) were consistent with that of the other training groups, in that there was a statistically significant improvement in knowledge about trauma. The average scores for the ability to support students who experience trauma, recognize risk factors, and recognize common symptoms were all significantly higher at posttest. Perceived ability to implement trauma-informed practices also improved as indicated by their knowledge, ability to persevere in implementing TIPS, and their confidence in creating a safe school environment.

Discussion

Creating trauma-informed systems can help children as they contend with life stressors and potentially traumatic events that can temporarily disrupt their functioning, and help them with resilience and recovery (Bonanno, 2021, NCTSN, n.d.). Challenges have been noted in implementing trauma-informed practices in educational settings (Thomas et al., 2019), such as the complex nature of school systems, demonstrating the need for TIPS. This study addressed several of the needs for TIPS implementation proposed by Phifer and Hull (2016) starting with developing a strong university–community partnership, engaging district advisory teams to co-develop university researcher protocols, and providing training to teachers as well as teacher candidates. Implementing the first three phases of our tailored TIPS training illuminated features of TIPS that may be common across diverse school districts. An interesting finding was how similar the results of the needs assessment were across the two participating school districts and relative to existing literature that teachers craved emotional support and tools for supporting students through trauma (Chan et al., 2021). Teachers in our study desired training in TIPS, needed their administrators’ support, wanted their administrators to engage them as people, and requested that their time be valued and prioritized. This was further supported by the fact that, even though Valley School District neither initiated the partnership nor participated in the needs assessment, they still found the content to be relevant and match their needs.

Our process also identified features of TIPS training that required adaptation to meet the needs of each participating school district and TEP. Specifically, advisory committees provided key input about logistical and implementation features such as participant recruitment and incentives as well as the timing, length, and number and type of modules offered. One district was new to TIPS and wanted to move slowly and carefully into the transformation (i.e., Mountain School District) whereas the other school district wanted to quickly provide this training as a booster on a topic they had received training on in the past (i.e., Valley School District). Prior knowledge and experience with TIPS were another factor we identified as a district-level factor that should be considered in adapting training. Overall, these results are consistent with the finding that TIPS training and support benefit from the participation of administrators, teachers, parents, and students in an iterative process that is responsive to unique and evolving school community needs (Davis et al., 2020).

Overall, our pilot implementation phase (Phase 3) demonstrated that our TIPS training was well-received by administrators, teachers, and teacher candidates. Process evaluation results across administrator, teacher, and teacher candidate trainings revealed strong, positive ratings on aspects of quality of the training and trainers; the usefulness of the training, and the content was appealing and interesting. There was variability on the ratings of having sufficient time, with the teacher candidates rating that lower than administrators and teachers.

For outcome evaluation results, across the three training types there was significant improvement in most indicators. For administrators, most aspects of self-reported knowledge of trauma improved, other than identifying risk factors. Specifically, administrators improved in their perception that they could support trauma-affected students and recognize common symptoms. There were statistically significant increases in how knowledgeable administrators felt, how proactive in TIPS implementation they perceived themselves to be, and their perceived ability to persevere in implementation. However, there was no change in administrator perception of being supportive of employee implementation. Regarding changes in administrators’ leadership self-efficacy, there was a significant increase in administrators’ self-reported ability to create a positive and safe learning environment. There were no changes in perceived ability to develop clear and achievable goals for the school, guide teachers, motivate teachers, or develop a collective school culture. This suggests that more time and attention may be needed here.

For teachers, self-reported knowledge of trauma increased in all areas. In addition, their perceived ability to overcome challenges in implementing trauma-informed practices in their classrooms improved, as indicated by their increased self-ratings of knowledge about implementing TIPS and their ability to persevere in implementing TIPS. There was not a significant difference in their self-reports of whether they could create a safe school environment, which is something more systemic and less under their direct control. They also already felt fairly confident in their ability to contribute to this. The teacher candidate outcome evaluation data were very similar to the teacher outcomes, with the exception that the teacher candidates felt more confident in their ability to create a safe school environment at posttest compared to pretest. As they are learning to be teachers, there likely was more room for improvement.

Overall, these results are consistent with prior research that has found TIPS training can improve the attitudes, knowledge, and self-assessed skills of teacher candidates (Brown et al., 2022) and continues to build evidence for the acceptability, feasibility, knowledge gain, and confidence building of TIPS training for educators. Although the literature is clear that administrative support is key for successful implementation of new initiatives (e.g., McCluskey et al., 2008), this study is novel in including an administrator training designed to teach unique skills they need to support their schools and teachers in practicing TIPS.

Strengths and Limitations

This study documented how university–community partnerships could facilitate TIPS training to teachers and administrators despite the myriad complexities and barriers to school systems change. We were fortunate that a local school district approached the university with a funder to support a rigorous approach to addressing TIPS in their schools. Recruiting schools to engage in such intensive work would be much more challenging. Working with district advisory committees and starting with a needs assessment to guide curriculum development were notable strengths. Results provide a road map for doing this work in other school districts with evidence for the feasibility, acceptability, and potential impact of TIPS implementation.

Naturally, there were several limitations of this research. The focus on three elementary school districts and one TEP in a single geographic area limits the generalizability of results to other communities. However, these trainings should be tailored to the needs of the local community to some degree, and it is hoped this summary of the process can be a framework for other communities interested in this work. Of additional concern is the attrition of Ocean School District after they initiated the project and participated in the needs assessment phase, especially as they have the highest proportion of students with free and reduced lunch. This is not uncommon in community implementation in complex service systems, and others interested in doing this work should consider and prepare for this possibility. New leaders in Ocean School District have expressed a strong desire to engage in TIPS, but they have multiple initiatives they felt were more urgent and pressing that required their administrator and teacher extremely limited professional development time. An additional key limitation was the challenge of tracking participants over time. For this pilot phase, we were unable to approve an effective protocol for linking participant identities over time; thus, we are only able to compare group mean differences. In addition, we did not have a control group in this pilot phase of the project. A randomized control trial is the gold standard for evaluating the efficacy of a program. The challenge is randomization at the school or district level, knowing that schools and districts in this community can vary a lot from one another. Next steps include a wait-list control trial. Finally, due to advisory committee input and the reality of limited time, we used single-item measures of constructs rather than validated scales in some cases.

Lessons Learned

Several lessons have been learned throughout the project. First, we recognized that participants will bring their own unique experiences to the training. We realized that while it is possible that variation in types of student examples may reach saturation over several training administrations, there may always be unique contexts and situations that require adaptation. Thus, we concluded that a manualized, pre-recorded, or remote curriculum would not likely be as effective as an in person, tailored, and responsive training. A hybrid option might also be effective, where a core training is delivered through a variety of potential modes with an in-person opportunity for tailored content and consultation.

Feedback from participants regarding the need for support in implementing what they learned reinforced evidence from professional development literature that a one-and-done training is insufficient to elicit change (Hunzicker, 2011). Even though our eight-module training offered over the course of a year was likely more effective than a one- or two-day training covering the same material all at once, we recognized that additional booster sessions, the availability of group and individual consultation, and assistance with tier two and three school-based supports would likely be needed, depending on how well districts are already providing multi-tiered TIPS.

Finally, we found that teacher professional development is a limited resource with few options for delivery: in the few days of summer or before school starts; during regular, limited staff meetings; or by missing parts of the school day covered by a substitute teacher. Thus, university personnel and funders must be flexible and creative in working with advisory committees to identify available teacher time in the midst of competing priorities. Paying teachers for their time are one solution, although not all teachers have the bandwidth or time to work additional hours despite their interest. We imagine a future where preservice teachers receive thorough training in TIPS so that professional development can build from a strong foundation.

Future Directions

Future directions in the field of TIPS identified by Phifer and Hull (2016) include developing and maintaining comprehensive professional development opportunities and determining what helps teachers with implementation and sustainment of the practices. They also recommended that in order to establish multi-tiered provision of TIPS, educators need to provide professional development to all educators, parents, and students at tier 1; provide expert consultation at tier 2; and have trained mental health professionals provide evidence-based treatments at tier 3. Accordingly, next steps in this project are to work with advisory committees to build and test district internal capacity for TIPS training and development, to scale-up the training with a more rigorous multi-perspective outcome evaluation, to add additional features for sustaining what participants learn, and to ensure that their existing mental health staff (e.g., school psychologists, school counselors) have in-depth training in TIPS to provide consultation and individualized support.

A primary goal of this university–community partnership has been to provide ample training to participating schools so that they can sustain TIPS without requiring the university expertise and foundation funding. Thus, in collaboration with district advisory committees, we are currently developing Phase 4: train-the-trainers for district mental health providers. In this next phase, we plan to engage district mental health providers to build upon their knowledge of TIPS and develop their expertise to the degree that is needed to provide TIPS training and support. At the same time, we will also develop Phase 5: Scale-up with multi-perspective outcome data and consultation support for teachers to implement TIPS. In this phase, we plan to engage the broader school community (i.e., students and parents) as advisory committee members and participants in TIPS initiatives and research. Including student and parent voice in addition to the perspectives of administrators and teachers will be key to the broader success of TIPS work. We also plan to look at school-level outcomes including attendance, discipline, and academic performance. This would be paired with consultation to improve practice of TIPS within Tier 2 and Tier 3 supports that are already in place and develop a model of consultation (e.g., Mayworm et al., 2016) for teachers to obtain support in their Tier 1 supports of TIPS from the mental health practitioners in their schools.

Conclusion

There has been a growing number of trauma-related resources for schools available online in recent years, but the actual uptake and usefulness of these resources is unknown. Until state departments of education change training standards to require attention to TIPS in teacher education and practice, only schools that recognize the importance of TIPS and prioritize advisory committee attention and limited professional development time on TIPS will be accessible. Even for district who are eager to participate, it is likely that intensive university-community-school partnerships will be needed to support the school climate and practice required to successfully implement TIPS. Online resources are a great start, but educator questions and needs quickly move beyond the general principles discussed in online curricula to the specific challenges they are facing on a daily basis that can be better addressed by experts, in person. Rigorous research focused on the implementation, sustainment, and outcomes of TIPS is needed to understand the full impact of TIPS and drive forward these necessary policy and praxis changes.