Background

Regular participation in physical activity is essential for young people’s physical, psychological, emotional, and cognitive health [1]. However, only 27% to 33% of children and adolescents meet the recommended 60 min of moderate to vigorous physical activity per day across the globe [2]. Physical activity begins to decline during childhood and continues throughout adolescence [3, 4]. Although some of the decline in physical activity may have a biological basis, increased academic and work commitments (i.e., lack of time), low perceived competence, and lack of interest and support from peers have been identified as barriers to participation among adolescents [5, 6]. Schools are internationally recognized as key settings for promoting physical activity, given many children and adolescents attend school for a substantial portion of their time [7]. In addition, most education systems have policies and curricula that mandate physical activity opportunities for young people during school hours. Schools also have qualified personnel (i.e., teachers and support staff) responsible for supporting the education, health, and well-being of young people.

Despite their potential, school-based physical activity interventions have had limited effect on young people’s objectively measured physical activity [8,9,10,11,12,13]. For example, a recent individual participant pooled meta-analysis of randomized controlled trials found that school-based interventions led to increases of 1.5 min/day of vigorous-intensity and 1.3 min/day of moderate-intensity physical activity [12]. Jago and colleagues recently suggested that the failure to consider important school contextual factors (e.g., school setting, ethos, staff, and sociodemographic factors) has contributed to the small effects [14, 15]. Poor implementation of physical activity programs and policies by teachers and other school staff has been offered as another reason for the limited effects [16].

Teachers (i.e., generalist and specialist physical education) are recognized as ‘key agents of change’ responsible for the implementation of school-based physical activity interventions [17,18,19]. Considering their frontline position in implementing physical activity programs and policies in primary and secondary school settings with a range of related tasks (e.g., designing physical activity curricula, organizing sports activities, or coordinating active breaks during class time), there is an urgent need to consider the barriers and facilitators teachers experience in the delivery of interventions. Naylor and colleagues conducted a systematic review of the factors influencing the implementation of school-based physical activity interventions and found that ‘time’ was the most commonly cited barrier [20]. Other influencing factors were resource availability and quality (e.g., activity resources, personnel, facilities), and supportive school climate (e.g., shared vision and administrative support) [20]. Using the Theoretical Domains Framework as a guide, Nathan and colleagues also reviewed the barriers and facilitators that influence the implementation of physical activity policies in schools. Their review of 17 studies found the most commonly reported domains were 'environmental context and resources' (e.g., availability of equipment, time or staff), 'social influences' (e.g., support from school executives), ‘goals’(e.g., perceived priority of the physical activity policy) and 'skills' (e.g., teachers' capability to implement the policy) [21]. In summary, the most commonly reported barriers to the implementation of physical activity programs and policies in schools include inadequate teacher training, time constraints, lack of motivation, and low perceived priority. Failure to consider these factors (i.e., determinants of implementation) in the co-creation and feasibility stages, may help explain the modest effects of previous school-based interventions.

Given the multiple challenges experienced by teachers, there is a need to identify and evaluate the impact of school-based implementation support strategies (i.e., methods used to enhance the adoption and implementation of interventions) [22, 23]. Previous reviews have examined the effect of staff professional development within school-based physical activity interventions [24] and the specific features associated with intervention fidelity and student physical activity [25]. Lander and colleagues [24] found that teacher professional development sessions lasting one day or more, delivered using multiple formats, and including subject and pedagogical content were more effective. More recently, Ryan and colleagues [25] demonstrated the use of behavior change techniques, informed by the COM-B model, such as ‘Action planning’ and ‘Feedback on the behavior’, were associated with better implementation and increases in children’s physical activity.

Although previous studies have attempted to examine the impact of implementation strategies on the key determinants of teachers’ implementation of physical activity, most have relied on unvalidated tools (i.e., designed specifically for their study) [20, 26]. There are more than 60 implementation theories, models, and frameworks [27, 28]. We selected the Capability, Opportunity, and Motivation Behavior (COM-B) model for this study because it offers a robust framework for understanding behavior and has proven utility in guiding interventions [17, 29]. Moreover, the COM-B model is now included in the ‘Individuals’ domain of the updated ‘Consolidated Framework for Implementation Research (CFIR)’, which is one of the most highly cited frameworks in implementation science [30]. Utilizing the COM-B model to assess teachers' capability, opportunity, and motivation to implement physical activity interventions within schools may offer insights into teacher-level determinants of implementation, and the way in which these may impact implementation of interventions. Such insights are essential for informing the development and evaluation of teacher delivered physical activity interventions. Therefore, the aim of our study was to develop and evaluate a brief tool for assessing teachers' capability, opportunity, and motivation to implement physical activity programs and policies in schools. The tool was designed to be adaptable, making it appropriate for the evaluation of different physical activity programs and policies in primary and secondary school settings.

Methods

Our study involved three research phases (see Fig. 1). In Phase 1, we explored items for the Capability, Opportunity, and Motivation to deliver Physical Activity in School Scale (COM-PASS) through a Delphi study with academic experts. In Phase 2, we assessed how teachers interpreted the COM-PASS items, and refined the tool using 'think-aloud' interviews with primary and secondary school teachers. In Phase 3, we explored the structural validity of the COM-PASS scores using confirmatory factor analyses (CFA) and structural equation modelling. The COM-PASS was designed to assess teachers’ capability, opportunity, and motivation to deliver specific physical activity interventions (i.e., programs or policies). The measure was not designed to assess teachers’ general capability, opportunity, and motivation to promote physical activity in school. Ethics approval was obtained from the University of Newcastle Human Research Ethics Committee (H-2021–0418) and the New South Wales Department of Education (State Education Research Application Process (SERAP): 2,022,215).

Fig. 1
figure 1

Phases of the development of the COM-PASS

Phase 1: Delphi study—scale development and content validity assessment

The aim of the first phase was to develop items for the COM-PASS and assess content validity using a Delphi study approach [31]. International academic experts (n = 45) who were first or senior author on a peer reviewed school-based physical activity intervention in the last five years were invited to review the COM-PASS tool by completing three review rounds of a 20 min (per round) online survey using the QuestionPro software [32]. The first version of the tool (round 1) included 13 items, and was based on items developed by Keyworth et al. [33] using the COM-B model [29] (see Supplementary File 1).

Two researchers (A.V. and D.R.L.) adapted the scale developed by Keyworth et al. [33] for physical activity promotion in the school setting. Academic experts were then asked to rank each item on the degree to which it matched the definition of the six COM-B model constructs: (i) physical capability, (ii) psychological capability, (iii) physical opportunity, (iv) social opportunity, (v) reflective motivation and (vi) automatic motivation [29] using a 5-point scale ranging from ‘1 = Poor match’ to’5 = Excellent match’. The survey included space for experts to make amendments and provide suggestions. The academic experts were informed their contribution would include three rounds including a 20-min online survey per round to provide their feedback. Academic experts who accepted the invitation were requested to complete their feedback within two weeks. A reminder was sent to the experts who did not complete the survey after the given time and extra time was given if requested. A.V. and D.R.L. reviewed the feedback per round and amended the questions accordingly, until the item rating reached an average score of 4.50 out of 5 or higher. The feedback was reviewed per round and amended accordingly. Prior Delphi studies have utilized cut-off thresholds ranging from 55 to 100% [31, 34]. However, in light of our COM-PASS items being grounded in the existing COM-B constructs, we used a consensus threshold of ≥ 4.50 out of a total of 5. The total timeframe of the Delphi study was eight months (November 2022 to July 2023).

Phase 2: Teacher interviews—teachers’ interpretation assessment

In Phase 2, we recruited primary (n = 5) and secondary (n = 5) school teachers currently teaching in Australia via convenience sampling within our networks. The main aim of this phase was to evaluate how teachers understood and interpreted the COM-PASS items. Seeking input from members of the target population can offer valuable insights into both content relevance and representativeness [35, 36] and substantive aspects of validity [35]. We discussed the second version of the COM-PASS (i.e., after processing expert feedback on the first version) using a modified ‘think-aloud’ interview protocol [37,38,39] to further refine and pre-test the initial 17 items and response options including a 5-point Likert scale ranging from ‘1 = Strongly disagree’ to ‘5 = Strongly agree’.

Primary and secondary teachers completed an online (n = 8) or face-to-face 20-min interview (n = 2) with one author (A.V.). All interviews were audio and/or video recorded after obtaining consent. The teachers were instructed to read all COM-PASS items out loud and answer for all items separately the question ‘What, in your own words, does the question mean to you?’. Subsequently, the participants answered the following questions regarding the overall tool (a)’Did the answer choices include your answer?’, (b)’Did you understand how to answer the questions?’, (c) ‘Did the questionnaire leave anything out you felt was important?’ [37, 38] and (d)’Do you have any other comments?’. The interview script and the COM-PASS items used for this assessment can be found in Supplementary file 2. All interviews were transcribed (A.V.), reviewed (A.V. and D.R.L.) and amended accordingly (presented in Table 2 in the results section). We used a constant comparison approach [40] to identify sentences and phrases in which teachers raised concerns regarding one or more items, focusing on problematic and alternative interpretations of items. Participants received a 20-dollar (AUS) gift voucher to acknowledge their contribution. Detailed transcripts were attached to the email invitation for the academic experts as part of their second time reviewing the COM-PASS tool to evaluate to what extent the items matched to the COM-B constructs (Phase 1: Delphi study, round 2).

Phase 3: Structural validity assessment

In Phase 3, we explored the structural validity of scores derived from the COM-PASS in a different sample of primary and secondary school teachers to Phase 2 [35, 41]. Participants were recruited using convenience sampling. First, we recruited teachers attending two Australian teacher physical education conferences (i.e., the Personal Development, Health and Physical Education Conference in New South Wales and the Australian Council for Health, Physical Education and Recreation Conference in Victoria). Second, we sent email invitations to our network of teachers in Australia, Germany, and the United Kingdom. Finally, we invited teachers from an ongoing implementation-effectiveness trial of the Australian Resistance Training for Teens program [42].

The COM-PASS items were included in a brief 10-min survey that included a 3-min video describing the Resistance Training for Teens (RT4T) program [42]. Teachers were asked to use RT4T as a reference when completing the COM-PASS items. We used CFA to explore structural validity because the COM-PASS tool was developed using the COM-B model [43]. We conducted analyses using IBM SPSS AMOS 29.0 software [44] and report the following fit indices: i) the comparative fit index (CFI) [45], ii) the Tucker-Lewis index (TLI) [46], and iii) the root mean square error of approximation (RMSEA) [47]. CFI and TLI compare the fit of a hypothesized model with the worst fit [48], while the RMSEA assesses how far a hypothesized model is from a perfect model. Hu and Bentler suggest that CFI and TLI values larger than 0.95 and an RMSEA value smaller than 0.06, indicate relatively good model fit to the observed data [45]. Our CFA included correlated residuals, as failing to correlate residuals may lead to parameter bias [49]. Additionally, Cronbach alphas were calculated to evaluate the measurement reliability of the separate capability, opportunity, and motivation constructs. Missing data were handled by the item mean substitution method where the mean item score was substituted for every missing value of a particular item, which has been identified as an appropriate approach if the number of items were missing for each scale are 20% or less [50]. The readability of the final tool was assessed using the Flesch Reading Ease Score to indicate its suitability for use with teachers, using a 100-point scale ranging from ‘0 = Very difficult’ to ‘100 = Very easy’ [51].

Results

Phase 1: Delphi study – scale development and content validity assessment

Three ranking review rounds were completed by 38 academic experts (84.4% response rate). The first round had an average score of 4.04, the second round 4.51, and the third (final) round had an average score of 4.78 agreement. This third round was deemed the final version, as all items received an average score of ≥ 4.50 (see Table 1). Although one item (Q14) scoring slightly below our chosen threshold at 4.45, we decided to retain the item after careful consideration of received comments.

Table 1 Results of the third ranking Delphi round of the COM-PASS by academic experts

Phase 2: Teacher interviews—teachers’ interpretation assessment

We conducted interviews with primary (n = 5) and secondary (n = 5) school teachers (approximately 20 min in duration) to assess their interpretation of the COM-PASS. The second version of the COM-PASS (i.e., after review round 1 was completed by the experts) was used for this phase so any amendments could be approved by the academic experts in the following review round. Teachers’ interpretation was well aligned with the meaning of all COM-PASS items based on the COM-B model [29]. All teachers agreed on the question ‘Did the answer choices include your answer?’ and half of the teachers commented in their answer that the tool and answer options were clear and achievable to answer. The question ‘Did you understand how to answer the questions?’ was answered with ‘yes’ by all teachers. Regarding the question ‘Did the questionnaire leave anything out you felt was important?’, all teachers mentioned nothing was left out, except for one teacher who suggested to add in the question ‘How easy did you find it to use the program materials/resources?’ as they experienced challenges with a program application for tablets in their school and could not use it as much as they wanted due to technical issues. This item was added to the revised version of the COM-PASS (round 2) and reviewed by the academic experts to ensure the item was representative of the construct. However, this item was subsequently removed based on a low score of 4.03 and comments received from the academic experts (e.g., the item fits more in a process evaluation), and discussions among authors. Teachers had no further comments on the question ‘Do you have any other comments?’, and half of the teachers expressed appreciation for the tool and referred to the COM-PASS as a clear questionnaire.

As a result of three review rounds by the experts (Phase 1) and the ‘think-aloud’ interviews with teachers (Phase 2) the COM-PASS tool was refined three times whereby concerns from experts and teachers were discussed (A.V. and D.R.L.), resulting in actions taken (see Table 2). Changes to the final tool included: examples in five questions were amended to provide greater clarity, the addition of four items, three items were removed, two items were reworded, and two items were merged.

Table 2 Findings from the ‘think-aloud’ teacher interviews and expert reviewers

Phase 3: Structural validity assessment

In Phase 3, the final version of the COM-PASS was completed online by 196 teachers [male n = 100 (51%), female n = 96 (49%), primary n = 44 (22%) and secondary n = 152 (78%), Australian n = 155 (79%), German n = 10 (5%), and British n = 31 (16%)] (see Table 3). Teachers used the Resistance Training for Teens program as a reference when completing the scale [42]. Three missing values (0.1% of total responses) were replaced by the mean values of that specific item. Internal consistency was confirmed for all constructs (i.e., capability: α = 0.75, opportunity: α = 0.75, motivation α = 0.81). Supplementary file 3 presents the correlations among the COM-PASS items and the descriptive statistics (i.e., mean (M), standard deviation (SD), minimum, maximum and sample size). The final version of the COM-PASS obtained a Flesch Reading Ease Score of 54.6, equivalent to a reading level of 10th to 12th grade of high school [51]. Figure 2 presents an overview of the CFA using the IBM SPSS AMOS 29 Graphics software [44] with the three-factor loading model containing factors: capability, opportunity, and motivation. Findings from the CFA with the three components aligned with the COM-B model constructs (i.e., capability, opportunity, and motivation) demonstrating adequate fit (χ2 = 122.6, df = 66, p < 0.001, CFI = 0.945, TLI = 0.924, RMSEA = 0.066) and standardized factor loadings ranged from 0.43 to 0.80. A final version of the COM-PASS including answer options using a 5-point Likert scale anchored by 1 (Strongly disagree) to 5 (Strongly agree) can be found in Appendix 1.

Table 3 Internal consistency of the final of COM-PASS items and constructs
Fig. 2
figure 2

Standardized factor loadings and inter-factor correlations from the COM-PASS confirmatory factor analysis

Discussion

The aim of our study was to develop and evaluate a brief tool for assessing teachers’ capability, opportunity, and motivation to deliver physical activity programs and policies in schools. Our findings provide preliminary support for the internal consistency and structural validity of scores derived from the COM-PASS in primary and secondary school teachers. The measure was designed to evaluate the effects of implementation support strategies in school-based physical activity interventions in efficacy, effectiveness, and dissemination studies. The COM-PASS may also have utility for evaluating the effects of pre-service (university undergraduate students) and in-service (current teachers) professional learning courses focused on physical activity promotion in schools.

It has been suggested that teacher professional development to support the delivery of school-based physical activity interventions should be informed by relevant theory and include evidence-based behavior change techniques [25]. However, prior to our study, we were not aware of any validated measures designed to assess teachers’ capability, opportunity, and motivation to deliver physical activity programs in schools. Importantly, our brief measure has been designed to be used to evaluate different physical activity programs and policies in research across the research translation pathway (i.e., from feasibility to dissemination). McKay and colleagues [28] recently proposed a minimum set of implementation outcomes (i.e., adoption, dose delivered, reach, fidelity, and sustainability) and determinants (i.e., context, acceptability, adaptability, feasibility, compatibility, cost, culture, dose, complexity, and self-efficacy) for the evaluation of physical activity interventions delivered at-scale. The COM-PASS overlaps with some of the determinants outlined by McKay and colleagues (e.g., self-efficacy), but is focused at the teacher level, as teachers are largely responsible for the delivery physical activity interventions in schools. In addition, the COM-PASS has been design for use in feasibility, efficacy, and effectiveness trials.

The COM-PASS has good content and structural validity and is considered appropriate by teachers. Positive feedback from teachers highlighted the user-friendly nature of the tool [52], which had a Flesch Reading Ease Score of 54.6 (i.e., reading level 10th to 12th grade of high school) [51]. All of the final items were scored ≥ 4.50 by academic experts, indicating that the COM-PASS items are well aligned with the COM-B model [29]. Findings from our CFA suggest that scores derived from the COM-PASS fit a three-factor model, aligned with the COM-B model (i.e., capability, opportunity, and motivation). Moreover, our Cronbach alpha results suggest that the three sub-scales have acceptable internal consistency (α > 0.70). Although our measure included items aligned with the six COM-B constructs (i.e., physical capability, psychological capability, physical opportunity, social opportunity, reflective motivation, and automatic motivation), we opted for a more parsimonious three-factor solution. Previous studies have identified an inverse association between questionnaire length and response rate [53] and researchers often encounter difficulties in persuading teachers to complete follow-up surveys in school-based research. This is especially true in large-scale dissemination studies, which have lower response rates than feasibility, efficacy, and effectiveness trials [54,55,56].

Teachers play an important role in the delivery of school-based physical activity interventions, but few studies have examined the impact of implementation support strategies on teacher level determinants (e.g., feasibility, acceptability, and capability). Ryan and colleagues [25] found evidence to support the use of the behavior change techniques ‘Action Planning’ ‘and ‘Feedback on behavior’ in staff training to increase students’ physical activity. However, the authors noted a lack of thorough reporting on the implementation of school-based physical activity interventions and highlighted the need for valid and reliable tools [25]. As such, there is need for pragmatic measures that are feasible to use in real-world settings, such as schools [57]. The COM-PASS addresses this shortfall and may have utility for measuring the impact of implementation support strategies on teachers’ capability, opportunity, and motivation to deliver physical activity programs and policies in schools.

Future research

As noted by Beets and colleagues [58] in their Theory of Expanded, Extended and Enhanced Opportunities for youth physical activity, teachers are largely responsible for the effects of school-based physical activity interventions by creating new opportunities for students to be active at school (expanding), making existing opportunities longer (extending), and making the most out of existing opportunities (enhancing). We encourage researchers to use the COM-PASS to explore the role of teachers’ competence, opportunity, and motivation, as mediators of the intervention effect on students’ physical activity levels. We also encourage researchers to conduct further validation studies of the COM-PASS in diverse samples of primary and secondary school teachers. For example, future studies should examine the test–retest reliability and responsiveness of the COM-PASS. There is also a need for further studies to examine the appropriateness of the tool when adapted for the evaluation of different physical activity programs and policies.

Strengths and limitations

A notable strength of this study is the involvement of academic experts and teachers to develop a pragmatic tool. In addition, our measure was developed using the COM-B model, which has been identified as an appropriate framework for assessing and guiding physical activity interventions [17, 29]. However, there are some limitations that should be noted. First, most of the participants in Phase 3 (i.e., factorial validity) were Australian secondary school teachers. Further studies examining the factorial validity of the COM-PASS in primary and secondary teachers across the globe are needed. Second, the sample size involved in our factorial validity study was below the > 250 participant threshold recommended for confirmatory factors analyses [45]. It is important to note that our study was conducted during the post COVID-19 period, when schools and teachers were experiencing high levels of disruption and absenteeism [59]. Despite these limitations, our findings provide preliminary evidence for the content and structural validity of the COM-PASS.

Conclusions

The development and evaluation of the COM-PASS tool represents an important step towards bridging the gap between research and practice in school-based physical activity research. Our research has shown that the COM-PASS has good content validity, internal consistency, and structural validity. We have also demonstrated that the measure is considered appropriate by teachers. We developed the COM-PASS to help researchers navigate the design, evaluation, and dissemination of school-based physical activity interventions. The tool may also have utility in university and school settings for evaluating the effects of physical activity courses for preservice and in-service teachers. The COM-PASS is free to use and is available upon request from the corresponding author.