The Fidelity of Training in Behaviour Change Techniques to Intervention Design in a National Diabetes Prevention Programme

Abstract

Background

The National Health Service Diabetes Prevention Programme (NHS-DPP) is a behavioural intervention for people identified as high risk for developing type 2 diabetes that has been rolled out across England. The present study evaluates whether the four commercial providers of the NHS-DPP train staff to deliver behaviour change technique (BCT) content with fidelity to intervention plans.

Method

One set of mandatory training courses across the four NHS-DPP providers (seven courses across 13 days) was audio-recorded, and all additional training materials used were collected. Recordings and training materials were coded for BCT content using the BCT Taxonomy v1. BCTs and depth of training (e.g. instruction, demonstration, practice) of BCT content was checked against providers’ intervention plans.

Results

Ten trainers and 78 trainees were observed, and 12 documents examined. The number of unique BCTs in audio recordings and associated training materials ranged from 19 to 44 across providers, and staff were trained in 53 unique BCTs across the whole NHS-DPP. Staff were trained in 66% of BCTs that were in intervention plans, though two providers trained staff in approximately half of BCTs to be delivered. The most common way that staff were trained in BCT delivery was through instruction. Training delivery style (e.g. experiential versus educational) varied between providers.

Conclusion

Observed training evidences dilution from providers’ intervention plans. NHS-DPP providers should review their training to ensure staff are trained in all key intervention components, ensuring thorough training of BCTs (e.g. demonstrating and practicing how to deliver) to enhance BCT delivery.

Introduction

In 2004, the World Health Organization reported that worldwide incidence of type 2 diabetes had increased to 422 million people [1]. In the United Kingdom (UK), 3.8 million adults were reported to have the condition in 2015, with a projected increase to 4.9 million by 2025 [2]. In response to this, the National Health Service Diabetes Prevention Programme (NHS-DPP) was launched by Public Health England in 2016. This is a behavioural intervention for adults with elevated blood glucose levels, or non-diabetic hyperglycaemia, to slow or stop their progression to type 2 diabetes [3]. The NHS-DPP is one of several diabetes prevention programmes internationally, which have prevented progression from non-diabetic hyperglycaemia to type 2 diabetes in attendees [4, 5]. Early results suggest that the NHS-DPP has been effective in achieving these outcomes [6]. During the first three waves of implementation of the programme in 2016–2019, the NHS-DPP was delivered by four independent provider organisations who each secured contracts to deliver the service in localities across England [7].

The main aims of the programme were to bring about weight loss and reduced blood glucose levels through delivery of behaviour change techniques (BCTs) to change dietary and physical activity behaviours [8]. BCTs are the ‘active ingredients’ of interventions that produce behavioural change in individuals [9]. Nineteen core BCTs were stipulated for inclusion in diabetes prevention programmes by an evidence review underpinning the NHS-DPP intervention [25]. The majority of these BCTs were designed to improve self-regulation of behaviours [10], through prompting goal setting, action planning, problem solving and self-monitoring. NHS England specified that staff employed by providers should be sufficiently trained in the delivery of the service and behaviour change content [8].

A fidelity assessment of the NHS-DPP is critical to establish reasons for its effectiveness, or lack thereof, and whether its benefits are comparable to the published trials in reducing the onset of type 2 diabetes. Intervention fidelity describes whether an intervention was delivered as intended [11]. Without adequate assessment of fidelity, it cannot be ascertained whether intervention effectiveness is due to the intervention being implemented as planned, or if it is due to other factors added to or omitted from the intervention [12]. The National Institutes of Health Behaviour Change Consortium (NIH-BCC) model defines five domains of assessing fidelity at each stage of an intervention. These are study design (whether the planned intervention is in line with underlying theory), provider training (whether deliverers are trained in key components of the planned intervention), treatment delivery (whether the intervention’s key components are actually delivered), treatment receipt (whether recipients understand the intervention) and treatment enactment (whether recipients incorporate the key components of the intervention in their day-to-day lives) [11].

Previous fidelity assessments have mainly focused on fidelity of delivery. For example, in trials of complex healthcare interventions, treatment delivery is the most common component of fidelity measured, as reported by 96% of surveyed researchers [13]. Further, in routine practice as opposed to research studies, fidelity evaluations of health interventions are less common, but still have a focus on fidelity of delivery (e.g. the NHS England Stop Smoking Service [14]). However, if the other domains of the NIH-BCC model are not accounted for, evaluators should have less confidence in drawing accurate conclusions about what happened in an intervention and why. For example, if staff training for a programme is poor, it is likely that the delivery of the programme will also be poor, which should subsequently lead to poor receipt of the intervention [15]. Without a fidelity assessment of training, evaluators will be less certain about the solutions to enhance fidelity of delivery. For that reason, researchers recommend encompassing the whole fidelity model, and where feasible, considering each of these five domains of fidelity [15].

To date, a handful of studies have assessed training fidelity in various settings, including dementia care [16, 17], ‘train the trainer’ model in suicide prevention [18] and a physical therapy intervention [19]. In relation to health-related behaviours specifically, a systematic review evaluated fidelity assessments in physical activity interventions and found only two studies reported assessment of training fidelity, but noted a lack of clear distinction between fidelity of training and fidelity of delivery [20]. Other work has looked at how to optimise staff training using fidelity strategies in interventions to increase physical activity in those with type 2 diabetes [21, 22], though this work did not include an assessment of training fidelity specifically.

Adequate training of intervention deliverers provides them with information about the theory underpinning the programme and the necessary skills required for the intervention, therefore ensuring competence across deliverers [23]. However, this includes not only providing intervention deliverers with adequate knowledge, but also showing them how to deliver key intervention features (‘showing how’) and allowing them to demonstrate these new skills by ‘doing’ [24]. Thus, when staff are being trained in delivering behaviour change content, as is the case in the NHS-DPP, this model [24] suggests that they should not only be told which BCTs to deliver, but also shown how to deliver BCTs for specific activities and given the opportunity to practice BCT delivery.

In line with the NIH-BCC guidance [11], the authors of the current paper have previously evaluated fidelity of providers’ planned BCTs to BCTs specified in the evidence base [8, 25]. This evaluation of the NHS-DPP study design found that providers planned to deliver 74% of the BCTs that an evidence review indicated [26]. There are no studies assessing fidelity of staff training in diabetes prevention programmes internationally, despite trials being implemented in multiple countries [4, 5, 27,28,29]. For the NHS-DPP in England, a formal fidelity assessment was not conducted in the previous evaluation of the pilot in 2015 or subsequent phased roll-out in 2016 [30]. Further, to the authors’ knowledge, there are no other formal fidelity assessments of staff training in multi-site public health interventions, and there are none that focus on training behaviour change techniques. A fidelity assessment of staff training of BCTs in the NHS-DPP is important because if lack of fidelity in the delivery of the programme is detected, it needs to be clear whether this is due to ineffective training or other contextual factors in the delivery of the intervention.

The aims of the current study were to (1) describe the behaviour change content of staff training across the four provider organisations delivering the NHS-DPP, (2) evaluate fidelity of BCT training to the four providers’ intervention plans, and (3) describe the depth of BCT training and the delivery style of staff training. This paper provides a unique and comprehensive evaluation of the fidelity of BCTs that staff were trained to deliver in the NHS-DPP compared to providers’ intervention plans.

Methods

Design and Participants

An observational study comparing the mandatory staff training from each of the four NHS-DPP providers to those providers’ intervention plans. The four NHS-DPP providers were commercial organisations who each secured contracts to deliver the NHS-DPP in localities across England in 2018–2019.

Trainers were employed by each of the four providers or the intervention developers and delivered the training to newly appointed facilitators employed by each of the providers via face-to-face group training courses. Trainee facilitators were required to attend mandatory training courses on the group delivery of the NHS-DPP as part of their induction to the job role before they were allocated their own groups to deliver in the field. The appointed trainee facilitators were not health professionals, but they came from backgrounds including nutrition, personal training and public health.

Procedures

Researchers attended one set of mandatory staff training for each of the four providers between February and December 2018. The training sampled was based on the timing of training of each NHS-DPP provider who were recruiting new staff and delivering staff training at the time of the evaluation (2018–2019). The four provider training courses were observed in four different geographical areas. Written informed consent was obtained from participants prior to the training session starting and prior to researchers turning on the audio-recorder. Participants consented to researchers taking notes on the content of the training sessions.

An audio-recorder was placed next to the trainer at the front of the room to capture all training content, including BCTs, delivered during the training sessions, and a new file was used for each 30–120-min session throughout the day.

Researchers requested the pre-course reading materials supplied to trainees from the management staff employed by each of the providers with whom researchers were in contact with. Such documentation was either sent via e-mail or hard copies were posted to the research team. The research received ethical approval by the North West – Greater Manchester East NHS Research Ethics Committee on 1 August 2017 (Reference: 17/NW/0426).

Materials

Documents detailing providers’ intervention plans were obtained prior to researchers observing the NHS-DPP staff training sessions. These consisted of the following for each provider:

  1. (a)

    Framework response bids describing the proposed service delivery which were submitted by providers during service procurement

  2. (b)

    Programme manuals containing a session-by-session protocol for facilitators to follow when delivering the programme.

Assessment of training content consisted of the following for each provider:

  1. (a)

    Audio-recordings of NHS-DPP staff training courses (n = 47 audio recordings captured across 13 training days observed across all four providers)

  2. (b)

    Additional researcher field notes written during each training session, capturing any other notable observations such as other training content covered (e.g. training of group facilitation behaviours, group management) and delivery style of training (e.g. the use of educational materials, role play, the general rapport and interaction between trainers and trainees, and the types of discussions covered during the training)

  3. (c)

    All pre-course reading materials that were distributed to trainees prior to each of the training courses, e.g. pre-training handbooks, journal articles

Initial assessments (a consultation which service users attended prior to being enrolled onto the NHS-DPP group sessions) were not part of the formal behaviour change intervention as they determined eligibility for the group sessions [8], and not all providers trained staff in how to deliver initial assessments (as sometimes it was sub-contracted out to another healthcare professional). Therefore, we did not include the initial assessment protocols within the main fidelity analysis, although sensitivity analyses were conducted and detailed in the results.

Analyses

BCT coding used the Behaviour Change Technique Taxonomy v1 (BCTTv1; [9]), defining 93 distinct BCTs. Coding was documented using author-developed data collection forms (see Electronic Supplementary Material 1 for the data collection checklist used in the staff training observations and Electronic Supplementary Material 2 for BCT coding instructions). Researchers underwent training in the use of the BCTTv1 [32] and a set of coding rules were developed through team discussions following guidance from taxonomy authors. Coding rules were based on those previously used to code providers’ intervention plans [26].

Intervention plans comprised each provider’s programme manuals and framework response bid combined, as these documents gave the most comprehensive description of the BCTs that providers planned to include in their programmes. BCTs identified in the NHS-DPP intervention plans are reported elsewhere [26]. These ‘design’ criteria were compared with the staff training of BCTs identified in the audio-recordings and associated training materials. Assessment of the BCTs in the NHS-DPP intervention plans demonstrated moderate to strong agreement between coders [31] (kappa values ranged from 0.75 to 0.88; [26]).

Researcher REH independently coded all training materials that were supplied by each provider and the audio-recordings of each providers’ staff training courses for BCTs that staff were trained to deliver. A new instance of a BCT was coded when a new intervention activity was described or if a different health behaviour (e.g. diet, physical activity) was targeted. The level of target behaviour was also documented when coding the BCT ‘information about health consequences’ (e.g. levels of the target behaviour ‘diet’ included information about carbohydrates, fats, sugar, etc.) as researchers felt these were distinct pieces of information targeting distinct behaviours.

Researcher LMM double-coded 10% of the audio-recordings of staff training sessions (n = 5 audio recordings from training courses). Interrater reliability (IRR) was calculated using the kappa statistic to determine consistency between coders [31]. Identified coding discrepancies were discussed between REH, LMM and DPF until agreement was met.

The depth of BCT training was also coded to capture the varying methods in which staff were trained in specific BCTs. One of the following labels was given to each of the BCTs coded in the face-to-face staff training sessions:

  1. (a)

    Informed about BCT (e.g. background reading and providing background information about the BCT)

  2. (b)

    Directed to deliver BCT (e.g. the BCT is mentioned or referred to when describing an intervention activity)

  3. (c)

    Instructed how to deliver BCT (trainer delivers instruction on how the BCT can be delivered in group sessions)

  4. (d)

    Demonstrated how to deliver BCT (trainer demonstrates the delivery of the BCT; e.g. demonstrates how to conduct a problem solving activity)

  5. (e)

    Practiced how to deliver BCT (trainees practice delivering BCT, e.g. trainees practice delivering a problem solving activity)

  6. (f)

    Modelled how to deliver BCT (the training delivery models the intervention delivery so trainees could experience the BCT from the patients’ perspective, e.g. role play of a problem solving activity so trainees can experience participating in this activity)

To assess extent of staff training fidelity to intervention plans, the BCTs in training courses and pre-course reading were compared to the BCTs present in each of the four providers’ intervention plans (programme manuals and framework response documents). The proportion of additional BCTs that staff were trained in but were not specified in provider’s intervention plans was also calculated.

Results

Study Sample

Each provider had a different number of mandatory training courses that staff were required to attend, which lasted between 2 and 5 days depending on the provider delivering the training. The final sample of NHS-DPP staff training consisted of seven mandatory training courses (two training courses for provider A, three training courses for provider B, one each for providers C and D). All attending trainers (n = 10) and trainees (n = 78) consented to the researchers attending, observing and audio-recording the NHS-DPP staff training courses. Both trainers and trainees had a diversity of backgrounds (see Table 1).

Table 1 Number of mandatory training courses and participants consented for each provider

BCTs Present in Staff Training

Staff were trained in a total of 53 unique BCTs across the four providers. Each provider trained staff on between 19 and 44 unique BCTs in their face-to-face training and accompanying pre-course reading. Kappa values ranged from 0.61 to 0.80 for the staff training sessions, demonstrating moderate to strong agreement between coders [31], prior to resolving discrepancies (see Table S1 in Electronic Supplementary Material 3 displaying IRR values for providers’ intervention plans and Table S2 in Electronic Supplementary Material 3 displaying IRR values for the staff training sessions).

In the face-to-face training courses, the number of distinct instances of training in the use of specified BCTs that occurred were 72, 207, 292 and 57 times, respectively, for providers A, B, C and D. For three of the providers, staff were most commonly trained in the BCT ‘Information about health consequences’, trained 23 times with provider A, 73 times with provider B and 45 times with provider C. For provider D, staff were most commonly trained in the BCT ‘Social support (unspecified)’, trained 11 times, followed by ‘Information about health consequences’, trained 10 times.

Overall, there were 11 BCTs in which all staff were trained across providers: action planning, behavioural practice, behavioural substitution, goal setting for behaviours, goal setting for outcomes, information about health consequences, information about emotional consequences, problem solving, self-monitoring of behaviours, self-monitoring of outcomes and unspecified social support (see Electronic Supplementary Material 4 for BCT definitions, according to the BCTTv1; [9]). Ten of these BCTs had been recommended for inclusion in intervention delivery in the NHS commissioning specification [8] or public health guidelines [25].

Fidelity of Trained BCTs to Intervention Design

The BCTs present in the staff training of each provider (pre-course reading and training courses) were compared to the planned BCTs in each providers’ intervention plans (framework responses and programme manuals; see Table 2). Sixty unique BCTs were specified in the intervention plan documents across all four providers. Providers B and C had the highest fidelity of BCTs delivered in staff training (81.6% and 85.1%, respectively), whereas providers A and D only trained staff in approximately half of BCTs they were planning to deliver (46.3% and 51.3%, respectively). Overall, fidelity of the intervention training to the planned intervention was 66.1%.

Table 2 Unique BCTs specified in providers’ intervention plans compared to BCTs present in each providers’ staff training

The number of BCTs omitted in staff training varied between providers; 22, 7, 7 and 19 BCTs were omitted in the staff training that were present in the intervention plans of providers A–D, respectively. There were also some additional BCTs delivered in the training which were not present in providers’ intervention plans, ranging from no additional BCTs in provider A’s training to five additional BCTs delivered in provider B’s training (see Table 2). Electronic Supplementary Material 5 shows some sensitivity analyses, comparing trained BCTs to providers’ individual framework response documents and programme manuals separately. There were minimal differences in fidelity scores between framework response and programme manual for providers B and C. Providers A and D had higher fidelity to their programme manuals.

Depth of BCT Training

For each provider, training for the majority of BCTs was face-to-face, although provider D trained 23% of BCTs in materials alone. Across all four providers, the most common way that staff were trained in BCTs during the face-to-face courses was by instructing them how to deliver a BCT (see Table 3). However, there were some BCTs which staff were only either ‘informed about’ or ‘directed to deliver’, but with no further elaboration on how these BCTs should be delivered in a group session. Staff were only ‘informed about’ or ‘directed to deliver’ a total of three (15.8%), five (14.7%), six (14.6%) and two (11.8%) unique BCTs with providers A–D, respectively. This included some self-regulatory BCTs, for example, provider A only directed staff to deliver ‘action planning’ and providers B and D only informed or directed trainees about ‘self-monitoring of behaviour’, with no further training on how to deliver these BCTs. Tables S7–S10 in Electronic Supplementary Material 6 provides a further breakdown of which behaviours were targeted for each BCT trained by each of the providers, and the depth of training for each BCT targeting each health behaviour.

Table 3 Number of BCTs trained and depth of BCT training across each providers’ face-to-face training courses

 Despite providers A and D training staff in fewer BCTs overall in their face-to-face training (19 and 17 unique BCTs, respectively), a higher proportion of trained unique BCTs were practiced by trainees and modelled actual delivery (i.e. trainers delivered training using a desired BCT so trainees could experience the BCT from the patients’ perspective. For example, trainer might do a problem solving activity around how to manage difficult conversations so trainees can experience participating in a problem solving activity), compared to other providers (see Table 3). A summary of key BCTs that staff were trained in across providers’ face-to-face courses is shown in Table 4.

Table 4 Summary of key BCTs that staff were trained in during face-to-face courses across providers

Delivery Style of Staff Training

The key characteristics of each providers’ staff training courses is summarised in Table 5. There was variation in the way each provider delivered their staff training. For example, provider A’s training was more self-directed, where trainees role-played sessions described in the manual and received feedback from the trainers. Provider B delivered training which followed an educational format (e.g. use of PowerPoint and trainees taking notes). Provider C’s training was experiential and focused on the delivery of sessions in the manual to trainees with the trainer providing instruction on how each activity should be delivered, and provider D took a more informal approach, in which trainees had input on areas in which they felt they needed more training. Pre-course reading documents also varied between provider, which included journal articles, pre-training handbooks, and reading to supplement the programme manual.

Table 5 Summary of key characteristics of provider training in the NHS-DPP

All providers encouraged trainees to role-play some aspects of delivery, but some providers had more emphasis on this than others did. Two providers had a particular emphasis on training group facilitation behaviours (e.g. open listening, empathising and group management).

Discussion

Overall, providers trained staff in 66% of BCTs present in their NHS-DPP intervention plans. The current research team’s previous fidelity evaluation comparing BCTs specified in providers’ intervention plans to BCTs specified in the evidence base showed that providers planned to deliver 74% of BCTs [26]. Thus, a drift in fidelity from the NHS-DPP design to the training is evident. Fidelity was notably higher for two providers in comparison to the other two providers who only trained staff in approximately half of BCTs they had planned to deliver.

Despite variation across providers in their training delivery style, all four providers did train staff in 11 common BCTs, the majority of which were self-regulatory. That is, BCTs designed to help individuals to take control of their behaviour such as goal setting, self-monitoring and problem solving. Such BCTs have the strongest evidence for effectiveness in behaviour change [10], and the evidence review underpinning the NHS-DPP stated these BCTs should be embedded in the programme [25]. It is encouraging that all four providers trained their staff in these BCTs. However, three of the self-regulatory BCTs were only directed to be delivered without demonstration to trainees or the opportunity for them to practice delivering the BCT themselves, which in turn would increase their capability of delivering these techniques [24].

What This Study Adds to the Literature

To our knowledge, this study is the first thorough examination of fidelity of staff training of BCTs to the intervention design for any diabetes prevention programme in the world. The most comparable research in terms of staff training of behaviour change interventions is research from the English Stop Smoking Service, which found that the face-to-face skills training course for stop smoking practitioners appeared to increase trainees’ confidence in delivering smoking cessation support, including the delivery of BCTs for behavioural support [33]. Further, research found that service users in the Stop Smoking Service were more likely to have quit smoking if their practitioner had completed the relevant staff training, thus highlighting the importance of evidence-based training for staff delivering behaviour change programmes [34].

The only other study to date that has applied the BCTTv1 [9] to staff training courses is research which evaluated the use of BCTs in continued professional development courses for medical staff [35]. In this study, researchers coded BCTs delivered in training courses to change health professional practice behaviours [35]. However, the current study assessed whether NHS-DPP trainee facilitators were trained in the delivery of BCTs to participants attending diabetes prevention group sessions.

Implications for Practice

Results from the current study highlighted that fidelity of BCTs in the NHS-DPP staff training to the intervention plans was 66% across the four providers, though two providers only trained staff in approximately half of the BCTs in their intervention plans. Providers should review their training to ensure staff are trained in all key components of their planned intervention designs; if the training does not include the key behaviour change components, then it is likely that these key components will also be missing in the delivery of the intervention [15]. When interpreting effectiveness of the NHS-DPP, it must be taken into account that training varies across providers and staff may be trained in only half of the planned BCTs. The findings reported here will be useful when considering the lack of fidelity of delivery of the programme that our subsequent research has identified [36, 37], in terms of whether deficiencies in training appear to impact on delivery across providers.

Our observations suggested that the most common way in which providers trained their staff was by instructing them how to deliver BCTs, sometimes without demonstration of how to deliver these BCTs in group settings or the practice of BCT delivery. The importance of role-play in staff training has been emphasised as a way to assess skill acquisition [12]. For example, previous research demonstrated that training for a walking intervention that involved role-play with feedback and competency assessments resulted in 80% fidelity of delivery in primary care; this was high fidelity in comparison to previous interventions [38]. Further, a systematic review and meta-analysis found that high-quality staff training improved health outcomes in behaviour change interventions, especially in training that included a combination of educational and practical activities, rather than educational components alone [39]. Thus, providers could further review their training to ensure that staff are trained sufficiently thoroughly so that trainees are clear on how exactly to deliver particular BCTs for different activities. Providers should therefore allocate enough training days or sessions to deliver comprehensive training of the use of BCTs specifically, especially as trainees may come from a range of backgrounds with varying experience in the delivery of BCTs.

Implications for Research

This study is the first known assessment of BCT content and the first thorough fidelity evaluation of a national diabetes prevention programme in the world. The paper extends on previous fidelity research which to date has focused more on evaluating fidelity of intervention delivery [14]. The author-developed framework for assessing the depth of BCT training could be used in future evaluations to determine the comprehensiveness of BCT training delivered to staff and may help to identify gaps in behaviour change content which could be trained more thoroughly.

The current evaluation did highlight that providers who trained their staff in fewer BCTs, and subsequently had a lower fidelity of trained BCTs to intervention plans, did train their staff in a higher proportion of BCTs in more depth (e.g. role playing BCT delivery rather than just instructing staff to deliver a BCT) compared to providers who demonstrated higher fidelity. Future research could assess whether the depth of BCT training has an impact on: (a) the fidelity of delivery of BCTs in the field, and (b) the overall effectiveness of the intervention, especially for self-regulatory BCTs in which there is the most evidence for their effectiveness in changing health behaviours [10]. Such research may establish whether providing in-depth training on how to deliver BCTs (such as demonstration and practice) would increase the actual delivery of BCTs in the field, and whether this subsequently has an impact on the overall effectiveness of a programme.

In addition to the assessment of trained BCTs, the training of facilitation skills such as active listening, empathising and group management are also important for effective delivery of group intervention sessions [40], which may have a subsequent impact on group rapport and retention of service users on the programme. Training in facilitation skills was observed across provider training in the NHS-DPP, though some providers placed more emphasis on this than others did. It was beyond the scope of the current evaluation to assess fidelity of staff competencies other than the behaviour change content of the NHS-DPP. However, future research could further assess fidelity of trained group facilitation behaviours and the impact this has on the delivery and outcomes of an intervention.

Strengths and Limitations

This fidelity analysis used a standardised BCT framework [9] and obtained all relevant documentation (e.g. pre-course reading materials of all mandatory staff training) to complete the analysis. The use of audio recordings to capture staff training content is considered a ‘gold standard’ for fidelity evaluations [12], and authors have demonstrated that it is a reliable method for assessing fidelity of BCTs in staff training as external evaluators. This study is one of the first fidelity evaluations of a national programme, and to the authors’ knowledge, one of the only studies to assess fidelity of the staff training to the intervention design with a focus on behaviour change content. Further, researchers developed a coding framework to assess the depth in which staff were trained in BCTs; to our knowledge, this is the first study to assess the depth of BCT training.

Despite the merits of the current study, researchers were only able to observe one set of core training courses for each provider. Authors do not know the extent to which the same results would have been obtained if a different set of training courses were observed. The staff training for each provider was observed in four different geographical locations across England, obtaining as diverse and varied of sample of training as was feasible. However, authors cannot be sure whether providers selected training courses and sites based on what they thought would represent their best training courses; if this is the case, there may be a bias towards observing the ‘better’ training courses.

Further, authors did not observe any ‘top-up’ training courses or other forms of continued professional development due to the time and resources required for intensive observation and the variation in the types of further training courses that were delivered across providers. However, the core training courses observed across providers were mandatory training in order for NHS-DPP facilitators to deliver the programme in the field, and NHS-DPP facilitators were delivering group sessions on the basis of this core training alone, therefore offering the best representation of training that all staff across the NHS-DPP must have received.

Conclusions

This fidelity analysis found that overall providers trained their staff in 66% of the BCTs present in their intervention plans. The research team’s previous document analysis of the NHS-DPP design, which compared BCTs specified in providers’ intervention plans to the BCTs specified in the underlying evidence base, yielded 74% fidelity of BCTs [26]. Thus, a drift in fidelity from the intervention design to the training stage is evident, and may result in a further dilution in fidelity of the delivery of BCTs in the NHS-DPP. Given that BCTs are the ‘active ingredients’ that can produce behaviour change in individuals, it is vital that staff are adequately trained in how to deliver these techniques in group settings encouraging lifestyle behaviour change. Further, our results suggest that providers may need to incorporate more comprehensive BCT training into their core training courses to ensure that trainee staff are not only told which BCTs should be delivered in the NHS-DPP, but shown how to deliver these BCTs for various group activities and given the opportunity to practice BCT delivery during their training courses.

References

  1. 1.

    World Health Organization. Global report on diabetes. Geneva: World Health Organization; 2016.

    Google Scholar 

  2. 2.

    National Cardiovascular Intelligence Network (NCIN). Diabetes prevalence model for local authorities and CCGs. London: Public Health England. 2015. https://www.gov.uk/government/publications/diabetes-prevalence-estimates-for-local-populations

  3. 3.

    NHS England. “NHS Diabetes Prevention Programme (NHS DPP).” 2017. https://www.england.nhs.uk/diabetes/diabetes-prevention/2017/

  4. 4.

    Tuomilehto J, Lindström J, Eriksson JG, et al. Prevention of type 2 diabetes mellitus by changes in lifestyle among subjects with impaired glucose tolerance. New Eng J Med. 2001;344(18):1343–50.

    CAS  Article  Google Scholar 

  5. 5.

    Knowler WC, Barrett-Connor E, Fowler SE, et al. Reduction in the incidence of type 2 diabetes with lifestyle intervention or metformin. New Eng J Med. 2002;346(6):393–403.

    CAS  Article  Google Scholar 

  6. 6.

    Valabhji J, Barron E, Bradley D, et al. Early outcomes from the English National Health Service Diabetes Prevention Programme. Diabetes Care. 2020;43(1):152–60.

    Article  Google Scholar 

  7. 7.

    Hawkes RE, Cameron E, Cotterill S, Bower P, French DP. The NHS Diabetes Prevention Programme: An observational study of service delivery and patient experience. BMC Health Serv Res. 2020;20(1):1–12.

    Article  Google Scholar 

  8. 8.

    NHS England. Service Specification No. 1: Provision of behavioural interventions for people with non-diabetic hyperglycaemia. [Version 01]. March 2016. https://www.england.nhs.uk/wp-content/uploads/2016/08/dpp-service-spec-aug16.pdf

  9. 9.

    Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Annals Behav Med. 2013;46(1):81–95.

    Article  Google Scholar 

  10. 10.

    Michie S, Abraham C, Whittington C, McAteer J, Gupta S. Effective techniques in healthy eating and physical activity interventions: a meta-regression. Health Psychol. 2009;28(6):690–701.

    Article  Google Scholar 

  11. 11.

    Bellg AJ, Borrelli B, Resnick B, et al. Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol. 2004;23(5):443–51.

    Article  Google Scholar 

  12. 12.

    Borrelli B. The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Public Health Dent. 2011;71:S52–63.

    Article  Google Scholar 

  13. 13.

    McGee D, Lorencatto F, Matvienko-Sikar K, Toomey E. Surveying knowledge, practice and attitudes towards intervention fidelity within trials of complex healthcare interventions. Trials. 2018;19(1):504.

    Article  Google Scholar 

  14. 14.

    Lorencatto F, West R, Christopherson C, Michie S. Assessing fidelity of delivery of smoking cessation behavioural support in practice. Implement Sci. 2013;8(1):40.

    Article  Google Scholar 

  15. 15.

    Toomey E, Hardeman W, Hankonen N, et al. Focusing on fidelity: narrative review and recommendations for improving intervention fidelity within trials of health behaviour change interventions. Health Psychol Behav Med. 2020;8(1):132–51.

    Article  Google Scholar 

  16. 16.

    Fletcher S, Zimmerman S, Preisser JS, et al. Implementation fidelity of a standardized dementia care training program across multiple trainers and settings. Alzheimer’s Care Today. 2010;11(1):51–60.

    Google Scholar 

  17. 17.

    Teri L, McKenzie GL, Pike KC, et al. Staff training in assisted living: evaluating treatment fidelity. Am J Geriatr Psychiatry. 2010;18(6):502–9.

    Article  Google Scholar 

  18. 18.

    Cross WF, Pisani AR, Schmeelk-Cone K, et al. Measuring trainer fidelity in the transfer of suicide prevention training. Crisis. 2014;35:202–12.

    Article  Google Scholar 

  19. 19.

    Hurley DA, Keogh A, Mc Ardle D, et al. Evaluation of an E-learning training program to support implementation of a group-based, theory-driven, self-management intervention for osteoarthritis and low-back pain: pre-post study. J Med Internet Res. 2019;21(3):e11123.

    Article  Google Scholar 

  20. 20.

    Lambert JD, Greaves CJ, Farrand P, Cross R, Haase AM, Taylor AH. Assessment of fidelity in individual level behaviour change interventions promoting physical activity among adults: a systematic review. BMC Public Health. 2017;17(1):765.

    Article  Google Scholar 

  21. 21.

    Avery L, Sniehotta FF, Denton SJ, Steen N, McColl E, Taylor R, Trenell MI. Movement as medicine for type 2 diabetes: protocol for an open pilot study and external pilot clustered randomised controlled trial to assess acceptability, feasibility and fidelity of a multifaceted behavioural intervention targeting physical activity in primary care. Trials. 2014;15(1):46.

    Article  Google Scholar 

  22. 22.

    Avery L, Charman SJ, Taylor L, Flynn D, Mosely K, Speight J, Lievesley M, Taylor R, Sniehotta FF, Trenell MI. Systematic development of a theory-informed multifaceted behavioural intervention to increase physical activity of adults with type 2 diabetes in routine primary care: movement as medicine for type 2 diabetes. Implement Sci. 2015;11(1):99.

    Article  Google Scholar 

  23. 23.

    Horner S, Rew L, Torres R. Enhancing intervention fidelity: a means of strengthening study impact. J Spec Pediatr Nurs. 2006;11(2):80–9.

    Article  Google Scholar 

  24. 24.

    Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9):S63–7.

    CAS  Article  Google Scholar 

  25. 25.

    National Institute for Health and Care Excellence (NICE). PH38 Type 2 diabetes: Prevention in people at high risk. London: National Institute for Health and Care Excellence. 2012. (Updated September 2017). https://www.nice.org.uk/guidance/ph38/resources/type-2-diabetes-prevention-in-people-at-high-risk-pdf-1996304192197

  26. 26.

    Hawkes RE, Cameron E, Bower P, French DP. Does the design of the NHS Diabetes Prevention Programme intervention have fidelity to the programme specification? A document analysis Diabet Med. 2020;37(8):1357–66.

    CAS  Article  Google Scholar 

  27. 27.

    Kosaka K, Noda M, Kuzuya T. Prevention of type 2 diabetes by lifestyle intervention: a Japanese trial in IGT males. Diabetes Res Clin Pract. 2005;67(2):152–62.

    Article  Google Scholar 

  28. 28.

    Pan XR, Li GW, Hu YH, et al. Effects of diet and exercise in preventing NIDDM in people with impaired glucose tolerance: The Da Qing IGT and Diabetes Study. Diabetes Care. 1997;20(4):537–44.

    CAS  Article  Google Scholar 

  29. 29.

    Ramachandran A, Snehalatha C, Mary S, Mukesh B, Bhaskar AD, Vijay V. The Indian Diabetes Prevention Programme shows that lifestyle modification and metformin prevent type 2 diabetes in Asian Indian subjects with impaired glucose tolerance (IDPP-1). Diabetologia. 2006;49(2):289–97.

    CAS  Article  Google Scholar 

  30. 30.

    Penn L, Rodrigues A, Haste A, et al. NHS Diabetes Prevention Programme in England: formative evaluation of the programme in early phase implementation. BMJ Open. 2018;8(2):e019467.

    Article  Google Scholar 

  31. 31.

    McHugh ML. Interrater reliability: the kappa statistic. Biochem med: Biochem med. 2012;22(3):276–82.

    Article  Google Scholar 

  32. 32.

    BCTTv1 Online Training. (nd). https://www.bct-taxonomy.com/

  33. 33.

    Brose LS, West R, Michie S, McEwen A. Evaluation of face-to-face courses in behavioural support for stop smoking practitioners. J Smok Cessat. 2012;7(1):25–30.

    Article  Google Scholar 

  34. 34.

    Brose LS, McEwen A, Michie S, West R, Chew XY, Lorencatto F. Treatment manuals, training and successful provision of stop smoking behavioural support. Behav Res Ther. 2015;71:34–9.

    Article  Google Scholar 

  35. 35.

    Pearson E, Byrne-Davis L, Bull E, Hart J. Behavior change techniques in health professional training: developing a coding tool. Transl Behav Med. 2020;10(1):96–102.

    PubMed  Google Scholar 

  36. 36.

    French DP, Hawkes RE, Bower P, Cameron E. Is the NHS Diabetes Prevention Programme intervention delivered as planned? Ann Behav Med: An observational study of intervention delivery; 2021. [In Press].

    Google Scholar 

  37. 37.

    Hawkes RE, Warren L, Cameron E, French DP. An Evaluation of Goal Setting in the NHS England Diabetes Prevention Programme. Psychol Health. 2021. https://doi.org/10.1080/08870446.2021.1872790.

    Article  PubMed  Google Scholar 

  38. 38.

    Williams SL, McSharry J, Taylor C, Dale J, Michie S, French DP. Translating a walking intervention for health professional delivery within primary care: a mixed-methods treatment fidelity assessment. Br J Health Psychol. 2020;25(1):17–38.

    Article  Google Scholar 

  39. 39.

    Hatfield TG, Withers TM, Greaves CJ. Systematic review on the effect of training interventions to improve the skills of health professionals in promoting health behaviour change, with meta-analysis of behavioural outcomes. BMC Health Serv Res. 2020;20:593.

    Article  Google Scholar 

  40. 40.

    NHS England. “The Facilitator’s Toolkit.” 2017. https://www.england.nhs.uk/improvement-hub/wp-content/uploads/sites/44/2017/11/Facilitator-Toolkit.pdf

Download references

Acknowledgments

This work is independent research funded by the National Institute for Health Research (Health Services and Delivery Research, 16/48/07 – Evaluating the NHS Diabetes Prevention Programme (NHS DPP): the DIPLOMA research programme (Diabetes Prevention – Long Term Multimethod Assessment)). The views and opinions expressed in this manuscript are those of the authors and do not necessarily reflect those of the National Institute for Health Research or the Department of Health and Social Care. We would like to thank the NHS-DPP providers for assisting in the organisation of observations at each of the staff training sessions, and we are grateful to all the trainers and attendees who consented to observations of NHS-DPP staff training sessions. We would also like to thank Peter Bower and Sarah Cotterill from the DIPLOMA team who provided valuable feedback during the manuscript preparation.

Author information

Affiliations

Authors

Corresponding author

Correspondence to David P. French.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 38 KB)

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hawkes, R.E., Cameron, E., Miles, L.M. et al. The Fidelity of Training in Behaviour Change Techniques to Intervention Design in a National Diabetes Prevention Programme. Int.J. Behav. Med. (2021). https://doi.org/10.1007/s12529-021-09961-5

Download citation

Keywords

  • Behaviour change techniques
  • Fidelity
  • Staff training
  • Diabetes prevention
  • Type 2 diabetes