Introduction

The translation of research evidence into health practice and policy relies on healthcare organisations and systems having sufficient research capacity and capability [1,2,3]. Health organisation executives and policymakers globally, recognise the need to invest in research capacity building (RCB) initiatives and interventions that are delivered in healthcare settings [2,3,4]. RCB strategies encompass a range of initiatives designed to promote individual, team and organisation research skills, competence and to influence attitudes towards research [2, 5,6,7]. Initiatives designed to build individual and organisational research capacity may include education and training programs, funding for embedded researchers (e.g., fellowships, scholarships) and other research support roles (e.g., research librarians, knowledge-brokers), strategic collaborations with academic partners and developing research infrastructure [2, 6, 8]. RCB strategies often comprise a combination of the aforementioned approaches [8] and notably, research education and training programs are a sustaining feature of many [2, 3, 6, 8,9,10,11]. This is likely related to the insufficient coverage of research in undergraduate health curricula and the need for supplementary education to fill research knowledge and skill gaps, particularly for non-medically trained healthcare professionals. Medically trained healthcare professionals typically have a greater inclination toward and engagement in research than their nurse and allied health counterparts [4, 8, 12, 13]. Given that nursing and allied health form the majority of the health workforce [14, 15], there is increasing interest in RCB strategies that target nurses and allied health professionals to enhance the delivery of evidence-informed care across all healthcare settings and services [8, 16,17,18]. Allied health comprises a range of autonomous healthcare professions including physiotherapy, social work, podiatry, and occupational therapy [16].

This review was commissioned by an academic health science centre in Australia, to inform the research education and training component of its health organisation RCB strategy. Given the typically multidimensional nature of RCB strategies, their functions and impacts at the various levels are inextricably related [2, 5]. This makes the discernment between research education and training interventions and other elements of strategies a fraught endeavour. For example, embedded researchers may form part of a broader organisational RCB strategy, and in the scope of their work, may perform an ad hoc education function (e.g., through their interactions with novice researchers) [11, 19]. Aligning with the purpose of this work, this review defines research education and training programs as organised initiatives or interventions that are either discrete (e.g., standalone workshops or research days) or longer in their duration (e.g., research courses or a series of workshops or lectures) wherein curriculum is developed and shared with multiple individuals or participants, with a view to develop and apply research skills [2, 5]. Healthcare settings are considered those wherein the provision of healthcare is considered core business (e.g., hospitals, community-based health services, cancer care services, family medicine clinics) and is therefore the setting in which research evidence needs to be applied or translated to reduce the gap between research knowledge and practice [2, 20].

An initial search of Cochrane Database of Systematic Reviews, Joanna Briggs Institute’s Evidence Synthesis, PROSPERO, and Google Scholar for reviews of research education and training programs delivered in health settings, yielded no existing or planned reviews. On further cursory review of the RCB and research education literature, and concomitant discussions with four content experts (i.e., educators, academic and clinician researchers concerned with research capacity building), it became apparent that research education programs take different forms, occur in pockets within health organisations across health districts and regions, are not always formally evaluated, and often fail to account for adult learning principles and theories. The decision to conduct a scoping review, rather than a conventional systematic review, was based on three key factors: 1) the heterogeneity evident in research education program characteristics; 2) the absence of an existing synthesis of evidence for research education programs delivered in health settings [5]; and 3) the need to identify the gaps in knowledge about these programs.

This systematic scoping review sought to scope the research education and training programs delivered to nurses and allied health professionals working in health settings and the evidence supporting these approaches. The specific review objectives were to describe the:

  1. 1.

    Types of research education programs delivered in health settings in high-income countries

  2. 2.

    Theoretical or pedagogical principles that underly the programs

  3. 3.

    Approaches to research education program evaluation

  4. 4.

    Types of outcomes reported

Methods

This review used the Joanna Briggs Institute’s (JBI) scoping review methodology. As per the JBI methodology, search terms were developed for Population, Concept and Context (PCC). The review question, objectives, inclusion/exclusion criteria and search strategies were developed and documented in advance (Additional File 1 Scoping Review Protocol). The review is reported in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) extension for scoping reviews (Additional File 2 PRISMA-ScR checklist [21]).

Search strategy

The researchers identified a set of key papers based on their knowledge of contemporary research education programs and in consultation with four content experts from two high-income countries. They used these papers to identify the key search terms. In consultation with the research librarians (SH and HS, see acknowledgements), the research team conducted preliminary scoping searches to test the search terms and strategy (between 3 March – 10 March 2022). These searches informed decisions about final search terms. A tailored search strategy was developed for each academic database (Additional file 3 Search Strategy).

Academic databases searched included PubMed, Ovid MEDLINE, Embase, CINAHL, VOCEDPlus, PEDro, Scopus, ERIC, Informit Health Database, JBI, and Google Scholar. Selected grey literature platforms as determined by our knowledge of relevant websites and organisations, were searched. Where larger search yields were observed (e.g., via Google and Google Scholar), the first 250 items were reviewed, only (Additional file 4 Grey literature search). The final research database searches were conducted between 12 and 15 March 2022 by a researcher with extensive systematic literature searching experience (Author 2) in consultation with a research librarian. Grey literature searches were conducted on 17 March 2022. Searches of the reference lists of included records and forward citation searches were undertaken.

Inclusion criteria and exclusion criteria

Literature was selected according to defined inclusion and exclusion criteria developed using the PCC framework (see Table 1). Research education or capacity building programs delivered to qualified health professionals, working in health settings (excluding programs delivered as part of tertiary study) in high-income countries (HIC) as defined by the Organisation for Economic Co-operation and Development (OECD), were included [22]. The decision to include studies published in HICs only was made with a view to introduce a level of homogeneity around the broader resource contexts of the study populations [23, 24]. No date limits applied, and all types of literature published up to 17 March 2022 were included. Literature published in English only was included, due to resource limitations.

Table 1 Inclusion and exclusion criteria

Study selection, quality appraisal and data extraction

Citations were imported into Covidence (Veritas Health Innovation, Melbourne, Australia) for screening. Titles and abstracts were independently screened by two reviewers initially, with conflicts resolved by a third (independent) reviewer. Similarly, full texts were reviewed by two researchers and the reasons for exclusion were noted (Additional file 5 Excluded studies). Data was extracted from the included texts by five researchers. Formal quality appraisal is not typically undertaken as part of scoping review methodology and was not undertaken for the papers included in this review [25].

Data extracted were tabulated and results were synthesized using a descriptive approach guided by the review objectives as per a scoping review methodology. Outcomes measured and reported in the papers were mapped to the modified Kirkpatrick’s educational outcomes typology [26, 27]. Recognising the complex interactions between individuals, research education programs, organisational and other factors, and the various outcomes produced [2], the modified Kirkpatrick’s typology gives rise to the identification of outcome measures at multiple levels or within these inter-related domains [26].

Results

Of the 207 citations considered for full text screening, 60 met the inclusion criteria and nine additional papers were located through a citation search of the initial set (Fig. 1 PRISMA Flow Diagram) [28].

Fig. 1
figure 1

PRISMA Flow Diagram

Research education program characteristics

When, where and to whom research education programs were delivered

A total of 69 papers, describing 68 research education and training programs were reviewed. The implementation of the programs spanned five decades, with almost half (n = 33) implemented in the most recent decade. Research education programs were delivered in the United States of America (n = 22), Australia (n = 20), the United Kingdom (n = 9), Canada (n = 5), Denmark (n = 2), Qatar (n = 2), and one each in Argentina, Finland, Japan, Italy, Singapore, Sweden, Spain, and The Netherlands. The geographical distribution of programs by country is presented in Fig. 2. Research education programs were targeted and delivered to different healthcare professional groups. Programs were delivered most frequently to nurses and midwives (n = 35), then mixed professional groups (n = 18), allied health (n = 13), and pharmacists (n = 2). The characteristics of included programs are provided in Table 2.

Fig. 2
figure 2

Geographical distribution of research education programs. This image was generated by the authors via Microsoft Excel using the Map function

Table 2 Research education program characteristics

How research education programs were formatted and delivered

Research education programs were delivered in several different formats and over different types of durations. Some were delivered as standalone single study days, workshops or sessions [29,30,31,32,33,34], and others as a series of several short sessions or workshops [35,36,37,38,39,40,41,42,43,44,45]. The majority of papers described integrated research education courses of either a short duration, (i.e., one to 4 months) [46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65], medium duration (i.e., five to 11 months) [9, 66,67,68,69,70,71,72,73,74,75,76], or longer-duration (i.e., 1 year or longer) [77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94].

Programs almost always included a didactic element (e.g., lectures, seminars), delivered by an experienced academic or clinician-researcher (researcher with a primary healthcare qualification; [95]) or an individual with content expertise (e.g., biostatistician [48], librarian [33, 57, 66], ethics committee member [57] or data manager [42]). Most of the programs were multifaceted and included a mix of didactic teaching as well as either group discussion, online teaching (e.g., teleconferences or modules), or the practical application of theoretical principles between education sessions. Several were described as single mode research education programs (e.g., seminars, lectures, or online modules only) [29,30,31, 33, 37,38,39, 46, 48, 49, 53,54,55, 87]. Timing was described as an important consideration in several papers, with an emphasis on minimising impact on participants’ working day or clinical duties. For example, by holding sessions early (8 am) prior to the working day [9, 51] or on weekends [32, 63, 71].

Features and content of research education programs

The curricula or research education content described in the papers reflected the aims of the programs. Program aims were broadly categorised according to the level of intended participants’ research engagement: research use or consumption (n = 28) and research activity (n = 31) [96]. Where the program content focused on searching, retrieving, and appraising research literature, and considering in the context of clinical practice (i.e., evidence-based practice), this was considered engagement at the research user or consumer level. Slightly more programs were concerned with developing research skills to engage in and conduct research activity. These programs included content related to research methods, data collection and analysis techniques, protocol development and ethics application [31, 35, 37, 39, 42, 43, 48, 49, 52, 53, 57, 59, 63, 64, 67, 68, 73, 77,78,79,80,81,82,83,84,85, 90,91,92]. Seven programs were orientated toward developing participants’ skills for research dissemination, typically writing for publication [9, 32, 33, 47, 51, 74] or preparing research posters and seminars [88]. It was assumed that the participants in the programs concerned with writing for publication had already undertaken a research activity and needed further education and support to formally disseminate their findings. Two programs were specifically focused on developing participants’ skills to complete a systematic review [46, 76]. Three programs included content directly related to implementing research in practice [60, 80, 86].

Fourteen programs required that participants had overt support from their manager to participate (e.g., written approval or direct selection of participants) [46, 51, 58, 62, 75, 79,80,81, 83, 85, 91,92,93,94]. Two papers described participants’ departments being actively supportive of their participation in the research education program [59, 86]. One paper referred to managers’ positive role modelling by engaging in the research education program [39] and another described the criteria used to determine the suitability of participants based on their context (i.e., supportive managers who were interested in research and willing to release participating staff for half day each week) [88]. Five papers described manager or leadership support as being a key enabler to participants engaging in the education program [56, 60, 75, 89, 91] and four papers referred explicitly to the lack of organisational, managerial, or collegial support as key limitations to, or a negative influence on participants’ learning experience [49, 77, 84, 88].

Nine papers described the integration of opportunities to acknowledge the achievements of program participants. Opportunities were described as formal events held at the conclusion of the program to celebrate the participants’ completion [58, 66, 80, 83], recognition via staff communications or at an organisation-wide event [37], opening participants’ project presentations to a wider healthcare organisation audience [92], or by managers providing opportunities for participating staff to present their work to colleagues [81, 82]. One program included the acknowledgment of contact hours for nurse participants to attain continuing professional development points for their professional registration [54] and another referred to participants’ “recognition and exposure” within and beyond their organisation, as a participant-reported benefit (46, e–145).

Theories and pedagogical principles

Understanding how people learn effectively is fundamental to the design of any educational program. Thus, the second aim of this review was to determine what pedagogies (teaching methods) were employed for adult learners undertaking research education and training. Few of the studies (n = 13) included in this review explicitly stated which pedagogical strategies informed the design and delivery of the education programs. However, where possible we extracted pedagogical strategies that appear to be present (see Table 2).

Education programs generally included a mix of active and passive learning strategies. Active learning can be defined as an activity which engages students as participants in the learning process whereas with passive learning, students receive information from the instructor but have little active involvement [97]. Passive forms of learning or didactic approaches that were employed included seminars, lectures, reading, and exams. Five programs were described with respect to the didactic learning component only, with no reference or implication of any underlying pedagogy or learning theory [39, 45, 48, 49, 53].

Commonly, education programs included some form of experiential learning. Experiential learning, or “learning by doing” is a type of active learning whereby students apply knowledge to real-world situations and then reflect on the process and experience [98]. Examples of experiential learning described in the education programs include simulations, role-play, preparation of research protocols, grant proposals, manuscripts, and appraisal of research. Lack of experiential learning, or “practical experience”, was described as a limitation in one paper [38]. Quizzes were utilised in two programs [42, 66] to reinforce participants’ learning.

Social cognitive theories of learning, such as self-efficacy theory [99], were explicitly mentioned in seven studies [31, 47, 54, 56, 61, 71, 72]. Self-efficacy theory posits that a person’s belief in their capabilities provide the foundation for performance and accomplishment. If a person has low self-efficacy (little belief in their capabilities) and fear related to the task at hand, they will likely avoid that task for fear of failure. Education programs using a self-efficacy framework focused on increasing participants self-efficacy through coaching, support, social modelling, and mastery experiences. Five studies referred to Roger’s Diffusion of Innovation theory [37, 50, 60, 68, 71], which posits that identifying and working with highly motivated individuals is an efficient way to promote the adoption of new behaviours and practices more widely [8].

Two studies were informed by the Advancing Research and Clinical practice through close Collaboration (ARCC) Model which is based on cognitive-behavioural theory and control theory, and therefore designed to address barriers to desired behaviours and practice [65, 100]. Other programs described drew on the transtheoretical model of organisational change [62], Donald Ely’s conditions for change [37], the knowledge to action framework [52] and the Promoting Action on Research Implementation in Health Services (PARiHS) Framework [72].

Mentoring was a feature of more than half of the programs (n = 37). This is where novice researchers were paired with an experienced researcher, typically to support their application and practice of the knowledge gleaned through their education or training [101]. In three papers describing programs that did not include mentoring, this was identified as a critical element for future research education programs [37, 78, 92]. Several evaluations of programs that included mentoring illustrated that it was required throughout the life of the program and beyond [9, 32, 67, 68, 73, 81, 84]. Harding et al. [46] found that mentors as well as mentees, benefited from the research education program, in terms of their own learning and motivation.

Social theories of learning, or collaborative learning approaches, were also frequently utilised (n = 40). Collaborative learning approaches are based on the notion that learning is a social activity at its core, shaped by context and community. Such approaches promote socialisation and require learners to collaborate as a group to solve problems, complete tasks, or understand new concepts. Collaborative approaches utilised included journal clubs [38, 50, 54, 69, 70, 87], writing groups [32, 51], classroom discussions [33, 36, 72, 76, 80, 94], interactive group workshops or activities [29, 31, 46, 47, 56, 75, 82, 84, 86, 93], and development of team research projects [78, 79]. These approaches were often reported to enhance cultural support with participants networking, sharing resources, and celebrating successes together. One program employed a self-guided learning approach through the use of computer-based learning modules [55].

Approaches to program evaluation

Less than half of the included papers accurately and comprehensively described the methodology and methods used to evaluate the research education program [9, 30, 38, 46, 54,55,56, 60,61,62,63, 65, 69,70,71, 75, 77, 79, 82, 84,85,86, 89, 100, 102]. The remaining papers either referred to the data collection techniques used without describing the overarching approach or methodology. Therefore, in Table 3 rather than referring to the approach to program evaluation as quantitative, qualitative or mixed methods, reference is made to the data collection techniques (e.g., surveys, interviews, facilitator reflections, audit of research outputs).

Table 3 Research education program evaluation and outcomes reported

Most programs were evaluated using surveys (n = 51), some of these in combination with other outcome measures. More than half of the program evaluations (n = 38) used pre- and post-intervention surveys. Other evaluation methods included interviews, focus groups, attendance rates, and outcomes audits (e.g., ethics applications, manuscripts submitted for peer review or published, grant applications, grants awarded, or adherence to evidence-based guidelines). Twelve evaluation studies included a control group [36, 38, 51, 60, 65, 68,69,70, 77, 79, 86, 100]. Three evaluations were informal and did not explicitly draw on evaluation data but rather on general feedback, authors’ own reflections and observations, including observed research progress [35, 37, 94]. Evaluation of the longer-term outcomes were described in seven papers, where surveys were undertaken or outcomes were otherwise measured between one and 5 years after the programs were completed [44, 51, 76, 84, 85, 89, 93].

Outcomes measured and described

Program outcome measures were mapped to Barr et al.’s modified Kirkpatrick educational outcomes typology [27]. The typology categorises educational outcomes reported according to their level of impact. The outcomes levels range from individual learner-level outcomes through to the impact of educational program on their organisation and healthcare consumer outcomes. See Table 4 below for descriptions of the outcome levels and the corresponding citations.

Table 4 Evaluation outcomes according to Barr et al.’s modified Kirkpatrick typology

Almost all program evaluations included a mix of outcome measure types or levels. In addition to the modified Kirkpatrick level outcomes, other types of outcomes and impacts were measured and reported. Program participant engagement was measured and reported with reference to interest and uptake, attendance, and drop-out rates in five evaluations [48, 54, 74, 78, 87]. Twelve program evaluations explored participants’ experiences or perspectives of barriers to engaging in research in their health setting [34, 36, 49, 56, 71, 77, 81, 82, 84, 86, 88, 89] and four evaluations included program cost calculations [51, 60, 83, 90]. One evaluation measured group cohesion, participant (nurse) productivity and nursing staff retention [100].

Programs that were evaluated over a longer period demonstrated a high success rate with respect to manuscript publication [34, 51, 76], longer term development of research skills, experience, and engagement [44, 84, 89], and highlighted the value of mentoring to participants’ enduring engagement with research and to their development of research confidence and leadership skills [84]. One evaluation study included administrative leaders [89], one included training participants’ managers [93], however none included senior executives or healthcare consumers.

Discussion

To the authors’ knowledge, this is the first systematic scoping review of the research education literature. The findings of the review support existing evidence of the continued relevance of research education and training to RCB endeavours [2, 16]. Indeed, research education appears to be a mainstay RCB strategy over the last five decades. This review sought to explore the features or characteristics of research education and training programs delivered to nurses and allied health professionals working in health settings in HICs, the pedagogical principles or learning theories underpinning the programs, how programs were evaluated, and the types of outcomes reported.

Common features and approaches to the delivery of research education were identified. Some common pedagogical features of research education programs: multifaceted delivery to allow for flexibility in engaging with the program and content [5, 103], experiential learning [2, 103] and social or collaborative learning principles [103]. These underpinning principles were implied more frequently than they were explicitly stated. The integration of mentoring to reinforce the knowledge gleaned through research education programs appears to be a critical element and a key component of contemporary research education and capacity building [2, 3, 104].

This review also highlights some differences in the programs, particularly in terms of duration, which varied from single sessions or workshops to three-year programs. The curricula or educational content tended to reflect the aims of the programs which mapped to two different levels of engagement with research: research use or consumption and research activity. Some programs were specifically focused on advanced research skills, namely writing for publication, which is a particularly challenging aspect of the research process for clinicians [7, 51].

Findings indicate that organisational context and support are pivotal to the cultivation of and completion of research activity [2, 6, 7, 49, 77, 84, 88, 105]. Although this review focused specifically on papers describing research education programs targeting individual-level research capacity, there were several organisation-related factors that were integrated into the programs. Middle or executive level manager support for program participants was evident in numerous papers either through explicit support or permission, or positive role modelling. This resonates with the findings of existing evidence related to organisational factors enabling research [7, 106, 107]. Schmidt and colleagues [106] have previously highlighted a lack of managerial support for research training participants and their projects, as a factor influencing withdrawal. Several programs incorporated events or other opportunities for participants to present their work or to be otherwise recognised [37, 46, 54, 66, 80,81,82,83]. This facilitated organisation-level acknowledgement and celebration of individuals’ research activity and achievement, reinforcing organisational support for research [2].

This scoping review highlights some evidence of the impact of research education beyond the individual participants, and on their colleagues and organisations more broadly. This broader impact can be attributed to participants actively sharing their new knowledge and skills with their colleagues and teams [108]. Roger’s Diffusion of Innovation Theory can also underpin RCB strategies that are targeted at the individual level and explain how and why they have a broader impact on organisational research capacity and culture [104].

Research education program outcome measures tend to reflect lower levels of Kirkpatrick’s modified typology, with comparatively few studies reporting organisation-level impacts and none reporting health consumer outcomes. Although it is recognised that measuring and demonstrating direct links between RCB initiatives and health consumer outcomes is difficult [109], RCB initiatives including research training typically aim to promote the delivery of evidence-informed care, which in turn improves health consumer outcomes [110]. Some program evaluations included self-reported measures by participants that did not engage in the research education program, providing for comparisons between groups. Senior and executive managers, and healthcare consumers, however, were not involved in any evaluations reported. This limits knowledge of the outcomes and impacts beyond the individual participant level. Moreover, the program evaluation methods were generally poorly described. This is somewhat paradoxical, given the subject matter, however it is not a problem unique to research education and capacity building. Indeed poor evaluation is a widespread problem evident in multiple key healthcare areas such as Aboriginal Health in Australia [111] supportive care services for vulnerable populations [112], and in continuing education for healthcare professionals [113]. Factors contributing to poor program evaluation likely include time constraints, inaccessible data, and inadequate evaluation capacity and skills, as described in other scoping reviews of health and health professions education programs [111,112,113].

Although it is encouraging to see broadening interest in RCB initiatives for the nursing and allied health professions including research education, investment in rigorous, carefully planned, broadly targeted and long-term evaluation is required. This will ensure that research education programs maximise the outcomes for individuals and organisations and the most crucial impact on health consumer outcomes can be measured.

Strengths and methodological limitations

The strengths of this scoping review are the adherence to an established and systematic approach and the wide and comprehensive search including 11 research databases, multiple grey literature databases and search engines. The methodological and content expertise within the research team, including expertise in scoping review, systematic review, realist review methodologies and research education and capacity building strategies strengthened the rigour of the review. Moreover, the consultation with content experts during the development of the search strategy ensured the review was well-informed and shaped to meet the needs of those concerned with RCB.

Nonetheless, this review is limited by several factors. Research education, training, and RCB more broadly are poorly defined concepts [2], as such, it is acknowledged that the search strategy was developed in such a way that it may not have resulted in the retrieval of all relevant literature. This is acceptable, given the scoping review aimed to provide an overview of the breadth and depth of the literature and used content expertise to balance the comprehensiveness of the review with the capacity to answer research questions [114]. It is, however, recommended that the findings of this review inform a more focused and systematic review of the literature.

It is well-established that research education and training alone, do not sufficiently influence research capacity and capability at an individual or organisational level [1, 7]. Indeed, barriers to nurse and allied health-led research include time constraints, demanding clinical workloads, enduring workforce shortages, a lack of organisational support and research culture, funding, and inadequate research knowledge and skills, persist [7, 12, 39, 47, 115]. These factors were not analysed as part of the review. The explicit focus on research education meant that some RCB strategies with education as a component may have been missed.

The authorship team were situated in Australia, with limited knowledge of other, complementary search engines internationally and lacked the resources to execute extensive international grey literature searches. These limited grey literature searches introduce a level of publication bias. Publications in languages other than English were excluded for reasons related to feasibility and limited resourcing. Through engagement with content experts early in the review, it was noted that many education programs are not formally documented, evaluated, or published in peer-reviewed or grey literature and therefore not accessible to others outside the organisation. This means that the review of published literature may not entirely represent research education programs in health settings.

Conclusion

Research education is a cornerstone RCB strategy for nurses and allied health professionals working in health settings. Education is typically aimed at enhancing individual clinician-level RCB however, there is some evidence that the outcomes of individual-level research education can influence organisational research capacity and culture. Moreover, strategies targeted at the organisational level can be integrated into research education programs. Mentoring, experiential, and collaborative learning have gained recognition as key features of research education programs and facilitate the application of new knowledge and skills in practice. Evaluation continues to focus on lower levels of educational impact or traditional research outputs; there is need for greater attention to organisational culture, longer-term capacity building outcomes and health consumer impacts. Approaches to the evaluation of research education programs should incorporate the experiences and perspectives of managers, executives, health consumers and other stakeholders concerned with research capacity and the delivery of evidence-informed care. This will ensure that RCB strategies and initiatives with greater impact at the individual and organisational level can be supported and that the impact of such initiatives can be measured at the population health level.