Scholars commonly acknowledge inconsistent and sparse reporting about the design and implementation of complex interventions within the published literature [13]. Complex interventions (also referred to as multiple interventions) deliberately apply coordinated and interconnected intervention strategies, which are targeted at multiple levels of a system [4]. Variable and limited reporting of complex interventions compromises the ability to answer questions about how and why interventions work through systematic assessment across multiple studies [3]. In turn, limited evidence-based guidance is available to inform the efforts of those responsible for the design and implementation of interventions, and the gap remains between research and practice.

The momentum within the last five years to identify promising practices in many fields [57] increases the urgency and relevance of understanding how and why interventions work. However, complex community health programs involve a set of highly complex processes [810]. It has been argued that much of the research on these programs has treated the complex interactions among intervention elements and between intervention components and the external context as a 'black box' [4, 1114]. Of particular relevance to these programs are failures to either describe or take into account community involvement in the design stages of an intervention [8]; the dynamic, pervasive, and historical influences of inner and outer implementation contexts [12, 1417]; or pathways for change [13, 14]. A comprehensive set of propositions to guide the extraction of evidence relevant to the planning, implementation, and evaluation of complex community health programs is missing.

Our research team was interested in applying a set of propositions that arose out of a multiple intervention framework to examine reports on community health interventions [4]. To this end, we present a set of propositions that reflects best practices for intervention design, implementation, and evaluation for multiple interventions in community health, and we conduct a preliminary assessment of information reported in the published literature that corresponds to the propositions.

Propositions for the design, implementation and evaluation of community health interventions

The initial sources for propositions were primary studies and a series of systematic and integrative reviews of many large-scale multiple intervention programs in community health (e.g., in fields of tobacco control, heart health, injury prevention, HIV/AIDS, workplace health) [8, 10, 1824]. By multiple interventions, we mean multi-level and multi-strategy interventions [4]. Common to many of these were notable failures of well-designed research studies to achieve expected outcomes. Authors of these reviews have elaborated reasons why some multiple intervention programs may not have had their intended impact. Insights for propositions include researchers' reflections on the failure of their multiple intervention effectiveness studies to yield hypothesized outcomes, and reviews of community trials elaborating reasons why some multiple interventions programs have not demonstrated their intended impact [8, 10, 22, 23, 25, 26]. The predominant and recurring reasons for multiple intervention research failures are addressed in the initial set of propositions for how and why interventions contribute to positive outcomes.

The propositions arise from and are organized within a multiple interventions program framework (see Figure 1 and Table 1). The framework is based on social ecological principles and supported by theoretical and empirical literature describing the design, implementation, and evaluation of multiple intervention programs [810, 1821, 2529]. The framework has four main elements, and several processes within these elements. The propositions address some of the common reasons reported to explain failures in multiple intervention research.

Figure 1
figure 1

Multiple Interventions Program Framework. (adapted from Edwards, Mill & Kothari, 2004, reproduced with permission).

Table 1 Summary of propositions for multiple interventions in community health


The preliminary assessment involved three main steps: selection of a sample of multiple intervention projects and publications, development of a data extraction tool, and data extraction from the publications.

Selection of a sample of multiple intervention projects and publications

A first set of criteria was established to guide the selection of a pool of community-based multi-strategy and multi-level programs to use as case examples. The intent was not to be exhaustive, but to identify a set of programs that address a particular health issue that we anticipated might report details relevant to the propositions. The team decided reporting of such intervention features would most likely be represented in: a community-based primary prevention intervention program; a program that was well-resourced and evaluated, and thus represented a favorable opportunity for a pool of publications that potentially reported key intervention processes; and, a health issue that had been tackled using multiple intervention programs for a prolonged period, thus providing the maturation of ideas in the field.

In the last 30 years, community-based cardiovascular disease prevention programs have been conducted world-wide and their results have been abundantly published. The first pioneer community-based heart health program was the North Karelia Project in Finland, launched in 1971 [30]. Subsequent pioneering efforts included research and demonstration projects in the United States and Europe that included the Minnesota Heart Health Program, and Heartbeat Wales [9, 31, 32]. Although specific interventions varied across these projects, the general approach was similar. Community interventions were designed to reduce major modifiable risk factors in the general population and priority subgroups, and were implemented in various community settings to reach well-defined population groups. Interventions were theoretically sound and were informed by research in diverse fields such as individual behaviour change, diffusion of innovations, and organizational and community change. Combinations of interventions employed multiple strategies (e.g., media, education, policy) and targeted multiple layers of the social ecological system (e.g., individual, social networks, organizations, communities). Many of these exemplar community heart health programs were well-resourced relative to other preventive and public health programs, including large budgets for both process and outcome evaluations. Thus, community-based cardiovascular disease program studies were chosen as the case exemplar upon which to select publications to explore whether specific features of interventions as defined by the propositions were in fact described.

To guide the selection of a pool of published literature on community-based heart health programs, a second set of criteria was established. These included: studies representative of community-based heart health programs that were designed and recognized as exemplars of multiple intervention programs; studies deemed to be methodologically sound in an existing systematic review; and reports published in English. Selection of published articles meeting these criteria involved a two-step process. First, a search of the Effective Public Health Practice Project [33] was conducted to identify a systematic review of community-based heart health programs. The most recent found was by Dobbins and Beyers [25]. Dobbins and Beyers identified a pool of ten heart health programs deemed to be moderate or strong methodologically. From this pool, a subset of three projects was selected: the North Karelia Heart Health Project (1971–1992), Heartbeat Wales (1985–1990), and the Minnesota Heart Health Program (1980–1993), which were all well-resourced, extensively evaluated, and provided a pool of rigorous studies describing intervention effectiveness.

Second, a subset of primary publications identified in the Dobbins and Beyer's [25] systematic review was retrieved for each of the three programs. In total, four articles were retrieved and reviewed for the Minnesota Heart Health Program [3437] and five articles for Heartbeat Wales [3842]. For Heartbeat Wales, a technical report was also used because several of the publications referred to it for descriptions of the intervention [43]. The primary studies and detailed descriptions of the project design, implementation and evaluation for the North Karelia Project were retrieved from its book compilation [30].

Development of a data extraction tool

The team was interested in identifying the types of intervention information reported, or not reported, in the published literature that corresponded with the identified best processes in the design, delivery, and evaluation of multiple intervention programs featured in the propositions. To enhance consistency, accuracy, and completeness of this extraction, a systematic method to extract the intervention information reported in the selected research studies was used. Existing intervention extraction forms [44, 45] first were critiqued to determine their relevancy for extracting the types of intervention information corresponding to the propositions. These forms provided close-ended responses for various characteristics of interventions, but did not allow for the collection of information on the more complex intervention processes reflected in the propositions. Thus, the research team designed a data extraction tool that would guide the extraction of intervention information compatible with the propositions.

To this end, an open-ended format was used to extract verbatim text from the publications. Standard definitions for the proposition were developed (see Tables 2 through 7 in the results section), informed by key sources that described pertinent terms and concepts (e.g., sustainability, synergy) [4651]. In order to enhance completeness and consistency of data extraction, examples were added to the definitions following an early review of data extraction (see below).

Table 2 Summary of data reported for integrating theory (proposition one)
Table 3 Summary of data reported for creating synergy (propositions two and three)
Table 4 Summary of data reported for achieving adequate implementation (propositions four and five)
Table 5 Summary of data reported for creating enabling structures and conditions (proposition six)
Table 6 Summary of data reported for modification of interventions during implementation (propositions seven and eight)
Table 7 Summary of data reported for facilitating sustainability (proposition nine)

Data extraction from the publications

Pairs from the research team were assigned to one of the three heart health projects. Information from the studies was first extracted independently, and then the pairs for each project compared results to identify any patterns of discrepancies. Throughout the process, all issues and questions related to the data extraction were synthesized by a third party. Early on, examples were added to the definitions of the propositions to increase consistency of information extracted with respect to content and level of detail. Through discussion within pairs and across the research team, consensus was reached on information pertinent to the propositions, and each pair consolidated the information onto one form for each project. The consolidated form containing the consensus decisions from each pair was then used to compare patterns across the full set of articles. All members of the research team participated in the process to identify trends and issues related to reporting on relevant intervention processes. These trends and issues are described in the next section.


Results are reported for each proposition in order from one through nine, and grouped according to the themes shown in the multiple interventions program framework (Figure 1). For each proposition, results are briefly described in the text. These descriptions are accompanied by a table that includes the operational definition for the proposition, findings related to reporting on the proposition, and illustrative verbatim examples from one or more of the projects.

Integrating theory (proposition one)

Information regarding the use of theories was most often presented as a list, with limited description of the complementary or unifying connections among the theories in the design of the interventions. Commonly, intervention programs projected changes at multiple socio-ecological levels, such as individual behaviour changes, in addition to macro-environmental changes. However, while theories were used for interventions targeting various levels of the system, the integration of multiple theories was generally implicit and simply reflected in the anticipated outputs. Although less common, the use of several theories was made more explicit through description of the use of a program planning tool, such as a logic model (Table 2).

Creating synergy (propositions two and three)

General references were frequently made regarding the rationale for combining, sequencing, and staging interventions as an approach to optimizing overall program effectiveness and/or sustainability. In particular, references to this were most often found in proposed explanations for shortfalls in expected outcomes. However, specific details regarding how intervention strategies were combined, sequenced, or staged across levels, as well as across sectors and jurisdictions, were usually absent. Thus, insufficient information was provided to understand potential synergies that may have arisen from coordinating interventions across sectors and jurisdictions. In contrast, more specific details were reported for the combining, sequencing, and staging of interventions within levels of the system (i.e., a series of interventions directed at the intrapersonal level) (Table 3).

Achieving adequate implementation (propositions four and five)

Proposition four specifically considers the quantitative aspects of implementation. Information reported ranged from general statements to specific details. Although the population subgroups targeted by the intervention were often clearly identified, information regarding the estimated reach of the intervention was generally non-specific. The amount of time for specific intervention strategies and the overall program tended to be reported in time periods such as weeks, months or years. Information regarding specific exposure times for interventions tended to be unavailable. The intensity of interventions was provided in some reports, with authors describing strategies that included the passive receipt of information, interaction, and/or environmental changes. A description of investment levels is also a marker of the intensity of an intervention strategy. However, investment descriptions were quite variable, ranging from no information to general information on investment of human and financial resources. In addition, challenges to reporting costs and benefits were often acknowledged.

Proposition five considers the quality of implementation, represented by qualitative descriptions of the intervention. Reporting regarding the quality of the implementation was primarily implicit (Table 4).

Creating enabling structures and conditions (proposition six)

Reporting of information relative to the deliberate creation of structures and conditions was limited and generally implicit, often embedded in the details of intervention implementation (Table 5).

Modifying interventions during implementation (propositions seven and sight)

Although authors acknowledged the importance of flexibility in intervention delivery, information regarding adaptations to environmental circumstances was vague. Reference to context was often in discussion sections of studies, and provided as a partial explanation for unintended or unexpected outcomes. There was minimal description regarding the modification of interventions in response to information gained from process/formative evaluation, outcomes, or population trends – the core of proposition eight. Again, authors acknowledged the significance of process/formative evaluation in informing intervention implementation, with some examples to illustrate how interventions were guided in response to information gathered. At other times, in the summative evaluation, reporting focused on using process evaluation results to explain why expected outcomes were or were not achieved, rather than how the process evaluation results did or did not shape the interventions during implementation. Suggestions for improved program success, based on information gained from formative evaluations, were noted in some discussions (Table 6).

Facilitating sustainability (proposition nine)

Reporting on elements regarding the intention to facilitate sustainability of multiple intervention benefits was also variable. Authors made reference to the notion of sustainability at the onset of projects and described the conditions and supports that were in place to facilitate continued and extended benefits. Elements of sustainability represented in program outcomes were also described in some detail. In other examples, reporting only focused on sustainability of the program during the initial research phase of program implementation and discussed the desirability of continuing the program beyond the research phase (Table 7).


The primary purpose of this paper was to conduct a preliminary assessment of information reported in published literature on 'best' processes for multiple interventions in community health. It is only with this information that questions of how and why interventions work can be studied in systematic reviews and other synthesis methods (e.g., realist synthesis). The best processes were a set of propositions that arise from and were organized within a multiple interventions program framework. Community-based heart health exemplars were used as case examples.

Although some information was reported for each of the nine propositions, there was considerable variability in the quantity and specificity of information provided, and in the explicit nature of this information across studies. Several possible explanations may account for the insufficient reporting of implementation information. Authors are bound by word count restrictions in journal articles, and consequently, process details such as program reach might be excluded in favour of reporting methods and outcomes [3]. Reporting practices reflect what traditionally has been viewed as important in intervention research. There is emphasis on reporting to prove the worth of interventions over reporting to improve community health interventions. This follows from the emphasis on answering questions of attribution (does a program lead to the intended outcomes?), rather than questions of adaptation (how does a dynamic program respond to changing community readiness, shifting community capacity, and policy windows that suddenly open?) [16, 52].

An alternative explanation is that researchers are not attending to the processes identified in the propositions when they design multiple intervention programs. Following these propositions requires a transdisciplinary approach to integrating theory, implementation models that allow for contextual adaptation and feedback processes, and mixed methods designs that guide the integrative analysis of quantitative and qualitative findings. These all bring into question some of the fundamental principles that have long been espoused for community health intervention research, including issues of fidelity, the use of standardized interventions, the need to adhere to predictive theory, and the importance of following underlying research paradigms. When coupled with the challenges of operationalizing a complex community health research study that is time- and resource-limited, it is perhaps not surprising that the propositions were unevenly and weakly addressed.

It would be premature to generalize these results to other programs. The three multiple intervention programs (the North Karelia Project, Heartbeat Wales, and the Minnesota Heart Health Program) selected for this study were implemented between 1971 and 1993, and represented the 'crème de la crème' of heart health programs in terms of study resources and design. In particular, the North Karelia project continues to receive considerable attention due to the impressive outcomes achieved [17]. We think it would be useful to apply the data extraction tool developed by our team to some of the more contemporary multiple intervention programs targeting chronic illness. Our findings would provide a useful basis of comparison to determine whether or not there has been an improvement over the past decade in the reporting of information that is pertinent to the propositions. Before embarking on this step, it would be helpful to have further input on the data extraction tool, particularly from those who are involved in the development of new approaches to extract data on the processes of complex interventions with the Cochrane initiative [3].


Study findings suggest that limited reporting on intervention processes is a weak link in published research on multiple intervention programs in community health. Insufficient reporting prevents the systematic study of processes contributing to health outcomes across studies. In turn, this prevents the development and implementation of evidence-based practice guidelines. Based on the findings, and recognizing the preliminary status of the work, we offer two promising directions.

First, it is clear that a standard tool is needed to guide systematic reporting of multiple intervention programs. Such a tool could inform both the design of such research, as well as ensure that important information is available to readers of this literature and to inform systematic analyses across studies. In addition, a research tool that describes best processes for interventions could benefit practitioners who are responsible for program design, delivery, and evaluation.

Second, the reasons for limited reporting on intervention processes need to be understood. Some issues to explore include the influence of publication policies for relevant journals, and the types of research questions and processes that are used.

It is through a more concerted effort to describe and understand the black box processes of multiple interventions programs that we will move this field of research and practice forward. It is our contention that a shift to more inclusive reporting of intervention processes would help lead to a better understanding of successful or unsuccessful features of multi-strategy and multi-level interventions, and thereby improve the potential for effective practice and outcomes.