Program Logic Modelling and Complex Positive Psychology Intervention Design and Implementation: The ‘Resilient Futures’ Case Example


Positive psychology interventions (PPIs) and programs differ markedly in implementation quality. Lower quality implementation is associated with interventions that include multiple components (or PPIs), are delivered across multiple layers (individual, workgroup/classroom and organisation/school) or agency sites, and include cohorts with heterogeneous or complex needs (e.g., trauma). This paper collectively refers to these interventions as ‘complex PPI’s’ or ‘complex programs’. Drawing upon the implementation science literature, we argue that logic modelling represents a method to guide the design and implementation of ‘complex’ interventions. We describe a growth-focused logic model and implementation methodology (titled intentional practice) that operationalises positive psychology outcomes and processes to support program developers to achieve a balance between fidelity and adaptation. The Resilient Futures program, developed by the South Australian Health and Medical Research Institute (SAHMRI) Wellbeing and Resilience Centre, is described as a case example of the approach. This program represents a large scale and multi-site implementation of a resilience and wellbeing skill-building program for 850 young people from disadvantaged backgrounds with heterogeneous needs. The case study describes how logic modelling and intentional practice consolidated early theory building work to operationalise the intervention at the program level, but also to support multi-site nuancing and translation of the program to individual sites and agency/youth needs. The article argues that logic modelling offers a flexible and evidence-based method for program designers, facilitators and researchers to design and implement complex PPI’s or programs. Key considerations for utilising logic models are offered.


Psychology has long aimed to increase happiness and wellbeing. With a more solid evidence base emerging from Fordyce’s pioneering program to increase personal happiness (Fordyce 1977, 1983) and Diener’s ground breaking work in measuring and conceptualising subjective wellbeing (Diener 1984, 1994), ‘positive psychology interventions’ (PPIs) have emerged as empirically tested strategies, exercises and activities designed to promote happiness and wellbeing (Parks and Schueller 2014). PPI’s within the literature focus on outcomes such as optimism, meaningfulness, resilience, gratitude, kindness, and compassion. PPI’s draw upon a range of strategies, activities and methods, including character strength identification, mindful awareness and savouring approaches, as well as goal-setting and coaching techniques.

There is emerging evidence for the efficacy of PPI’s. To date there have been two overarching meta-analytical reviews of selected randomised controlled trials on the impact of PPIs (Bolier et al. 2013; Sin and Lyubomirsky 2009). Bolier and colleagues’ (Bolier et al. 2013) isolated 39 studies involving 6139 participants and reported statistically significant (p < .01) improvement for subjective (mean d = .34) and psychological wellbeing (mean d = .20), and a reduction in depressive symptoms (mean d = .23). Sin and Lyubomirsky’s (2009) review of 51 intervention trials involving 4266 participants revealed significant promotion of wellbeing (mean r = .29) and reduction in depressive symptoms (mean r = .31). The degree of impact of PPI’s is similar to broader interventions targeting resilience outcomes (e.g., Leppin et al. 2014, mean d = .37).

Beyond efficacy trials, Hone et al. (2015) investigated the effectiveness of 40 PPI trials, encapsulating 10,664 adult participants, that were implemented in real-world settings for their intervention utility. This review highlighted sparse reporting on aspects such as participation rates, methods to select delivery agents, differences between participants and non-participants, and program maintenance and costs. It also highlighted issues to do with sample selection in that 68% of participants were female, 87% were Caucasian, 62% were educated to a university level, and the average age was 43 years.

In short, while there is emerging evidence regarding the efficacy of PPI’s in improving wellbeing and reducing depressive symptoms, greater attention needs to be paid to their design and implementation in real world and complex settings (e.g., White 2016). We now demonstrate that the program development and implementation science literature offers significant utility in this regard.

Implementation of Complex Positive Psychology Interventions or Programs

Meta-analyses and systematic reviews have identified notable variability in terms of outcome effect sizes for PPI’s (Bolier et al. 2013; Iddon et al. 2016; Malouff and Schutte 2017; Sin and Lyubomirsky 2009; Sisk et al. 2018), with this theme also noted across the broader educational literature (e.g., social and emotional learning [SEL] interventions: Durlak et al. 2011). Within positive psychology, there is a movement away from a ‘what works’ to a ‘what works for whom’; which acknowledges the interaction effect between intervention (PPI) and person, or ‘person-activity fit’ (Schueller 2014). As the literature develops, there is likely to be increased understanding of the participant, programmatic and contextual factors that moderate PPI effect sizes. As an example, a recent meta-analysis reviewed the relationship between interventions targeting ‘growth mindsets’ and academic achievement. The researchers found that such interventions have differential effects across individual cohorts, with young people from lower socio-economic backgrounds and at higher risk of educational disengagement most likely to receive benefits (Sisk et al. 2018). More broadly, the educational, wellbeing, forensic and clinical literature have isolated covariates associated with stronger intervention effects (Durlak and DuPre 2008; Fixsen et al. 2005; Lipsey 2009). Drawing upon this literature, Raymond (2018b) proposed five principles associated with programs and interventions (targeting social, emotional and wellbeing competencies) delivering higher impact outcomes, as demonstrated by larger intervention effect sizes. These include:

  • Conceptually sound - the intervention has clearly articulated outcomes; and activities and methods that are logically coordinated to deliver these outcomes.

  • Skill-focused – the intervention brings an intent to behaviourally expressed skill development (e.g., mindfulness, problem solving).

  • Targeted – the intervention is targeted to individuals whose psychological or behavioural presentation is at the focus of intervention.

  • Responsive – the intervention activities and implementation delivery seeks to engage, positively challenge, motivate, and stimulate growth outcomes in the target cohort.

  • Program fidelity (integrity) – the intervention is implemented as intended and designed, with rigour brought to implementation quality.

There is wide variability in the degree such principles are operationalised into PPI program design and implementation, with ‘program fidelity’ (or consistency of delivery as per intended design) having particular relevance to the delivery of PPI’s within real world settings (Hone et al. 2015). Interventions that include multiple components (e.g., two or more PPI’s), are delivered across multiple levels (e.g., individual, workgroup or classroom, organisational) or implementation sites, and target cohorts with heterogeneous or complex needs (e.g., trauma) are at higher risk of both lower implementation quality and fidelity (Malti et al. 2016; Michie et al. 2009). Within the positive psychology discipline, multi-component interventions are characteristic of whole-of-agency (entire school or organisation) or multi-site interventions. The field of ‘positive education’, in particular, has brought significant attention to system-based implementation of PPI’s (e.g., Norrish 2015). We collectively refer to such intervention types as ‘complex PPI’s’ or ‘complex programs’.

The word ‘complex’ denotes heterogeneity in programmatic and setting conditions, while the word ‘program’ denotes the importance of planning and integration to intervention design and implementation. Such programs have also been described as ‘complex behaviour change interventions’ (Michie et al. 2009). ‘Complexity’ as a construct has been isolated in the literature as an important program design and implementation consideration (Craig et al. 2008; Michie et al. 2009). Across positive psychology, there are examples of complex programs being delivered in a high-quality manner with fidelity. A leader in this field is Geelong Grammar, a private school in Victoria, Australia, who have embedded positive psychology principles, values and content across multiple levels of their school (Norrish 2015; O’Connor and Cameron 2017; Seligman et al. 2009; White 2013). O’Connor and Cameron (2017) describe a planned and resourced implementation schedule targeted at multiple levels (student, teacher, family, leadership and organisational) and integrating a range of teaching strategies (both implicit and explicit). Another leading example is St Peter’s College, a K-12 private, boys’ school in Adelaide, South Australia (Waters and White 2015; White and Waters 2015). Their implementation strategy used appreciative enquiry to underpin the organisational change process (Waters and White 2015), and embedded positive psychology constructs across multiple levels including sport, curriculum, leadership programs, counselling and wellbeing services (White and Waters 2015). White and Waters reported that aspects of their design and implementation (e.g., infusing character strengths) occurred through a ‘shotgun’ approach where the PPI was implemented through “multiple initiatives rather than one universal program” (p. 74). We argue for a programmatic approach to complex PPI design and implementation.

It is important to note that the two case examples were underpinned by a highly resourced implementation strategy; not easily replicated across other complex programs (or positive education locations) and within not-for-profit community service settings. More generally, positive education is not demonstrating the same levels of traction or outcomes (e.g., not ‘sticking’, White 2016). White and others (e.g., O’Connor and Cameron 2017; Slemp et al. 2017) have argued for a planned and resourced approach to positive psychology implementation, supported through systems and underpinned by implementation science. This view is consistent with the broader literature on complex program design and implementation which argues for detailed attention being paid to planning (Crosby and Noar 2011; Michie et al. 2011), and the clear articulation of intervention components (Eldredge et al. 2016; Kok et al. 2016). The implementation science literature brings focus to the high quality translation (or implementation) of scientific knowledge into practice through organisational, structural, financial, and professional strategies (Albrecht et al. 2013; Fixsen et al. 2005). The increased focus on ‘implementation’ has emerged owing to wide slippages between the translation of evidence into practice across multiple human service disciplines (Fixsen et al. 2009). While a strong argument is made for the implementation of evidence-based practices, increasing importance is being placed in contextualising programs and interventions to their cohort and settings (Malti et al. 2016). In other words, program developers need to find the “right mix of fidelity and adaptation” (Durlak and DuPre 2008, p. 341) or ‘flexibility within fidelity’ (Kendall et al. 2008). This movement towards flexibility has brought an implementation focus to ‘evidence-base adaptation’ (Ghate 2016). This thinking has been particularly highlighted in child and youth cohorts, where interventions should be matched to specific developmental (Malti et al. 2016) or presenting needs (e.g., trauma, see Brunzell et al. 2016 for integration of positive psychology and trauma-informed practice).

We argue that complex positive psychology programs (or positive education interventions) should be designed and implemented in an intentional and programmatic manner, thus maximising the possible outcomes that can be obtained by finite resources. Given the importance of contextualising programs to setting and cohort, we suggest the operationalisation of complex interventions through a ‘home-grown’ model that draws upon local wisdom and knowledge, and brings focus to both evidence-based practice and adaptation. We contend that such modelling should cohesively describe: (1) desired localised outcomes, (2) evidence-based methods or intervention components (e.g., PPI’s) that target those outcomes, and (3) theory of change (or practice philosophy) that underpins the program. Program logic modelling is introduced as a method to support such an endeavour.

Program Logic Modelling

Logic modelling is a method to strengthen program design and implementation, and offers particular utility for multi-component or complex programs. It is an approach that conceptually describes the relationship (or logic) between an individual program’s processes (program delivery and activities) and its associated outputs or outcomes (Cooksy et al. 2001). In other words, it offers a simplified way to “systematically link problems, activities, and outcomes” (Renger et al. 2007, p. 202). Such models are often represented in table or diagram form with the following organising categories: (1) inputs or resources, (2) program activities, (3) outputs or deliverables, (4) desired outcomes or program impact. These categories are populated with content specific to the individual program, target group and its context. Logic models bring strong attention to the principle of “conceptually sound” program design, or the coherent and evidenced-informed mapping of program processes and outcomes (Raymond 2018b).

Program logic models are routinely applied across diverse settings and disciplines, including to operationalise multi-site school interventions (Holliday 2014), research programming (O’Keefe and Head 2011), public health promotion (Crosby and Noar 2011), conservation planning (Margoluis et al. 2009), forensic treatment (Stinchcomb 2001) and organisational development and coaching (Oosthuizen and Louw 2013) as just some examples. Logic models offer a method to support agencies or work teams to develop a shared understanding of the program model, including its delivery and underpinning philosophies (English and Kaleveld 2003; McLaughlin and Jordan 2004). The models can also be used to describe the relationship between short-, medium- and longer-term outcomes (Julian 1997), thereby operationalising program evaluation (Cooksy et al. 2001; Jordan 2013). Logic models have a key role to bring accountability to intervention outcomes and the use of finite resources (Hernandez 2000).

An important feature of logic models is that they ‘illuminate’ and ‘magnify’ key elements or components (e.g., specific intervention activities, PPI’s) within program design and implementation (Bordage 2009). By illuminating elements program developers can bring higher awareness and energy to their quality implementation. Logic models bring focus to clearly operationalising intervention processes and outcomes. Such an approach has particular relevance to positive psychology as an emerging discipline, given there has been increasing calls to bring rigour and consistency to the operationalisation of positive psychology constructs (Hone et al. 2014; Pawelski 2016) and interventions (Crane et al. 2016; Iddon et al. 2016). The current lack of coherency of constructs remains an impediment to the growth of the field (White 2016).

There are a number of strengths to logic modelling as an approach. First, logic modelling offers a method to coherently describe program models and illuminate key intervention features. Coherence of description, supported through a communication strategy, is a key driver of program fidelity and implementation quality with complex programming (Blase et al. 2012; Fixsen et al. 2013; Fixsen et al. 2005; Michie et al. 2009). Second, the method seeks to isolate the ‘heavy lifters’ or ‘active ingredients’ of an intervention, or the fixed versus flexible intervention components (Malti et al. 2016). In other words, it provides a method of “identifying and distinguishing core elements and adaptable characteristics from an EBP [evidence-based program], then supporting the implementation of the adapted model” (Aarons et al. 2012, p. 7). When such an approach is supported by a scientific approach to local implementation and adaptation (Stricker and Trierweiler 2006), program adaptations can occur in an intentional, cost effective and efficient manner (Raymond et al. 2018).

A core weakeness of logic models is that they represent fixed or point-in-time model constructions, which cannot be easily and dynamically operationalised to heterogeneous programs or client-centred variables (Julian 1997; Renger et al. 2007). In other words, while they support dynamic and evidence-based adaptation of the intervention at the program design level, they cannot easily describe or operationalise adaptation (or contextualising) to specific client needs, personality or backgrounds. In light of these limitations, a growth-focused logic model and intentional practice implementation method was developed (Raymond 2018b).

Growth-Focused Logic Modelling (Intentional Practice)

Intentional Practice is an implementation methodology where individuals, programs and organisations bring ‘mindful awareness’ to the intent of an intervention, including the desired outcomes (‘what’) and the processes or mechanisms (‘how’) by which they are achieved (Raymond 2018b). Developed with reference to the implementation, positive psychology and trauma-informed science literature (see Raymond 2016a, 2018b), the method asks the individual, program or institution to bring mindful awareness to key questions such as:

  • What is the intent, energy or philosophy driving the intervention (or supporting role)?

  • What outcome is the focus of the intervention?

  • How, or by which method or process, is this outcome being achieved?

  • Is the intervention having an activating, or growth effect?

These questions can be operationalised at multiple levels. This includes across an entire institution (e.g., strategic intent), through program design and implementation (Raymond et al. 2018; Raymond 2016a, 2018b; Raymond 2019; Raymond and Lappin 2016), and moment-to-moment practice or supporting adult roles (e.g., teachers, counsellors, professionals, youth workers). In respect to the later, intentional practice has been applied to operationalise trauma-informed care (Raymond 2019), clinical interventions (Raymond 2018a) and implicit and explicit learning methods (Raymond et al. 2018). To support the translation across all three layers (institutional, program design and implementation, and moment-to-moment practice), a model of intentional practice (Life Buoyancy Model; LBM) has been developed (Raymond 2018b), and is depicted in Fig. 1 below.

Fig. 1

Life Buoyancy Model. * Figure 1. reproduced with permission of the Licensor through PLSclear, Raymond 2018b

The LBM provides an organising scaffold for the population of ‘home-grown’ (or contexualised) program models that bring high awareness to conceptually sound intervention design and implementation. The LBM is comprised of two primary categories (1) outcomes (‘what’) and (2) processes (‘how’), which are further delineated through secondary categories, and the modelling is underpinned by a growth-focused practice philosophy. Within program design, the primary and secondary categories of the LBM are populated with context specific content. Each is considered in turn, with detailed evidence supporting each category available elsewhere (Raymond 2018b).


‘Outcomes’ articulate the intent or purpose of the intervention, or “what” the intervention is designed to achieve. In the model, the primary category is further delineated into a hierarchy of short-, medium- and long-term outcomes. The long-term outcome (or impact) represents the vision or desired intervention impact, with the medium-term outcomes representing an intermediate goal that has a conceptual or evidence-informed relationship with the long-term outcome. A core feature of the LBM is that it brings the strongest attention to the immediate intent (or growth intent) of the intervention (short-term outcomes). The short-term outcomes have an evidence-informed relationship with the medium-term outcome and are delineated under the organising categories of:

  1. (1)

    Awareness - the knowledge or insight an individual holds of themselves, others, their world, their past and future, their actions, intended outcomes and identity.

  2. (2)

    Skills - the behaviours, actions and coping responses an individual applies to respond to day-to-day demands (e.g., self-regulation) and thrive across all life domains.

  3. (3)

    Mindsets - the thinking processes (patterns of thoughts/beliefs) an individual holds about themselves, others and their world.

The reason the short-term outcomes (or ‘growth intent’) remains a central feature of intentional practice and the LBM is because it provides a mechanism for interventions to be operationalised into moment-to-moment practice (e.g., facilitation, coaching, counselling, clinical work). In other words, personnel can be trained and coached to bring a ‘growth intent’ to their roles, as matched to the growth of awareness, skills or mindsets articulated in the broader model. This provides a communication bridge for the program model to be translated into direct practice.


‘Processes’ bring attention to “how” the intervention delivers its stated outcomes and they are divided into secondary categories of ‘intervention components’ and ‘activating experiences’. Intervention components includes the activities, learning experiences, training sessions, program deliverables or core communication patterns that are delivered within the intervention (Raymond 2018b). The LBM (and intentional practice method) does not prescribe intervention components, but instead, values creativity, innovation and the use of pre-existing interventions or program components that are delivered in an intentional manner, aligned with evidence and with high awareness. In other words, the modelling asks the practitioner or program developer to critically reflect upon their intervention and isolate the ‘heavy lifters’ or intervention components central to their work. At the counselling, coaching or clinical level, intervention components may include individual PPI’s, clinical interventions (e.g., cognitive restructuring, motivational interviewing), psycho-education or homework exercises (for integration of PPI’s and clinical interventions see Raymond 2018a). At the program level, they include training, staff coaching and mentoring, embedding systems and templates, and individual programs (e.g., social and emotional learning program, group of PPI’s).

Intervention components were conceptualised from the viewpoint that interventions can be sub-divided into meaningful bits or ‘modules’, which can be implemented collectively or independently. This has been drawn from the therapeutic (Chorpita et al. 2005; Lyon et al. 2014) and the program design and implementation literature (Eldredge et al. 2016; Kok et al. 2016). This modular or ‘common elements’ approach seeks to identify the ‘main ingredients’ within an intervention (Chorpita et al. 2005). It is founded upon the premise that evidence-based programs can be deconstructed into core components and adaptable features (see Dynamic Adaptation Process [DAP], Aarons et al. 2012). Chorpita et al. (2005) suggests that modular approaches offer a balance between flexibility, prescription and structure, in a manner where the intervention is contextualised to local setting and specific client needs. A metaphor to describe intentional practice and intervention components is the wagon wheel (adapted from Brunk et al. 2014). As depicted in Fig. 2, the hub of the wheel represents intentional practice as the unifying implementation method, with each spoke supporting the external rim representing the ‘intervention components’.

Fig. 2

Intentional practice as represented by a wagon wheel

When the spokes are clearly articulated and strong in implementation, the wheel retains its shape and fidelity is achieved. When the spokes are missing or weak, the wheel (or program or service delivery) can distort and be less effective. For example, if we consider a whole-of-agency wellbeing intervention that includes intervention components of (1) staff resilience training, (2) peer mentoring program, and (3) employee assistance program. If one of these components is missing, or delivered with low quality, the collective intervention is compromised. Across the broader intervention literature, complex programs are often delivered through ‘bundled’ or multi-component interventions, each which brings a different content focus. It is not unusual for interventions to be delivered in a non-integrative manner with low fidelity (Michie et al. 2009), analogous to a weak or distorted wheel (or rim). This point is likely to have particular relevance to the delivery of complex programs within positive psychology.

‘Activating experiences’ brings attention to three process factors (validation, curiosity and coaching) that engage, positively challenge, motivate, and stimulate (or activate) growth outcomes (Raymond 2018b). These factors were drawn from the therapeutic, educational and positive psychology literature, and bring focus to key common drivers or elements that ‘activate’ the potential of the intervention components or modules. They are briefly summarised as follows:

  • Validation is when the intervention (and its delivery) is experienced in a manner that makes the individual (or group) feel heard, understood and valued (‘I am noticed’ and ‘I matter’).

  • Curiosity is when the intervention activates curious and enquiring reflection linked to the intervention and/or its application (‘I wonder what this means’).

  • Coaching is a process that focuses on the understanding and application of the learning content through expressed actions and skills (‘how can I apply this’).

These process elements support the nuancing of the LBM and intentional practice method to specific client needs and context. These process elements are applied in two main ways. First, once a program logic has been developed, they support the articulation of a program’s theory (or how intervention components deliver growth outcomes). Second, they are applied to operationalise moment-to-moment practice (e.g., teacher, facilitation, clinician etc). For example, consider a teacher who is bringing a ‘growth intent’ to growing awareness of ‘what is mindfulness’ for a group of 13 year old students. The intervention component is ‘direct classroom instruction’. Within the delivery of the intervention, the facilitator ensures the group feels heard and validated (e.g., voice of participants is empowered), the delivery evokes curiosity (e.g., supported by audio-visual aides, humour) and the intervention includes time and practice in ‘how’ the content can be applied (e.g., mindfulness activities).

Practice Philosophy (Growth Intent)

The LBM was designed to be applied across multi-disciplinary settings. To deliver this aim, the model was framed and organised through positive psychology principles and constructs, for example ‘growing’ individual capacity for optimal functioning (Seligman and Csikszentmihalyi 2000). The LBM’s categories where labelled with reference to this literature. Importantly, the model isolates ‘growth’ as the core category of intent to be brought to focus within intervention design and implementation. That is, the intent or purpose of interventions are to ‘grow’ (or build) the capacity of individuals for improved wellbeing and behavioural outcomes (Raymond 2016a). This growth-focused orientation is operationalised as individuals and programs adopting a ‘growth intent’. This growth intent construct has been drawn from the self-determination (Deci and Ryan 2000) and growth mindset literature (Dweck 2012). For this reason, the LBM is described as a growth-focused model of intentional practice.

The reason the model explicitly names growth as an intent is that many complex programs are embedded within agencies and systems that have competing needs and different intents. For example, an agency (e.g., school) may bring focused energy and intent to ‘manage’ client behaviour or organisational risk, or bring focus to ‘compliance’ in terms of organisational systems or funder demands. By naming ‘growth’ as an intent, the practice philosophies of positive psychology are magnified and illuminated, and can be more easily embedded alongside existing systems (see Raymond and Lappin 2017 for case example).


Intentional practice is an awareness raising methodology designed to support the delivery of higher impact intervention outcomes (Raymond, 2018b). By ‘describing’, as opposed to ‘prescribing’ intervention conditions, it provides a flexible method for program developers to bring creative flair to their work, but within a framework of intentionality and awareness of intent, desired outcomes (‘what’) and method (‘how’). Intentional practice was infused as an implementation methodology of the Resilient Futures program.

Case Example: Resilient Futures

Resilient Futures is a complex positive psychology program. The program was designed to increase the education participation, wellbeing and life prospects of young South Australians at-risk of negative future outcomes or disadvantage (e.g., young people at risk of vocational and educational disengagement, mental illness, offending, poverty and social exclusion). The program was developed by the South Australian Health and Medical Research Institute’s (SAHMRI) Wellbeing and Resilience Centre (WRC). Substantial philanthropic funding was provided by the Wyatt Trust and the James and Dianna Ramsey Foundation. The Resilient Futures program underwent an extensive co-design and iterative development process. Detailed information on the program, including key pivots made within design and implementation, and preliminary outcome data, are described elsewhere (Raymond et al. 2018). This case study provides restricted focus to the use of logic modelling to operationalise the relationship between program outcomes (‘what’), methods (‘how’) and underpinning program philosophy or intent.

The broad objectives of the three-year project were two-fold. First, to measure and build the resilience and wellbeing of 850 disadvantaged young people (aged 16–20) by providing resilience skills training, reinforced through ongoing adult mentoring and access to online tools and resources. Second, to invest in the skills and knowledge of youth workers and educators. This investment was also founded upon a systems view to improving the professional practice of building wellbeing and resilience skills in young people, and bringing a focus to addressing key distal and proximal factors associated with social exclusion and disadvantage.

In order to recruit participants from traditionally hard to reach cohorts, the WRC collaborated with key agencies in program implementation. These agencies were selected based on their ability to demonstrate a long-term engagement with vulnerable young people in disadvantaged communities, and commitment to structured adult-to-youth relationships through mentoring, coaching or case engagement approaches. Initial partners included: three alternative schools focused on reconnecting disengaged learners to mainstream education; a youth agency providing case management and learning to disengaged learners and young people in custodial care; an international aid agency focused on youth justice and custodial care; a specialist youth mental health provider; and a therapeutic care agency specialising in residential services for young people who have been removed from their families.

Resilient Futures befits a complex positive psychology program. The program included multiple individual PPI’s which were mapped to 10 resilience skills (e.g., growth mindset, mindfulness, values based goals) and were collectively referred within the program as RF-10 skills (for individual skill breakdown see Raymond et al. 2018). The program also included a range of complex implementation features; including multiple (1) program components (e.g., explicit learning and mentoring), (2) service delivery sites (e.g., education, alternative care, mental health and youth justice) and (3) target layers (youth, agency staff, and agency system). The participant cohort was heterogeneous in nature, presenting with comorbid and complex needs, with backgrounds of developmental trauma, school disengagement and lower rates of literacy. Given complex programs are at higher risk of low implementation quality and associated poorer outcomes (Michie et al. 2009), logic modelling was employed to develop a coherent understanding of desired program outcomes and methods to deliver them. The modelling was employed in a two-stage process; as mapped to the iterative development of the program (see Raymond et al. 2018). First, a realist theory building methodology was employed (Westhorp 2014). This was then consolidated and described through a growth-focused logic model (LBM). They are now described in turn.

At the point of project initiation, a realist methodology was applied to conceptualise the relationship between desired program outcomes (‘what’) and the method or ‘how’ change would occur, and in which contexts. The realist methodology offers significant utility in understanding the impact of novel, iterative interventions, and brings particular focus on isolating how change occurs, for whom, and in which circumstances, thereby providing a foundation to scale out interventions to new contexts (Westhorp 2014). Realist methodologies provide a mechanism to understand and operationalise complex interventions (Fletcher et al. 2016; Wong et al. 2013). A realist theory of change model was developed by independent evaluators on the Resilient Futures program through consultation with the WRC project team and key stakeholders. This process enabled the research and project team to develop a preliminary and shared understanding of core program outcomes and components to guide early implementation.

Through an action research methodology (described in Raymond et al. 2018), supplemented by an independent realist process evaluation, the realist theory model was further refined and included the addition of new intervention components that were targeted at youth, staff, and at agency levels. This consolidated model is shown in Fig. 3 (reproduced from Community Matters 2017).

The model depicted in Fig. 3 was applied as a working theory of change that described the relationship between program components or methods (‘how’) and both intermediate and longer-term outcomes (‘what’). As noted within Fig. 3, the outcomes and methods layer upon each other. In other words, intervention components (e.g., ‘workers conduct resilience skills training program with young people’) are associated with intermediate outcomes (‘young people increase in self-awareness’) and long-term impact (‘improved life outcomes’). Following piloting, refinement and program stabilisation, greater attention was paid to program fidelity (or consistent delivery of components). This also occurred at a time when intentional practice was infused as an implementation methodology, both at the program design layer and the facilitator or moment-to-moment service delivery of the program. These two applications are now described in turn.

Fig. 3

Realist theory of change of the Resilient Futures program (Community Matters 2017)

In terms of the program design layer, the project team, supported through partner stakeholders, sought to isolate the ‘interventions components’ that were essential to program delivery, and articulate a hierarchy of outcomes associated with the program. Drawing upon action research and the realist evaluation (Community Matters 2017), and applying the LBM as a categorising framework, the Resilient Futures program was operationalised through a logic model. This model is replicated in Table 1.

Table 1 Resilient Futures program logic model (reproduced with permission from Raymond et al. 2018)

Table 1 identifies 16 intervention components (analogous to ‘spokes’ of a wheel) that provide the Resilient Futures program structure and robustness. These components are categorised against youth, agency staff and agency system layers. The operational definition of each intervention component is provided in Tables 2, 3 and 4 below.

Table 2 Description of Resilient Futures’ intervention components with underlying growth intent (youth domain)
Table 3 Description of Resilient Futures’ intervention components with underlying growth intent (agency staff domain)
Table 4 Description of Resilient Futures’ intervention components with underlying growth intent (agency system domain)

The Resilient Futures program logic (Table 1) also describes a hierarchy of short-, medium- and long-term outcomes. The short-term outcomes are delineated under the categories of (1) awareness (knowledge or insight), (2) skills (expression of awareness into actions), (3) mindsets (thought processes or beliefs), and (4) resources (broader system-based outcomes associated with the program). Resources was included as a separate category to capture short-term outcomes in which the WRC project team wanted to bring awareness to in the program which were not able to be accommodated in the other three categories. These short-term outcomes represent the immediate growth intent of the program and have predictive qualities (or an evidence-informed relationship) with the medium-term outcomes and long-term program impact. The intentional practice methodology asked the Resilient Futures program to bring ‘mindful awareness’ to the outcomes (‘what’), method (intervention components or ‘how’) and the intent of the program. To this effect, the intent or purpose of each intervention component is described in Tables 2, 3 and 4.

As noted in Tables 2, 3 and 4, across all intervention components, the intent of the component is framed through the describing word ‘grow’ (or a close derivative, e.g., ‘build’, ‘strengthen’, ‘activate’). For example, the intervention component ‘explicit learning of RF-10 skills’ has the intent to ‘grow young people’s awareness of RF-10 skill constructs’ (see Table 2). Growth focused intent represents the underpinning practice philosophy of the entire Resilient Futures program and enables positive psychology constructs to be operationalised from the program design into local moment-to-moment practice (facilitation, mentoring, training).

The Resilient Futures program was prototyped and iterated over two years, which consolidated into the logic model described in Table 1. The WRC project team employed an implementation strategy where partner agencies formulated individual implementation plans, based upon the logic model. That is, intervention components were delivered in a staged manner in conjunction with partner needs, context and experience in the program. The staged implementation was based around the delivery of core intervention components which included the implicit teaching of RF-10 skills, supported through intentional adult mentoring and coaching.

As articulated within Tables 2, 3 and 4, ‘intentionality’ (or mindful awareness of the ‘what’ [outcomes] and the ‘how’ [method]) underpinned the entire Resilient Futures program. Importantly, intentionality as a method was infused into facilitator or moment-to-moment service delivery (e.g., through implicit and explicit teaching, mentoring, coaching or case management relationships). Intentional practice was employed as a flexible teaching and facilitation method to teach resilience skills across multiple sites and to support individual tailoring of implicit and explicit teaching across heterogeneous participant cohorts. Given that the learning content could not be manualised or prescriptively defined across sites, intentional practice brought both rigour and flexibility to the teaching method. To support this endeavour, and as described in Raymond et al. (2018), the RF-10 skills were broken down into anchor points (or building blocks or competencies), mapped to the categories of (1) awareness, (2) skills and (3) mindsets. These were designed to be brought to awareness in moment-to-moment facilitation or service delivery and thus represent the immediate ‘growth intent’ of individually tailored interventions. Individual agency staff and sites were supported and coached to develop their own intervention components (or ‘how’ scripts and strategies) that were mapped to the RF-10 anchor points. Staff were trained and coached to draw upon the three LBM activating processes of validation, curiosity and coaching to strengthen intervention impact. To demonstrate the methodology, consider a case example where a staff member isolates a growth intent to build a young person’s awareness of ‘what is mindfulness’. Following this isolation, they articulated ‘how’ scripts and strategies designed to deliver this intent, as contextualised to the needs of the individual. This contextualisation included awareness being brought to the activating processes of validation, curiosity and coaching. In practice, the final interventions ranged from traditional didactic approaches to project-based learning conducted within groups.

In summary, the infusion of intentional practice, and growth-focused logic modelling, into the design and implementation of the Resilient Futures program, enabled a coherent description of the program that could be directly translated to service delivery through a unifying and clear implementation method (intentionality). Preliminary evidence indicated that meaningful outcomes were being achieved for young people (wellbeing and resilience awareness and skill expression), and across both agency staff and broader system (e.g., infusion of wellbeing content into agency practice) (Raymond et al. 2018). The methodology employed by Resilient Futures serves as an innovative case study by which the science of wellbeing and resilience can be translated into practice through attention being paid to the program development and implementation literature.


Positive psychology is an emergent discipline which has brought focus to a range of constructs, interventions (PPI’s) and approaches that can significantly strengthen individual and collective wellbeing (Hone et al. 2015). Despite this, compared to other disciplines (e.g., clinical psychology, education), positive psychology is relatively under-developed, and there are increasing calls to understand the factors that moderate intervention outcomes (Schueller 2014). Across the wider literature, there is widespread understanding that ‘complex’ interventions or programs, or those that include multiple components (or PPI’s), are delivered across multiple layers (individual, workgroup/classroom and organisation/school) or agency sites, and include cohorts with heterogeneous or complex needs (e.g., trauma), are at higher risk of lower quality implementation. The operationalisation of positive education to individual schools or institutions (e.g., whole-of-school approach) befits a ‘complex program’.

We argue for strong attention being paid to implementation and program development science, specifically the design and implementation of interventions which are (1) conceptually sound, (2) skill-focused, (3) responsive to client need and interest, (4) uphold program fidelity (integrity) and (5) are targeted (Raymond 2018b). This includes attention being paid to both ‘evidence-based practice’ and ‘evidence-based adaptation’, which is particularly relevant for child and youth interventions (Ghate 2016; Malti et al. 2016), or clients with complex needs (e.g., trauma).

We also see utility for the role of locally generated or home-grown models that operationalise conceptually sound program design. Logic modelling is suggested as an approach to operationalise this endeavour. This article has described and reviewed intentional practice (and the underpinning LBM), a growth-focused logic model and implementation methodology, to support the design and operationalisation of complex programming. Drawing upon the Resilient Futures case example, logic modelling was applied to consolidate early theory building work (realist method) and provide a cohesive description of the relationship between program outcomes (‘what’) and processes (or ‘how’). Logic models provide an opportunity to assess and operationalise program fidelity, and to support design changes to be made in an intentional and cost-efficient manner. Complex programs that are cohesively described and articulated are more likely to be implemented with higher quality (Michie et al. 2009). Despite this, logic models are not the panacea for program implementation, as they need to be supported through an intentional, resourced and monitored implementation strategy, targeted to all layers of the supporting system (Fixsen et al. 2009; O’Connor and Cameron 2017; Slemp et al. 2017; White 2016).

In the case of Resilient Futures, the consolidated logic model drew upon action research and theory building methods that included input from scientists, program developers and partner agency staff. One may question whether or not such logic models can be operationalised by non-scientists within local settings. While the implementation science literature identifies a key role for ‘purveyors’ (or coaches) to walk alongside agencies and individuals in program development and implementation planning/monitoring (Fixsen et al. 2009; Fixsen et al. 2005), such resources may not be available. We suggest there is no reason why non-scientists cannot strive to develop conceptually sound positive psychology interventions, or whole-of-agency positive education programs, that clearly articulate a hierarchy of outcomes (‘what’) and intervention components (‘how’). Across positive education, it is reported that positive psychology content and processes are not delivering sustainable outcomes (e.g., not ‘sticking’, White 2016). White and others (e.g., O’Connor and Cameron 2017; Slemp et al. 2017) have argued for an intentional approach to positive psychology implementation, supported through systems and underpinned by implementation science. The use of logic modelling (and programmatic approaches) and intentional practice as an implementation methodology arguably represents a mechanism to support this endeavour. To this effect, agencies and individuals who wish to design such programs (and logic models), which are nuanced to their context, are encouraged to consider the following steps:

  1. 1.

    Describe all desired program outcomes, and then work to categorise them as short-, medium- and long-term. Operationalise all short-term outcomes in descriptions that can be understood by all local personnel.

  2. 2.

    Describe all the key intervention components or methods that will be employed to deliver the stated outcomes. Seek to operationalise these in descriptions that would be understood by all local personnel.

  3. 3.

    Seek to understand the philosophy, values or broad intent (e.g., growth intent) that underpins the entire program design, and how these map to the underpinning intent and philosophies of the local agency. Ensure this is clearly articulated within the model.

  4. 4.

    Formulate an intent or theory for each intervention component by articulating a relationship with a short-term outcome (or growth intent). Draw upon scientific evidence to articulate and review this relationship.

  5. 5.

    Work to find a creative way to represent the content in a logical manner.

These steps are provided to operationalise intentional methods within program design. They point to the importance of bringing an integrated and planned approach to intervention design and planning, and by doing so, uphold the Gestaltian principle that the ‘whole is greater than the sum of the parts’. In other words, positive psychology interventions that bring integration and cohesiveness to a vision and intent are more effective than those delivered in compartmentalised or ad hoc ways. We have sought to provide new insights designed to support the positive psychology discipline transition beyond the ‘science of wellbeing’ towards being scholarly leaders in the ‘implementation of wellbeing science’.


  1. Aarons, G. A., Green, A. E., Palinkas, L. A., Self-Brown, S., Whitaker, D. J., Lutzker, J. R., et al. (2012). Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science, 7(1), 32.

    Article  PubMed  Google Scholar 

  2. Albrecht, L., Archibald, M., Arseneau, D., & Scott, S. D. (2013). Development of a checklist to assess the quality of reporting of knowledge translation interventions using the workgroup for intervention development and evaluation research (WIDER) recommendations. Implementation Science, 8, 52.

    Article  PubMed  Google Scholar 

  3. Blase, K., Van Dyke, M., Fixsen, D., & Bailey, F. (2012). Implementation science: Key concepts, themes and evidence for practitioners in educational psychology. Handbook of implementation science for psychology in education. In B. Kelly & D. Perkins (Eds.), Handbook of implementation science for psychology in education: How to promote evidence-based practice (pp. 13–34). London: Cambridge University Press.

    Google Scholar 

  4. Bolier, L., Haverman, M., Westerhof, G. J., Riper, H., Smit, F., & Bohlmeijer, E. (2013). Positive psychology interventions: A meta-analysis of randomized controlled studies. BMC Public Health, 13(1), 119.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Bordage, G. (2009). Conceptual frameworks to illuminate and magnify. Medical Education, 43(4), 312–319.

    Article  PubMed  Google Scholar 

  6. Brunk, M. A., Chapman, J. E., & Schoenwald, S. K. (2014). Defining and evaluating fidelity at the program level in psychosocial treatments. Zeitschrift für Psychologie, 222(1), 22–29.

    Article  Google Scholar 

  7. Brunzell, T., Stokes, H., & Waters, L. (2016). Trauma-informed positive education: Using positive psychology to strengthen vulnerable students. Contemporary School Psychology, 20(1), 63–83.

    Article  Google Scholar 

  8. Chorpita, B. F., Daleiden, E. L., & Weisz, J. R. (2005). Modularity in the design and application of therapeutic interventions. Applied and Preventive Psychology, 11(3), 141–156.

    Article  Google Scholar 

  9. Community Matters. (2017). Resilient Futures: Evaluation Report, 2017. SAHMRI Wellbeing and Resilience Centre: Unpublished report.

    Google Scholar 

  10. Cooksy, L. J., Gill, P., & Kelly, P. A. (2001). The program logic model as an integrative framework for a multimethod evaluation. Evaluation and Program Planning, 24(2), 119–128.

    Article  Google Scholar 

  11. Craig, P., Dieppe, P., Macintyre, S., Michie, S., Nazareth, I., & Petticrew, M. (2008). Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ, 337.

  12. Crane, R. S., Brewer, J., Feldman, C., Kabat-Zinn, J., Santorelli, S., Williams, J. M. G., & Kuyken, W. (2016). What defines mindfulness-based programs? The warp and the weft. Psychological Medicine, 47(6), 990–999.

    Article  PubMed  Google Scholar 

  13. Crosby, R., & Noar, S. M. (2011). What is a planning model? An introduction to PRECEDE-PROCEED. Journal of Public Health Dentistry, 71(s1), S7–S15.

    Article  PubMed  Google Scholar 

  14. Deci, E. L., & Ryan, R. M. (2000). The "what" and "why" of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268.

    Article  Google Scholar 

  15. Diener, E. (1984). Subjective well-being. Psychological Bulletin, 95, 542–575.

    Article  PubMed  Google Scholar 

  16. Diener, E. (1994). Assessing subjective well-being: Progress and opportunities. Social Indicators Research, 31, 103–157.

    Article  Google Scholar 

  17. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350.

    Article  Google Scholar 

  18. Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82(1), 405–432.

    Article  Google Scholar 

  19. Dweck, C. (2012). Mindset: How you can fulfil your potential. London: Constable & Robinson.

    Google Scholar 

  20. Eldredge, L. K. B., Markham, C. M., Ruiter, R. A., Kok, G., & Parcel, G. S. (2016). Planning health promotion programs: An intervention mapping approach. San Francisco: John Wiley & Sons.

    Google Scholar 

  21. English, B., & Kaleveld, L. (2003). The politics of program logic. Evaluation Journal of Australasia, 3(1), 35–42.

    Article  Google Scholar 

  22. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, the National Implementation Research Network (FMHI publication # 231).

  23. Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531–540.

    Article  Google Scholar 

  24. Fixsen, D. L., Blase, K. A., Metz, A., & Van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children, 79(2), 213–230.

    Article  Google Scholar 

  25. Fletcher, A., Jamal, F., Moore, G., Evans, R. E., Murphy, S., & Bonell, C. (2016). Realist complex intervention science: Applying realist principles across all phases of the Medical Research Council framework for developing and evaluating complex interventions. Evaluation, 22(3), 286–303.

    Article  PubMed  Google Scholar 

  26. Fordyce, M. W. (1977). Development of a program to increase personal happiness. Journal of Counseling Psychology, 24, 511–521.

    Article  Google Scholar 

  27. Fordyce, M. W. (1983). A program to increase happiness: Further studies. Journal of Counseling Psychology, 30, 483–498.

    Article  Google Scholar 

  28. Ghate, D. (2016). From programs to systems: Deploying implementation science and practice for sustained real world effectiveness in services for children and families. Journal of Clinical Child and Adolescent Psychology, 45(6), 812–826.

    Article  PubMed  Google Scholar 

  29. Hernandez, M. (2000). Using logic models and program theory to build outcome accountability. Education and Treatment of Children, 23(1), 24–40.

    Google Scholar 

  30. Holliday, L. R. (2014). Using logic model mapping to evaluate program fidelity. Studies in Educational Evaluation, 42(supplement C), 109-117.

  31. Hone, L. C., Jarden, A., Schofield, G., & Duncan, S. (2014). Measuring flourishing: The impact of operational definitions on the prevalence of high levels of wellbeing. International Journal of Wellbeing, 4(1), 62–90.

    Article  Google Scholar 

  32. Hone, L., Jarden, A., & Schofield, G. (2015). An evaluation of positive psychology intervention effectiveness trials using the re-aim framework: a practice-friendly review. The Journal of Positive Psychology, 10(4), 303–322.

  33. Iddon, J. E., Dickson, J. M., & Unwin, J. (2016). Positive psychological interventions and chronic non-cancer pain: A systematic review of the literature. International Journal of Applied Positive Psychology, 1(1), 133–157.

    Article  Google Scholar 

  34. Jordan, G. B. (2013). Logic modeling: A tool for designing program evaluations. In A. N. Link & N. S. Vonortas (Eds.), Handbook on the theory and practice of program evaluation (pp. 143–165). Cheltenham: Edward Elgar Publishing Inc.

    Google Scholar 

  35. Julian, D. A. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20(3), 251–257.

    Article  Google Scholar 

  36. Kendall, P. C., Gosch, E., Furr, J. M., & Sood, E. (2008). Flexibility within fidelity. Journal of the American Academy of Child and Adolescent Psychiatry, 47(9), 987–993.

    Article  PubMed  Google Scholar 

  37. Kok, G., Gottlieb, N. H., Peters, G. J. Y., Mullen, P. D., Parcel, G. S., Ruiter, R. A. C., et al. (2016). A taxonomy of behaviour change methods: An intervention mapping approach. Health Psychology Review, 10(3), 297–312.

    Article  PubMed  Google Scholar 

  38. Leppin, A. L., Bora, P. R., Tilburt, J. C., Gionfriddo, M. R., Zeballos-Palacios, C., Dulohery, M. M., et al. (2014). The efficacy of resiliency training programs: A systematic review and meta-analysis of randomized trials. PLoS One, 9(10), e111420.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Lipsey, M. W. (2009). The primary factors that characterize effective interventions with juvenile offenders: A meta-analytic overview. Victims and Offenders, 4(2), 124–147.

    Article  Google Scholar 

  40. Lyon, A. R., Lau, A. S., McCauley, E., Vander Stoep, A., & Chorpita, B. F. (2014). A case for modular design: Implications for implementing evidence-based interventions with culturally diverse youth. Professional Psychology: Research and Practice, 45(1), 57–66.

    Article  Google Scholar 

  41. Malouff, J. M., & Schutte, N. S. (2017). Can psychological interventions increase optimism? A meta-analysis. The Journal of Positive Psychology, 12(6), 594–604.

    Article  Google Scholar 

  42. Malti, T., Noam, G. G., Beelmann, A., & Sommer, S. (2016). Toward dynamic adaptation of psychological interventions for child and adolescent development and mental health. Journal of Clinical Child and Adolescent Psychology, 45(6), 827–836.

    Article  PubMed  Google Scholar 

  43. Margoluis, R., Stem, C., Salafsky, N., & Brown, M. (2009). Using conceptual models as a planning and evaluation tool in conservation. Evaluation and Program Planning, 32(2), 138–147.

    Article  PubMed  Google Scholar 

  44. McLaughlin, J. A., & Jordan, G. B. (2004). Using logic models. Handbook of Practical Program Evaluation, 2, 7–32.

    Google Scholar 

  45. Michie, S., Fixsen, D., Grimshaw, J. M., & Eccles, M. P. (2009). Specifying and reporting complex behaviour change interventions: The need for a scientific method. Implementation Science, 4(1), 40.

    Article  PubMed  Google Scholar 

  46. Michie, S., van Stralen, M. M., & West, R. (2011). The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implementation Science, 6, 42–42.

  47. Norrish, J. M. (2015). Positive education: the Geelong Grammar School journey. Oxford University Press.

  48. O’Connor, M., & Cameron, G. (2017). The Geelong Grammar positive psychology experience. In E. Frydenberg, A. J. Martin, & R. J. Collie (Eds.), Social and emotional learning in Australia and the Asia-Pacific: Perspectives, programs and approaches (pp. 353–370). Singapore: Springer.

    Google Scholar 

  49. O’Keefe, C. M., & Head, R. J. (2011). Application of logic models in a large scientific research program. Evaluation and Program Planning, 34(3), 174–184.

    Article  PubMed  Google Scholar 

  50. Oosthuizen, C., & Louw, J. (2013). Developing program theory for purveyor programs. Implementation Science, 8(1), 23.

    Article  PubMed  Google Scholar 

  51. Parks, A. C., & Schueller, S. E. (2014). The Wiley-Blackwell handbook of positive psychological inventions. West Sussex, UK: John Wiley and Sons.

    Google Scholar 

  52. Pawelski, J. O. (2016). Defining the ‘positive’ in positive psychology: Part I. A descriptive analysis. The Journal of Positive Psychology, 11(4), 339–356.

    Article  Google Scholar 

  53. Raymond, I. J. (2016a). Can intensive wilderness programs be a catalyst for change for young people at risk of offending, educational disengagement and poor wellbeing? (Doctoral Thesis). Flinders University, Adelaide. Retrieved from

  54. Raymond, I. J. (2016b). IMPACT: Respond rather than react (introduction to intentional practice). Adelaide: Life Buoyancy Institute.

    Google Scholar 

  55. Raymond, I. J. (2018a). Intentional practice: A positive psychology intervention planning and implementation method. Clinical Applications of Positive Psychology: An International Perspective, 1(fall 2018). Retrieved 10th December 2018 from

  56. Raymond, I. J. (2018b). A programme logic framework designed to strengthen the impact and fidelity of wellbeing and behavioural interventions. In P. Slee, G. Skrzypiec, & C. Cefai (Eds.), Child and adolescent well-being and violence prevention in schools (pp. 199–208). London: Routledge.

  57. Raymond, I. J. (2019). Intentional practice as a method to reduce the implementation gap between science and practice in the delivery of trauma-informed residential care. (submitted to Residential Treatment for Children and Youth).

  58. Raymond, I. J., & Lappin, S. (2016). Early intervention youth boot camp program: 2015 program implementation review summary report. Adelaide: Connected Self Pty Ltd. Retrieved 30th September 2018 from

  59. Raymond, I. J., & Lappin, S. (2017). EIYBC Program implementation review (2013–2016) and future directions. Adelaide: Connected Self Pty Ltd. Retrieved 18th July 2018 from

  60. Raymond, I., Iasiello, M., Jarden, A., & Kelly, D. (2018). Resilient Futures: An individual and system-level approach to improve the well-being and resilience of disadvantaged young Australians. Translational Issues in Psychological Science, 4(3), 228–244.

    Article  Google Scholar 

  61. Renger, R., Page, M., & Renger, J. (2007). What an eight-year-old can teach us about logic modelling and maintstreaming. The Canadian Journal of Program Evaluation, 22(1), 195–204.

    Google Scholar 

  62. Schueller, S. M. (2014). Person–activity fit in positive psychological interventions. In a. C. Parks & S. Schueller (Eds.), The Wiley Blackwell Handbook of Positive Psychological Interventions (pp. 385-402): John Wiley & Sons.

  63. Seligman, M. E. P. (2012). Flourish: A visionary new understanding of happiness and well-being. New York: Simon and Schuster.

    Google Scholar 

  64. Seligman, M. E. P., & Csikszentmihalyi, M. (2000). Positive psychology: An introduction. The American Psychologist, 55(1), 5–14.

    Article  PubMed  Google Scholar 

  65. Seligman, M. E. P., Ernst, R. M., Gillham, J., Reivich, K., & Linkins, M. (2009). Positive education: Positive psychology and classroom interventions. Oxford Review of Education, 35(3), 293–311.

    Article  Google Scholar 

  66. Sin, N. L., & Lyubomirsky, S. (2009). Enhancing well-being and alleviating depressive symptoms with positive psychology interventions: A practice-friendly meta-analysis. Journal of Clinical Psychology, 65(5), 467–487.

    Article  PubMed  Google Scholar 

  67. Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., & Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological science, 1–23.

  68. Slemp, G. R., Chin, T. C., Kern, M. L., Siokou, C., Loton, D., Oades, L. G., et al. (2017). Positive education in Australia: Practice, measurement, and future directions. In E. Frydenberg, A. J. Martin, & R. J. Collie (Eds.), Social and emotional learning in Australia and the Asia-Pacific: Perspectives, programs and approaches (pp. 101–122). Singapore: Springer Singapore.

  69. Stinchcomb, J. B. (2001). Using logic modeling to focus evaluation efforts. Journal of Offender Rehabilitation, 33(2), 47–65.

    Article  Google Scholar 

  70. Stricker, G., & Trierweiler, S. J. (2006). The local clinical scientist: A bridge between science and practice. Training and Education in Professional Psychology, S(1), 37-46.

  71. Waters, L., & White, M. (2015). Case study of a school well-being initiative: Using appreciative inquiry to support positive change. International Journal of Wellbeing, 5(1), 19–32.

    Article  Google Scholar 

  72. Westhorp, G. (2014). Realist impact evaluation: An introduction. London: Overseas Development Institute. Retrieved 15th March 2018 from

  73. White, M. (2013). Positive education at Geelong Grammar School. In S. A. David, I. Boniwell, & A. Ayers (Eds.), The Oxford handbook of happiness (pp. 657–670). Oxford: Oxford University Press.

  74. White, M. A. (2016). Why won’t it stick? Positive psychology and positive education. Psychology of Well-Being, 6(1), 2.

    Article  PubMed  PubMed Central  Google Scholar 

  75. White, M. A., & Waters, L. E. (2015). A case study of ‘the good school:’ Examples of the use of Peterson’s strengths-based approach with students. The Journal of Positive Psychology, 10(1), 69–76.

    Article  PubMed  Google Scholar 

  76. Wong, G., Greenhalgh, T., Westhorp, G., Buckingham, J., & Pawson, R. (2013). RAMESES publication standards: Realist syntheses. BMC Medicine, 11(1), 21.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


The Resilient Futures project was supported by substantial philanthropic funding from the Wyatt Trust and the James and Dianna Ramsey Foundation. We would like to sincerely thank these funding bodies, and acknowledge the significant effort and commitment of partner agencies within this project. We also thank Joseph Van Agteren for invaluable feedback provided in an earlier draft of this paper.

Author information



Corresponding author

Correspondence to Ivan Raymond.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Raymond, I., Iasiello, M., Kelly, D. et al. Program Logic Modelling and Complex Positive Psychology Intervention Design and Implementation: The ‘Resilient Futures’ Case Example. Int J Appl Posit Psychol 3, 43–67 (2019).

Download citation


  • Program design
  • Resilient Futures
  • Implementation science
  • Logic modelling
  • Intentional practice
  • Positive psychology