Sustainability Science

, Volume 14, Issue 6, pp 1531–1548 | Cite as

Improving conservation practice with principles and tools from systems thinking and evaluation

  • Andrew T. KnightEmail author
  • Carly N. Cook
  • Kent H. Redford
  • Duan Biggs
  • Claudia Romero
  • Alejandro Ortega-Argueta
  • Cameron D. Norman
  • Beverly Parsons
  • Martin Reynolds
  • Glenda Eoyang
  • Matt Keene
Open Access
Original Article
Part of the following topical collections:
  1. Ecosystems, Biodiversity, and Natural Resource Management


Achieving nature conservation goals require grappling with ‘wicked’ problems. These intractable problems arise from the complexity and dynamism of the social–ecological systems in which they are embedded. To enhance their ability to address these problems, conservation professionals are increasingly looking to the transdisciplines of systems thinking and evaluation, which provide philosophies, theories, methods, tools and approaches that show promise for addressing intractable problems in a variety of other sectors. These transdisciplines come together especially around praxis, i.e., the process by which a theory or idea is enacted, embodied or realized. We present a review and synthesis of the learnings about praxis that have emerged from The Silwood Group, a consortium of conservation professionals, professional evaluators, and complexity and systems thinkers. The Silwood Group believes that for conservation activities to achieve ambitious goals, we should benefit nature without compromising the well-being of people, and that framing a praxis for conservation in the context of social–ecological systems will provide the greatest potential for positive impact. The learnings are presented as four key principles of a ‘praxis for effective conservation’. The four principles are: (1) attend to the whole with humility; (2) engage constructively with the values, cultures, politics, and histories of stakeholders; (3) learn through evaluative, systemic enquiry, and (4) exercise wisdom in judgement and action. We also provide descriptions and references for tools and methods to support such praxis and discuss how the thinking and approaches used by conservation professionals can be transformed to achieve greater effectiveness.


Complexity Knowing–doing gap Learning Praxis Transdisciplinarity Transformative learning Wicked problems Wisdom 


Nature conservation initiatives typically operate in complex and dynamic social–ecological systems that necessitate grappling with ‘wicked problems’ (Rittel and Webber 1973). Such problems may not only be insoluble in the short- and medium-term, but may be exacerbated by negative feedbacks created when people attempt to solve them. The general failure to reverse widespread and growing pressures on nature (Butchart et al. 2015), despite substantial investment in both conservation science and practice, reflects this ‘wicked’ state.

Tradeoffs are increasingly accepted as necessary for conservation initiatives, which must navigate power and political relationships while attempting to simultaneously achieve conservation, development and human well-being goals. The high rates of failure amongst initiatives aiming to navigate these trade-offs, be they protected areas, Integrated Conservation and Development Projects, Biosphere Reserves or Community-Based Conservation initiatives, demonstrates the historically pervasive challenge posed by balancing the multiple values, politics and power inherent in these contexts (McShane et al. 2011). A new approach to conceptualizing and practicing conservation is urgently required.

Whilst the discipline of conservation aspires to be increasingly effective at grappling with wicked problems (Game et al. 2014), many conservation initiatives struggle to recognize and instill the learning processes necessary to grapple with the ever-changing challenges facing conservation (Redford et al. 2018). Conservation professionals have begun to place increasing emphasis on understanding the most effective ways to learn through their work (Pullin and Knight 2001; Dicks et al. 2014), especially recognizing the need for a relationship between what we learn, know and do.

Numerous models aim to embed learning into conservation practice, for example, adaptive management (Holling 1978), The Open Standards for the Practice of Conservation (CMP 2013), management effectiveness evaluation (Hockings et al. 2006), and structured decision-making (Gregory et al. 2012). These models tend to emphasize more rigorous measurement of effectiveness and disciplined recording of activities. These are important activities but they are insufficient for effectively grappling with the complexity of wicked problems. Systems thinking (including complexity science) and evaluation are two transdisciplines that seek positive outcomes to complex collaboratively defined problems. By integrating different approaches focused upon learning to create actionable shared knowledge (Wickson et al. 2006) across multiple contexts, these transdisciplines have philosophies, theories, methods and tools that, if integrated into conservation science and practice, offer insights that the conservation professional can use to positively transform their approach to their work.

The Silwood Group (i.e., us, the authors) is a team of scholars and practitioners from the fields of conservation, evaluation, and complexity and systems thinking that brings together over 200 years of expertise in the design, management and assessment of over 1000 initiatives from across the sectors of business, development, education, environmental management, healthcare, natural resources management, and social services. We formed The Silwood Group in late 2014 to improve the ability of conservation professionals, volunteers, funders and other stakeholders to grapple with wicked problems. Recognizing the opportunity to increase the conservation community’s exposure to well-established philosophies, theories, methods and tools from systems thinking and evaluation as applied in other sectors, we offer this learning to the broader conservation sector to promote the achievement of conservation goals for a more sustainable and equitable future. We frame our consolidated knowledge through a lens of praxis.

We seek to present and enrich the concept of praxis for conservation professionals and organizations that aim to improve their practice. We broadly define effective conservation as any purposeful activity that involves people successfully working towards achieving their explicitly stated goal of ensuring the persistence of nature, in ways that do not compromise human well-being. We recognize that individual perspectives on what constitutes ‘effective’ will vary according to their values, beliefs and context. We present four principles of a praxis for effective conservation, each illustrated through examples. A glossary of useful terms (Table 1) and a compendium of tools and their potential applications (Table 2) are provided to assist readers new to evaluation and systems thinking. We note that our personal experience indicates that a subset of conservation professionals struggle deeply with accepting the validity of non-reductionist philosophies, theories, methods, tools and approaches. The widely accepted use of these in other sectors, however, is a testament to their robustness and utility, and we argue, to driving the transformation of the conservation sector, through learning, towards increasingly effective thinking and practice.
Table 1

A glossary of terms defining approaches and concepts derived primarily from the transdisciplines of evaluation and systems thinking as used by the authors when developing the praxis for effective conservation approach presented in this article. We posit that they can prove useful for conservation professionals seeking to adopt a praxis for effective  conservation approach to their practice




Foundation references

Adaptive management

A process for taking action in a way that reduces uncertainty about the system being managed

Environmental management

Holling (1978)


Boundaries are the limits of a system, and define what is inside a system of interest and what is outside

Systems thinking

Williams and Hummelbrunner (2010)


Specific circumstances that form the setting for an intervention or event, statement, or idea, which provides its meaning. For example: “…the conservation proposals need to be considered in the context of European directives”. Contrast with “situation” (see below)

Policy review and social sciences

None. But used in, for example, adaptive management in international development. See:

Culturally responsive evaluation

A holistic framework for centering evaluation in culture, rejecting culture-free evaluation and recognizing that culturally defined values and beliefs lie at the heart of any evaluative effort


Hood et al. (2015)

Developmental evaluation

An approach supporting development of innovations and adaptations of interventions in dynamic environments


Patton (2010)


Alternative options presented as either/or decisions




Seemingly alternative options that can be true simultaneously, characterized by both/and statements (e.g., light can be both a wave and a particle)




Different philosophies about how knowledge is generated


Patton (2002)


A process of making value judgements about the merit, worth and significance of an intervention with the purpose of understanding how to achieve better outcomes


Scriven (1991)

Impact evaluation

The systematic process of measuring the intended and unintended causal effects of a project, program, or policy by comparing what actually happened with an intervention to what would have happened without it (i.e., the counterfactual)


Gertler et al. (2011)


The connections between elements in the system

Systems thinking

Williams and Hummelbrunner (2010)

Leverage points

Places within a complex system where a small shift in one element can produce big changes in the system as a whole

Systems thinking

Meadows (2008)

Management effectiveness evaluation

Evaluating the elements of the management process (e.g., objectives, planning, inputs, actions) to make judgements about why particular outcomes were achieved

Environmental management

Hockings et al. (2000)


A judgement about whether an intervention was effective, had the desired impact, was high quality


Scriven (1991)

Open Standards for the Practice of Conservation

An approach to planning for conducting, and evaluating management to facilitate learning and improvement based upon agreed principles of practice and measurement

Conservation biology

CMP (2013)

Participatory evaluation

An approach where persons trained in evaluation methods and logic work in collaboration with those not so trained to implement evaluation activities


Cousins (2003)


The views, values and beliefs of different stakeholders about a system.

Systems thinking

Williams and Hummelbrunner (2010)


The process by which a theory or idea is enacted, embodied or realized, where theory informs practice and practice informs theory to increase effectiveness by embedding learning


McKeon (1974)


An epistemology based on knowledge being developed through the scientific method—it seeks to reduce complexity and uncertainty by breaking systems down into components


Patton (2002)


A process of actively pausing to critically consider actions and their outcomes


Dewey (1933)

Resilience thinking

A way of exploring human and natural systems as complex entities continually adapting through cycles of change, which seeks to understand the qualities of a system that must be maintained or enhanced to achieve sustainability

Ecology, economics, natural resource governance

Folke (2006)


Judgement about whether an intervention was important, should it be a priority


Scriven (1991)


General circumstances or state of affairs that may affect or be affected by interventions, i.e. the ‘real-world’ flux of events, people, and ideas. For example: “…changing situations regarding issues of biodiversity and climate change may trigger different conservation initiatives”. Contrast with “context” (see above)

Systems thinking

No specific foundation reference, but refer to: Reynolds and Holwell (2010)

Social–ecological system

Systems where the human and nature dimensions of a space of interest to people are coupled and interacting

Resilience thinking

Berkes et al. (2000)

Social learning

The process of acquiring new, or modifying existing, knowledge, behaviors, skills, values, or preferences whilst interacting with others, emphasizing iterations of participation, communication and interaction as key elements of the learning process


Wenger (1998)


A person with an interest or concern in a specific issue




A set of elements or parts that is coherently organized and interconnected in a pattern or structure that produces a characteristic set of behaviors, often classified as its “function” or “purpose”

Systems Thinking

Meadows (2008)


According to an agreed set of methods or an organized plan

General use


A systemic problem or change is a basic one, experienced by the whole of an organization or a country and not just particular parts of it

General use

Systems approach to evaluation

An approach that draws on systems thinking and complexity to develop an holistic view of a context (interrelationships, perspectives and boundaries) being evaluated

Systems thinking and Evaluation

Thomas and Parsons (2017)

Systems thinking

Systems thinking is concerned with understanding the dynamics that influence and shape systems, revealing leverage points that guide actions to move a system towards a desired state

Systems thinking

Meadows (2008)


The theory and practice of cooperatively formulating, understanding and solving the contemporary complex problems facing society through a single methodology that embodies the unity of existing, and creation of new, integrative knowledge within and across disciplines

Education, quantum physics

Nicolescu (2002)


Fields in their own right but whose theory and methods are applied across several fields

Ecological economics

Max-Neef (2005)


An intended change driven through strategically enacted means and processes



Transformative learning

Transformative learning involves positively contesting assumptions and worldviews to challenge entrenched beliefs and enable essential change


Mezirow (1991)

‘Wicked’ problems

Challenges that are messy, intractable, subject to multiple interpretations, and for which solutions at present are not evident or inscrutable

Political sciences

Rittel and Webber (1973)


The evaluation of understanding to make judgements about how to apply said understanding to individual challenges and contexts

Philosophy, systems thinking

Ackoff (1989)


How a person’s values and experiences combine to shape the way they interpret and understand the contexts and situations they inhabit


Thompson et al. (1990)


Judgement about whether an intervention: has value to affected stakeholders, or was needed


Scriven (1991)

Table 2

A list of methods, tools and approaches useful for enabling a ‘praxis for effective conservation’ that support the four principles synthesized from the fields of systems thinking and evaluation

Approach or tool


Explanatory literature

Example application

Adaptive action

An enquiry-based, iterative problem-solving process stimulated by addressing three questions: (1) What? (what is the current situation), So what? (what are the implications of that situation), Now what? (what is involved in changing the situation)

Eoyang and Holladay (2013)

Conservation: None known.

Other: Moore and Maland Cady (2015)

Agile software development

A set of values and principles under which requirements and solutions evolve through collaborations of self-organizing teams, advocating adaptive planning, evolutionary development, early delivery, continuous improvement, and encouraging rapid and flexible responses to change

Schwaber and Beedle (2001)

Conservation: None known.

Other: Dybå and Dingsøyr (2008)

Appreciative inquiry

A process for engaging a wide range of stakeholders to determine what they value and its implications for shared action. It uses an holistic framework comprising stages: Define (use the positive as the focus of inquiry); Discovery (identify exceptionally positive moments); Dream/Design (creating a desired image of a shared future); Destiny (taking action). The process includes participants interviewing one another to stimulate dialogue about positive experiences

Watkins and Cooperrider (2000)

Conservation: Nyaupane and Poudel (2011).

Other: Preskill and Catsambas (2006)

Boundary critique

Boundary critique involves checking (or reflecting on) systems’ boundaries according to changing realities (‘facts’) and changing values of the stakeholders associated with any complex situation. Boundary judgements can be grouped into four sets of questions relating to (1) motivation, (2) control, (3) knowledge, and (4) legitimacy

Ulrich (2000)

Conservation: Foote et al. (2007).

Other: Ulrich and Reynolds (2010)

Consequence table/matrix

Summarises different management alternatives in relation to how they perform relative to different objectives. This matrix can be used to reveal where there are trade-offs in the ability to maximise benefits for all objectives, and select the alternative that is most acceptable to different stakeholders. Using a participatory process, whereby all stakeholders are involved, the agreed objectives are outlined and different management alternatives are scored in terms of the likely outcomes of each relative to the different management objectives

Gregory et al. (2012)

Conservation: Gregory and Long (2009).

Other: Gregory and Gregory (2010)

CDE (containers, differences, exchanges)

A complex adaptive systems method drawn from Human Systems Dynamics. It explores the way in which framing systems properties as containers (C), differences (D), and exchanges (E) can enable us to understand and influence how complex systems work. It addresses the following questions: (1) What are the conditions that shape a self-organizing process? (2) What interventions might influence the path and outcomes of a self-organizing process?

Eoyang (2004)

Conservation: None known.

Other: Eoyang (2007)

DIKW framework

Data–information–knowledge–wisdom Framework

Ackoff (1989)

Conservation: None known.

Other: Awad and Ghaziri (2004)

Lean start-up

Incorporating aspects of design thinking and lean manufacturing, a methodology for developing businesses and products that shorten product development cycles by adopting a combination of hypothesis-driven experimentation, iterative product releases, and validated learning

Ries (2011)

Conservation: None known.

Other: Harms (2015)

Mental models mapping

A process of eliciting and sharing cognitive frameworks of individuals and groups that can be used to construct a shared vision

Johnson-Laird (1983)

Conservation: Biggs et al. (2011).

Other: Nonaka and Takeuchi (1995)

Multi-criteria decision analysis

A transparent approach for identifying actions that perform best when aiming to achieve multiple objectives that involves outlining and weighting multiple objectives or performance criteria, and rating alternatives in terms of how they perform against each criteria

Cochrane and Zeleny (1973)

Conservation: Huang et al. (2011).

Other: Le Gales and Moatti (1990)

Logic models

A graphical way to organize information and display thinking. Depicts the implicit maps we carry in our minds of how the world does or should work

Knowlton and Phillips (2013)

Conservation: Margoluis et al. (2009).

Other: McLaughlin and Jordan (1999)

Rich pictures

Rich pictures are usually drawn prior to analysing a situation when it is unclear which parts of a situation have particular importance. They are an attempt to encapsulate the points of interest concerning a situation. They can be invaluable in communicating issues between groups of people where there are cultural or language differences. Drawings, pictures and text can provide the basis for developing the shared understanding needed to enable further dialogue

Checkland (1989)

Conservation: Sayer et al. (2007).

Other: Bell and Morse (2013)


Rubrics are performance criteria for scoring constructed responses (qualitative data) to specific assessment questions. They are used when a scale from low to high performance makes sense. They have been used extensively in education and are increasingly being used in evaluation. A matrix is constructed outlining each performance criteria and a description of the different levels of performance (“very poor” to “outstanding”) relative to each criterion. Rubrics can be generic or customized for a particular situation

Arter and McTighe (2001)

Conservation: Allen et al. (2014).

Other: Arter and McTighe (2001)

Scenario planning

Outcomes A large group process that takes a wide range of disparate stakeholders through a process used to anticipate possible alternative futures. It may encompass many different approaches to creating alternative visions of the future based on key uncertainties and trends, and exploring actions that will move a group toward desirable futures

Wack (1985)

Conservation: Wildlife Conservation Society and Bio-Era (2007).

Other: Schoemaker (1995)

Simple rules

Simple Rules are instructions to inform the behavior of agents in a Complex Adaptive System. Whether by conscious agreement or by unspoken assent, agents of a CAS appear to engage with each other according to a short list of simple rules. Those Simple Rules shape the conditions that characterize the dominant patterns of a system

Zimmerman et al. (2008)

Conservation: None known.

Other: Stewart (2016)

Structured decision-making

A purposive process that explicitly and quantitatively assesses the trade-offs and consequences of choosing amongst a set of alternative actions so as to identify optimal actions that balancing diverse stakeholder objectives in a state of uncertainty

Gregory et al. (2012)

Conservation: Gregory and Long (2009).

Other: Martin et al. (2009)

Praxis and conservation

Praxis is the purposive process of acting on, embodying or realizing an idea, theory or concept. The concept of praxis has a long history, stretching back to Aristotle (384 BC to 322 BC). The modern use of praxis has many lineages that might be traced back to the enlightenment period of critical philosophy initiated with Immanuel Kant (1724–1804), on the back of questioning the mind–body dualism (e.g. theory vs. practice) most commonly associated with Renee Descartes (1596–1650). Since the eighteenth century, the application of praxis, beyond simple philosophical discourse, to achieve societal transformation is evident in work ranging from general political economy per Karl Marx (1818–1883), the radical pedagogy and educational studies of Paolo Friere (1921–1997), modern feminist and cultural critiques (e.g. Linda Alcoff; Alcoff 2006) and sociology more broadly, including structuration theories coupling social structure and agency (e.g., Anthony Giddens; Giddens 1984). From these various bodies of learning, we find that a useful praxis has three key attributes that are specifically relevant to effective conservation.

First, praxis acknowledges and embraces dualities (i.e. both/and) to promote science and action, knowing and doing. Praxis challenges the notion of dualisms (i.e., either/or), such as the false divides between science and action (e.g., Toomey et al. 2017) and between knowing and doing (e.g., Holling 1978). Dualisms also direct conservation professionals to assign success or failure singularly to outcomes, and focus on the process of planning or implementing action. Instead, embracing dualities endorses the interdependence of different elements, as reflected in notions of science-in-action and adaptive management (Pfeffer and Sutton 1998). As such, practice may precede, and be designed to generate, the knowledge necessary for increasingly effective conservation (Cook and Wagenaar 2012).

Second, praxis facilitates a type of learning which is essential to improve the effectiveness of conservation initiatives (Dicks et al. 2014). Such praxis is a process that creates space for acknowledging the political dimensions of conservation problems by articulating, revealing and negotiating power dynamics and a diversity of perspectives, particularly from those on the margins (Freire 1970). Praxis fosters conversations about values, providing an alternative to political and positional bargaining, by making transparent our default responses that maintain, and do not allow questioning of, the mental models and disciplinary allegiances that stymie effective action and transformation (Pielke 2007).

Third, praxis is continually attentive to the goals and directions of purposive transformation, i.e., an intended change driven through strategically enacted means and processes. A useful praxis remains mindful of who and/or what may be marginalized by the politics and power imbalances that pervade all conservation initiatives. ‘Good’ praxis recognizes failure as a rich source of learning, and explicitly and continually experiments with new approaches and processes developed from both successful and failed activities to solve entrenched conservation problems. The praxis process can then facilitate learning to inform future actions and, where and when necessary, adjust goals and activities as part of an iterative process. Taking the time to reflect on the diversity of elements comprising a conservation context and the ways in which they interact and evolve is an essential prerequisite for attending to purposive transformations in conservation contexts and embedding learning within individuals and organizations (Salafsky et al. 2002).

Systems thinking: describing and understanding situations

Reductionist sciences, such as analytical chemistry, population ecology and social psychology, often generate knowledge about how entities or phenomenon function by systematically reducing a ‘whole’ into ever-smaller components. In contrast, the discipline of systems thinking purposively attends to the relationships and interactions between parts identified as relevant and the interconnected whole of situations (Reynolds and Holwell 2010). Since the mid-20th century, systems thinking has provided frameworks and tools (Table 2) to reveal the context of conceptually bounded problem situations. These are described and rendered as systems, which can be simply defined as a collection of entities perceived by someone as interacting together to do something. Inherent in this definition is a condition that systems are not predetermined but rather purposeful. As such, there are multiple valid perspectives on, and representations of, a purpose, problem, or situation (Cilliers 2005).

Deciding what constitutes a ‘whole’ system in a given context involves making decisions about what parts and processes, natural and social, are included and excluded. These ‘boundary judgements’ (Ulrich and Reynolds 2010) demarcate a perceived system from its broader situation, environment, and histories, and may be referred to as the ‘system of interest’. Given that boundaries of a system of interest are human constructs, systems are inevitably partial as they: (1) delimit only a subset of all possible inter-relationships; and (2) inevitably serve to meet the needs of some stakeholder groups better than others (Ulrich 2003). When the demarcation of a system of interest does not comprise an explicit process, misunderstanding and conflict may arise when different stakeholders make different boundary judgements based on their different values, experiences and priorities. For example, the ways in which power is distributed within both implicit and explicit political processes influences who is involved in decision-making, which may promote or curtail elite capture of benefits derived from purposive transformations in a system.

Systems thinking in practice comprises three activities: (1) understanding interrelationships between elements; (2) engaging with multiple perspectives; and (3) reflecting on boundary judgements (Reynolds and Holwell 2010). Figure 1 illustrates this for variables related to these three activities—interrelationships ranging from a small number of tight interrelationships to many loose interrelationships; multiple perspectives ranging from a few explicit convergent perspectives to many implicit, divergent perspectives; and boundary judgements ranging from a few closed and fixed boundaries to many open flexible boundaries. In social–ecological systems, interrelationships include stakeholders and their relation to one another (Checkland 2000). Stakeholders have unique perspectives, determined by individual values and worldviews (Biggs et al. 2011), meaning a system of interest has multiple potential boundaries related to physical, spatial, temporal, and social attributes. Boundaries may be fixed, for example, using the perspective of one stakeholder group at the expense of others, or more helpfully, adaptable to situational changes.
Fig. 1

A conceptual tool comprising the three dimensions of a system that are often most useful for establishing a foundation for an effective conservation initiative, namely: (1) interrelationships; (2) multiple perspectives; and (3) boundary judgements. This conceptual tool can assist a group of stakeholders to: (1) identify a ‘system of interest’ in which they wish to intervene; and (2) structure an evolving process for deciding “what action is next” to deliver an increasingly complete understanding of their ‘system of interest’. The understanding generated through this tool can then be applied to the design and selection of conservation strategies, inclusive of philosophies, theories, methods, tools, mechanisms and approaches that are well-matched to the characteristics of the ‘system of interest’. The small cube in the lower-front corner represents ‘systems of interest’ that might be most usefully understood through experimental studies (i.e., a system presenting fixed boundaries few in number; a few tight interrelationships; and a few explicit but convergent perspectives). In contrast, the small cube in the upper-back corner represents ‘systems of interest’ typical of ‘wicked’ conservation challenges (i.e., the system has many open and flexible boundaries; many loose interrelationships; and many implicit and divergent perspectives)

Understanding the importance of boundary judgements is integral to the work of all conservation professionals. For example, conservation biologists may be required to map the spatial distribution of a plant species’ habitat to inform restoration activities, or a protected area manager to decide which stakeholders are most affected by management decisions. In many cases, decisions must be made as to what elements of a system are in, and what are out, of bounds. The existence and effect of the different perspectives presented by stakeholders are likewise embedded within conservation initiatives. Professionals associated with conservation have historically taken a narrow view of the systems they work in, bounding systems in ways that largely exclude the people, institutions and political processes that impact them (Dowie 2011). Conservation biologists are often highly proficient in the use of reductionist experimental methods to identify causal relationships within systems of interest where boundaries are fixed and not open to interpretation or change, relationships are few and tightly connected, and perspectives explicitly-stated and converging around similar values (Fig. 1). However, these characteristics rarely typify conservation situations. If a subset of perspectives becomes privileged, the knowledge used to make judgements and take action is incomplete and therefore inadequate. Emergence of the concept of social–ecological systems (Berkes and Folke 1998) represented a move to address the limitations of a reductionist perspective as it affects conservation challenges, and provides a platform for further theoretical and practical advances (Liu et al. 2007).

Evaluation: values, learning and judgement

The formalization of the transdiscipline of evaluation can be traced back to federally funded social programs in the United States in the 1960’s. These programs were accompanied by a requirement for an evaluation to determine their effectiveness. The field has evolved from narrowly defined programs to large-scale initiatives and processes, as well as community-based development and advocacy. It has had a longstanding emphasis on learning and improvement rather than simply proving that an intervention “works” (WKKF 2017). For the past two or three decades, four themes of particular emphasis have been: the participation of stakeholders (participatory evaluation; Cousins 2003), the importance of attention to culture (culturally responsive evaluation; Hood et al. 2015; Thomas and Parsons 2017), the evolving nature of interventions (developmental evaluation; Patton 2011) and attention to complexity especially in social systems and networks (systems-oriented evaluation; Preskill and Gopal 2014; Parsons 2012; WKKF 2017).

The transdiscipline of evaluation comprises a deep body of knowledge and scholarship incorporating three dimensions: values, methods and use (Christie and Alkin 2012). Rather than a one-size-fits-all approach, evaluation today consists of a portfolio of philosophies, theories, methods, tools, social networks and knowledge to suit a wide range of contexts (Table 2) (e.g., WKKF 2017; Davidson 2005). The process of evaluation permeates all dimensions of an initiative, and is focused squarely on learning and the utility of processes and outputs (Patton 2008; Christie and Alkin 2012).

The process of evaluation has been defined across many sectors as the determination of merit (e.g., how effective was an intervention?), worth (e.g., how valuable was an intervention?) and/or significance (e.g., how important was an intervention?) (Scriven 2007). In contrast, in conservation, evaluation is most commonly a process for determining only the effectiveness of an intervention (i.e., merit) (Mascia et al. 2014), with less emphasis placed on an intervention’s worth and significance, though this situation is improving (e.g., Romero et al. 2017). It is typically implemented as a solitary concluding activity of a management cycle (Schwartz et al. 2017), and is often not implemented at all (e.g., Kapos et al. 2008; Redford et al. 2018). Evaluation is comparatively new, but increasingly familiar, to conservation professionals, particularly those promoting evidence-based conservation (Keene and Pullin 2011), as is evident from the increasing number of studies assessing the effectiveness of protected areas (Geldmann et al. 2013; Gill et al. 2017).

While research typically aims to create new generalizable knowledge, evaluation generates situation-specific information for decision-making. This process occurs within the context of stakeholder values and judgements, serving as a means to communicate those judgements to others with the aim of influencing decisions. A common misconception is that an evaluator makes such judgements objectively, as an independent third party, and while this was formerly the norm (e.g., in the development arena; Easterly 2013), it is no longer common in many sectors. Whilst striving for independence may be useful in some circumstances, many evaluation approaches emphasize the role of stakeholders as active participants in establishing evaluative criteria, decision-making and learning (Patton 2011). Evaluation more often acknowledges the different perspectives, values, culture, politics and histories of people within the situation being evaluated (Hood et al. 2015; Samuels and Ryan 2011). The processes of making meaning from data and providing useful results, i.e., going beyond designing and conducting basic research or inquiry, have become increasingly sophisticated, recognizing varying contexts and purposes. Over recent decades, the evaluation field has expanded in terms of its range of quantitative and qualitative methodologies and the scope of its focus. The evaluative thinking and practices common amongst conservation professionals today typically represent a very small subset of all that are potentially useful to them (Baylis et al. 2016). Accordingly, we outline a set of approaches and tools we believe are useful in Table 2.

The discipline of evaluation is active in at least 158 regional, national and international professional organizations totaling approximately fifty thousand members ( These institutions and processes ably support the development of a praxis that is well-suited to tackle the diversity and complexity of conservation situations.

Principles of praxis for effective conservation

In a world where all biophysical systems have been impacted by humanity, what form of praxis will best contribute to effective, worthy and relevant conservation (Ison and Schlindwein 2015)? Here, we present a ‘praxis for effective conservation’ approach embodied in four principles. These emerged from our review and synthesis of both our collective expertise and the fields of evaluation, systems thinking, and conservation; the design and delivery of two workshops in 2014; and the learning generated in the process of drafting this paper:
  1. 1.

    Attend to the whole with humility.

  2. 2.

    Engage constructively with the values, cultures, politics and histories of stakeholders.

  3. 3.

    Learn through evaluative, systemic enquiry; and.

  4. 4.

    Exercise wisdom in judgement and action.


In introducing these principles, we reinforce our belief that effective conservation initiatives recognize whole social–ecological systems, ensuring the persistence of nature without compromising human well-being. For each principle, we discuss how common approaches to conservation could be transformed through its application. We also present a suite of evaluation and systems thinking approaches and tools that may support praxis (Table 2), and offer examples of their use in linked social–ecological systems.

Attend to the whole with humility

Uncertainty and complexity are intrinsic qualities of living systems. Effective conservation action must move beyond reductionist science, giving due attention to the uncertainty permeating these systems (Holling 2001). Humility allows people to accept that, despite all we do know, in most systems of interest, uncertainty is high and predictability low. Likewise, it is increasingly recognized by conservation professionals that we cannot know, understand, or gather data on all dimensions of ever-changing systems (e.g., Cowling et al. 2010). Humility is also fundamental to including stakeholders in collaborations that acknowledge the existence and validity of different values and types of knowledge. Humility is the foundation upon which trust is developed.

In contrast to a traditional view of conservation (i.e., people excluded from nature; Mace 2014), Attending to the whole requires consideration of the richness of nature–human interrelationships (i.e., people connected with nature; Zylstra et al. 2014), genuine engagement with multiple perspectives, and careful reflection on where to draw system boundaries (Reynolds 2011). Recognition of dualities (e.g., conservation and development, traditional and contemporary, outsiders and locals) across a range of spatial and temporal scales (Valters 2015) sets the foundation for a praxis for effective conservation. Developing a shared understanding of these, and other, dualities might begin with the use of tools such as mental models, logic models and theories of change (Table 2). These tools capture and communicate how individual actors understand a system (Biggs et al. 2011), boundary critique can assess the consequences of working with specific values and realities to make judgements (Ulrich and Reynolds 2010), and rich pictures to qualitatively and holistically identify phenomena that influence a system (Table 2; Bell and Morse 2013). Accurately and precisely conceptualizing a system enables understanding (e.g., through complex adaptive systems models), analysis (e.g., exploring the implications of bounding a system), identification of leverage points (e.g., targeting incentives for human behavior change), and hence purposive transformation of complex, complicated and/or conflictual situations (e.g., through design of a process for protecting rhinoceros from criminal poaching).

Engage constructively with the values, cultures, politics and histories of stakeholders

The power imbalance among those who directly and indirectly benefit from the use of natural resources, and those who bear the potential costs of conservation choices, have been underrepresented in conservation (Barry and Oelschlaeger 1996), as conservation practice as historically been driven by the values and politics of Western conservation scientists and practitioners (Adams and Mulligan 2003). These persistent, long-term power imbalances manifest as structural inequities that, ironically, contradict the value systems of many conservation professionals. Fortunately, recognition of the need to Engage constructively with the values, cultures, politics and histories of stakeholders, and other social dimensions, is gaining momentum in the formulation of social–ecological approaches to conservation (e.g., Bennett et al. 2017). Where the diversity of local values and knowledge has not been engaged, conservation decisions can result in polarized views that lead to local peoples’ displacement or resource-use restrictions aimed to fence-in nature (Adams and Mulligan 2003). Even within the confines of the scientific community there are polarized worldviews derived from different values and perspectives. For example, species triage (i.e., prioritizing species with the greatest potential to be conserved, rather than the most endangered) is highly controversial, as it can lead to decisions that accept extinctions to ensure greater overall conservation outcomes (Bottrill et al. 2008). Such judgements are unacceptable to some because it may mean the loss of species valued for personal, cultural or religious reasons (Jachowski and Kesler 2009). Similarly, wildlife hunting is abhorrent to some, while others believe sustainable exploitation is a right that facilitates private and communal land conservation, and that may ultimately achieve conservation goals most effectively (Naidoo et al. 2016).

Engaging constructively with stakeholders whose values and worldviews reinforce the dichotomies that impede the conversations necessary for developing and implementing a useful praxis for effective conservation is central to identifying equitable and resilient approaches to conservation challenges (Tuhiwai Smith 2018). By example, culturally responsive evaluation (amongst other approaches) evolved to address challenges such as: power inequities, especially among groups traditionally under-served or marginalized; elite capture of benefits, and; the validity of different cultural perspectives as it relates to decision-making (Hood et al. 2015).

There are different ways to assist in making visible disparate stakeholders’ perspectives (Table 2). Platforms and tools for building dialogue (e.g., consequence tables and focus groups, appreciative inquiry, participatory mapping), when used to include the range of stakeholders, can capture and make transparent the multiple values, objectives and expectations common to conservation initiatives. These approaches are also important to reveal the inevitable trade-offs and identify actions that could maximize benefits and minimize conflicts between different actors and their objectives (Gregory et al. 2012). Scenario planning can capture the role of values in understanding the present drivers of change and envision future social–ecological contexts based on deliberation and negotiation (Malinga et al. 2013). While a few of these tools are commonly used in conservation, the widespread application of a more comprehensive toolkit (Table 2) will more effectively engage a still-untapped potential to help make the role of values, cultures, politics, histories and expectations explicit, revealing how they influence a system or promote collaborative judgements to encourage wiser action.

Learn through evaluative, systemic enquiry

There is a range of decision-making frameworks that have been applied in conservation to assist in the integration of program design, implementation, monitoring and evaluation, and re-conceptualization to test assumptions and promote learning and adaptation (e.g., adaptive management, management effectiveness evaluation, structured decision-making; see Schwartz et al. 2017). There is often a specific desire to include monitoring and evaluation within conservation programs to facilitate management and learning (Mascia et al. 2014), but these activities are often not implemented, or are implemented ineffectively (Legg and Nagy 2006; Redford et al. 2018). Common barriers include failure to commit funding to these activities, unsupportive political contexts, limited technical and methodological capacities, particularly in developing countries (e.g. Ortega-Argueta et al. 2016), along with the fear of exposing failures (Redford and Taber 2000).

This absence of monitoring and evaluation activities suggests practitioners are concerned by the costs of publicly recognized failures more than they are by the time and financial costs of these activities (Redford and Taber 2000). Further, it suggests that the improved practices generated by learning from failure are discounted against acknowledging failure. This perspective contrasts the common rhetoric that learning is an essential activity for effective action, as reflected by its inclusion in most evidence-based decision-making frameworks (Cook et al. 2016). Fear of failure and purported dichotomies (e.g., planning versus implementation), commonly restrict flows of information and opportunities to learn in the conservation sector, as demonstrated for spatial prioritization (Knight et al. 2008) and recovery planning (Bottrill et al. 2011).

A strong praxis for effective conservation enables simultaneous planning and implementation because it is supported by activities that, accompanied by continual reflection, generate learning that informs both theory and action. The linear model of knowledge transfer where academic researchers and institutions are the holders and providers of knowledge while practitioners are the users of that knowledge is outdated and hinders praxis (Pielke 2007; Toomey et al. 2017). The knowing–doing “gap” is not usually a breach along a linear information exchange pathway but rather a “knowing–doing space” comprising the dimensions perceived by stakeholders as relevant for transforming social–ecological systems (Toomey et al. 2017). Effective learning depends upon whether the “right” questions are asked of stakeholders. Evaluative inquiry (the systematic practice-oriented process of using empirically derived and value-based data to craft and investigate questions of interest (Parsons 2009)) and governance structures and dialogue platforms through which knowledge can be developed between all stakeholders, can help. Evaluative thinking—the combination of critical thinking, creative thinking, inferential thinking, and practical thinking—can be used in complex systems to, for instance, craft contextually specific approaches to using fit-for-purpose questions that generate reasoned, evidence-based judgements about value (Vo and Archibald 2018). For example, wildlife-users occupying Wildlife Management Units in Mexico are linked by a monitoring and reporting system of ecological (e.g., species, harvest rates and uses) and socioeconomic (e.g., legal wildlife products, markets and income) indicators. Information is collated and reported to national government and donors to develop better policies, management guidelines and technical training for wildlife users (Ortega-Argueta et al. 2016). More generally, the introduction of reporting and feedback systems by donors that require conservation organizations to clearly demonstrate learning, inclusive of learning from failures, as a pre-condition for securing and maintaining funding is one potential mechanism for promoting learning.

The portfolio of evaluative resources to support learning is expansive. Management effectiveness evaluation (e.g., Hockings et al. 2006), derived from utilization-focused evaluation (Patton 2008), and adaptive management, are familiar and put to use by a proportion of conservation professionals. Other resources in the rapidly growing collection of systems- and complexity-oriented methods, tools and approaches are not yet mainstream outside of the professional evaluation community (Table 2; Williams and Hummelbrunner 2010), suggesting stronger, more formal transdisciplinary collaborations between professional evaluators and conservation professionals are essential.

Exercise wisdom in judgement and action

People extract value from experience and learning in different ways, using different approaches and understandings. One conceptualization of this process, the DIKW (Data–Information–Knowledge–Wisdom) Framework, identifies distinctions and links across this spectrum (Ackoff 1989). By example, scientists gather data (i.e., observations recorded but unprocessed) to generate information (i.e., data processed, useful for decisions and action) that is organized and applied to become knowledge (i.e., information contextualized, cause–effect relationships determined). Wisdom (i.e., the ability to think, act and utilize knowledge, experience, understanding, and insights; Ackoff 1989) is frequently neglected. However, each element is considered a prerequisite for those subsequent ones, magnifying the utility for affecting positive change. Conservation biology has often gathered data and information at the expense of generating knowledge and wisdom (e.g., Stuart et al. 2010), despite the often rapidly diminishing returns on such investments (e.g., Grantham et al. 2008).

Wisdom is a prerequisite for effective conservation. For example, wisdom is central to assessing the merit, worth and/or significance of the relationships between actions and outcomes in the context of human values. But whilst data and information are developed from past experiences and activities, knowledge and wisdom are focused on making judgements for the present and the future. Practical wisdom underpins choices about the next challenge to be addressed and the next actions to take, and hence the vision and design, of effective conservation initiatives (Schwartz and Sharpe 2006). To build conservation wisdom, all potential knowledge in all its different forms must be respected, articulated and accessible for use.

Conservation thinking is increasingly enriched through the diversity and depth of different knowledge systems, which bolsters its ability to gain wisdom or use the wisdom already present in a system. The incorporation of, for example, traditional ecological knowledge into conservation initiatives has improved outcomes (Berkes et al. 2000). Knowledge complementarity and interaction are now recognized within the Convention on Biological Diversity, the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES) and the Sustainable Development Goals as relevant for the conservation and sustainable use of nature, while safe-guarding local knowledge, innovations, respect and practices of local communities. Inter-cultural education in Mexico and Tanzania has contributed to enhanced critical thinking and new knowledge construction for advancing conservation goals (Burford et al. 2012). Given the ongoing erosion of local and traditional knowledge, the conservation sector will benefit from accessible, useful, credible knowledge-sharing platforms that are well-matched to the complexity of its endeavors.

A variety of approaches and methods can assist to put wisdom to work amidst complexity (Table 2). A few of the promising ones include Adaptive Action (Eoyang and Holladay 2013), Lean Startup (Ries 2011), developmental evaluation (Patton 2011) and culturally responsive evaluation (Hood et al. 2015; Thomas and Parsons 2017). In the emerging conservation contexts where software development is increasingly important, Agile methods and practices will already be common (Table 2; Schwaber and Beedle 2001). These approaches tend to be evaluative and systems-oriented, often based on rapid iterative cycles of visible knowledge generation and learning that facilitate transparent decision-making about wise actions.


Conservation organizations are investing immense effort in grappling with ‘wicked problems’. We have argued that philosophies, theories, methods and tools drawn from the fields of systems thinking and evaluation can enrich the capacity of conservation professionals and organizations individually and collectively to grapple with these challenges. This begins with reconceptualizing the ways in which people define and engage with conservation challenges, looking within ourselves, our teams and our organizations, rather than simply continuing to adopt the outward-looking perspective that currently dominates conservation thinking and practice (e.g., our focus upon people imposing threatening processes upon nature). Here we have introduced the idea of a praxis for effective conservation based on four principles. These are founded on, and synthesized from, the established transdisciplines of systems thinking and evaluation whose long histories of understanding, grappling with, and learning through, ‘wicked’ problems may serve as a strong foundation for this transformation towards greater effectiveness. In developing these principles, we (The Silwood Group) have identified some of our own unchallenged assumptions, gaps in our knowledge, and limitations of our worldviews and practices. We look to engage with other professionals to enact and improve these principles, and trust that the strong sense of discomfort felt when confronting the limitations of all our practices does not deter us from reflecting upon, and enacting, positive change.



This work was supported by the National Environmental Research Council (NERC) through the Tansley Working Groups Fund and Imperial College London. Carlyn Samuel, Jessica Bray, Aidan Keane, Sam Sinclair, E.J. Milner-Gulland and especially Bob Williams are thanked for their contributions towards developing the praxis for effective conservation approach. Rodney Hopson is thanked for providing supporting literature. We are grateful to Chris Metzner for translating our thinking, through his artwork, into the conceptual tool presented as Fig. 1.


  1. Ackoff RL (1989) From data to wisdom. J Appl Syst Anal 16:3–9Google Scholar
  2. Adams WM, Mulligan M (2003) Decolonizing nature: strategies for conservation in a post-colonial era. Earthscan, London, United KingdomGoogle Scholar
  3. Alcoff LM (2006) Visible identities: race, gender and the self. Oxford University Press, Oxford, United KingdomGoogle Scholar
  4. Allen W, Ogilvie S, Blackie H, Smith D, Sam S, Doherty J, McKenzie D, Ataria J, Shapiro L, MacKay J, Murphy E (2014) Bridging disciplines, knowledge systems and cultures in pest management. Env Manag 53:429–440Google Scholar
  5. Arter JA, McTighe J (2001) Scoring rubrics in the classroom: using performance criteria for assessing and improving student performance. Corwin Press, Thousand Oaks, CaliforniaGoogle Scholar
  6. Awad EM, Ghaziri HM (2004) Knowledge management. Pearson Education International, Upper Saddle River, New JerseyGoogle Scholar
  7. Barry D, Oelschlaeger M (1996) A science for survival: values and conservation biology. Conserv Biol 10:905–911Google Scholar
  8. Baylis K, Honey-Rosés J, Börner J, Corbera E, Ezzine-de-Blas D, Ferraro PJ, Lapeyre R, Persson UM, Pfaff A, Wunder S (2016) Mainstreaming impact evaluation in nature conservation. Conserv Letts 9:58–64Google Scholar
  9. Bell S, Morse S (2013) Rich pictures: a means to explore the ‘sustainable mind’? Sust Develop 21:30–47Google Scholar
  10. Bennett NJ, Roth R, Klain SC, Chan KMA, Clark DA, Cullman G, Epstein G, Nelson MP, Stedman R, Teel TL, Thomas REW, Wyborn C, Curran D, Greenberg A, Sandlos J, Veríssimo D (2017) Mainstreaming the social sciences in conservation. Conserv Biol 31:56–66Google Scholar
  11. Berkes F, Folke C (1998) Linking social and ecological systems: management practices and social mechanisms for building resilience. Cambridge University Press, Cambridge, United KingdomGoogle Scholar
  12. Berkes F, Folke C, Colding J (2000) Linking social and ecological systems: management practices and social mechanisms for building resilience. Cambridge University Press, Cambridge, United KingdomGoogle Scholar
  13. Biggs D, Abel N, Knight AT, Leitch A, Langston A, Ban NC (2011) The implementation crisis in conservation planning: could “mental models” help? Conserv Lett 4:169–183Google Scholar
  14. Bottrill MC, Joseph LN, Carwardine J, Bode M, Cook CN, Game ET, Grantham H, Kark S, Linke S, McDonald-Madden E, Pressey RL, Walker S, Wilson KA, Possingham HP (2008) Is conservation triage just smart decision making? Trends Ecol Evol 23:649–654Google Scholar
  15. Bottrill MC, Walsh JC, Watson JEM, Joseph LN, Ortega-Argueta A, Possingham HP (2011) Does recovery planning improve the status of threatened species? Biol Conserv 144:1595–1601Google Scholar
  16. Burford G, Kissmann S, Rosado-May FJ, Alvarado Dzul SH, Harder MK (2012) Indigenous participation in intercultural education: learning from Mexico and Tanzania. Ecol Soc 17(4):33Google Scholar
  17. Butchart SHM, Clarke M, Smith RJ, Sykes RE, Scharlemann JPW, Harfoot M, Buchanan GM, Angulo A, Balmford A, Bertzky B, Brooks TM, Carpenter KE, Comeros-Raynal MT, Cornell J, Ficetola GF, Fishpool LDC, Fuller RA, Geldmann J, Harwell H, Hilton-Taylor C, Hoffmann M, Joolia A, Joppa L, Kingston N, May I, Milam A, Polidoro B, Ralph G, Richman N, Rondinini C, Segan DB, Skolnik B, Spalding MD, Stuart SN, Symes A, Taylor J, Visconti P, Watson JEM, Wood L, Burgess ND (2015) Shortfalls and solutions for meeting national and global conservation area targets. Conserv Lett 8:329–337Google Scholar
  18. Checkland PB (1989) Soft systems methodology. Hum Sys Manag 8:273–289Google Scholar
  19. Checkland P (2000) Soft systems methodology: a thirty year retrospective. Syst Rev Behav Sci 17:11–58Google Scholar
  20. Christie CA, Alkin MC (2012) An evaluation theory tree. In: Alkin MC (ed) Evaluation roots: a wider perspective of theorists’ views and influences, 2nd edn. Sage, Thousand Oaks, CaliforniaGoogle Scholar
  21. Cilliers P (2005) Knowing complex systems. In: Richardson KA (ed) Managing organizational complexity: philosophy, theory, and application. Information Age Publications, Greenwich, pp 7–19Google Scholar
  22. CMP (2013) The open standards for the practice of conservation. Conservation measures partnership. Washington, DCGoogle Scholar
  23. Cochrane JL, Zeleny M (1973) Multiple criteria decision-making. University of South Carolina Press, Columbia, South CarolinaGoogle Scholar
  24. Cook SN, Wagenaar H (2012) Navigating the eternally unfolding present toward an epistemology of practice. Am Rev Public Adm 42:3–38Google Scholar
  25. Cook CN, Mascia MB, Schwartz MW, Possingham HP, Fuller RA (2013) Achieving conservation science that bridges the knowledge-action boundary. Conserv Biol 27:669–678Google Scholar
  26. Cook CN, de Bie K, Keith DA, Addison PFE (2016) Decision triggers are a critical part of evidence-based conservation. Biol Conserv 195:46–51Google Scholar
  27. Cousins BJ (2003) Utilization effects of participatory evaluation. In: Kellaghan T, Stufflebeam DL (eds) International handbook of educational evaluation, vol 9. Kluwer International Handbooks of Education, Springer, DordrechtGoogle Scholar
  28. Cowling RM, Knight AT, Privett SDJ, Sharma GP (2010) Invest in opportunity, not inventory in hotspots. Conserv Biol 24:633–635Google Scholar
  29. Davidson EJ (2005) Marketing evaluation as a profession and a discipline. J MultiDisciplinary Eval 2:3–10Google Scholar
  30. Dewey J (1933) How we think: a restatement of the relation of reflective thinking to the educative process. DC Heath and Company, ChicagoGoogle Scholar
  31. Dicks LV, Walsh JC, Sutherland WJ (2014) Organising evidence for environmental management decisions: a ‘4S’ hierarchy. Trends Ecol Evol 29:607–613Google Scholar
  32. Dowie M (2011) Conservation refugees: the hundred-year conflict between global conservation and native peoples. MIT Press, Cambridge, MassachusettsGoogle Scholar
  33. Dybå T, Dingsøyr T (2008) Empirical studies of agile software development: a systematic review. Inf Soft Technol 50(9–10):833–859Google Scholar
  34. Easterly W (2013) The tyranny of experts: economists, dictators, and the forgotten rights of the poor. Basic Books, New York CityGoogle Scholar
  35. Eoyang GH (2004) Conditions for self-organizing in human systems. Futurics 28:10–59Google Scholar
  36. Eoyang GH (2007) Human systems dynamics: complexity-based approach for complex evaluation. In: Williams B, Imam I (eds) Systems concepts in evaluation. Edge Press of Inverness. Point Reyes, CaliforniaGoogle Scholar
  37. Eoyang G, Holladay R (2013) Adaptive action: leveraging uncertainty in your organization. Stanford University Press, Stanford, CaliforniaGoogle Scholar
  38. Folke C (2006) Resilience: the emergence of a perspective for social-ecological systems analyses. Global Environ Change 16:253–267Google Scholar
  39. Foote JL, Gregor JE, Hepil MC, Baker VE, Houston DJ, Midgley G (2007) Systemic problem structuring applied to community involvement in water conservation. J Op Res Soc 58:645–654Google Scholar
  40. Freire P (1970) Pedagogy of the oppressed. Continuum, New YorkGoogle Scholar
  41. Game ET, Meijaard E, Sheil D, McDonald-Madden E (2014) Conservation in a wicked complex world: challenges and solutions. Conserv Lett 7:271–277Google Scholar
  42. Geldmann J, Barnes M, Coad L, Craigie ID, Hockings M, Burgess ND (2013) Effectiveness of terrestrial protected areas in reducing habitat loss and population declines. Biol Conserv 161:230–238Google Scholar
  43. Gertler PJ, Martinez S, Premand P, Rawlings LB, Vermeersch CMJ (2011) Impact evaluation in practice. The World Bank, Washington, DCGoogle Scholar
  44. Giddens A (1984) The constitution of society: outline of the theory of structuration. Polity, Cambridge, United KingdomGoogle Scholar
  45. Gill DA, Mascia MB, Ahmadia GN, Glew L, Lester SE, Barnes M, Craigie I, Darling ES, Free CM, Geldmann J, Holst S, Jensen OP, White AT, Basurto X, Coad L, Gates RD, Guannel G, Mumby PJ, Thomas H, Whitmee S, Woodley S, Fox HE (2017) Capacity shortfalls hinder the performance of marine protected areas globally. Nature 543:665–669Google Scholar
  46. Grantham H, Moilanen A, Wilson KA, Pressey RL, Rebelo AG, Possingham HP (2008) Diminishing return on investment for biodiversity data in conservation planning. Conserv Lett 1:190–198Google Scholar
  47. Gregory R, Long G (2009) Using structured decision-making to help implement a precautionary approach to endangered species management. Risk Anal 29:518–532Google Scholar
  48. Gregory N, Gregory R (2010) A values-based framework for community food choices. Env Val 19:99–119Google Scholar
  49. Gregory R, Failing L, Harstone M, Long G, McDaniels T, Ohlson D (2012) Structured decision making: a practical guide to environmental management choices. Wiley-Blackwell, West Sussex, United KingdomGoogle Scholar
  50. Harms R (2015) Self-regulated learning, team learning and project performance in entrepreneurship education: learning in a lean startup environment. Technol Forecast Soc Chang 100:21–28Google Scholar
  51. Hockings M, Stolton S, Dudley N (2000) Evaluating effectiveness: a framework for assessing the management of protected areas. World Commission on Protected Areas Best Practice Guidelines. IUCN, SwitzerlandGoogle Scholar
  52. Hockings M, Stolton S, Leverington F, Dudley N, Courrau J (2006) Evaluating effectiveness: a framework for assessing management effectiveness of protected areas, 2nd edn. IUCN, Gland, Switzerland, Cambridge, United KingdomGoogle Scholar
  53. Holling CS (1978) Adaptive environmental assessment and management. Wiley, Chichester, United KingdomGoogle Scholar
  54. Holling CS (2001) Understanding the complexity of economic, ecological, and social systems. Ecosystems 4:390–405Google Scholar
  55. Hood S, Hopson R, Frierson H (2015) Continuing the journey to reposition culture and cultural context in evaluation theory and practice. Information Age Publishing, Charlotte, North CarolinaGoogle Scholar
  56. Huang IB, Keisler J, Linkov I (2011) Multi-criteria decision analysis in environmental sciences: ten years of applications and trends. Sci Tot Env 409:3578–3594Google Scholar
  57. Ison R, Schlindwein SL (2015) Navigating through an ecological desert and a sociological hell: a cyber-systemic governance approach for the Anthropocene. Kybernetes 44:891–902Google Scholar
  58. Jachowski DS, Kesler DC (2009) Allowing extinction: should we let species go? Trends Ecol Evol 24(4):180Google Scholar
  59. Johnson-Laird PN (1983) Mental models: towards a cognitive science of language, inference, and consciousness. Harvard University Press, Cambridge, MassachusettsGoogle Scholar
  60. Kapos V et al (2008) Calibrating conservation: new tools for measuring success. Conserv Lett 1:155–164Google Scholar
  61. Keene M, Pullin AS (2011) Realizing an effectiveness revolution in environmental management. J Environ Manage 92:2130–2135Google Scholar
  62. Knight AT, Cowling RM, Rouget M, Balmford A, Lombard AT, Campbell BM (2008) Knowing but not doing: selecting priority conservation areas and the research-implementation gap. Conserv Biol 22:610–617Google Scholar
  63. Knowlton L, Phillips C (2013) The logic model guidebook: Better strategies for great results. Sage, Los AngelesGoogle Scholar
  64. Le Gales C, Moatti JP (1990) Searching for consensus through multi-criteria decision analysis. Int J Tech Assess in Health Care 6:430–449Google Scholar
  65. Legg CJ, Nagy L (2006) Why most conservation monitoring is, but need not be, a waste of time. J Environ Manage 78:194–199Google Scholar
  66. Liu J, Dietz T, Carpenter SR, Alberti M, Folke C, Moran E, Pell AN, Deadman P, Kratz T, Lubchenco J, Ostrom E, Ouyang Z, Provencher W, Redman CL, Schneider SH, Taylor WW (2007) Complexity of coupled human and natural systems. Science 317:1513–1516Google Scholar
  67. Mace GM (2014) Whose conservation? Science 345:1558–1560Google Scholar
  68. Malinga R, Gordon LJ, Lindborg R, Jewitt G (2013) Using participatory scenario planning to identify ecosystem services in changing landscapes. Ecol Soc 18:10Google Scholar
  69. Margoluis R, Stem C, Salafsky N, Brown M (2009) Design alternatives for evaluating the impact of conservation projects. New Dir Eval 122:85–96Google Scholar
  70. Martin J, Runge MC, Nichols JD, Lubow BC, Kendall WL (2009) Structured decision making as a conceptual framework to identify thresholds for conservation and management. Ecol Appl 19:1079–1090Google Scholar
  71. Mascia MB, Pailler S, Thieme ML, Rowe A, Bottrill MC, Danielsen F, Geldmann J, Naidoo R, Pullin AS, Burgess ND (2014) Commonalities and complementarities among approaches to conservation monitoring and evaluation. Biol Conserv 169:258–267Google Scholar
  72. Max-Neef MA (2005) Foundations of transdisciplinarity. Ecol Econ 53:5–16Google Scholar
  73. McKeon RP (1974) Introduction to Aristotle: revised and enlarged. University of Chicago Press, ChicagoGoogle Scholar
  74. McLaughlin JA, Jordan GB (1999) Logic models: a tool for telling your programs performance story. Eval Program Plann 22:65–72Google Scholar
  75. McShane TO, Hirsch PD, Trung TC, Songorwa AN, Kinzig A, Monteferri B, Mutekanga D, Van Thang H, Dammert JL, Pulgar-Vidal M, Welch-Devineh M, Brosius JP, Coppolillo P, O’Connor S (2011) Hard choices: making trade-offs between biodiversity conservation and human well-being. Biol Conserv 144:966–972Google Scholar
  76. Meadows DH (2008) Thinking in systems. Chelsea Green Publishing, White River Junction, VermontGoogle Scholar
  77. Mezirow J (1991) Transformative dimensions of adult learning. Jossey-Bass, San FranciscoGoogle Scholar
  78. Moore M, Maland Cady J (2015) Developmental evaluation in the McKnight Foundation’s Collaborative Crop Research Program: a journey of discovery. In: Patton MQ, McKegg K, Wehipeihana N (eds) Developmental evaluation exemplars: principles in practice. Guilford Press, New York, pp 143–162Google Scholar
  79. Naidoo R, Weaver LC, Diggle RW, Matongo G, Stuart-Hill G, Thouless C (2016) Complementary benefits of tourism and hunting to communal conservancies in Namibia. Conserv Biol 30:628–638Google Scholar
  80. Nicolescu B (2002) Manifesto of transdisciplinarity. SUNY Press, New YorkGoogle Scholar
  81. Nonaka I, Takeuchi H (1995) The knowledge-creating company: how Japanese companies create the dynamics of innovation. Oxford University Press, United KingdomGoogle Scholar
  82. Nyaupane GP, Poudel S (2011) Linkages among biodiversity, livelihood, and tourism. Ann Tour Res 38:1344–1366Google Scholar
  83. Ortega-Argueta A, González-Zamora A, Contreras-Hernándezc A (2016) A framework and indicators for evaluating policies for conservation and development: the case of wildlife management units in Mexico. Environ Sci Policy 63:91–100Google Scholar
  84. Parsons B (2009) Evaluative inquiry for complex times. OD Practitioner 41:44–49Google Scholar
  85. Parsons B (2012) Using complexity science concepts when designing system interventions and evaluations. InSites, Fort Collins, Colorado.
  86. Patton MQ (2002) Qualitative research and evaluation methods, 3rd edn. Sage Publications, Thousand Oaks, CaliforniaGoogle Scholar
  87. Patton MQ (2008) Utilization-focused evaluation, 4th edn. Sage Publications, Thousand Oaks, CaliforniaGoogle Scholar
  88. Patton MQ (2010) Developmental evaluation applying complexity concepts to enhance innovation and use. Guilford Press, New YorkGoogle Scholar
  89. Patton MQ (2011) Developmental evaluation: applying complexity concepts to enhance innovation and use. Guilford Press, New YorkGoogle Scholar
  90. Pfeffer J, Sutton RI (1999) Knowing “what” to do is not enough: turning knowledge into action. Calif Manage Rev 42:83–107Google Scholar
  91. Pielke R (2007) The honest broker: making sense of science in policy and politics. Cambridge University Press, Cambridge, United KingdomGoogle Scholar
  92. Preskill H, Catsambas T (2006) Reframing evaluation through appreciative inquiry. Sage, Thousand Oaks, CaliforniaGoogle Scholar
  93. Preskill H, Gopal S (2014) Evaluating complexity: propositions for improving practice. FSG, Seattle, WashingtonGoogle Scholar
  94. Pullin AS, Knight TM (2001) Effectiveness in conservation practice: pointers from medicine and public health. Conserv Biol 15:50–54Google Scholar
  95. Redford KH, Taber S (2000) Writing the wrongs: developing a safe-fail culture in conservation. Conserv Biol 14:1567–1568Google Scholar
  96. Redford KH, Hulvey KB, Williamson MA, Schwartz MW (2018) Assessment of the conservation measures partnership’s effort to improve conservation outcomes through adaptive management. Conserv Biol 32:926–937Google Scholar
  97. Reynolds M (2011) Critical thinking and systems thinking: towards a critical literacy for systems thinking in practice. In: Horvath CP, Forte JM (eds) Critical thinking. Nova Science Publishers, New York, pp 37–68Google Scholar
  98. Reynolds M, Holwell S (2010) Introducing systems approaches. In: Reynolds M, Holwell S (eds) Systems approaches to managing change: a practical guide. Springer, London, pp 1–23Google Scholar
  99. Ries E (2011) The lean startup. Crown Publishing, New YorkGoogle Scholar
  100. Rittel HWJ, Webber MM (1973) Dilemmas in a general theory of planning. Pol Sci 4:155–169Google Scholar
  101. Romero C, Sills EO, Guariguata MR, Cerutti PO, Lescuyer G, Putz FE (2017) Evaluation of the impacts of Forest Stewardship Council (FSC) certification of natural forest management in the tropics: rigorous approach to assessment of a complex conservation intervention. Int For Rev 19:1–14Google Scholar
  102. Salafsky N, Margolius R, Redford KH, Robinson JG (2002) Improving the practice of conservation: a conceptual framework and research agenda for conservation science. Conserv Biol 16:1469–1479Google Scholar
  103. Samuels M, Ryan K (2011) Grounding evaluations in culture. Am J Eval 32:183–198Google Scholar
  104. Sayer J, Campbell B, Petheram L, Aldrich M, Perez MR, Endamana D, Burgess N (2007) Assessing environment and development outcomes in conservation landscapes. Biodivers Conserv 16:2677–2694Google Scholar
  105. Schoemaker PJH (1995) Scenario planning: a tool for strategic thinking. Sloan Manag Rev 36:25–40Google Scholar
  106. Schwaber K, Beedle M (2001) Agile software development with scrum. Prentice Hall, Upper Saddle RiverGoogle Scholar
  107. Schwartz B, Sharpe K (2006) Practical wisdom: Aristotle meets positive psychology. J Happiness Stud 7:377–395Google Scholar
  108. Schwartz MW, Cook CN, Pressey RL, Pullin AS, Runge MC, Sutherland WJ, Williamson MA (2017) Decision support frameworks and tools for conservation. Conserv Lett 11(2):1–12Google Scholar
  109. Scriven M (1991) Evaluation thesaurus. Edge Press of Inverness, Point Reyes, CaliforniaGoogle Scholar
  110. Scriven M (2007) The logic of evaluation. In: Hansen HV (ed) Dissensus and the search for common ground. University of Windsor, Ontario, pp 1–17Google Scholar
  111. Stewart MA (2016) Nurturing caring relationships through five simple rules. Engl J 105(3):22–28Google Scholar
  112. Stuart SN, Wilson EO, McNeely JA, Mittermeier RA, Rodríguez JP (2010) The barometer of life. Science 328:177Google Scholar
  113. Thomas V, Parsons B (2017) Culturally responsive evaluation meets systems-oriented evaluation. Am J Eval 38:7–28Google Scholar
  114. Thompson M, Ellis R, Wildavsky A (1990) Cultural Theory. Westview Press, Boulder, ColoradoGoogle Scholar
  115. Toomey AH, Knight AT, Barlow J (2017) Navigating the space between research and implementation in conservation. Conserv Lett 10:619–625Google Scholar
  116. Tuhiwai Smith L (2018) Indigenous insight on valuing complexity, sustaining relationships, being accountable. In: Hopson R, Cram F (eds) Tackling wicked problems in complex ecologies: the role of evaluation. Stanford University Press, Palo Alto, CA, pp 45–66Google Scholar
  117. Ulrich W (2000) Reflective practice in the civil society: the contribution of critically systemic thinking. Refl Pract 1:247–268Google Scholar
  118. Ulrich W (2003) Beyond methodology choice: critical systems thinking as critically systemic discourse. J Oper Res Soc 54:325–342Google Scholar
  119. Ulrich W, Reynolds M (2010) Critical systems heuristics. In: Reynolds M, Holwell S (eds) Systems approaches to managing change. Springer, London, pp 243–292Google Scholar
  120. Valters C (2015) Theories of change: time for a radical approach to learning in development. The Asia Foundation, Overseas Development Institute, LondonGoogle Scholar
  121. Vo AT, Archibald T (2018) New directions for evaluative thinking. In: Vo AT, Archibald T (eds) New directions for evaluation. 158, pp 139-147Google Scholar
  122. Wack P (1985) Scenarios: uncharted waters ahead. Harv Bus Rev 63:72–89Google Scholar
  123. Watkins J, Cooperrider D (2000) Appreciative inquiry: a transformative paradigm. OD Pract 32:6–12Google Scholar
  124. Wenger E (1998) Communities of practice: learning, meaning, and identity. Cambridge University Press, Cambridge, United KingdomGoogle Scholar
  125. Wickson F, Carew AL, Russell AW (2006) Transdisciplinary research: characteristics, quandaries and quality. Futures 38:1046–1059Google Scholar
  126. Wildlife Conservation Society, Bio-Era (2007) Futures of the Wild. A Project of the Wildlife Conservation Society’s Futures Group, Wildlife Conservation Society and Bio-Era, New YorkGoogle Scholar
  127. Williams B, Hummelbrunner R (2010) Systems concepts in action: a practitioner’s toolkit. Stanford University Press, CaliforniaGoogle Scholar
  128. WK Kellogg Foundation (2017) The step-by-step guide of evaluation: how to become savvy evaluation consumers. Battle Creek, Michigan, United StatesGoogle Scholar
  129. Zimmerman B, Lindberg C, Plsek PE (2008) Edge Ware: lessons from complexity science for health care leaders. VHA, Irving, TexasGoogle Scholar
  130. Zylstra MJ, Knight AT, Esler KJ, Le Grange LL (2014) Connectedness as a core conservation concern: an interdisciplinary review of theory and a call for practice. Springer Sci Rev 2:119–143Google Scholar

Copyright information

© The Author(s) 2019

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Andrew T. Knight
    • 1
    • 2
    • 3
    Email author
  • Carly N. Cook
    • 4
  • Kent H. Redford
    • 5
    • 6
  • Duan Biggs
    • 2
    • 7
    • 8
  • Claudia Romero
    • 9
  • Alejandro Ortega-Argueta
    • 10
  • Cameron D. Norman
    • 11
    • 12
  • Beverly Parsons
    • 13
  • Martin Reynolds
    • 14
  • Glenda Eoyang
    • 15
  • Matt Keene
    • 16
  1. 1.Department of Life SciencesImperial College LondonBerkshireUnited Kingdom
  2. 2.ARC Centre of Excellence for Environmental DecisionsThe University of QueenslandSt. LuciaAustralia
  3. 3.Department of BotanyNelson Mandela UniversityPort ElizabethSouth Africa
  4. 4.School of Biological SciencesMonash UniversityClaytonAustralia
  5. 5.Archipelago ConsultingPortlandUSA
  6. 6.Department of Environmental StudiesUniversity of New EnglandBiddefordUSA
  7. 7.Environmental Futures Research InstituteGriffith UniversityNathanAustralia
  8. 8.Department of Conservation Ecology and EntomologyStellenbosch UniversityMatielandSouth Africa
  9. 9.Department of BiologyUniversity of FloridaGainesvilleUSA
  10. 10.Departamento de Conservación de la Biodiversidad, El Colegio de la Frontera Sur (ECOSUR)Unidad San Cristóbal CarretChiapasMexico
  11. 11.Dalla Lana School of Public HealthUniversity of TorontoTorontoCanada
  12. 12.Cense LtdTorontoCanada
  13. 13.InSitesFort CollinsUSA
  14. 14.School of Engineering and InnovationThe Open UniversityMilton KeynesUK
  15. 15.Human Systems Dynamics InstituteCircle PinesUSA
  16. 16.The Silwood Group LLCArlingtonUSA

Personalised recommendations