Introduction

As human populations modify large areas of natural habitat to provide food and other services, interactions between people and wild animals are increasingly common. Interactions can also occur more frequently when conservation efforts result in an increase in population density of certain wildlife species, leading to damage to economic crops (e.g. agriculture and forestry; Sullivan and Sullivan 2008; Barrio et al. 2010), threats to human safety (Kaplan et al. 2011; Rogers 2011), and predation of commercially valuable species such as livestock (Smith et al. 2000) or game (Redpath 2001; Redpath and Thirgood 2009). In some cases, these impacts can lead to wildlife persecution (Woodroffe et al. 2005; Fitzherbert et al. 2014). Competition over resources is a common driver of these impacts (Kaplan et al. 2011) as abundant game, agricultural crops or woodland provide highly concentrated sources of food or shelter for the target species, so the likelihood of damage is higher when the surrounding natural resources are limited (Calenge et al. 2004; Ziegltrum 2004; Barrio et al. 2010). Habitat loss and changes in land use patterns can also lead to adverse wildlife–wildlife interactions and potentially increase predation of vulnerable species (Smart and Ratcliffe 2000). In response to these human-wildlife and wildlife–wildlife interactions, stakeholders often try to manage the landscape to mitigate adverse impacts either on biodiversity or on economic output. However, where multiple land use objectives or differing conservation values lead to stakeholder disagreement on the best way to mitigate adverse impacts, conservation conflicts arise (Young et al. 2010; Redpath et al. 2013).

Historically, management techniques for the alleviation of adverse environmental impacts have focused on the removal or translocation of individual animals to different areas. Culling predators, for example, can result in demographic improvements for vulnerable bird species such as increased nest success (Smith et al. 2010). These improvements, however, can depend on the maintenance of a low-density predator population (Payton et al. 1997) and the strategy has shown a mixed and often low success rate for livestock protection (Conner et al. 1998; Berger 2006), or conserving bird populations (Cote and Sutherland 1997; Holt et al. 2008). Where the ethics of culling native predators are questioned (Witmer et al. 2000; Ziegltrum 2008), translocation is often considered a humane alternative (Massei et al. 2011). Translocation, however, may incur significantly higher costs than culling (Sillero-Zubiri et al. 2010) and benefits to prey populations may not be sustained, with the movement of translocated predators back into the area a recurring problem (Bradley et al. 2005; Linnell et al. 1997). The welfare of translocated animals must also be considered, as translocation programmes often result in high levels stress and mortality in the target individuals (Teixeira et al. 2007).

Where target species are protected or where public acceptance of the mitigation action is necessary, it may not be possible or desirable to employ lethal control or translocation and other options are required. To address the problem of livestock depredation, for example, visual, physical and sonic deterrents, chemical repellents and ‘aversive conditioning’ by the treatment of bait with an unpleasant compound have been used on various target predators with varying levels of success. Smith (2000b) reviews these methods, suggesting that most are only effective for a limited time, on a small scale and when targeted to a specific species. An alternative approach, which is reviewed here, is the use of diversionary feeding.

Although the terms ‘diversionary’ and ‘supplementary’ are sometimes used interchangeably with respect to feeding, diversionary feeding is defined hereafter as the use of food to divert the activity or behaviour of a target species from an action that causes a negative impact, without the intention of increasing the density of the target population. In contrast, supplementary feeding is defined as the use of feeding as a conservation method to improve the population viability or density of a particular species or population (Ewen et al. 2014). In this review, we focus on diversionary feeding. Diversionary feeding has been used in mitigation for a range of wildlife impacts (e.g. raiding of human food stores by baboons Papio ursinus—Kaplan et al. 2011; predation of game species by hen harrier Circus cyaneus—Redpath 2001; forest damage by black bears Ursus americanus—Witmer et al. 2000; Ziegltrum 2008). However, to date, with an exception of evidence for strategies aimed at bird species (see Williams et al. 2013), there has been no synthesis of available evidence to assess the effectiveness of this strategy in terms of its likelihood of successfully diverting the behaviour of a target species or population from an action causing conflict, nor have any previous studies evaluated the factors influencing success or failure, or considered the effect of such action on the welfare of the target species.

The effects of diversionary feeding can be measured at three stages; the initial uptake of diversionary food; the ‘output’ or direct impact of diversionary feeding on the problem (e.g. a reduction in predation or damage); and the ‘outcome’ or overall benefit relative to management objectives (e.g. increased crop yield; Walsh et al. 2012; Fig. 1). Understanding the reasons for the success or failure of previous conservation actions and associated uncertainties can assist with deciding what outcomes could potentially be achieved given similar conditions. Uncertainty in the system and in the effectiveness of actions lends itself to an adaptive approach, whereby actions are continuously monitored and refined as knowledge of the system improves and the efficacy of management increases with time (Walters and Hilborn 1978). This approach is particularly relevant for conservation conflicts which are, by their nature, difficult to manage due to the contrasting values of stakeholders and the need to encompass ecological facets as well as political and social ones (Turnhout et al. 2007). A transparent planning process that incorporates stakeholder objectives, current knowledge of the ecosystem and an existing evidence-base of the effectiveness of the strategy (Sutherland et al. 2004) is vital for successful management and conservation in systems where wildlife and people interact (Bunnefeld et al. 2011) and will greatly improve stakeholder participation as people feel included in, and understand, the process. Further, developing clear, concise and objectives-driven conservation goals which encompass the values of resource managers and reflect acceptability to publics will increase the likelihood that scientific findings will be adopted, by ensuring the most appropriate questions are addressed (Gregory et al. 2012).

Fig. 1
figure 1

Example of the sequence of events that should be considered by stakeholders when implementing diversionary feeding for four categories of conflict; predation of game species; predation of vulnerable species; damage to crops; threats to human safety. The ‘output’ is defined as the direct impact of diversionary feeding on the problem (e.g. reduced crop damage) and the ‘outcome’ is defined as the overall benefit relative to management objectives (increased crop yield). In practice, stochastic influences on the system and unexpected behaviour of the target species make the effects of feeding more uncertain. Adapted from Walsh et al. (2012)

In order to apply an adaptive management framework to conservation challenges with multiple stakeholders and uncertainty, such as diversionary feeding, decision making needs to be developed based on decision analysis and theory (Keith et al. 2011). Structured decision-making (SDM) is an application of multiple objective decision analysis that is increasingly proposed for environmental management (e.g. Ewen et al. 2014, Tulloch et al. 2015), as it deals with the uncertainty inherent within ecological mechanisms, expected outcomes and tradeoffs between stakeholders or outcomes (Gregory et al. 2012; McCarthy 2014). The steps of SDM include definition of the problem; development of clear, measurable objectives; consideration of alternative management options; comparison of the likely consequences or outcomes of alternative management options; decision making regarding acceptable levels of uncertainty; and the implementation, monitoring and review of management efficacy. Focused and concise objectives, including qualitative ones such as social acceptance of the strategy and the welfare of the target species, are a key requirement of the SDM framework. For example, stating explicitly that harm to the target species should be minimised limits the comparison of outcomes to other non-lethal methods. In many cases, the consequences or benefits of management can be monetary (e.g. increased crop yield leading to improved economic outcomes), but often in wildlife management the benefits are non-monetary, measured simply as an increase in the size of a population after management.

When decision-makers are faced with potentially costly management options with uncertain expected returns, economic techniques such as return on investment (ROI) can be useful (Murdoch et al. 2007). Given the perceived expense of diversionary feeding techniques (Conover 2002), incorporating ROI into decision analysis could assist with evaluating the efficiency and cost-effectiveness of potential management options, by allowing managers to calculate the amount of effort required to divert a particular species and achieve an expected return (Andreassen et al. 2005; Auerbach et al. 2014). The use of ROI has been particularly successful for planning the conservation of protected areas, with an increased number of species protected compared to other priority-setting approaches (Murdoch et al. 2007; Underwood et al. 2008).

The use of decision analysis and a priori assessment of alternative management options has proven successful for many areas of wildlife management. The manipulation of food resources as a management tool, however, is often employed without thorough evaluation of its suitability for the specific problem or justification of its use over alternative management options, leading to polarised opinions and lack of confidence in its reliability (Ewen et al. 2014). Whilst some diversionary feeding experiments have monitored progress and attempted to address the factors hindering success (Redpath 2001; Sullivan et al. 2001; Sullivan and Sullivan 2004, 2008; New et al. 2012), most provide an a posteriori review of effectiveness and are unable to isolate causal factors. Recently, a six-step SDM approach was suggested to guide decisions regarding supplementary feeding of the endangered Hihi (Notiomystis cincta) in New Zealand, and the first five steps applied with some success (see Ewen et al. 2014). We propose here that this SDM framework could have similar success for diversionary feeding strategies, as it would enable more transparent and defensible decisions to be made regarding the suitability of diversionary feeding. We first review the literature on diversionary feeding to identify variables likely to influence its efficacy. We use these variables to develop a series of questions that can be used in a decision-making framework that helps managers develop an effective strategy. We structure the resulting framework into three components: the ‘implement’ stage looks at the effect of feeding on the wildlife population; the ‘monitor’ stage focuses on how the response to, and success of, feeding can be observed; and the ‘review’ stage aims to develop and evaluate indicators of success. These three stages reflect the sixth and final ‘Implement, monitor, review’ step of the SDM process employed by Ewen et al. (2014). We focus on this final step as, despite the importance of monitoring for informing future management (Nichols and Williams 2006), the ‘implement, monitor, review’ step was not explored in detail by Ewen et al. (2014), and because the preceding SDM steps of setting objectives and canvassing alternative management options would follow a similar process for diversionary feeding as for supplementary feeding (Ewen et al. 2014). As a crucial part of this step in the SDM process, we discuss the importance of ROI analyses to evaluate the likely cost-effectiveness of diversionary feeding management strategies, based on reported costs and resulting levels of success in previous studies.

Methods

References to diversionary feeding were sourced up to 2014 using the databases ISI Web of Knowledge, Science Direct and Google Scholar with the words ‘diversionary’, ‘supplementary’ and ‘supplemental’ combined with ‘feeding’ or ‘food’ and one or more of the following: ‘conservation, conflict, mitigation, non-lethal, management’. For papers found using the term ‘supplementary’, the aims were checked to see whether they fell under our definition of supplementary or diversionary feeding. Whilst we refer to studies on supplementary feeding where details of the experimental design are of relevance to diversionary feeding, the outcomes of these studies were not included in this review. The references and citations of each paper found in the original searches were checked and sourced where applicable.

Studies were grouped into four categories, based on initial review of their objectives; ‘increase population density of game species’ e.g. increase red grouse (Lagopus lagopus) population density by reducing depredation by hen harriers (Redpath 2001), ‘increase population density of at-risk species’ e.g. increase little tern populations by reducing depredation by the common kestrel (Falco tinnunculus; Smart and Ratcliffe 2000), ‘increase crop yield’ e.g. increase grape yield by reducing damage by European rabbits (Oryctolagus cuniculus; Barrio et al. 2010) and ‘reduce threats to human safety’ e.g. by reducing road collisions with moose (Alces alces; Andreassen et al. 2005). Whilst qualitative objectives, such as social acceptance of the strategy, were included in the literature search, none were explicitly stated or tested. Due to the wide range of objectives for which diversionary feeding is employed and heterogeneity in study designs (for both implementation and evaluation of effectiveness), it was not possible to conduct a meta-analysis or full systematic review. We therefore followed a literature synthesis approach (Prevedello and Vieira 2010). To quantify results, studies were classified as either successful or unsuccessful based on the initial study objectives compared to the results of the study, or on declarations by the author in the discussion section of each article, which were based on the significance and/or the effect size considered to represent a success. To provide the information necessary for a ROI analysis, we standardised results within each category as a percentage difference in the measured response variable before and after treatment—this value was the ‘return’, or benefit of management. For outputs, a successful result resulted in a negative percentage difference (i.e. a reduction in damage); for ease of interpretation, we have presented these successful results as positive percentages (i.e. a result of −5 % is presented as a 5 % reduction in damage). Costs were converted to US$ at the exchange rate at the time of publication and increased by respective inflation rates since publication. We calculated unit cost (US$) per hectare (ha), or per feeder where site size was not specified. To calculated the grand mean effect size (outputs and outcomes), we first weighted each effect size by multiplying it by the percentage of degrees of freedom in the model out of the total degrees of freedom for all models. Results were then summed. A weighting of one was given for the results of simulations (Ziegltrum 2006; New et al. 2012) and the sample size used where effect sizes are based on a descriptive analysis (Sahlsten et al. 2010; Table 1). The mean cost was found by dividing the summed costs of each study by the number of studies, for all categories combined.

Table 1 Results from diversionary feeding studies for cost-effectiveness analysis. Consecutive rows for a species indicate multiple experiments. Output (e.g. reduced predation or nuisance reports) and outcome (e.g. increase in prey density or reduction in translocation) are given as % change or US$ saved. Results were standardised as the percentage difference between control and treatment using (%treatment – %control)/%control)

The experiments from each study were only included in a quantitative review of efficacy if the methods included at least one of the following criteria: target individuals or treatment sites replicated; control sites included; paired or randomised site selection; cross-over of treatment and control sites; monitoring conducted before or after treatment. Details of the methodology used in each study are provided in Online Resource 1.

A total of 52 articles were initially returned for ‘diversionary feeding’ and 1628 for ‘supplementary feeding’. Papers were checked for relevance by reading the title initially, then the abstract where relevance was uncertain. Twenty papers contained experimental data and two modelled the efficacy of diversionary feeding (Ziegltrum 2006; New et al. 2012). Twenty one of these were used for the quantitative review of efficacy, whilst one of the experimental papers (Amar et al. 2004) was excluded as direct effects of feeding were not quantified (for details of studies belonging to each category and identification of target species, see Table 1). A review of all relevant literature including the stated objectives, the explanatory variables affecting success measured at the input, output or outcome stages, and statements pertaining to success in the discussion of each study revealed eight common factors to be the most important influences on the success of trials during implementation. These included sufficient knowledge of the target species behaviour and ecology, operational details of food placement, and negative effects of feeding on the target species. Secondary impacts of diversionary feeding on the surrounding habitat and non-target-species was also included here as, depending on the fundamental objectives of the study, these damaging effects may render the strategy unsuitable. We reviewed the monitoring techniques employed by each study at the input, output and outcome stages and present the most effective options within an SDM framework. Finally, we scrutinised the review process to identify whether the strategy was considered cost-effective, whether the objectives of the strategy had been met and, ultimately, whether conservation conflict had been reduced. We suggest refinements to the review process, including clear objective-setting and transparent presentation of costs and results to enable informed management decisions. We frame these as questions below, integrated into the three-part decision-making framework (implement, monitor and review; Fig. 2) constituting the final step of an SDM process.

Fig. 2
figure 2

Decision making framework for planning and implementation of a diversionary feeding strategy. Questions are grouped into implement, monitor and review stages. These stages should be part of an adaptive system where the efficacy of the strategy is re-evaluated during each iteration of the cycle

Implementation of diversionary feeding

Implementation requires knowledge of the ecology and behaviour of the species, and how these factors might change in response to alternative management actions. Knowledge of the ecological requirements of the target species will determine the type of food used in diversionary feeding, how and when it is distributed, and inform likely responses, which are needed when setting targets for evaluating achievement of objectives (e.g. expectations of a 50 % reduction in threats to human safety might be unrealistic, but an increase in crop yield of 20 % might be expected if there is evidence of this effect being possible in previous experiments). There are four key questions, outlined below, that can be used to inform choices about where and how to implement diversionary feeding (Fig. 2).

Is there sufficient information on the target species?

Is the population food-limited?

It is important that diversionary food is only provided for short time periods as increased food resources could enhance survival leading to increased population density (Conover 2002). Where damage occurs to agricultural crops or timber, diversionary food needs to be more nutritionally appealing (e.g. higher calorific content) than the crop (Sullivan 1979; Sullivan and Sullivan 1982). In British Columbia, a short ‘pulse’ of sunflower seeds, which have a similar nutritional content to several conifer seeds, reduced tree damage by American red squirrels (Tamiasciurus hudsonicus) by 41 % (Sullivan and Klenner 1993). In this case, damage occurred within a relatively short period (May–June), when natural food was limited. If, however, protection is needed for a more sustained period, diversionary food could be exhausted quickly with no significant long-term reduction in damage (Sullivan and Sullivan 2004), or in some cases, result in an increase in damage (Sullivan et al. 2001; Sullivan and Sullivan 2004), possibly due to a temporary increase in population size. Managers may also aim to use diversionary feeding in order to shift the predator away from sensitive areas entirely, for example to reduce predation of rare species. This shift may not be possible where alternative natural food is abundant, as was speculated to be the cause of failed attempts to prevent depredation of ground-nesting birds by racoons (Procyon lotor) in Georgia, USA (Storey 1997).

Is damage caused by a sub-set of the population?

In some cases, a particular demographic group or individual may need to be targeted. In the UK, for instance, female hen harriers mated to bigamous males depredate more grouse chicks than those mated to monogamous males. It may therefore be more efficient to target females with higher kill rates directly, rather than attempting to supply diversionary food to all individuals (Redpath 2001). Targeting selected individuals may also reduce the risk of increasing the population size (Sullivan et al. 2001; Massei et al. 2011). Although entire troops of baboons engage in activities that damage residential areas in South Africa, providing food for the alpha male alone can lead to substantial changes in troop behaviour and a reduction in time spent by the troop in urban areas (Kaplan et al. 2011). In the US Pacific Northwest, the majority of concentrated commercial timber debarking by black bears is carried out by small bears, most likely females raising cubs (Witmer et al. 2000). Placing feeding stations within known female territories is therefore likely to be the most effective strategy.

How will the target species respond to diversionary food?

Where diversionary feeding is used to reduce predation, the target species usually have a generalist diet (Storey 1997; Greenwood et al. 1998; Smart and Ratcliffe 2000; Redpath 2001) that focus on the most abundant prey available (Smart and Ratcliffe 2000). As long as natural food is limited, this trait is favourable for diversionary feeding strategies because target animals will be more likely to switch quickly to the proffered food (Conover 2002). The quantity of food provided can affect how long it takes for the target species to use it. For example, the time taken for cinereous vultures (Aegypius monachus) to visit `vulture restaurants` is reduced, and the number of visiting vultures increased, with a higher quantity of food available (Moreno-Opo et al. 2010).

What spatial distribution of diversionary food should be used?

How will food be distributed?

The nature of the conflict, including social aspects of land ownership and multiple management objectives, as well as the ecology of the target species dictate the most appropriate distribution method of diversionary food. The use of feeding stations, where a concentrated amount of food is placed, is the standard method for feeding large mammals such as black bear or moose (Ziegltrum 2004; Sahlsten et al. 2010; Rogers 2011). For birds, previous studies have placed food near nests of birds of prey (New et al. 2012; Redpath 2001; Smart and Ratcliffe 2000), used feeding stations (Hammond 1961; Knittle and Porter 1988) and sacrificial crop fields (Hammond 1961). Feeding stations offer flexibility as they can be moved. Sacrificial crop fields, on the other hand, allow more animals to access the food source simultaneously, increasing effectiveness and reducing competition (Conover 2002). On a finer scale, presenting food in long strips rather than concentrated piles may prevent dominant animals controlling access to the food (Vassant et al. 1992; Calenge et al. 2004).

Where should feeding stations be located?

The success of diversionary feeding can vary between locations and during different times of year, making it difficult for managers to make predictions based on results elsewhere. This variation could either be due to intrinsic differences in behaviour between populations, or differences in behaviour within a population during phases of the life cycle or in response to resource availability. For instance, although feeders on migratory paths to over-wintering sites in south-eastern Norway prevented moose-traffic collisions (Andreassen et al. 2005), a study in northern Norway and Sweden found that moose ignored feeding stations along the migration paths, but heavily used those placed in wintering areas (Sahlsten et al. 2010). The density of feeders can also affect their attractiveness. A high density of feeders could cause more mobile, sociable species to aggregate (Miller et al. 2003), whilst for solitary species, low feeder density may prove more effective as individuals will use the resources available in their home range (Conover 2002). The proximity of feeding stations to the focal area must also be considered carefully; whilst stations need to be close enough to capture the attention of the animals causing the problem, if placed too close to the area experiencing the problem, the aggregation of animals may result in increased damage (Geisser and Reyer 2004).

Does feeding have detrimental effects on the target species or surrounding area?

Does feeding create dependence for the target species?

Diversionary feeding strategies may create dependence on food supplied by humans that cannot be sustained by natural resources once this food source is removed or exhausted. The removal of diversionary food can lead to increased damage, as seen for black bears, where damage to conifer stands increased almost seven-fold after feeding stations were removed (Ziegltrum 2004). Feeding should coincide with periods of food limitation, but cease when natural resources increase (Witmer et al. 2000). The choice of food is also important in avoiding dependence. Black bears strip bark from economically valuable trees immediately after they emerge from hibernation as other natural food sources are scarce (Ziegltrum 2004). To prevent dependence on diversionary food once natural food abundance increased, Flowers (1987) developed a diversionary food that was more palatable than the phloem at risk, but less nutritious than wild berries, causing a switch to the wild alternative once it became available.

Is feeding detrimental to the health of the target species?

Feeding stations facilitate contact between individuals, increasing the transmission risk of infectious disease (Miller et al. 2003; Castillo et al. 2011). A greater quantity of food may exacerbate the problem as animals may spend longer feeding (Miller et al. 2003). Poor food quality can also promote disease occurrence. Spanish imperial eagle (Aquila adalberti) populations receive supplementary food to increase breeding productivity, but the use of domestic rabbits containing high levels of antibiotics and anti-parasitic drugs causes higher pathogen abundance and a depressed immune system compared to those fed with wild rabbits (Blanco et al. 2011).

Supplemental feeding is often used as a ‘quick fix’, with little attention to long-term consequences for the target species (Blanco 2006). Supplementarily-fed bird populations, for example, show reduced clutch sizes in blue tits (Cyanistes caeruleus) and great tits (Parus major: Harrison et al. 2010). This reduction may be due to increased survival of genetically ‘weaker’ individuals, retaining undesirable traits in the population as these individuals continue to breed which, in time, could reduce the population density. The possibility of retaining undesirable traits is of particular concern when the target species are protected; although practitioners may not want to increase population density, they must be careful not to decrease it either.

Have effects on habitat conditions and non-target species been evaluated?

Aggregations of social animals around a feeder can have a direct impact on the local ecosystem. Many ungulates continue to browse natural vegetation when receiving additional food and damage to flora by white-tailed deer (Odocoileus virginianus; Cooper et al. 2005) and moose (Gundersen et al. 2004; vanBeest et al. 2010) occurs with increasing proximity to feeding stations. Effects on local fauna may be more complex. For example, different passerine species vary in response to increased moose activity around feeding stations, apparently as a result of altered arthropod food availability, with great tits having lower fledging success and pied-flycatchers (Ficedula hypoleuca) showing higher success (Mathisen et al. 2012).

Indirect effects on ecosystems can occur when non-target predatory species are attracted to feeding sites. Studies using artificial bird nests suggest increased predation of ground-nesting birds adjacent to ‘vulture restaurants’ (Cortés-Avizanda et al. 2009) and deer feeders (Cooper and Ginnett 2000), with a 30 % increase in the predation of artificial ground nests in close proximity to deer feeders in Poland (Selva et al. 2014). Similarly, trials to quantify predation of artificial turtle nests showed that scavengers attracted to deer feeding stations resulted in a five-fold increase in predation rates on artificial nests (Hamilton et al. 2002). Predation of artificial ground-nests, however, often differs from that of natural ones (Burke et al. 2003) so confirmation by monitoring natural ground-nests is required.

Monitoring of diversionary feeding

Whether or not diversionary feeding is considered successful will differ depending on stakeholder objectives. As such, indicators of success and suitable monitoring techniques need to be developed to suit each case. This process will also allow knowledge transfer to potential future applications. Three key questions that managers should ask to determine how to monitor responses, as well as whether responses can be adequately monitored, are outlined below.

Can the uptake of diversionary food (the input) be monitored directly?

Different sub-groups or individuals may vary in their response to diversionary food. For instance, only 50 % of mountain hares (Lepus timidus) visited feeding stations in a population in Scotland, UK (Newey et al. 2010), whilst female hen harriers take diversionary food at a higher rate than males (Redpath 2001). Assuming individual animals are distinguishable, direct monitoring of food sources may allow observations of individual variation in response to food and alert managers if non-target animals or species deplete the food supply. Camera traps have been used successfully to monitor feeding stations (Ziegltrum 2008) although it is not always possible to identify individual animals. In some cases, individuals may be marked visually (Smart and Ratcliffe 2000), or tagged with Passive Integrated Transponders (PIT tags) and individual visitations recorded by a receiver on the feeding station (Newey et al. 2009), although methods also rely on a substantial proportion of the population being tagged to provide this information.

Choosing appropriate monitoring methods is essential to highlight areas of improvement for an efficient iterative management approach. For instance, pellet counts to assess moose distribution in relation to feeding sites at the population level initially suggested that moose used areas close to feeding sites more than the surrounding area. Tracking individuals fitted with GPS transmitters revealed, however, that only one of 15 animals used the feeding site regularly, and only three used the site at all (Sahlsten et al. 2010), indicating that very few of the moose responded to diversionary feeding.

Can the direct impact of feeding (the output) be monitored?

The selection of appropriate methods to monitor outputs needs to be influenced by the problem being managed. When diversionary feeding is used to mitigate habitat damage, outputs can be monitored using pre- and post-treatment habitat surveys (Sullivan and Klenner 1993; Sullivan et al. 2001; Sullivan and Sullivan 2004; Ziegltrum 2004; Barrio et al. 2010) or, where problems are highlighted by landowners, pre- and post-treatment questionnaires (Calenge et al. 2004). Regardless of the method used, appropriate control sites should also be monitored to avoid incorrectly interpreting an unrelated temporal trend as a treatment effect. For predation, monitoring is dependent upon prey type. Direct monitoring may be possible, such as radio tagging red grouse to quantify adult survival rates (Redpath 2001), or using camera traps to quantify nest predation (Summers et al. 2009). However, ground-nesting birds are often found by flushing incubating females (Greenwood et al. 1998; Redpath 2001), which may be too disruptive for vulnerable species. Camera traps can also increase predation rates by drawing attention to nests (Summers et al. 2009). Whilst the effort required to locate rare or elusive species may reduce the cost-effectiveness of diversionary feeding, any on-going monitoring (e.g. to assess population trends) could incorporate such management objectives.

Can the overall benefit in terms of management objectives (the outcomes) be monitored?

To measure whether a strategy has achieved objectives of increased densities of game or at-risk species, increased crop yield, or reductions in threats to human safety, managers need to relate the results of outcome monitoring to the management targets set during goal-setting (e.g. an objective of reducing threats to human safety by 30 %, or increasing game populations by 20 %). Achieving improvements in outputs (e.g. reduced predation) may not necessarily translate to positive outcomes such as increased prey abundance (Fig. 1). For example, despite a significant reduction in damage by the European rabbit to vineyards after the application of diversionary food (Barrio et al. 2010), increased grape yield was not observed. Likewise, reduced predation of ground-nesting birds by hen harriers (Redpath 2001) and skunks (Mephitis mephitis; Greenwood et al. 1998) did not result in observable increases in prey survival, perhaps due to from compensatory predation or increased densities of other predators (Greenwood et al. 1998; Jackson 2001) due to the diversionary food (Redpath 2001). To enable targeted revisions within the management framework, success should be measured at each of these stages wherever possible.

Where field experiments are not feasible because of funding constraints or the scale of the problem but sufficient prior knowledge of the system exists, modelling the effects of feeding under different management and environmental conditions would allow the likely outcomes (New et al. 2012) or net costs (Ziegltrum 2006) to be assessed for each scenario. The use of experts to evaluate the options for monitoring the performance of management decisions, or to predict expected performance, may also be useful here. Bayesian inference, as used by New et al. (2012), may be invaluable in these cases (Ellison 2004) as each subsequent experiment builds upon previous knowledge, essentially creating a meta-analysis of findings, and potentially reduces uncertainty in the resulting trends or highlights those that are unfounded.

Review of diversionary feeding

Does the strategy meet stakeholder objectives?

Despite some differences in the indicators used to measure success, the 21 diversionary feeding papers all presented quantifiable measures of the effect of diversionary feeding. Our review of these shows mixed success (Table 1). At the output stage, 10 of 15 trials for crop protection were considered a success, whilst only one of three trials involving risks to human safety, and two of five trials to reduce predation of vulnerable or game species are reported as successful. Fewer studies report the ultimate outcomes of diversionary feeding trials and, although success is comparable to the output stage with four of eight successful trials to reduce crop damage and one of two for risks to human safety, none of the three outcomes for predation reduction were considered a success. Due to small sample sizes, however, these results should be interpreted with caution.

The majority (76 %) of reported outputs and outcomes for all categories included a statistical measure of support based on a predetermined significance level of 95 % (i.e. P < 0.05). While most studies also present an effect size, measured as the observed magnitude of the difference between treatments, some failed to take the effect size into account in their consideration of the success of the result, stating a lack of success due to non-significance at the 95 % CI, even where the effect sizes appear to be relatively high (Greenwood et al. 1998; Sullivan and Sullivan 2008; Table 1). Focusing only on statistical significance related to P values without consideration of the associated effect size, may lead to the dismissal of potentially promising results (Fidler et al. 2006; Nakagawa and Cuthill 2007). Providing a level of significance, regardless of the arbitrary and precautionary ‘success’ threshold of 95 %, coupled with an effect size and associated confidence intervals, will allow practitioners to decide upon the level of certainty they are willing to accept (e.g. 90 %, or potentially lower levels of confidence in the results) depending on the nature of the problem, the potential payoffs or effect size, and the available funds.

Have conservation conflicts been reduced?

The mitigation of crop damage was the most successful of all categories reviewed, with efforts in Washington, USA, to protect timber plantations from black bears gaining acceptance from the public, a positive response from animal rights groups, satisfaction from timber managers (Ziegltrum 2006) and continued use of the strategy (Witmer et al. 2000; Ziegltrum 2008). The strategy has also been implemented on a long term basis by collaboration between wine growers and hunters to alleviate wild boar (Sus scrofa) damage to vineyards in Southern France (Calenge et al. 2004), and has resolved a minimum of 147 complaints from land owners regarding waterfowl damage to crops over a 60 day damage season (Fairaizl and Pfeifer 1987).

The development of clear, focused objectives that incorporate stakeholder values and public opinion, as well as ecological requirements, plays a large part in determining whether positive outcomes result in the resolution or reduction of a conservation conflict (Gregory et al. 2012). Social acceptance of management strategies was noted in several papers as being an important objective (Witmer et al. 2000; Andreassen et al. 2005; Massei et al. 2011), with one paper stating that the public have a right to involvement in conservation decision making due to the amount of public funding used for this purpose (Thompson et al. 2009). However, only one study reported that public acceptance had been actively sought in the long term (Ziegltrum 2006) and none of the papers explicitly examined the level of public acceptance in diversionary feeding. When scientists define the objectives of a diversionary feeding trial without the involvement of the resource managers, the information provided may be of limited use if it does not address the concerns of the people making the decisions. A lack of targeted objectives that clearly address the concerns of resource managers could add to the uncertainty associated with the strategy. When lethal control is opposed by certain parties due to either the charismatic nature of the target animal or its conservation status, diversionary feeding is a potential alternative. Although scientific support for the strategy is currently sparse, high perceived uncertainty due to poor objective-setting may allow political and social pressures to override scientific findings even when the findings are positive, causing conflict to remain unresolved (Thirgood and Redpath 2008).

Is the strategy cost-effective?

When evaluating the efficiency of a wildlife management option, the potential benefits must be assessed carefully against the effort and investment expended to achieve them. Despite cost being of key importance to decision making, cost-effectiveness analyses are rarely reported in the literature, and only 33 % of the reviewed studies reported the overall costs of management and conservation actions (Table 1). Where analyses are provided, the cost-effectiveness of diversionary feeding receives mixed support. For the purpose of crop protection, a 2:1 benefit-cost ratio has been found for protection against waterfowl damage (Fairaizl and Pfeifer 1987), whereas for protection against wild boar damage, the cost of feeding was similar to that of replacing lost crops (Massei et al. 2011). Where possible, it is also useful to be able to compare the cost-effectiveness of diversionary feeding with other potential mitigation techniques. In a complete SDM process, this comparison would occur in the steps preceding implementation and evaluation, in which all potential management actions are first canvassed then the consequences evaluated (Gregory et al. 2012, Ewen et al. 2014). Research on reducing moose-train collisions indicated that vegetation clearance was more cost effective than diversionary feeding (Andreassen et al. 2005).

Return on investment analyses enable comparisons between different initial damage levels and timescales of diversionary feeding; for example, feeding to mitigate timber damage by black bears reduced income losses between 11 % and 31 % (Ziegltrum 2006). As only five studies (n = 7 trials) present both outputs and costs, and only 6 studies (n = 6 trials) reported both outcomes and costs, we were unable to estimate an ROI curve for either the outputs or outcomes of diversionary feeding. This lack of data clearly demonstrates the need for studies to report the overall costs of management strategies, as well as standardized results. In the absence of return on investment analyses, the grand means of outputs, outcomes and costs provides a measure of the overall efficacy across studies. The strategy was found to be more successful at the output stage, with a 37 % reduction in problem activity across studies, compared to the outcome stage, with a 15 % increase in success relative to respective management objectives, for a mean cost of US$291.27 ha−1. An important observation is the tendency for unsuccessful studies to fail to report costs or some other comparable measure of effort expended (Storey 1997; Smart and Ratcliffe 2000; Geisser and Reyer 2004; Sahlsten et al. 2010; Table 1), and we would encourage researchers to report costs whatever the outcome of their experiment. One cost that was not reported in any studies and hence could not be evaluated here was the cost of the monitoring itself. In many cases the cost of monitoring might be incorporated within the overall management costs, but in others it could extend well beyond the lifetime of the on-ground management particularly where there is a time lag to species’ responses. Monitoring and its effectiveness can and should be scrutinized using decision analysis in the same way that we have evaluated management here (Nichols and Williams 2006).

The likely effects of feeding regimes are not always intuitive, making the overall cost difficult to estimate during planning. Browsing of commercial timber by brown bears (Ursus arctos), for instance, may result in compensatory growth of browsed trees, effectively reducing timber loss and reducing the cost-effectiveness of mitigation (Helgenberg 1998). The amount of food required may be higher than expected if consumption by non-target animals occurs (Conover 2002), although exact figures would be difficult to estimate prior to implementation.

In traditional economic cost-benefit analysis, costs and benefits are reported in monetary amounts and are easily compared. In circumstances where the aim of diversionary feeding is to prevent or reduce damage, managers must ensure that the cost of the strategy is lower than the cost of repair, replacement or compensation for damage. Diversionary feeding for crop protection, for example, may only be justified for crops of high economic value (Conover 2002). In some cases, such as vulnerable species protection, the intrinsic value of the subject species may be the guiding factor and benefits cannot be quantified in monetary terms. Non-monetary value can complicate decisions made by multiple stakeholders due to the difficulty of agreeing on the target levels of ‘return’ (e.g. a particular growth rate or increase in productivity) that are required or realistic for a given input. The process of objective-setting within the SDM process may help to overcome this issue as, rather than a single ‘cost-effectiveness’ objective, multiple objectives can be set, including one to minimise the cost of the strategy and those stating expected non-monetary benefits. These objectives can then be weighted depending on the preference of the stakeholders, and the outcomes of alternative actions, including diversionary feeding, calculated and compared.

Discussion

When the persistence of the animals implicated in a human-wildlife or wildlife-wildlife conflict is a fundamental objective of management, diversionary feeding could be considered as an alternative to lethal methods of control. The success of diversionary feeding varies greatly depending on the behaviour of the target species, the distribution method of diversionary food and the effects on the surrounding habitat. Successful uptake of diversionary food is most likely where populations are food-limited (Witmer et al. 2000; Calenge et al. 2004; Ziegltrum 2004; Barrio et al. 2010). Care, however, must be taken to avoid increasing population sizes of target species (Sullivan and Klenner 1993) or their dependence on the additional food source (Ziegltrum 2004). Targeting selected individuals can reduce the risk of increasing population size and, if a sub-set of the population is responsible for the problem, targeting these individuals could increase efficiency and efficacy (Redpath 2001). Careful choice of the type of diversionary food can reduce dependency as, if food is less appealing than natural food sources (but, in cases of crop damage, more appealing than the crop at risk), animals will switch to natural sources of food once they become available (Sullivan 1979; Sullivan and Sullivan 1982; Flowers 1987). The switch between diversionary and natural food sources, as desired, is more likely when the target species is generalist (Storey 1997; Greenwood et al. 1998; Smart and Ratcliffe 2000; Redpath 2001; Conover 2002). The most effective distribution method is species dependent, with feeding stations being the common method used for large mammals (Ziegltrum 2004; Sahlsten et al. 2010; Rogers 2011). Presentation of food in long strips, rather than concentrated piles, improves efficacy in some cases as it prevents dominance over the food supply; lower concentrations of food may also reduce disease transmission as it reduces contact between individuals (Miller et al. 2003; Castillo et al. 2011).

Overall, diversionary feeding trials to date appear to have been more effective for the mitigation of habitat damage than for reducing predation of target species or reducing threats to human safety (Table 1). The most successful use of diversionary feeding has been in combination with other management tools such as fencing (Kaplan et al. 2011), or scare devices (Conover 2002). In these cases, diversionary food provides an accessible alternative to food sources the animals have been excluded from. The least successful programmes result in increased damage or threats rather than a reduction, such as increased crop damage by montane voles (Microtus montanus; Sullivan and Sullivan 2004), or wild boar (Geisser and Reyer 2004). In the case of wild boar, the close proximity of feeding stations to crops may have increased, rather than decreased, visits to the crops. Sullivan (Sullivan and Sullivan 2004) suggested that diversionary food may have been depleted quickly, resulting in a short term increase in montane vole population size and subsequent increase in damage. Failure to reach desired management goals may, in part, be due to the underlying variability associated with natural systems. The application of any intervention to a stochastic natural system will alter its structure in potentially unpredictable ways, so predicted outcomes based on the original state of the ecosystem might no longer be valid (Walters et al. 1990).

By placing the results of previous studies of diversionary feeding into an adaptive decision-making framework that explicitly considers management and monitoring objectives, it is possible to link objectives, inputs, outputs and outcomes (Fig. 1). The framework enables predictions of the likely effectiveness of management actions and provides a solid evidence-base for efficient decision-making. Positive initial outputs, such as a reduction in damage or predation, often fail to translate into the outcomes desired by stakeholders, such as an increase in crop yield, or increased population density of the prey species. A number of factors may contribute to this failure, such as a compensatory response from other species (Greenwood et al. 1998; Jackson 2001), attraction of other animals to the area by the food source, response to feeding by only a sub-set of the population (Newey et al. 2010) or the presence of other strong influences on the problem such as disease or climate (Redpath 2001). Many diversionary feeding trials monitor only the initial response to feeding and, given the inconsistency in meeting the ultimate strategy goals, this limitation may lead to unreasonable expectations as to its likely usefulness. By evaluating the objectives of monitoring as well as management using our framework, and choosing a monitoring method and timeframe that will efficiently measure likely outcomes relative to objectives, future studies can better place their results in the context of the effort expended. Further, consistent reporting of the outcomes of diversionary feeding trials at each stage, with particular reference to the overall objectives will enable the production of the generalisable results needed for a full systematic review of its efficacy, which is not possible at present. The decision-making framework for implementing and evaluating management outlined in this review represents the final of six steps described in the SDM approach (Gregory et al. 2012) outlined by Ewen et al. (2014). Whilst the results of the ‘implement, monitor, review’ stage outlined here will certainly inform the management process during each iteration of the decision-making process, the prior development of clear objectives and thorough evaluation of potential alternatives, as specified in a SDM approach (Gregory et al. 2012), is imperative when choosing between wildlife management actions with more predictable outcomes but that could cause direct harm to wildlife (e.g. culling) and management actions with uncertain outcomes that do not cause direct harm to wildlife (e.g. diversionary feeding) but might also be less effective.

Although diversionary feeding is considered an expensive option for management (Witmer et al. 2000; Mason and Bodenchuk 2002), detailed cost-effectiveness analyses are rarely conducted. A lack of published data means that it was not possible to conduct a ROI analysis, although with increased reporting of results, including failures, future studies might be able to include predictor terms in ROI analyses, informed by the nine questions we have outlined in this review, that account for some of the variability in response to diversionary feeding (e.g. Walsh et al. 2012).

Decisions on the use of diversionary feeding as a tool for wildlife management and conservation are essentially a trade-off between the potential benefit of the action, as described by the magnitude of effect seen through observation or model prediction; the confidence in these findings, and the costs required to obtain these potential benefits. More of the studies in the present review may have been considered successful if their effect size relative to effort expended had been evaluated instead of relying purely on the statistical significance of the result to categorise the management as successful. By providing transparent and unbiased information to stakeholders in each of these areas, without the constraints of an absolute threshold of perceived success (i.e. the 95 % confidence limit), researchers transfer the power of decision making to the stakeholder. This unbiased approach will allow them to set realistic targets for management, and when evaluating outcomes, to declare success or failure based on whether the results are within their own acceptable level of uncertainty.