Framework Directive 2009/128/EC requires that all Member States show how their National Action Plans ensure the implementation of the eight general principles of IPM, and Article 55 of Regulation 1107/2009/EC requires that professional pesticide users comply with these principles. Beyond the legal requirement, the authors believe that the set of general principles provides valuable guidance encouraging growers along a logical process of decision-making. Intelligent application of the principles can be taken as an opportunity to both reduce dependency on pesticides and innovate.
The eight principles and their numbering actually result from a logical sequence of events. Figure 3 illustrates this sequence. Principle 1 (Prevention and suppression) comes first because it encompasses the initial design and actions undertaken at the cropping system level to reduce the severity and frequency of pest outbreaks. Principles 2 (Monitoring) and 3 (Decision-making), which come into play once the cropping system is in place, are based on the idea that in-season control measures result from a sound decision-making process that takes into account actual or predicted pest incidence. In the event that an intervention is decided, Principles 4 to 7 offer a sequence of control options that can be explored starting with the least preoccupying ones first. Principle 8 (Evaluation) closes the loop by ensuring that users look back and assess their actions in view of improving the entire process.
Principle 1—prevention and suppression
“Prevention is better than cure” is the first general rule in any production system. Prevention can be considered as the creation of cropping systems inherently less likely to experience significant economic losses due to the presence of pests. Suppression, understood as the reduction of the incidence of pests or of the severity of their impact, complements prevention. This principle means that the aim is not to completely eliminate pests but to prevent any single one from becoming dominant or damaging in a cropping system.
Certain aspects of prevention dealing with the use of healthy and weed-free planting material and detection of pathogens in substrates deserve more attention, particularly in light of new technologies. Many pathogens associated with seed become the source of disease in the subsequent year. Weed seed contaminating harvest can become a major problem in the subsequent year. Certification of disease-free seed, seed potatoes, bulbs, cuttings, and new sorting technologies are very helpful in avoiding problems but it is important to apply measures early, prior to certification of harvested seed (Van der Wolf et al. 2013). Soil substrates, manure, and other amendments can now be screened with modern molecular multiplex technologies to qualitatively and quantitatively assess the disease situation (Van Gent-Pelzer et al. 2010; Sikora et al. 2012). Such diagnostic allows better decision-making regarding the choice of subsequent crops or cultivars. Been et al. (2005) developed a web-based tool that potato farmers can use to fine-tune their rotation strategies based on the detection of certain nematode pathotypes. For the detection of pathogens in latently infected seed and plants, however, new technologies with higher sensitivity are needed.
Plant breeding for pest resistance is recognized as an important contributor to the development of prevention strategies. The use of pest-tolerant and resistant cultivars will help to decrease dependence on pesticides in arable crops. However, absolute resistance to a specific pest in crops is not a realistic goal. Even resistance by pyramiding resistance genes in one cultivar can be overcome if no other measures to reduce selection pressure are applied. To avoid such an outcome, the use of new cultivars needs to be combined with continuous monitoring of emerging virulent biotypes and pathogens carrying resistance-breaking genes. Haverkort et al. (2008) showed the feasibility of this approach against Phytophthora in potato.
Combinations of tactics and multi-pest approach
The combination of control tactics into management strategies generates more effective and sustainable results than single-tactic approaches. To create conditions that reduce the frequency and intensity of pest outbreaks, research and extension need to develop strategies integrating a range of methods. Plant genetic resistance can be exploited while addressing multiple pests, diversifying cropping systems in time and space, and integrating crop management practices and landscape effects within pest management. Even though testing such integrated strategies requires careful planning and relatively high investment, it is feasible. The FP7 PURE project (Pesticide Use-and-risk Reduction in European farming systems with Integrated Pest Management, www.pure-ipm.eu) evaluates the feasibility of combinations on six different cropping system types. This European project successfully tested various combinations integrating the following tactics in maize-based cropping systems: foregoing pre-emergence herbicides, establishing a false seedbed, harrowing at the 2–3 leaf stage, use of low-dose post emergence herbicide, hoeing combined with postemergence band-spraying, and Trichogramma releases against European corn borer, Ostrinia nubilalis (PURE 2013).
When feasible, control strategies take into account multiple pests as control of one pest may affect others. In the UK, the Hortlink SCEPTRE 4-year, multi-crop, multi-season, multi-pest project tested on-farm is a combination of options against aphids, raspberry beetle, and Botrytis drawing from pest-resistant varieties, biocontrol agents, precision monitoring, and biopesticides for protected raspberry systems (Horticultural Development Company 2012). The optimal combination in each region reduced pesticide inputs by at least 30 % and provided as good pest suppression as current pesticide-based practice. The further development of multi-pest recommendations requires that scientific research, on-farm testing, and advisor/farmer education programs move away from the compartmentalized study of weeds, insects, and pathogens. On-farm and whole-farm initiatives cutting across scientific disciplines are needed to take into account arthropod-weed-pathogen complexes and devise solutions that are workable from the farmer’s point of view.
Rotation
Spatial and temporal diversification is key to minimizing pest pressure and achieving effective prevention. In organic arable crop farming, crop rotation is the most effective agronomic alternative to synthetic pesticides (Fig. 4). In annual crops, the manipulation of crop sequence to break the life cycle of pests through rotation with crop species belonging to different families is a major lever to strengthen robustness of cropping and farming systems. In this context, robustness refers to stabilizing agronomic performance in spite of disturbances caused by the presence of pests. A diversified crop sequence prevents selection and buildup of the best-adapted pest populations. As a general rule in arable crop rotations, alternating winter and spring-summer crops is recommended as this will break the life cycle of many pests, particularly weeds, more efficiently than a rotation with just winter or summer crops. Similar principles can also be developed for vegetable cropping systems where rotation between leaf and root crops is promoted, while the frequent occurrence of crops within the same botanical family is discouraged. For many specialist fungal pathogens, the selection of different crop families within a rotation effectively reduces pressure. There are other plant pathogens characterized by broad host range that need to be treated differently, however. Such is the case of the bacteria Pseudomonas syringae (Bartoli et al. 2014; Lamichhane et al. 2014a) and Xanthomonas arboricola (Lamichhane 2014) which infect a number of botanical families.
Maize-based cropping systems offer an illustration of the importance of crop rotation. Continuous maize cultivation is widespread in Europe for grain production as well as for silage and energy production. Rotation has been demonstrated as key to reducing reliance on pesticides while allowing the successful management of the invasive Western corn rootworm Diabrotica virgifera virgifera as well as several noxious weeds (Vasileiadis et al. 2011). In Europe, the Western corn rootworm can be considered as a pest of a specific maize cultivation system because the complete egg-to-adult development of the pest extends through two maize cultivation cycles. When maize is rotated with a non-maize crop, the cycle of this pest is broken and its population decreases to minimal levels. Because of the multiple on-farm uses of maize and the profits it generates, introduction of a new crop nevertheless requires careful market considerations and additional inputs in terms of knowledge and machinery (Vasileiadis et al. 2013). The rotation rule is therefore applied in a flexible way, adapting it to local conditions and allowing more frequent maize crops in a rotation and wider occurrence per farm area than theoretically optimal (Levay et al. 2014; Szalai et al. 2014). Rotating maize to a diversity of non-maize crop species helps farmers avoid the development of variant of the Western corn rootworm that is referred to as “rotation resistant” because of its propensity to oviposit in non-maize crops. The maize-soybean rotation in the US corn belt routinely applied over large areas for many years has selected for a strain that has lost its preference for laying eggs in maize, resulting in damage in maize following soybean crops (Levine and Oloumi-Sadeghi 1996; Gray et al. 1998; Levine et al. 2002).
Crop management and ecology
Many crop management practices apparently unrelated to pest management actually have a significant impact on the vulnerability of cropping systems to pests. Fertilization is known to affect sap-sucking insects and mites (Altieri and Nicholls 2003), plant pathogenic fungi (Snoeijer et al. 2000), and bacteria (Lamichhane et al. 2013). Mechanical weeding can damage crop tissue and favor diseases (Hatcher and Melander 2003). Crop residue management can affect the overwintering capacity of pests (Sojka et al. 1991). Tillage systems often determine abundance and composition of weed communities and soil-borne diseases (Norris 2005).
Conservation tillage is referred to under Principle 1 as a desirable cultivation technique. Its role within IPM, however, is not always clear-cut. While it is true that reduced tillage does favor soil organic matter and biodiversity, and that it reduces CO2 emissions and risks of soil erosion, the supposed benefits for crop protection cannot be generalized. For example, Fusarium blight, one of the main causes of mycotoxins, is greatly favored by no-till systems where maize and wheat residues remain on the soil surface year-round (Kandhai et al. 2011). Also, no-till systems are usually associated with greater herbicide dependency—due to “chemical mowing” prior to sowing—and conditions more favorable to the evolution of herbicide resistance (Melander et al. 2012). The benefits of conservation tillage need to be assessed relative to multiple sustainability criteria generating trade-offs. Here, as is often the case in IPM, it is difficult to generate simple and general recommendations—local fine-tuning is more pertinent.
Increasing inter- and intra-specific diversity within and around the cultivated field is gaining increased attention as a crop protection strategy. There are many strategies to increase spatial diversity. These include the use of mixed cultivars, composite cross-populations (i.e., crop populations obtained by continuously exposing a population rather than individual plants to natural selection), intercropping (i.e., the spatial association of two or more crop species), living mulches, or semi-natural vegetation. Significant disease reduction was obtained by interspacing a rice cultivar susceptible to Magnaporthe oryzae—causing rice blast—with a resistant one (Zhu et al. 2000; Raboin et al. 2012). Sapoukhina et al. (2013) used modeling techniques to demonstrate that mixing wheat cultivars using small proportions of highly resistant ones could more effectively reduce disease severity against multiple diseases while exerting low selective pressure on pathogens. A high level of crop genetic diversity can be obtained by using many parental lines to generate composite cross-populations and by ensuring that the resulting populations are continuously adapted to local conditions. In wheat, such composite cross-populations were shown to decrease outbreaks of the leaf spot disease complex (a combination of Tan Spot, Pyrenophora tritici-repentis, Septoria leaf blotch, Mycosphaerella graminicola, and Stagonospora leaf blotch, Parastagonospora nodorum), relative to single commercial varieties or variety mixtures (Costanzo 2014). Similar patterns have been reported for composite cross-populations of barley (Saghai Maroof et al. 1983). Intercropping different crop species can be used to reduce severity of diseases (Fernandez-Aparicio et al. 2010; Gao et al. 2014). In perennial systems, living mulches can be successfully established for the control of weeds (Baumgartner et al. 2008; Fourie 2010). Some plant species can act as trap crops (Aluja et al. 1997), or as sources of natural enemies providing top-down control (Paredes et al. 2013). There are many such mechanisms by which increasing plant diversity can be taken advantage of to improve biological regulation of pests, but this approach has received more attention in tropical systems (Crowder and Jabbour 2014; Ratnadass et al. 2012). The “push–pull” strategy successfully implemented in Eastern Africa is an example of this approach. It is based on repelling the stem borer from maize—the main crop—by intercropping it with the leguminous plant silverleaf desmodium and simultaneously attracting the borer to a border strip planted with Napier grass, which acts as a dead-end trap crop. Silverleaf desmodium not only reduces stem borer attack on maize but also increases stem borer parasitization rates and suppresses the parasitic Striga weed via an allelopathic effect as well (Cook et al. 2007; Khan et al. 2010).
Preventive strategies designed to create a healthy, robust cropping systems require effective integration of multiple agronomic levers to reduce reliance on pesticide use.
Principle 2—monitoring
Beyond prevention, moving away from a pesticide-based strategy implies monitoring harmful organisms at regular intervals (Fig. 5) or upon issue of local warnings. In an ideal world, all farmers would monitor pest populations and use forecasting systems prior to making a decision regarding control. The current reality, however, is that warning and forecasting systems are not available and affordable in all countries for all crops. Some countries have nevertheless developed successful support systems. In Denmark, an extensive monitoring system linked to the farm advisory system plays a major role in placing that country among the lowest pesticide users in arable crops in the European Union (Kudsk and Jensen 2014). In Germany, an online forecasting platform (das Informationssystem Integrierte Pflanzenproduktion
www.isip.de) that integrates weather data in disease models provides regional decision-support in major crops (Racca et al. 2011). In Switzerland, farmers rely on weekly regional plant protection recommendations and online pest and disease forecasting systems and decision-support tools (www.phytopre.ch, www.agrometeo.ch, www.sopra.admin.ch) to assess risks for a variety of crops and, if need be, optimize timing of applications (Samietz et al. 2011). In France, for each region and cropping system, information on pest pressure collected by 4000 observers covering 15,400 plots is made freely available via a weekly update produced by multi-actor regional groups (DGAL 2014). This system has recently been enriched with data on non-intentional effects of pesticide use including monitoring of biodiversity trends via four indicator groups of species as well as monitoring of pesticide resistance in 30 pest and active substance associations.
The Europe-wide monitoring of the potato late blight pathogen, Phytophthora infestans, is an example of a well-developed multi-country monitoring system. Researchers from the UK, The Netherlands, and Denmark developed a DNA-fingerprinting method based on microsatellite markers. Within the EuroBlight network (www.EuroBlight.net), pesticide companies, advisors, and farmers participate in sampling and analyzing infected leaves. This extensive sampling network makes it possible to visualize the distribution of dominant pathogen clones, their virulence, the development of fungicide resistance, and the dynamics over several years and several areas of Europe.
Farmers are often reluctant to monitor weeds because they look similar at the young growth stages when management decisions are needed. While the level of infestation of most pathogens and insect pests will significantly vary between years, shifts in weed flora—except for herbicide-resistant biotypes—occur slowly (Walter 1996). The gradual year-to-year changes in weed populations makes it possible to generate weed maps by monitoring end-of-season weeds or by establishing small untreated plots in the field. The information collected in this way can then be used to plan weed management in the subsequent crops.
The availability and nature of monitoring, warning, and forecasting systems vary according to the type of pest and the means locally available. Researchers, advisers, and farmers are faced with the challenge of adapting to such a diversity of situations.
Principle 3—decision based on monitoring and thresholds
While it is true that sound intervention thresholds play an important role in IPM, they are, however, not always applicable, available, or sufficient. In many cases, thresholds have not been established for weeds (Sattin et al. 1992). This is also the case for pathogens, particularly those that switch from a saprophytic to a pathogenic lifestyle depending on environmental events and climatic conditions (Underwood et al. 2007).
In the past, many IPM programs have centered on threshold-based decisions. When decision-support systems are not in place or are not appropriate, however, the use of thresholds along with the concept of IPM are disregarded. It may be better in such cases to stress the importance of observation in general, of sound decision rules, and of the entire set of IPM principles.
IPM historically emerged in the area of insect pest control where the use of intervention thresholds has generated good results. The practicability of threshold-based decisions against diseases and weeds now needs to be demonstrated and reconsidered. Although there have been efforts to define economic thresholds for weeds (Keller et al. 2014), there is no consensus regarding their applicability. Developing thresholds for weeds is a challenge because they usually appear as a community of multiple species, typically have a patchy distribution, and have long-term impact through a persistent seed bank. Similarly, the pertinence of the threshold approach can be questioned in the case of polycyclic diseases, where it is often necessary to target the primary cycle while the inoculum level is very low and disease symptoms invisible. Conversely, thresholds may not apply in the case of tolerant varieties that can exhibit visible disease symptoms that do not in reality impact yield. We cannot realistically assume that robust and scientifically sound economic injury levels will be available for all major pests in all major crop varieties and cultivation environments. Complexity, regional and site specificities, emerging and invading pests, differing crop management practices, and ideally, the integration of externalities make that impossible. Principle 3, which requires growers to assess pest pressure, is important but not sufficient to ensure integration of all available measures. With respect to this last point, there may be an opportunity to develop a new generation of decision-support systems. Whereas present-day decision-support systems are usually based on real-time tactical decision-making involving one crop, one pest, and one control technique, new systems could support strategic approaches encompassing the whole range of IPM options. Instead of “spray/don’t spray” guidance, the new systems would provide insights on desired varieties, cropping sequences, combinations of direct control methods, and relevant agronomic practices. They could also provide information on expected damage and economic consequences as well as on impacts on non-target and beneficial organisms.
The decision-making process determining inseason control measures based on the short-term pest situation could be extended to integrate more systemic factors for longer-term strategic design.
Principle 4—non-chemical methods
Giving preference to non-chemical over chemical methods, if they provide satisfactory pest control, appears to be a sound and straightforward principle. The difficulty lies in the way “satisfactory pest control” is determined. The authors believe that the highest level of control attainable by chemical measures is often not sustainable, creates new pest problems, and is not a proper standard against which single non-chemical tactics are evaluated, rather, a satisfactory while sustainable level of pest management can be achieved via a broad IPM strategy that includes an array of protection methods. Separately, each alternative method, e.g., a biopesticide, may perform with lower and slower biocidal power and appear more costly than synthetic pesticides if externalities are not included. Collectively, however, alternative methods should generate synergies resulting in satisfactory pest management. Their cost could become more attractive if pesticide steering taxes currently under consideration in several European countries are introduced. More on the evaluation of protection measures is provided below under Principle 8.
There is a wide range of non-chemical but direct pest control measures such as soil solarization or biological control, but their availability, efficacy, or pertinence varies considerably. Though various biotechnical methods have been developed, pheromone-based mating disruption is probably the most advanced and successful of such techniques (Fig. 6). Key insect pests of apple and grape such as codling moth, summer fruit tortrix, smaller fruit tortrix, and grapevine moths are effectively controlled with mating disruption. In Switzerland, the technique is in use in 50 % of the apple orchards and 60 % of the vineyards and has enabled a reduction of synthetic pesticide use by two thirds (Samietz and Höhn 2010; Günter and Pasquier 2008).
The use of live natural enemies represents a major non-chemical IPM tool that could be further developed. Whereas biological control agents are well developed in protected crops, significant opportunities for their use still exist in other systems such as arable crops. The use of Trichogramma against the European corn borer Ostrinia nubilalis is one of the few successful examples. The target specificity of natural enemies is an environmental asset that nevertheless presents challenges for biocontrol producers who are not assured high returns on their investment. Also, the use and handling of biological control agents require fine-tuning and specific skills best addressed via public-private research initiatives, education, and training (ENDURE 2010c). Innovative screening protocols that focus on important factors beyond mere efficacy will make it possible to tap into promising taxa of microorganisms outside the presently limited source of biodiversity (Kohl et al. 2011). Beyond the production and marketing of live biological agents, the cropping system and the landscape into which these organisms are released also needs to be taken into account to optimize and sustain their efficacy. There is a growing interest in better understanding ecological processes at the landscape level to achieve area-wide IPM based on the action of natural enemies. Rusch et al. (2010) have begun to study these aspects in the context of rotations involving oilseed rape.
For weeds, where biological control options seldom apply, many effective agronomic, mechanical, and physical control methods are available. They can be incorporated into Integrated Weed Management strategies to reduce impact of weeds and over-use of conventional herbicides over the long-term. Ideally, such strategies integrate preventive, cultural, and direct chemical or non-chemical tactics. A number of non-chemical direct methods such as suppressive winter cover crops, stale seedbed technique, pre-emergence cultivation, more dense crop stands, interrow precision weeding, and use of hoes equipped with tools for intra-row weeding are feasible (Fig. 7). They have been successfully applied in maize-based cropping systems and other row crops without jeopardizing yield (Vasileiadis et al. 2011). There is still a need to work out strategies based on knowledge of weed biology and the ecology of crop weed interactions that combine crop rotation, cultural control, non-chemical control methods, and chemical control using smart application technologies and adapt them to local circumstances. Direct non-chemical measures can cause undesired effects on other components of the pest-weed-disease complex. Changes in pest management could therefore be associated with monitoring of secondary pests.
The effective use of non-chemical alternatives requires a new mindset seeking synergies gained from the combined effect of alternative methods that may individually be less efficient or convenient than synthetic pesticides.
Principle 5—pesticide selection
IPM seeks to reduce reliance on pesticides. When prevention and alternative control methods by themselves do no yield satisfactory results, however, selective pesticides are also used. In this situation, Principles 5, 6, and 7, which presuppose pesticide use, become relevant. Sound selection of pesticides to minimize unwanted health or environmental effects (including negative effects on pest regulation) is essential (Fig. 8).
The undesired non-target impacts of broad-spectrum insecticides on arthropod natural enemies are well documented. In Switzerland, in the early 1970s, the excessive use of non-selective pesticides in orchards and vineyards nearly eradicated predatory mites and caused acaricide resistance among spider mites. Uncontrollable spider mite outbreaks could only be regulated with a pest control program specifically designed to preserve naturally occurring but reintroduced predatory mites (Stäubli 1983). To minimize disruption of biological control of pests and improve IPM, products more compatible with beneficial arthropods are favored. Databases can be consulted online for this purpose. They include the IOBC Pest Select Database, the IPM Impact Side-effects database (available on subscription), the Pesticide Action Network-North America’s pesticideinfo.org, the University of Hertfordshire’s Pesticide Properties DataBase, or the French Ministry of Agriculture’s E-phy catalogue (in French). Biological agent commercial companies—such as Koppert or Biobest—also provide information covering pesticide effects on the beneficial arthropods they deliver. Alternatives to more persistent molecules are being developed (Czaja et al. 2015; Gerwick and Sparks 2014). Selective biopesticides represent a particularly desirable alternative to chemical pesticides, but a wider range of such products needs to be made available. Some biopesticides are available on the market, but the number of bioherbicides remains low (Cantrell et al. 2012). The further development of biopesticides faces the same regulatory constraints as their synthetic pesticide counterparts as they fall under the same regulations (Villaverde et al. 2014).
Italy’s Emilia-Romagna region has historically emphasized this principle in its agricultural development policy and obtained significant improvements during the last 25 years. IPM regulation and implementation in that region tackled both pesticide quantity and quality with the aim of reducing impact on human health and the environment while maintaining economically acceptable production. Only pesticides with a lower impact on human health and the environment were allowed in the new IPM system. As a result, 70 to 90 % of the pesticides with high acute toxicity and 40 to 95 % of those with a high chronic toxicity were excluded, and the overall quantity of pesticides used was reduced by 20–35 % between 1995 and 2005 (Galassi and Sattin 2014).
A number of existing databases and the further development of biopesticides offer desirable options for the selection of products minimizing impact on human health, the environment, and biological regulation of pests.
Principle 6—reduced pesticide use
Reducing doses, application frequency, and resorting to partial application of pesticides contribute to the IPM goal of reducing or minimizing risks to human health and the environment. In fact, national pesticide plans have adopted reduced use as their overall quantitative time-bound goal. Expressing reduction in terms of volume used automatically generates a downward trend due to a switch to more potent products. To circumvent this artifact, Denmark pioneered the “treatment frequency index,” which simultaneously computes frequency of use and dose (Kudsk and Jensen 2014). Although the present authors consider reducing dose rates as secondary to reducing reliance on pesticides, we acknowledge it as a tactic along the IPM continuum that can be judiciously combined with other ones: use of resistant cultivars, applying thresholds concerning disease intensity rather than frequency combined with advanced decision-support systems. One aspect to consider applying reduced doses is the potential influence on the risk of pesticide resistance developing in the pest population, which is the focus of the next principle.
Reduced pesticide use, in terms of frequency, spot spraying, or dose reduction is a recognized tactic along the IPM continuum that can be combined with other ones.
Principle 7—anti-resistance strategies
The number of pest species resistant to pesticides is increasing and jeopardizing the efficacy of many products. The resistance of insect pests to insecticides was a major initial driver for the development of IPM (Stern et al. 1959). There are now many instances of resistance among all pest categories. For example, Podosphaera xanthii, the fungus causing cucurbit powdery mildew quickly developed resistance to demethylation inhibitor fungicides (McGrath et al. 1996), strobilurin (McGrath and Shishkoff 2003), and more recently to cyflufenamid (Pirondi et al. 2014). This issue is particularly acute for weed management because very few new herbicidal modes of action remain available (Heap 2014; Duke 2012). The increased likelihood of over-reliance on a narrow spectrum of molecules threatens the viability of conventional cropping systems where spatial and temporal diversity is low.
There is a debate on the relationship between use of pesticide doses lower than that recommended on the label, sublethal effects, the hormesis effect, and the evolution of resistance to pesticides. An aside to this question regards the converse situation—pesticide over-use—which probably contributes significantly to the evolution of pesticide-resistant pest biotypes. There are many situations where appropriate lower doses can be recommended without increasing the risk of inducing non-target resistance. Such a situation is reported with Phytophthora control in potato, as long as information on pest incidence, phenology, susceptibility to pesticides, and canopy structure, is included in decision-making (Cooke et al. 2011). In any case, there is no consistent evidence and no consensus among crop protection scientists that reduced pesticide dosages are related to resistance development. The authors believe that the debate is not precisely where it should be. Bearing in mind that no unequivocal relationship exists between pesticide dose and efficacy, a focus on efficacy levels rather than pesticide doses seems more pertinent (Kudsk 2014). The new vision of sustainable pesticide use focuses on a desirable control level that is then related to the selection pressure due to the biological activity and persistence of the active ingredients concerned.
Of course, the combination of chemically based tactics can help to reduce the evolution of resistance. For example, combining fungicides with different modes of action, application timing, and splitting applications did lead to more reliable resistance management strategies (van den Bosch et al. 2014a, b). It is also possible to monitor the occurrence of resistance to guide decision-making. This is precisely what the EuroBlight network mentioned above is engaged. It monitors Phytophthora infestans and provides updated multi-country information on new resistant isolates.
If we take a step back to look at the larger picture, we find that the root causes of increased risk of resistance are associated with the over-simplification and intensification of cropping systems (e.g., monocultures reliant on too few crop protection measures). This has been shown for weeds in continuous cropping situations (Neve et al. 2014). Focusing on weeds, Owen et al. (2014) conclude that merely modifying herbicide use will not yield lasting solutions to herbicide resistance in weeds. But this is probably true for the whole range of pests. To reduce the selection of resistant pest biotypes and lengthen the commercial lifespan of pesticides, farmers can strive for a higher level of IPM, consider the spatial distribution of tolerant varieties or non-host crops, and make full use of preventive measures (see Principle 1).
The root causes of increased risk of evolution of non-target resistance can be addressed by revisiting over-simplification and excessive intensification of cropping systems.
Principle 8—evaluation
Principle 8 encourages farmers to assess the soundness of the crop protection measures they adopt, and this is an important aspect of sound management. The delicate point here regards the evaluation criteria used. Farmer interviews showed that absolute yield—irrespective of profit—and total absence of pests, i.e., “clean” fields, are the two indicators of good crop protection practice most used among farmers and advisers (Lamine et al. 2009). Such traditional assessment methods can impede the development of alternatives. IPM-compatible assessment could cover multi-season effects, trade-offs with other compartments of production and economics, as well as human health and the environment. New IPM-adapted performance criteria and standards of reference could integrate these factors at a cropping system and agroecosystem level. Many positive effects of IPM strategies are multi-year, and effective evaluation therefore covers all crops of the rotation over more than one season. This is particularly pertinent to the management of weed seed banks, accumulation of soil-borne pathogens, resistance development of pathogens, and unpredictable insect outbreaks. As mentioned under Principle 4, the level of short-term control attained by chemical measures alone is not the standard by which “success” is gauged. A process of re-thinking and reassessing evaluation needs to be initiated. It would emphasize the evaluation of yield, yield stability, and profit over multiple years at the cropping system level. Lechenet et al. (2014) provide an example of an approach to assess pesticide use intensity at the cropping system level while taking into account multiple trade-offs. Research and extension work at the farm community level will develop new standards of reference, and performance criteria can become widely shared among farmers.
Sustainability in pest management calls for new evaluation criteria that take into account multi-season effects and a diversity of trade-offs, and can be widely shared among the farmer community.