1 Introduction

Damping-off is a historical term coined during the early nineteenth century, and represents one of the oldest worldwide nursery problems as discussed in detail in the classic nursery manual (Hartley and Pierce 1917; Tillotson 1917; Hartley 1921). Damping-off was considered “the most serious problem encountered in raising nursery seedlings,” and consequently was one of the most focused research subject since the beginning of its description (Hartley and Pierce 1917). The definition of damping-off is not straightforward in the literature. Many authors refer to damping-off as a “disease” (McNew 1960; Horst 2013), while others refer to damping-off as a “symptomatic condition” (Agrios 2005; Kemerait and Vidhyasekaran 2006). In the former case, damping-off is usually associated to soil-borne pathogens while in the latter case, seed-borne pathogens can promote damping-off. Nevertheless, both interpretations comprehend that damping-off involves non-germination, prevention of seedling emergence after germination, or the rotting and collapse of seedlings at the soil level.

Overall, damping-off can be caused by a number of biotic or abiotic stresses/factors, which prevent seeds to germinate or seedlings to emerge, including those caused by plant-pathogenic bacteria or insect pests notably those living in soil such as Delia spp. Agriotes spp., or Melolontha spp. (Fig. 1). As a consequence, the symptoms associated with damping-off widely vary depending on the type of stress associated with it and time of its occurrence. In general, many fungi and fungi-like species (Table 1) have been reported as the most important biotic stress weakening or destroying seeds and seedlings of almost all species including fruit, vegetable, field, ornamental, and forestry crops (Filer and Peterson 1975; Kraft et al. 2000). However, this paper will focus on damping-off caused by Fusarium spp., Rhizoctonia spp., Pythium spp. and Phytophthora spp. since these pathogens are the most frequently associated with damping-off and are considered the most important causal agents of this problem in the literature (Table 1). Furthermore, the role of abiotic stresses will be also discussed as they indirectly affect damping-off occurrence. Favorable abiotic conditions for damping-off problems generally involve excessive soil moisture and excessive overhead misting, lower soil temperatures before emergence, higher soil temperatures after emergence, and overcrowded flats or seedbeds (Wright 1957; Papavizas and Davey 1961; Duniway 1983a; James 2012a; Starkey and Enebak 2012).

Fig. 1
figure 1

Damping-off is either a disease of germinating seeds (pre-emergence—A) or young seedlings (post-emergence—B). The latter also comprises cotyledon blight. While damping-off is usually refereed to diseases caused by soil-borne fungi or oomycetes, a number of abiotic stresses may contribute to damping-off symptoms (C) (adapted from Landis (2013)

Table 1 A non-exhaustive list of studies highlighting first reports of damping-off worldwide since 2001

In recent years, numerous soil-borne fungi belonging to over a dozen of genera and oomycetes (Pythium and Phytophthora), and some seed-borne fungi, have been reported to cause damping-off on a large number of crops (Table 1). Most of these pathogens are common in agricultural soils and can be spread via non-anthropic and anthropic activities, including water run-off through irrigation or rain (Zappia et al. 2014), soil contamination by improperly sanitized tools, introduction of infected plants (mainly in case of seed-borne pathogens), improperly sanitized greenhouse, and the use of contaminated irrigation water (Papavizas and Davey 1961; Duniway 1983b; Schmitthenner and Canaday 1983; Huang and Kuhlman 1990; James 2012a; Starkey and Enebak 2012). Once established, damping-off pathogens are able to survive for many years in the soil, even in the absence of host plants, either as saprophytes or as living resting structures that are capable of enduring adverse conditions (Menzies 1963). Their wide host range also aids in the longevity of these fungi and fungus-like organisms.

Despite a long history behind and a number of research works conducted on damping-off, it still represents one of the most difficult problems to be managed both in the nurseries and fields. There is no country or geographic area without damping-off problems, on a number of economically important crops. Indeed, since only the beginning of the twenty-first century, almost 50 new reports of damping-off diseases have been noticed on over 30 crops and from over 20 countries (Table 1). This clearly suggests that damping-off problem is multifaceted, and requires more research efforts to generate further knowledge needed for a durable and sustainable management of damping-off.

Overall, the economic losses due to damping-off are represented by a direct cost, due to damages of seed or seedlings (Fig. 2), and an indirect cost, which consists of an additional cost of replanting and the consequent lower yields due to the later planting dates (Babadoost and Islam 2003; Bacharis et al. 2010; Horst 2013). Although there is no detailed and precise estimation about the real economic impact of damping-off at the global level in monetary terms, a previous study reported that 40 million extra seedlings are planted each year only in Georgia (the USA) to counterbalance losses due to non-viable seeds and damping-off of seedlings (Huang and Kuhlman 1990). Likewise, in 2016, in Brittany (France), the grass or cereal fly Geomyza tripunctata damaged thousands of hectares of maize crops with significant economic losses in the region (BSV 2016). An extensive literature research showed that the incidence of damping-off may vary from 5 to 80% (Table 1).

Fig. 2
figure 2

An overview of soybean (a) and pea (b) fields affected by damping-off diseases due to Pythium spp. The presence of empty space along the row indicates seed or seedlings affected by pre- and post-emergence damping-off diseases which killed plants. The economic losses in such a situation are severe owing to a direct cost due to damages of seed or seedlings and an indirect cost related to an additional cost of replanting and the consequent lower yields due to the later planting dates (Fig. 1A is photo courtesy of Martin Chilvers while Fig. 1B is photo courtesy of Lindsey J. du Toit)

In addition to a significant economic importance, there is a considerable environmental impact due to the widespread use of fungicides to manage this frequently occurring problem. For example, the methyl bromide seed treatment and fumigation, a practice forbidden in the European Union (Mouttet et al. 2014), still represents one of the major practices adopted elsewhere, including in the USA, to manage damping-off diseases (Weiland et al. 2013). However, following the Montreal Protocol (UNEP 2006), this practice tends to decline and restrictions for soil fumigation have been increased (Weiland et al. 2013). Nevertheless, other conventional fungicides play an increasingly important role in mitigating seed and seedling damage caused by damping-off pathogens. The frequent use of these fungicides has led to the development of fungicide-resistant isolates with additional challenges for farmers to manage damping-off (Taylor et al. 2002; Moorman et al. 2002; Lamichhane et al. 2016).

In light of the high economic impact of damping-off and negative environmental effects generated by conventional fungicide-based control strategies, there is a need to develop alternative and sustainable solutions to manage damping-off. Integrated pest management (IPM) exemplifies a sustainable approach to this aim as it combines preventive measures (e.g., enhancement of seed health, which represents the core of resilient agroecosystems) as well as best agronomic and cultural practices first and pesticide-based tactics as the last option. Therefore, the objectives of this work were to (i) highlight the major features of damping-off diseases, especially those caused by Fusarium spp., Rhizoctonia spp., Pythium spp., and Phytophthora spp.; (ii) report and discuss currently used disease management strategies and knowledge gaps; and (iii) suggest key challenges and future priorities for a sustainable management of damping-off diseases.

2 Symptoms of damping-off

Damping-off symptoms can be observed from seeding until the fourth to sixth week post-sowing (Horst 2013). The disease symptoms can be divided in two phases based on the time of its appearance.

2.1 Pre-emergence symptoms

They occur when seeds decay prior to emergence. This can occur (i) before seed germination, or when (ii) the germinating seeds are killed by biotic stresses while shoot tissues are still below ground (Fig. 3; Filer and Peterson 1975; Crous 2002; Horst 2013). In the first case, seeds become soft, rotten, and fail to germinate. In the second case, stems of germinating seeds are affected with characteristic water-soaked lesions formed at or below the soil line (Cram 2003; Landis 2013). With the progression of the disease, these lesions may darken to reddish-brown, brown, or black. Expanding lesions quickly girdle young and tender stems. Seedlings may wilt and die soon before emergence. In general, random pockets of poor seedling emergence are an indication of pre-emergence damping-off.

Fig. 3
figure 3

Characteristic symptoms of pre-emergence damping-off of pea (Pisum sativum L.) caused by Pythium spp. Despite the same sowing date, only the first three seeds on the left have emerged. Note non-emerged seeds with or without root development. Soft, rotten, and decayed seeds prior to germinating or the germinating seeds killed by biotic stresses while shoot tissues are still below ground are characteristic symptoms of pre-emergence damping-off. The sixth seed from the left had germinated but the stem of germinating seeds was affected by the disease with characteristic water-soaked lesions below the soil line. This led to wilting of the seedling soon after emergence (Photo courtesy of Lindsey J. du Toit)

Abiotic stresses can be divided into two categories: chemical and physical stress. The first notably involves limiting (i) concentrations in carbon dioxide or ethylene (Negm and Smith 1978), (ii) potential of hydrogen (Foy 1984), (iii) osmotic potential (Romo and Haferkamp 1987), and (iv) phytotoxicity (Wang et al. 2001). The second includes (i) extreme temperatures (high or low) (Khan 1977; Wen 2015), extreme seedbed humidity (high or low) (Maraghni et al. 2010; Wen 2015) and (iii) mechanical stresses such as seedbed clods (Dürr and Aubertot 2000), or crusting at the soil surface (Aubertot et al. 2002). Other mechanical events, such as removal of mulch or soil by wind and rain, may also contribute to non-uniform seeding of containers or beds, poor seed development, and seed rot and decay (Landis 2013).

Because biotic and abiotic stresses interact among them, it is important to distinguish which of them are associated with the disease symptoms.

2.2 Post-emergence symptoms

Post-emergence damping-off symptoms occur when seedlings decay, wilt, and die after emergence (Fig. 4; Boyce 1961; Horst 2013). In most cases, all symptoms result in the collapse and death of at least some seedlings in any given seedling population. In the case of soil-borne pathogen, there could be the death of seedlings in groups in roughly circular patches and the seedlings may have stem lesions at ground level. Seedling stems can become thin and tough (commonly known as “wirestem”), which often leads to reduced seedling vigor. These symptoms can be also accompanied by leaf spotting and a complete root rot may occur. Overall, the symptoms on the stem of the seedlings include water-soaked, sunken lesion at or slightly below the ground level and sometime also below ground line (i.e., on the roots), causing the plant to fall over (Wright 1944; Filer and Peterson 1975). Surviving plants are stunted, and affected areas often show uneven growth.

Fig. 4
figure 4

Characteristic symptoms of post-emergence damping-off of soybean (a and b) and corn (c and d). The succulent tissue of sprouts with aboveground shoots collapsed, leading to wilting of some seedling populations. Soybean seedlings with stem lesions at ground level and the death of seedlings in groups (a and b). The presence of an empty space along the row between corn seedlings indicates the lack of emerged seedlings due to damping-off disease (c and d). (Photo courtesy of Martin Chilvers)

Abiotic stresses, such as superficial soil heat, can also lead to post-emergence seedling symptoms such as whitish lesions, which are often located only on one side of the stem in the early growth stage of seedlings (Hartley 1918). Such symptoms can be distinguished from those caused by biotic stresses because the damage owing to heat lesions is generally scattered throughout nurseries/seedbeds, which mainly depend on patterns of shade and heat buildup (Hartley 1921), while that caused by biotic stresses often occurs in expanding patches. Soil crusting is another important abiotic stress that often hinders seedling emergence or leads to stunted seedling growth (Fig. 5). Phytotoxicity caused by chemical fungicides is another abiotic stress. The symptoms of phytotoxicity, however, may vary based on the type of chemical used including marginal necrosis, chlorotic patches or spots, and malformed flowers, buds, and young leaves (Dole and Wilkins 2004). For example, fungicides based with benzimidazole can cause reduced plant growth and visual damage in bedding plants (Iersel and Bugbee 1996).

Fig. 5
figure 5

Lack of sugar beet (Beta vulgaris L.) seedlings emergence due to soil crusting followed by drought. The formation of soil crusts on the soil surface represents a strong mechanical barrier which impedes seedlings from being emerged. Overall, lack of seed germination and emergence in such a field is characteristic of abiotic stresses including stunted growth of seedlings without any necrosis of leaves or stems. (Photo courtesy of Carolyne Dürr)

2.3 Occurrence of damping-off symptoms

Most damping-off diseases present a single sort of symptom (pre- or post-emergence). However, both sorts of symptoms are also reported to some extent (Table 1) although the underlying factors leading to the occurrence of each sort of symptom are poorly discussed in the literature. The complexity of damping-off symptoms result from interactions between cropping practices and the production situation (Aubertot and Robin 2013). This may explain the relevant lack of information. This complexity involves synergism among damping-off pathogens (Al-Hazmi and Al-Nadary 2015), variation of symptoms according to environmental conduciveness (Schwanck et al. 2015), direct effect of plant density (Burdon and Chilvers 1975), and many other factors, which are very specific for each damping-off symptom. For instance, the disease cycle components of damping-off are seldom discussed in a broader sense in the literature, by comparing different diseases. Some factors related to the time/moment of disease occurrence and timing of disease cycle components could determine whether pre- or post-emergence symptoms will occur. In this sense, it is possible that for both pre- and post-emergence damping-off, the infection occurs during seed germination but a longer or shorter incubation period may implicate in pre- or post-emergence damping-off. In addition, the effect of individual factors involved on the disease processes from the disease cycle and the host cycle (e.g., seed germination), for damping-off symptom development, is rarely discussed in the literature. Taken together, several studies virtually explore the effect of a given factor (e.g., temperature) on damping-off diseases intensity (Ben-Yephet and Nelson 1999), without specifying whether the factor plays a specific role on the pathogen (e.g., organism metabolism) or on the host (e.g., slow germination process increases time exposure underground). Further knowledge on disease cycle features and processes would help better understand damping-off symptom occurrence. Although it was out of the focus of this work, it is worth to mention that from the extensive literature review, we did not perceive any pattern on the sort of damping-off symptom (pre- or post-) according to the region, the pathogen genus, or crop species affected. Therefore, a meta-analytical approach to test hypotheses associated with damping-off diseases would be highly valuable to better explain the factors involved in damping-off symptoms occurrence. To this aim, the list of damping-off diseases we provide in Table 1 constitutes a potential starting point.

3 Integrated management of damping-off

An effective management of damping-off requires the deployment of a number of strategies, which can be classified into the following four major groups: (i) seed treatment to enhance germination and seedling vigor, (ii) deployment of resistant or tolerant cultivars to damping-off diseases, (iii) adoption of best cropping practices, and (iv) timely treatment interventions of seedlings with effective products (conventional pesticides as well as biopesticides and/or biocontrol agents). None of these strategies is effective in managing damping-off disease when applied individually and thus it requires that all of them are combined within the frame of IPM.

3.1 Seed treatment to enhance germination and seedling vigor

While the use of completely healthy seeds is the most effective means to prevent and/or contain damping-off diseases, seeds might not be always free from pathogens and thus would benefit from treatments. Even when there is no risk of contaminated seeds from seed-borne pathogens, seed treatments can be an effective means to increase seedling emergence, particularly when done on seeds of low vigor and when the seed coat has been damaged (Mancini and Romanazzi 2014).

Chemical seed treatments still represent a major practice in agriculture to manage damping-off diseases (Rhodes and Myers 1989; Babadoost and Islam 2003; Howell 2007; Bradley 2007; Leisso et al. 2009; Dorrance et al. 2009; Rothrock et al. 2012; Kandel et al. 2016). Several chemicals including bleach, hydrogen peroxide, ethanol, and fungicides can be applied to remove pathogen inoculum from seed coats (Dumroese and James 2005; Mancini and Romanazzi 2014). Generally, chemical treatments are effective but they can also negatively affect seed germination and cause phytotoxicity (Axelrood et al. 1995; du Toit 2004) besides negative impacts to human health and the environment (Lamichhane et al. 2016). In addition to chemical treatments, physical seed treatment can be applied including hot water, hot air, and electron treatments (Mancini and Romanazzi 2014). Finally, a number of biological seed treatment methods are being developed and used in recent years with a satisfactory level of damping-off disease suppression (Table 2).

Table 2 Examples of literature reports highlighting the efficacy of non-chemical seed treatments to suppress damping-off diseases. The tested formulations are most often reported to suppress both pre- and post-emergence damping-off although their effectiveness may vary in terms of disease suppressiveness

Because seed germination and emergence are often influenced by site-specific soil and climate conditions, an in-depth knowledge of a specific site in question is a prerequisite for an effective decision-making process for seed treatments. An experiment on pesticide-free agroecosystems conducted in 2014 across eight experimental sites in France, with non-treated seeds, showed that the percentage of emergence rates markedly differs for the same seeds across the sites (Fig. 3). In particular, while the rate of emergence of soft wheat was 100% in the Le Rheu and Grignon sites, it was lower across other sites ranging from 43% in Auzeville to 75% in Lusignan (Fig. 6). This means that while seed treatments may result essential across some sites, due to unfavorable soil and climatic conditions, which are conducive to disease development, it may not be the case in other areas.

Fig. 6
figure 6

Percentage of seed emergence (non-treated seeds) observed across different experimental sites managed under the “Res0Pest” network in France in 2014. Res0Pest is a “pesticide-free” trial network launched in 2011 by the INRA/CIRAD IPM network to address objectives of the French National Action Plan Ecophyto to develop and demonstrate the feasibility of pesticide-free cropping systems (Deytieux et al. 2014). Eight experimental sites, comprised of five arable cropping systems (in brown) and three mixed crop/husbandry systems (in green) are ongoing across the sites. DW durum wheat, SW soft wheat, MSW mixed of soft wheat varieties, SB spring barley. Different percentages of emergence across the experimental sites highlight how soil and climate and cropping practices affect seed germination and the seedling emergence process through biotic and abiotic stress. Low percentages of emerged seedlings are highlighted in red

3.2 Deployment of host-plant resistance and/or tolerance

Overall, host-plant resistance as a management tactic is composed of the following two strategies: (i) deployment of resistant and/or tolerant plant varieties, which support lower pathogen populations or better tolerate injury caused by them; and (ii) the integration of such varieties with other management tactics within the frame of IPM. Unfortunately, for many plant pathogens, including those causing damping-off diseases, no plant cultivar with measurable resistance is available (Babadoost and Islam 2003). Therefore, the only way to better use the available crop varieties with tolerance to pathogens is through their adequate integration with other disease management measures. Nevertheless, insufficient focus has been paid to date to the integration of plant resistance with other IPM tactics, and to quantifying the benefits of plant resistance in multi-tactic IPM programs (Stout and Davis 2009).

On the other hand, the breeding approach used to date to develop resistant and/or tolerant crop varieties should be revisited if we want to focus on sustainable crop protection based on IPM. This is particularly true taking into account the fact that most, if not all, crop varieties bred to date are based on a market-driven approach focused on high-yielding and most profitable crop varieties. This trend has boosted adoption of short rotations or monoculture practices, on one hand, and ignored the potential that minor crops may have for IPM, on the other (Messéan et al. 2016). The limited range of available minor crop varieties has been reported as one of the major obstacles to crop diversification, thereby confining certain beneficial practices such as multiple cropping or intercropping (Enjalbert et al. 2016; Messéan et al. 2016). Therefore, breeding for IPM should be based on a different approach than the traditional one given the strategic role of breeding in the competitiveness of crops and their adaptation to more diversified cropping systems (Enjalbert et al. 2016).

3.3 Adoption of best cropping practices

Once the causal agent of damping-off has been identified, all available cropping practices could be adapted to discourage the development of the pathogen. Indeed, any technique that allows to reduce the time between seed germination and emergence helps reduce effects of biotic stresses on seedlings. Overall, many pathogens involved in damping-off are relatively weak pathogens, which require favorable environmental conditions for infection to occur (Table 1). In addition to the susceptibility of host and aggressiveness of pathogen populations, the severity of damping-off is highly dependent on some critical factors including seedbed preparation, soil pH management, seeding date and rate, growing density, nutrition, irrigation, growing environment, crop sequence and intercropping, cover crops, soil residue management, soil solarization, and tillage (Table 3). Therefore, understanding combined effects of abiotic and biotic stresses and factors influencing them are a prerequisite towards effective IPM strategies of damping-off. Once these critical factors have been identified, which might differ from one region to another, best cropping practices should be put in place and adopted.

Table 3 Critical factors affecting damping-off and best cropping practices which help discourage its development

One of the most important practices that allow to the best management of damping-off and root diseases is fertilization. Adequate availability of nutrients in the soil can ensure higher vigor, with earlier emergence that limit the period of time where pathogens can infect seeds and seedlings during the autotrophic stage. In particular, the advantages of fertilizer placement on seed germination and seedling emergence have been previously demonstrated (Cook et al. 2000). The placement of fertilizers, directly under or slightly to one side of the seed, at the time of planting or sowing results in an increased level of seed germination and emergence (Fig. 7). This is because relatively immobile nutrients, such as phosphorus, are not readily available for plants especially for those species having no or a few lateral roots. Therefore, field fertilization, where damping-off diseases are important, requires that the nutrients be made easily accessible to the roots to increase growth rate. Although these nutrients do not always reduce seedling infection, they often enhance seed germination and seedling vigor (Smiley et al. 1990; Patterson et al. 1998). Indeed, higher seedling vigor allows seedlings to rapidly escape from the soil surface even in the presence of a high soil population density of the pathogen(s).

Fig. 7
figure 7

Effect of fertilizer placement on germination and emergence of oilseed rape. While the placement of micronutrients (zinc and phosphorous) at the time of sowing allowed seeds to readily germinate and emerge (the three lateral sides of the plot), the lack of nutrient placement has resulted in markedly reduced seed germination and emergence (the middle of the plot). The same field practices were applied in the field, including the same date of sowing and cultivar. In addition to oilseed rape, the cultivated field presents annual weed Poa annua and Vulpia myuros (light green color). (Photo courtesy of Jean-Pierre Sarthou)

3.4 Timely treatment interventions of seedlings with effective products

The strategies described above are mainly of preventive nature as they can be developed and adopted before the occurrence of damping-off diseases. Once the infection occurs on seedlings and there is a high risk of epidemic development, growers have to attempt for an effective control of the disease. Overall, there are two key measures available for damping-off control as described below.

3.4.1 Biological control

Because of their adverse effects on human health and the environment, the use of conventional pesticides, including fungicides, has come under increasing public scrutiny in many countries especially in the European Union (Bourguet and Guillemaud 2016; Lamichhane et al. 2016). In addition, increasing reports of pest resistance development to pesticides have become an issue, thereby increasing risks of pest management failure with potential threats of economic losses for farmers (Onstad 2013; Bourguet and Guillemaud 2016; Lamichhane et al. 2016). Chemical fungicides can also cause phytotoxicity on crops and foliage plants, which is another drawback of their use (Dias 2012).

The application of biocontrol agents/formulations is an important substitute to conventional fungicides, with lower negative impacts. Often, biocontrol is widely practiced as an alternative disease management strategy to conventional fungicides especially when the latter are not effective or cause secondary problems such as seed phytotoxicity from fungicides (Burns and Benson 2000). Individual beneficial organisms used as biocontrol agents can prevent damping-off pathogens through five mechanisms (Table 4). There are dozens of biocontrol products to control damping-off worldwide and most of them are based on antagonist fungi, including Trichoderma spp. and Gliocladium spp. or bacteria such as Pseudomonas spp. and Bacillus spp. (Table 5). However, not all of them are registered and marketed as biocontrol agents nor they are used as plant growth promoters, plant strengtheners (or biostimulants), or soil conditioners (Paulitz and Bélanger 2001). Numerous studies conducted on biocontrol research in the last 15 years clearly suggest the increasing concern of the scientific community to generate knowledge on an alternative to chemical solutions (Table 5). Most of these studies have also demonstrated a good effectiveness of biocontrol products in managing the disease. Accordingly, the biocontrol industry has become very dynamic in recent years especially in terms of using the available scientific knowledge to develop and commercialize formulations. However, most of these formulations are based on individual biocontrol agents and they specifically target a specific pathogen.

Table 4 Key mechanisms involved in biocontrol activities and list of selected references
Table 5 List of selected studies reporting the use of microbial antagonists for biological control of major damping-off pathogens worldwide since 2001. These biological control agents were either applied to seedlings or to soil to achieve disease suppression

3.4.2 Chemical control

While alternative tactics to chemical control are the priority for IPM to manage damping-off diseases, such measures available on the market are not always effective in controlling damping-off diseases and/or their effectiveness is variable. Therefore, a judicious use of fungicides maybe needed to combine with other IPM tactics especially when the disease infection has already occurred (Harman 2000).

Chemical control of damping-off as foliar application, however, is restricted to a few active ingredients due to the high cost of fungicides and the small number of products registered for some crops including those for ornamental use (Garzón et al. 2011). Among the most frequently used fungicides, there are etridiazole and metalaxyl, active against Phytopthora and Pythium spp.; benomyl and thiophanate methyl, active against Fusarium and Rhizoctonia spp.; mancozeb and maneb, active against Fusarium and Phythium spp.; and captan, active against common damping-off pathogens. A rapid decrease in market availability of many previously available fungicides further limits access to chemical treatments in many countries especially in the European Union (Lamichhane et al. 2016). On the other hand, resistance to commonly used fungicides developed by several strains of pathogens has challenged the long-term sustainability of chemical control (Taylor et al. 2002; Allain-Boulé et al. 2004; Moorman and Kim 2004; Reeleder et al. 2007; Weiland et al. 2014). All these new scenarios clearly highlight that non-chemical measures will be increasingly developed and used for the management of damping-off, particularly for post-emergence ones. This trend is clear also in the literature where most recent research efforts are on the development of biocontrol solutions rather than focusing on the chemical ones (Tables 2 and 5). This happens due to the general concerns with regards to conventional pesticides, but also because private and public sectors can design new solutions for the so-called “biocontrol market.” However, even with biocontrol solutions, diagnosis of the involved pathogens along with the analysis of treatment opportunity is still required.

4 Key challenges and future priorities for damping-off management

In order to tackle the complex and multifaceted nature of damping-off diseases and a range of factors that affect their occurrence and development, we propose five research priorities, which are essential towards a better understanding and management of damping-off diseases.

4.1 Correct identification of damping-off pathogens including non-secondary colonizers and anastomosis groups

An accurate identification of the causal agent(s) associated with damping-off is imperative for understanding the etiology of damping-off outbreaks and thus represents a cornerstone for the decision-making process to IPM. This involves confirming the pest, learning how it spreads, and then identifying critical points for its management, including development of preventive measures based on adapted cropping practices. Most often, the specific pathogen causing damping-off cannot be determined based on the visual inspections of symptoms. Therefore, their correct identification is essential. It is generally performed using both culture-based and culture-independent methods. However, both of these techniques have their advantages and drawbacks and hence are complementary to each other. For example, culture-based techniques allow for the characterization of important traits such as virulence or fungicide resistance. Not only are they time consuming, but they also underestimate the true diversity of species present within a sample (Zinger et al. 2012; James 2012b; Bik et al. 2016). Culture-independent methods, such as next generation sequencing, on the other hand, allow to identify the overall species diversity present in a given sample but their limit is that they do not allow to determine the virulence and fungicide resistance of the microbes associated with the disease (Lamichhane and Venturi 2015).

Although many modern PCR techniques allow a rapid detection and identification of one or more specific pathogens, including those reported to cause damping-off diseases (Weiland and Sundsbak 2000; Lievens et al. 2006; Ishiguro et al. 2013), the timely identification of the overall species diversity involved in the disease occurrence process still remains a challenge. In addition, such techniques require DNA purification, the availability of more expensive and sophisticated equipment, and more highly trained technical personnel to perform the test (Schroeder et al. 2012), which is a strong limit to their wider adoption. Therefore, we still need to develop techniques, which could simplify the detection, on one hand, and be economically sustainable, on the other.

All four soil-borne pathogens dealt in this paper are characterized by a complex of genetically distinct species, with a wide host range or virulence preference for certain hosts. For example, Rhizoctonia solani species Kühn (teleomorph: Thanatephorus cucumeris; A. B. Frank; Donk) is a multinucleate species that has been divided into 14 anastomosis groups (AGs; AG1 to AG13 and AG B1; (Sneh et al. 1991; Carling and Summer 1992; Carling et al. 2002). Binucleate Rhizoctonia spp. (teleomorph: Ceratobasidium) are divided into 19 AGs (AG A to AG S). Finally, R. oryzae and R. zeae are multinucleate with the teleomorphs Waitea circinata var. circinata and W. circinata var. zeae, respectively (Sneh et al. 1991). Given its variable nature within- and between-AG variation in virulence and host range, a correct and timely identification of the specific genetic lines associated with damping-off diseases is still a challenge, which calls for further research efforts.

Fusarium spp. are characterized by a wide genetic diversity and their taxonomy has been afflicted by changing species concepts, with as few as 9 to over 1000 species being recognized by different taxonomists during the past 100 years (Summerell et al. 2003). Indeed, the complexity and the recognized difficulty of rapidly identifying cultures to species have been reported as the major reason hindering effective disease management (Summerell et al. 2003). The challenge within the Fusarium species complex is also to determine the specific role of secondary colonizers in occurrence and development of damping-off diseases since they are characterized by a high variability and complexity in terms of host range and virulence.

Similar problems exist also for Pythium species with most plant-pathogenic lines having a wide host range. For example, Pythium ultimum is reported to attack over 719 host plants (Farr and Rossman 2012). Other species such as Pythium graminicola and Pythium arrhenomanes are restricted only to Poaceae family (Schroeder et al. 2012). Traditional baiting or other culture-based techniques are still widely used for the identification of Pythium species although culture-independent methods, such as cytochrome oxidase subunit 1 pyrosequencing, are also used (Coffua et al. 2016). The challenge is that methodological biases inherent to culture-independent methods may often lead to inconsistencies in diversity estimates of Pythium species associated with damping-off diseases. Nevertheless, culture-based techniques are the only means to demonstrate, for example, the presence of potential pathogens even in fields with no previous history of damping-off diseases. Indeed, based on culture-based techniques, several studies have isolated Pythium species from symptomatic and asymptomatic plants and demonstrated their pathogenicity on a large number of plant species (Bahramisharif et al. 2013b; Coffua et al. 2016). Further, culture-based methods and morphological observations may still result essential in confirming the presence of novel or unexpected species within a sampling location and thus have to be considered for identification purposes (Zitnick-Anderson and Nelson 2014).

The complexity in terms of taxonomy is even more accentuated for the genus Phytophthora with many studies over recent years recognizing different Phytophthora as a species complex. Often, the taxonomic status of the related species is also a matter of controversy or the presence of several distinct lineages perhaps representing as yet undescribed species (Safaiefarahani et al. 2015). Many new species of Phytophthora are constantly proposed and the taxonomy of this genus has been evolving very dynamically (Henricot et al. 2014). Consequently, development of rapid and reliable diagnostic methods is a challenging task for this genus too.

Because most damping-off pathogens are either soil- or water-borne, instead of airborne, adoption of good phytosanitary practices generally allows to manage damping-off diseases. This is especially the case if a proper detection of the causal agent(s) is timely made. This helps understand also critical management points that allow pathogens to enter into the field and/or nursery. The mode of transmission maybe different for each pathogen although spread in infected soil or growing medium is common to all species (Table 6). Because most of these pathogens are common in agricultural soils, they can be spread via contaminated soil, introduction of infected plants (mainly in case of seed-borne pathogens), improperly sanitized equipment and greenhouse, and the use of contaminated irrigation water (Zappia et al. 2014). In particular, Pythium spp. and Phytophthora spp. have motile zoospores, which are most commonly spread by water leading to epidemic developments (Hong and Moorman 2005; Zappia et al. 2014). Therefore, the potential presence of these pathogens in irrigation water should be timely determined using appropriate bioassays such as in-situ baiting (Ghimire et al. 2009) or PCR techniques (Martin et al. 2012; Schroeder et al. 2012). Appropriate treatments of the water should be implemented if their presence is confirmed in irrigation water. Detection and management approaches of plant pathogens in irrigation water have been previously described (Hong and Moorman 2005; Stewart-Wade 2011; Zappia et al. 2014).

Table 6 Mode of transmission of major causal agents of damping-off and soild and climate conditions favorable to their development. Any IPM approach should consist in the adoption of cropping practices, including cultivar choice and chemical control which could discourage factors favoring damping-off

4.2 Determination of potential interactions within and/or between damping-off pathogens and other living organisms

Plant disease occurrence and development are determined by numerous interactions between host, pathogen, and prevailing environmental conditions, especially biocenosis, under the influence of cropping practices. This is especially the case of soilborne pathogens for which there are many possibilities for potential interactions with other microorganisms/agents occupying the same ecological niche. A number of recent studies reported significant interactions within and/or between several damping-off pathogens and other pathogenic organisms (Table 7). Such studies have emphasized that co-inoculation of two or more pathogens consistently cause more detrimental effects on root development than either pathogen alone. These findings will guide future research on damping-off diseases, including studies of the genetic diversity within species, epidemiological and ecological features of the disease, and host-pathogen interactions, and ultimately help to develop durable and sustainable damping-off management practices.

Table 7 Selection of synergistic interactions within and/or between damping-off pathogens and other pathogenic organisms reported since 2000

Although our understanding about individual genetic lines of microbes causing damping-off has increased over the years, there is a severe knowledge gap about how synergistic interactions between two or more genetic lines belonging to the same or different fungal genera/pathogenic agents can lead to the occurrence and spread of damping-off diseases. Therefore, a focus to understanding such interactions would be another direction for future research, which is pivotal for the development of effective disease management strategies (Lamichhane and Venturi 2015).

4.3 A better knowledge of the role of abiotic factors that predispose seeds and seedlings to damping-off diseases

Overall, while there is good knowledge in the literature concerning the role of individual abiotic factors on damping-off (especially soil moisture and temperature), little is known about how interactions between abiotic and biotic factors lead to the occurrence of such diseases. Few works performed on abiotic stresses have highlighted that a number of abiotic factors predispose seed or seedlings to damping-off pathogens and increase the severity of infection. This is mainly due to certain soil and climate factors, which restrict normal seed and root growth and development (Burke et al. 1972a, b). In particular, wet (e.g., due to poor drainage or overwatering) and cool soils, cool to moderate air temperatures, are particularly favorable for the development of key damping-off pathogens (Table 3). Key predisposing factors, which trigger the development of Fusarium, Rhizoctonia, Pythium, and Phytophthora species are reported in Table 6.

Soil characteristics including soil aggregate size and texture markedly affect seedling emergence. Aggregate size influences the way the soil water content changes with time and the seed-soil contact, the path of the seedling to the soil surface, and the rate of soil surface degradation by rainfall. Greater soil aggregates size also represents mechanical obstacles for seedlings (Dürr and Aubertot 2000). Soil compaction is another factor causing stress in plants, especially where mechanized crop production is practiced, resulting in reduced root development (Allmaras et al. 1988; Harveson et al. 2005). Excessive soil compaction decreases porosity, degrades soil structure, and can impede water movement and root growth thereby predisposing seeds or seedlings to biotic stresses.

Higher salinity levels have been reported to trigger damping-off diseases. A recent study found an evidence about a synergistic interaction between salinity stress of seed or seedlings and salinity-tolerant Pythium species (Al-Sadi et al. 2010). Other studies showed an enhanced level of disease development on a number of crops due to higher salinity levels (Rasmussen and Stanghellini 1988; Sanogo 2004; Triky-Dotan et al. 2005). Likewise, heat developed just above the ground line can lead to seedling stresses or damages (Helgerson 1989).

4.4 Development of disease-suppressive seedbed soils with or without conservation agriculture

Suppressive soils provide an environment in which plant disease development is reduced, even in the presence of a pathogen and a susceptible host (Hadar and Papadopoulou 2012). Although several studies have reported the potentiality of disease-suppressive soils, their practical application is still limited. The reason behind is the lack of reliable prediction and quality control tools for assessing the level and specificity of the suppression effect. This is especially true taking into account the very complex soil environment with a high level of dynamic complexity and interactions occurring among microbes, plants, and the environment (Lemanceau et al. 2015). More specifically to damping-off, the development of a specific means that suppresses the development of a given damping-off pathogen may not provide a satisfactory suppression of another pathogen thereby questioning the durability of this approach. Indeed, a suppressive soil to one pathogen may not necessarily be suppressive to another due to specificity in the soil-plant-microbe interactions (Whipps 2001). Therefore, the creation of disease-suppressive seedbed environments that discourage the development of most damping-off pathogens is a challenging task for research. A previous study (Bonanomi et al. 2007) reported variable suppressive effects of organic amendments although in most cases such materials provided an effective disease suppressiveness. Another concern is that the suppressive effects of certain amendments, such as composts, are relatively lower and more variable when they are applied in the field compared to container media (Noble and Coventry 2005).

The difficulties in evaluating the level and specificity of the suppression effect can, however, be addressed, at least to some extent, using modern methods of analyzing microbial community structures, including metagenomics. The latter allow identification of both culturable and non-culturable microorganisms and thus provide important insights to help define the key organisms or groups of organisms that allow to exercise natural suppression of damping-off pathogens. However, to guide research in microbial ecology in complex environments, such as soil, there is a lack of ecological theory, which hinders hypothesis-driven research and interpretation of metadata, especially while dealing with compost and compost-amended environments (Prosser et al. 2007; Hadar and Papadopoulou 2012). In particular, our knowledge is still poor concerning why there are numerous cases of compost-mediated disease suppression but no or rare cases of suppressive soils at local levels (i.e., under field conditions). To respond to this question, a recent study identified common traits that have been regarded as potential indicators of suppression (Hadar and Papadopoulou 2012). A better understanding of these indicators may surely help develop disease-suppressive soils thereby contributing to damping-off management.

Because suppressive soil to one pathogen may not always be suppressive to another, there is a need for individual evaluation of compost products for specific pathosystems and the development of standardized compost production and storage protocols. At the same time, there needs a better focus towards a development of suppressive soils under field conditions by optimizing already existing compost-based amendments or combining every single tool and/or strategy that allows to enhance a disease-suppressive soil environment. This includes manipulation of the physiochemical and microbiological environment via best management practices and biological control using organisms such as Trichoderma spp. (Tables 2, 3, and 4).

Overall, there is a paucity of information in the literature concerning how conservation agriculture may affect damping-off diseases although prediction can be made from traditional epidemiological knowledge. Because the major damping-off pathogens discussed in this paper have a broad host range, the retention of crop residues on soil surface maybe a nutrient (food) source for the pathogens after harvest as well the presence of cover crops may act as a potential reservoir of these pathogens (intermediate hosts; (Bockus and Shroyer 1998; Cook 2001). For example, in areas with infected crop residues, infected seeds contribute to a rather small part of the inoculum as seeds and seedlings can be infected during their development. In addition, no-till fields maintain more surface residues than conventional-till fields, at least early in the season (Lindstrom and Onstad 1984; Govaerts et al. 2007), which means more moisture (Belvins et al. 1971; Power et al. 1986), a condition that favors development of damping-off pathogens (Schmitthenner and Van Doran 1985). Further, while the presence of crop residues may act as a physical barrier and prevent pathogens from being spread through soil movement by wind, water, or agricultural equipment, such effects may not be applied to damping-off pathogens given their soil-borne nature.

The little information available in the literature shows that conservation agriculture may have variable effects on damping-off pathogens. For instance, a reduction of tillage has been reported to have both negative (Dick and Van Doran 1985; Schmitthenner and Van Doran 1985; Adandonon et al. 2004) and positive (Tachibana 1983; Rovira 1986; Cook and Haglund 1991; Paulitz et al. 2002; Govaerts et al. 2007) effects on damping-off pathogens development. A previous study (Workneh et al. 1998) demonstrated the recovery of Phytophthora sojae in greater frequency near the soil surface in no-till fields than in conventional-till fields. This suggests that the potential development of damping-off diseases may be greater in no-till fields than in conventional-till ones. However, Schillinger et al. (2010) demonstrated that when no-till regime was included in a conservation agriculture approach (i.e., together with a more complex rotation and a permanent soil coverage), the incidence of Gaeumannomyces graminis var. tritici was decreased in comparison to continuous annual winter wheat, independently of the soil management. As for R. solani, no grain yield loss was observed in any kind of treatment applied although it was more pronounced in the no-till treatments. Very similar results were obtained by other authors while dealing with several cereal pathogens (Matusinsky et al. 2009; Paulitz et al. 2009).

However, it is worth to highlight that most of these studies were based on short-term experiments and we do not know how direct seeding affects damping-off disease over longer periods of time. Moreover, most of the results come from researches on partially-applied conservation agriculture systems, whereas it is well known that full benefits of conservation agriculture are delivered when its three principles are applied for several years (Farooq and Siddique 2015). Hence, more research efforts, based on long-term experiments, are needed to better elucidate the effects of conservation agriculture on these pathogens, which may differ case by case.

4.5 Modeling to help design integrated management strategies of damping-off diseases

Despite several benefits they provide, simulation studies were rarely performed to understand seed germination and seedling emergence. However, a model called SIMPLE (SIMulation of Plant Emergence) was previously developed and used to predict the effects of the main physical factors within the seedbed, including soil temperature and water potential, as well as mechanical obstacles to germination and emergence (Dürr et al. 2001). A few subsequent studies attempted to evaluate the effects of sowing conditions using the same model, including sowing date, sowing depth and seedbed preparation, or of seed lot characteristics (Dorsainvil et al. 2005; Moreau-Valancogne et al. 2008; Constantin et al. 2015). This SIMPLE model was also used to analyze the extent of the effects of plant genetic diversity on seed emergence rates under a wide range of environmental conditions (Brunel-Muguet et al. 2011; Dürr et al. 2016).

Little efforts towards the modeling of damping-off diseases have been undertaken so far. Early epidemiological modeling approaches were conducted in order to mathematically describe soil-borne diseases as a function of inoculum density (Baker 1971; Grogan et al. 1980). These approaches always relied on data sets obtained by experiments where one or more rarely several factors would vary. For instance, Burdon and Chilvers (1975) analyzed and modeled the impact of clumped planting patterns on epidemics of damping-off disease (Pythium irregulare) in cress seedling as a function of number of clumps per unit area. Furthermore, similar modeling approaches permitted to model soil suppressiveness to R. solani (Wijetunga and Baker 1979). Often, these approaches linked observed data to simple theoretical epidemiological models that were fitted to describe disease epidemics. Gilligan (1983) proposed a typology of the early modeling approaches in the field of soil-borne epidemiology: models for primary infection (rhizosphere models; surface density models; probability models); models for secondary infections (for three types of pathogens: unspecialized pathogens such as damping-off and non-ectotrophic root rotting fungi; specialized ectotrophic pathogens; specialized systemic pathogens); models for disease progress (growth curve analysis: non-linear models such as the ones proposed by van der Plank (1963); epidemiological models embedding host growth; multivariate methods; and computer simulations). Otten et al. (2003) proposed a simple compartmental model S-I (susceptible-infected) to model transmission rates for soil-borne epidemics as a function of primary inoculum density (R. solani) and the number of contacts of plants. It was later extended to take into account soil suppressiveness (Otten et al. 2004). More recent works allowed to model the impact of crop sequence on attacks of Fusarium oxysporum f.sp. cepae (Leoni et al. 2013), or the impact of climate change on six soil-borne fungal plant pathogens using a generic model associated to data on the impact of temperature obtained in controlled chambers and a soil humidity model (Manici et al. 2014).

Because there is a lack of tools to help design integrated management strategies of damping-off diseases, frameworks derived from the conceptual model, as presented in Fig. 8, would be very useful. Such models should integrate the impact of cropping practices and weather on the physical and chemical components of seedbed along with their impact on damping-off disease primary inocula from multiple pathogens and antagonistic microorganisms. As reported in Fig. 8, interactions between cropping practices and production situations are numerous and such models should integrate mechanisms as parsimoniously as possible. For instance, the abovementioned SIMPLE model could be used as a basis to develop such models since it already integrates the major abiotic stresses (thermal, hydric, and mechanical stress).

Fig. 8
figure 8

Generic conceptual model that represents the impact of cropping practices and weather on biotic and abiotic stresses affecting seed germination and seedling emergence

Future research, combining experimental and modeling approaches, should focus on a better understanding of the role of abiotic stresses in damping-off diseases. In addition, diagnoses of commercial fields with various levels of damping-off symptoms could also help analyze the effects of interactions between cropping practices and production situations on the biotic and abiotic drivers of damping-off. The developed models would thus significantly improve our understanding of the critical interactions between biotic and abiotic factors that affect damping-off diseases and would help design integrated management strategies of dumping-off diseases.

5 Conclusions and perspectives

The great economic importance of damping-off diseases and increasing concerns in finding sustainable solutions to this problem imply that opportunities exist to develop IPM strategies. Achieving this outcome will require a greater understanding of the ecology, genetics, and pathogenicity of the microbes associated with the disease. Research should focus on critical niches of complexity, such as seed, seedbed, associated microbes, and their interfaces, for which innovative and robust experimental and modeling approaches are needed. In particular, development and validation of new simulation models or improvement of those already existing ones may result useful.

Legislative pressure, fueled by public concern over the use of conventional pesticides in agriculture, requires that alternative to conventional pesticides be developed and applied for a durable and sustainable disease management. Nevertheless, management of damping-off appears to be less straightforward than one might expect. Given that several pathogenic organisms interact and cause damping-off, it is fundamental to have prior knowledge of the interaction concerned, as even a very low population density of soil-borne pathogens can lead to severe epidemic development. Consequently, the prevention or containment of one pathogen may not resolve the problem of the interaction. Therefore, there is a remarkable need for a better understanding of the interactions between plants, the environment and natural resident microbial agents/communities, under the influence of cropping practices. The information reported in this paper underlines the necessity of understanding such a complex relationship, which is essential for an effective decision-making process on damping-off disease management.