Introduction

Fusarium crown rot and common root rot are significant diseases of cereal crops in arid and semi-arid grain growing regions worldwide. Fusarium crown rot is caused by numerous Fusarium species including Fusarium pseudograminearum, F. culmorum, F. cerealis (F. crookwellense) and F. graminearum (Backhouse et al. 2004). The disease results in stem browning and rotting of the crown and sub-crown internode (Smiley et al. 2005). Common root rot is caused by the fungus Bipolaris sorokiniana and the disease causes dark lesions on the sub-crown internodes and roots of cereal plants, stunting plant growth. Epidemics of these diseases, particularly crown rot, can be a significant threat to food security in developing countries (Petronaitis et al. 2021). In Australia, the estimated annual yield losses in wheat alone exceed AUD 90 million for Fusarium crown rot and AUD 30 million for common root rot (Murray and Brennan 2009).

Despite continuous research and the development of crop protection strategies, the impacts of stubble-borne diseases of cereals in Australia have increased over the past four decades (Simpfendorfer et al. 2019). This increase has been associated with the adoption of conservation-agriculture practices, such as stubble retention (Ugalde et al. 2007). Previous management options such as tillage and burning were effective at reducing disease inoculum but are incompatible with current conservation-agriculture practices. Disease management strategies, such as inter-row sowing or non-host crop rotation, have since become industry standard (Kirkegaard et al. 2004; Verrell et al. 2017). These strategies rely on time to reduce inoculum levels by limiting successive infections in the short-term via non-host rotation or reduced contact with inoculum. Even so, disease often recurs due to the persistence of background inoculum or growers’ returning to susceptible crops and/or cultivars prematurely to maximise short-term profits. Thus, complete elimination of inoculum from cereal stubble and soil is needed to break the disease cycle.

Heat treatment is an effective method for eliminating pathogens from cereal stubble and soil, but there are currently limited options that suit conservation-agriculture systems. Solarisation can reduce the quantity of DNA of F. pseudograminearum in soil by 98.7% and reduces survival in cereal stubble to 5.8% (compared to 61.9% in non-solarised stubble) (Bottomley et al. 2017). However, the significant treatment time (e.g. 3 months) and substantial resources needed (application of plastic tarps) render it unsuitable for broadacre cropping. A faster and more convenient solution is to burn cereal stubble, which was common practice in Australia until 20–30 years ago (Ugalde et al. 2007). The negative environmental impacts of burning cereal stubble and associated long-term yield loss has reduced the popularity of this practice despite it being an effective disease management strategy (Burgess et al. 2001). The potential of novel strategies, such as microwave radiation, to destroy inoculum whilst retaining cereal stubble, are therefore worthy of investigation.

Materials can be heated when they interact with the alternating electric field produced by microwave radiation. This heat has been shown to destroy plant pathogens, for example, successfully eliminating F. pseudograminearum mycelia from infected durum wheat stubble (Petronaitis et al. 2018). Similarly, populations of various seed-borne cereal pathogens, including Fusarium species, have been reduced by using microwave radiation, but at the expense of seed viability (Knox et al. 2013). Although these studies are useful for proof-of-concept, none has identified the exact amount of energy required to deliver a fatal dose of microwave radiation to different cereal pathogens, which is important when considering the feasibility of field application.

Fungal spores, such as the conidia of B. sorokiniana, may require varying doses of microwave radiation to affect viability. Bipolaris sorokiniana has large thick-walled conidia that can survive dormant in soil, on cereal stubble and on grain for up to four years (Duczek et al. 1996; Stack 1995). Although the effect of microwave radiation on B. sorokiniana has never been investigated, the idea is promising because B. sorokiniana can be successfully eliminated from barley seed using other heating methods such as dry heat treatment (Couture and Sutton 1980).

Given that microwave radiation could offer faster and more effective control of different pathogens in cereal stubble and soil, microwave dose-response experiments were conducted on conidia of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis. We hypothesised that conidia of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis will be susceptible to microwave radiation, as these fungi (or close relatives) can be heat-killed. If successful, this knowledge will be useful in designing methods for broadacre-scale microwave application for all three pathogens.

Materials and methods

Isolate collection

Three microwave dose-response experiments were conducted to determine the energy requirements for heat-kill of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis using microwave radiation; one experiment for each pathogen respectively. The B. sorokiniana isolate with accession number DAR 84144 (NSW Biosecurity Collections, Orange NSW) was collected post-harvest from naturally infected wheat stubble (cv. Suntop) from Rowena, New South Wales in 2018. The F. pseudograminearum isolate with accession number DAR 84148 was collected post-harvest from naturally infected wheat stubble (cv. unknown) from Temora, New South Wales in 2017. The F. cerealis isolate with accession number DAR 84851 was collected post-harvest from naturally infected wheat stubble (cv. EGA Gregory) from Coolah, New South Wales in 2017. Monosporic cultures of each isolate, grown for 7 days on ¼ potato dextrose agar (PDA), were flooded with sterile deionised water and spores (conidia or macroconidia) were gently scraped off the surface and transferred to a sterile water solution. Solutions were mixed by pipetting up and down 10 times prior to treatment and again prior to serial dilutions to prevent spore settlement in the mixture.

Microwave radiation dose-response experiments

Spore suspensions (10 mL) in 30 mL glass beakers were placed in the centre of a commercial microwave (1100 W SHARP R-330E at 2450 MHz) and microwaved at full power for 4, 5, 6, 7, 8, 9 or 10 s. A control treatment was established by not microwaving (0 s) spore suspensions. One replicate of all treatments, being microwave exposure times, were microwaved in a single day. Each experiment was conducted across four replicate days, with the order of treatments randomised within each replicate day. For the F. pseudograminearum and F. cerealis experiments, duplicates of the 5 and 7 s exposure times were added within each replicate day to provide increased precision about regions of the dose-response where rapid changes in spore mortality were likely to occur based on a pilot study (data not shown).

The temperature and weight of the spore solution was measured immediately before and after imposing microwave radiation. Temperature was measured using a multi-meter with probe attachment (MM400, Klein Tools, Lincolnshire USA). A 10 mL water blank (non-microwave-treated deionised water) was used to construct a calibration curve by taking temperature and weight measurements at the same time as the microwave-treated suspensions. The amount of energy (J g− 1) applied to each sample and the water blanks was calculated using the following energy balance equation:

$${\rm{Energy}}\,\left( {{\rm{J}}{{\rm{g}}^{ - 1}}} \right) = \frac{{({\rm{mass}}\,{\rm{of}}\,{\rm{water}}\,\left( {\rm{g}} \right) \times 4.18 \times {\rm{ }}\Delta T) + (2260 \times {\rm{water}}\,{\rm{loss}}\left( {\rm{g}} \right))}}{{{\rm{mass}}\,{\rm{of}}\,{\rm{water}}\left( {\rm{g}} \right)}}$$

where: ∆T represents the temperature difference before and after microwaving, the value 4.18 represents the specific heat of water, and the value 2260 represents the latent heat of water. Calculated energy of the water blank was then subtracted from the energy for each sample to determine energy per gram of treated sample (J g− 1). Microwave power (W g− 1) was calculated by dividing energy per gram of treated sample (J g− 1) by heating time (seconds microwaved minus a 3 s magnetron delay).

Spore suspensions were serially diluted (once cooled to 22 °C) before culturing on ¼ PDA plus novobiocin (10 g PDA, 15 g technical agar plus 0.1 g novobiocin L− 1 water) and incubated under alternating ultra-violet light (12 h light/12 h dark) at 25 °C and kept in the randomised order in which the samples were exposed to microwave radiation. After four days, spore viability was assessed by counting colony forming units (CFU).

Statistical analyses

Logistic dose-response curves were estimated to describe the relationship between the response variable CFU and the explanatory variables energy per gram of treated sample and temperature of the sample after microwaving, separately for each experiment. This resulted in the fitting of six individual models. In each case, a four-parameter log-logistic model was fit to describe the relationship between CFU and each explanatory variable, for each experiment separately. This model enabled the estimation of parameters corresponding to the steepness of the dose-response curve, the lower and upper asymptotes of the response and the lethal dosage (LD) LD50, which is the value of the explanatory variable that results in a reduction in the response of 50% from the upper asymptote (Ritz et al. 2015).

Predictions of the LD95 and LD99 values, corresponding to the values of the explanatory variable that result in a reduction in the response of 95% and 99% from the upper asymptote respectively, were estimated from each model via inverse regression. Approximate standard errors of the LD values were obtained using the delta method (Ritz et al. 2015). All models were fit, and predicted quantities estimated, using the drc package (Ritz et al. 2015) in the R statistical computing environment (R Core Team 2019).

Results

Microwave radiation dose-response of B. sorokiniana

The decline of B. sorokiniana conidial viability with increasing energy (Fig. 1A) and temperature (Fig. 1B) applied via microwave radiation approximately followed that of a sigmoidal shaped response. The greatest rate of change in CFU occurred in a relatively large window when final temperatures of spore solutions reached approximately 40 to 60 °C (equivalent to 75 to 175 Jg− 1 of energy applied). However, Fig. 1A and 1B demonstrate the data from this experiment does lack some conformity to the logistic model. This is reflected in the large confidence intervals surrounding the fitted dose-response curve, along with the relatively large standard errors of the temperature and energy LD values (Table 1). The greatest lack of conformity with the logistic response was observed in the recovery of CFU over the range of 0-100 Jg− 1 of energy applied to the spore suspensions during microwave exposure (corresponding to a temperature range of 20 to approximately 45 °C). This variability in CFU recovered was also apparent in the non-microwaved control samples (Fig. 1A and 1B).

Fig. 1
figure 1

Microwave radiation dose-response curves of three cereal pathogens: Bipolaris sorokiniana conidia response to energy (A) and temperature (B), Fusarium pseudograminearum macroconidia response to energy (C) and temperature (D), and F. cerealis macroconidia response to energy (E) and temperature (F). Cereal pathogen populations are represented as colony forming units (CFU) recovered from spore solutions following microwave radiation. Blue lines correspond to fitted logistic dose-response curves, while the shaded regions correspond to the 95% confidence intervals of each response

Table 1 Lethal dosage (LD) of energy and temperature required to achieve 50% (LD50), 95% (LD95) or 99% (LD99) mortality of conidia of Bipolaris sorokiniana, and macroconidia of Fusarium pseudograminearum and F. cerealis using microwave radiation. Values in brackets represent the approximate standard error of the LD estimate

Conidia of B. sorokiniana appeared to have an initial reaction to increasing temperature similar to that witnessed in the experiment testing F. cerealis macroconidia (as LD50 values were similar), but this was followed by a stretched curve which resulted in relatively high temperatures required to achieve LD95 (61 °C) and LD99 (70 °C) for B. sorokiniana (Table 1). Reaching these LD values required 8–9 s (LD95) and 10 s (LD99) of microwave exposure time (data not shown).

Microwave radiation dose-response of F. pseudograminearum

Like B. sorokiniana, the recovery of F. pseudograminearum CFU declined as increasing energy and temperature was applied to spore suspensions (Fig. 1C and 1D). The F. pseudograminearum CFU began to decline following relatively short microwave exposures, as observed by low LD50 values for both energy and temperature (Table 1). In the energy model this translated to a very short upper asymptote, followed by a slow decline in CFU, resulting in a relatively large window between LD50 (78 Jg− 1) and LD 95 (185 Jg− 1) (Fig. 1C and Table 1). The LD99 value was predicted beyond the range of applied energy and as such was estimated with large uncertainty (Table 1). The variability in CFU recovery within the lower range of applied energy (for example, between 50 and 120 Jg− 1), likely contributed to the broader confidence bands and inability to reach the lower asymptote within the range of applied energy.

Temperature appeared to be a more effective predictor of the decline in CFU for F. pseudograminearum and was characterized by strong conformity to the sigmoidal shape and clearly defined upper and lower asymptotes (Fig. 1D). The sharpest decline in CFU was observed when spore suspensions reached between 35 and 50 °C (Fig. 1D). Reaching these temperatures required microwave exposure times of 5 s (to reach LD50 at 40 °C) to between 6 and 7 s (to reach LD95) (data not shown). The LD99 for temperature on F. pseudograminearum CFU was reached after approximately 7 s of microwave exposure time (data not shown), when the final temperature of the spore suspensions was approximately 57 °C (Table 1; Fig. 1D).

Microwave radiation dose-response of F. cerealis

As with B. sorokiniana and F. pseudograminearum, the recovery of F. cerealis CFU declined as increasing energy and temperature was applied to spore suspensions (Fig. 1E and F). Both F. cerealis response curves conformed well to the sigmoidal shape, with clearly defined upper asymptotes for both models. This suggests that F. cerealis macroconidia are initially relatively resistant to energy and/or temperature applied via microwave radiation but have a clear tolerance threshold where rapid mortality occurs (as a relatively small stretch in the curves between the upper and lower asymptotes was observed). Subsequently there was a relatively short window in which LD50 and LD95 occurred for energy (within 34 Jg− 1) and temperature (within 6 °C) (Fig. 1E and F, Table 1). Reaching these doses required approximately 6 s (LD50) or 7 s (LD95) of microwave exposure time (data not shown), and clear lower asymptotes followed that confirmed eradication of the pathogen well within the range of observed data. The LD99 thereby appeared relatively low for energy (153 Jg− 1) but was similar to F. pseudograminearum for temperature (both ~ 56 °C) (Table 1) and was achieved following microwave exposure times of 7–8 s (data not shown). Note that a lack of observations in the temperature model around LD50 resulted in “knee” shapes in the plotted confidence intervals due to uncertainty in the estimate of the model steepness parameter (Fig. 1F).

Discussion

This study is the first to demonstrate that B. sorokiniana conidia and F. pseudograminearum and F. cerealis macroconidia are susceptible to microwave radiation. Significant mortality was observed after exposing all three of these important cereal pathogens to microwave radiation. This supports the hypothesis that conidia of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis are susceptible to microwave radiation. This presents new options for improved stubble-borne disease control and warrants further development of microwave radiation application to treat infected stubble.

The lethality of microwave radiation on B. sorokiniana conidia and Fusarium macroconidia is likely to be thermal. The upper limit of heat tolerance for fungi is 60–62 °C (Tansey and Brock 1972), which aligns with the present study in that macroconidia of Fusarium species were non-viable when subjected to temperatures of ~ 56 °C (LD99) and above. Minimal recovery of B. sorokiniana conidia was observed if subject to temperatures of 61 °C (LD95) and above. Achieving these temperature thresholds using a domestic (1100 W) microwave oven, with an exposure time of 10 s or less, is a substantial improvement on other heating methods. For example, lengthy and higher temperatures were required (60 h at 90 °C) when using dry heat to eliminate B. sorokiniana conidia from contaminated seed (Couture and Sutton 1980). Additionally, microwave exposure time could be reduced even further, depending on the power available, by using a commercial-scale generator to deliver the required energy over a much shorter period. This approach would better suit field-scale microwave application.

This present study is the first to show that B. sorokiniana and F. cerealis are susceptible to microwave radiation. The susceptibility of several other Fusarium species to microwave radiation has been well documented, such as a range of Fusarium species in infected grain (Knox et al. 2013); F. pseudograminearum in cereal stubble (Petronaitis et al. 2021); and F. oxysporum f sp. melonis in microconidial suspensions (Soriano-Martín et al. 2006). Only the latter study reported on microwave energy inputs required to kill the pathogen. They estimated that 24,000 J (equal to 400 Jg− 1 for a 60 mL total volume) was required to achieve complete mortality of F. oxysporum f sp. melonis microconidia, which is more than double the energy estimated to achieve a LD99 for macroconidia of F. cerealis, but is within the upper confidence interval of the LD99 for F. pseudograminearum. Results from the present study are overall lower than those reported by Soriano-Martín et al. (2006), which may be because the latter study used an estimate of applied energy (microwave power multiplied by time set), rather than the energy absorbed (the heating efficiency of the solution) as per this study. Given that energy, which was applied to a sample, is not necessarily the energy absorbed by the sample (Brodie 2012), the method used in the present study is preferred for calculating microwave dose responses for spore suspensions. Furthermore, the use of LD values in this study will allow future comparisons to be made between microwave radiation and the efficacy of other heat treatments, as LD values are a standard approach to analysing dose responses in microbiology.

Pathogens often co-infect cereal crops, and the combination of Fusarium species with B. sorokiniana is particularly common in central and northern NSW in Australia (Simpfendorfer and McKay 2019). It may be possible to reduce or eliminate all three pathogens from stubble simultaneously (e.g., following multiple pathogen infection), however, this would require verification using experiments with multiple pathogens in the same solution. When applying microwave radiation to infected stubble, the energy dosage would have to be guided by the pathogen most tolerant to heat applied, which in our case is likely to be B. sorokiniana. At the very least, solutions of each of the three pathogens within the same experiment would be required to directly compare the dosage responses of all three pathogens.

The microwave dose responses estimated for B. sorokiniana must be considered with caution, as these experiments lacked some conformity to the logistic model, possibly due to large variability in CFU recovery in the control treatment and at lower energy dosages (i.e., < 100 Jg− 1). Still, it remains a possibility that B. sorokiniana does have a higher temperature threshold, for example compared with the average threshold defined by Tansey and Brock (1972). The conidia of B. sorokiniana have thickened outer cell walls containing melanin which protect against solar and UV radiation (Toledo et al. 2017). One may speculate that these adaptations, which can prolong pathogen survival under natural conditions, may also provide additional protection against microwave radiation. However, the geometry (shape and size) of a heated material (in this case spores) can affect their interaction with microwave radiation (Brodie 2012). Bipolaris sorokiniana conidia are larger and oval, and can consist of a broad range of sizes depending on maturity (15–20 × 60–120 μm in length) (Bockus et al. 2010) whilst Fusarium macroconidia are smaller, fusiform and more uniform in size (20–50 μm in length) (Aoki et al. 2015). The greater variation in the measured CFU in the B. sorokiniana experiment could result from the conidial suspensions consisting of spores of variable sizes that differ in their sensitivity to microwave radiation, whereby the smaller, more sensitive spores are eradicated more quickly at lower energy dosages. Conversely, the greater conformity of the response in CFUs with the sigmoidal shape in the Fusarium experiments could be attributed to more uniform size of Fusarium macroconidia.

The response curves for F. pseudograminearum and F. cerealis were quite different, although the LD values were similar in some instances. Although the two Fusarium species cannot be compared in this study, it might be of interest that differences in microwave response between the two Fusarium species, if present, are unlikely to be influenced by conidial morphology. Both species are closely related B-clade fusaria with fusiform and symmetrical macroconidia, although F. pseudograminearum macroconidia are slightly smaller, having a mean width of 4–4.5 μm compared with F. cerealis that has a mean width of ≥ 5 μm (Aoki et al. 2015).

The non-uniformity of power output and therefore energy delivery, which is typical of domestic microwave ovens, must be considered as a potential source of variation in these experiments (Tinga and Eke 2012). Magnetron delay (i.e., the initial period when the microwave is inactive) (Meredith 1998) was estimated at approximately 3 s, but may have varied for each sample (e.g. between 2 and 4 s). Ultimately, an improved dose-response was achieved with the Fusarium experiments by increasing replicate treatments in regions where rapid change was anticipated to occur (i.e., duplicating the 5 and 7 s treatments). In future, a more detailed dose-response study could be implemented using further refined time intervals in a well-controlled microwave system, for example using a solid-state microwave source that allows continuous variation between 1 W and 2 kW.

Both Fusarium species and B. sorokiniana are often (sometimes exclusively) present as hyphae within cereal residues. It has already been established that hyphae of F. pseudograminearum can be eliminated using microwave radiation from infected durum wheat stubble (Petronaitis et al. 2018), so hyphae of B. sorokiniana and F. cerealis are also likely to be susceptible to microwave radiation. Ideally, a method that can assess the lethality of microwave radiation on pathogens within cereal stubble, whilst being able to accurately predict the energy inputs and LDs, is needed in addition to the microwave-spore-response data presented here. In our study, the spores were likely imbibed with additional water from the surrounding solution. This, along with additional external heat transfer from the solution, is likely to have accelerated the heating compared with spores under field conditions. This is because moisture content influences a materials ability to store and convert electromagnetic (microwave) energy to heat (also known as the dielectric properties of a material) (Nelson 2010). Further, the dielectric properties of cereal stubble also need defining to assess if microwaving is an energy-efficient method of heating stubble, particularly at different moisture contents. Treatment efficacy can be greatly improved when cereal stubble is microwaved with soil, due to the additional moisture provided by the soil (Petronaitis et al. 2018). Determining the most efficient stubble and soil moisture combinations for heating stubble will therefore be needed to inform on the suitability of broadacre application.

The present study demonstrates that microwave radiation can heat-kill populations of several pathogenic fungi of cereals that could offer a more comprehensive disease control strategy than many of those currently available to growers. Microwave radiation of infected cereal stubble would also be compatible with beneficial conservation cropping practices. Some argue, however, that broadacre-scale application of microwave radiation is unrealistic, as it has been identified as one of the most expensive methods of alternative (i.e. non-chemical) knock-down weed eradication per hectare (using an estimate of 42 GJ ha− 1) (Walsh et al. 2019). In its defence, the cost of microwave radiation is largely driven by fuel costs (i.e., energy), which will change over time, particularly with the development of renewable energy. With the optimisation of microwave delivery systems, the knock-down cost of weed control can be as little as 1.3 GJ ha− 1, which is currently equal to AUD $150 to $200 per ha (Brodie et al. 2018). Given that microwave radiation units for the control of weeds and weed seeds in broadacre applications are in advanced stages of development (Brodie et al. 2018), pathologists may be able to adapt this existing technology to reduce development time and cost.

Eradication of pathogens from cereal stubble is not possible using the disease control strategies presently available in conservation-agriculture systems, however, microwave radiation has been shown to heat-kill conidia of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis using relatively small energy dosages and time (10 s or less). Although field validation will be important in the future, this study defines the specific energy requirements to support further development of microwave application for disease control.