Introduction

In the last few decades, climate change has been significantly impacting our environment, prompting us to address various aspects of human livelihood. Hereby, agriculture is one of the main drivers of the man-made contribution to global warming by direct and indirect greenhouse gas (GHG) emissions (Eyring et al. 2021). Despite the currently high agricultural productivity in Central Europe and especially in Germany, there are certain combinations of crop rotations and environmental conditions that carry the risk of high nitrogen losses. Of particular relevance for the development of the global climate are direct N2O emissions from soil nitrogen turnover as well as indirect emissions after nitrogen lost through leaching or erosion (Canadell et al. 2021; Bijay-Singh and Craswell 2021).

A commonly practiced crop rotation in Germany that poses these risks is winter oilseed rape (WOSR) followed by winter wheat (WW). WOSR, as a crop with high N demand (typically fertilized at about 180 kg N ha−1) but rather low nitrogen use efficiency (NUE), often leaves high levels of soil mineral N after harvest (Dreccer et al. 2000; Ziesemer and Lehmann 2006; Bouchet et al. 2016), because N uptake stops relatively early and crop residues of WOSR are not as efficient in immobilizing mineralized soil N than cereal straw. This N supply is usually hardly used by the following winter wheat, with a pre-winter N uptake of approx. 20–30 kg N ha−1 (Henke et al. 2008). Together with N from crop residues and the stabilized organic matter mineralized in fall, this creates a high risk of N loss through leaching during the usually precipitation-rich fall and winter (Webb et al. 2000; Sieling and Kage 2006; Rathke et al. 2006).

Extreme weather events that are becoming more frequent as a result of climate change can increase these losses (Ranasinghe et al. 2021; Seneviratne et al. 2021). Unusual heavy rainfall in fall/winter, for example, increases the amount of percolation water and therefore absolute N leaching (Di and Cameron 2002; Zhou and Butterbach-Bahl 2014). On the other hand, persistently high soil moisture (up to saturation) can also lead to oxygen depletion and thus increased denitrification, which would counteract N mineralization and reduce both mineral N and probably the N concentration in the percolation water. Unfortunately, this would be accompanied by higher N2 and N2O losses. Anoxic conditions could be further intensified by the oxygen-sapping degradation of the introduced biomass.

Climate models also predict an increase of spring and summer droughts in central Europe (Jacob et al. 2014). Dry conditions in semi-arid arable soils at times of the major N uptake hampers N supply from the soil, especially when mainly mineral N fertilizer is applied (Ullah et al. 2020). Therefore, N uptake by the crop is reduced. As a result, unused mineral N remains in the soil and is at risk of being lost during the next percolation period. Although predictions differ in frequency, duration and timing of these events (Kunz et al. 2017), a strong impact on agricultural productivity is to be expected (Bönecke et al. 2020). Some regional models even conclude that drought periods are likely to be followed by increased precipitation later in the year (Kunz et al. 2017), which in return induce again enhanced leaching. Although, some crops may even benefit from rising temperatures and increased CO2 concentrations (Shaheen et al. 2022; Wang et al. 2023), the IPCC Sixth Assessment Report highlights the need for changes in land use practices and agricultural management techniques to reduce greenhouse gas emissions and mitigate the impacts of climate change on agricultural systems (IPCC 2021).

A lever for change is the management of crop residues after harvest. The incorporation of crop WOSR residues in fall prior to sowing of wheat is common practice in the crop rotation mentioned above. On the one hand, because the alternative use of rapeseed straw is rare nowadays, e.g. for animal feed or biogas production (Moss 2002; Lönnqvist et al. 2013). On the other hand, this practice is already nutrient efficient: A certain part of the excess N is microbially immobilized when the straw is decomposed. Hereby, microbes consume mineral N from the soil if its availability in the decomposed substrate is limited. (C:N ratio > 25, Verberne et al. (1990). In organic form (microbial biomass, extracellular enzymes), N is protected from translocation and thus probably also from leaching (Mitchell et al. 2001; Chen et al. 2014; Masunga et al. 2016). More recalcitrant crop residues with a higher immobilization potential could increase this effect and help mitigating N leaching by immobilizing excess N in fall and at re-wetting after a drought. A delayed release of the saved N at times of higher N demand would increase the nutrient efficiency.

Given this, it is more important than ever to accurately forecast how systems will react to extreme weather events and to identify countermeasures that can help maintain conditions that sustain the current level of agricultural productivity. Therefore, the aim of this study was to assess the site-specific impact of residue management options after the harvest of WOSR with regard to mitigate the environmental impact of intensive agriculture, i.e. N leaching losses.

As it is non-trivial to transfer findings from a certain site or lab-borne results in time and space, we developed a modelling approach to assess the effects of an alternative crop residues with higher immobilization potential (cereal straw) and the interactions with environmental conditions. Specifically, we modeled three different residue management strategies after WOSR harvest at 10 sites using 25 years of real weather data. Separate analysis of periods of extreme conditions allowed the evaluation of the impact spectrum under different abiotic conditions on N immobilization/mineralization processes and the potential benefits for N use efficiency of WW. The study made use of a plant-soil-interaction model implemented in the HUME modelling environment (Kage and Stützel 1999; Kage et al. 2003; Henke et al. 2008; Räbiger et al. 2020).

Following hypotheses have been investigated:

  • Compared to the common treatment (incorporation of WOSR residues), residues with a higher immobilization potential (e.g. cereal straw) buffer post-harvest mineral N increase and reduce leaching. Residue removal should result in the opposite (Reichel et al. 2018; Rothardt and Kage 2023; Chen et al. 2023)

  • The efficiency of this loss reduction increases with the quantity of percolation, e.g. at sites with high annual precipitation and in years with above-average percolation rates

  • Following impeded plant N uptake during spring drought, persistent immobilization can reduce or prevent high residual mineral N formation

Materials and methods

Site descriptions

The sites of interest were selected to cover as many different climate and soil conditions as possible as well as the most important arable farming areas throughout Germany. Table 1 along with the climate diagrams (Appendix Fig. 1) give a brief characterization of all sites.

Table 1 Key properties of the modeled sites. Sand and Clay content of the topsoil texture from Soil Survey Map 1:200.000 (BGR 2023), World Reference Base for Soil Resources (WRB) Soil Classification extracted from Harmonized World Soil Database (Wieder et al. 2014), climatic data from E-OBS Raster Data Set (Cornes et al. 2018)

Model

The HUME modelling environment combines numerous sub-models to cover as many aspects of a land use system as needed to derive the targeted key parameters. The aim was to assess the impact of crop residue incorporation in a wide variety of environments and multiple years. Main components were a climate data driven crop growth model quantifying transpiration and N uptake (Ratjen and Kage 2015), a mineralization sub-model for C and N turnover, and combined sub-models simulating vertical water transport, water losses via evapotranspiration, and N leaching. Latter was calculated on daily timesteps based on drainage volume and \({\text{NO}}_{{3}}^{ - }\) concentration resulting from soil \({\text{NO}}_{{3}}^{ - }\) accumulation exceeding actual plant uptake (Di and Cameron 2002). The model was initialized and parameterized site-specifically as described below. The mineralization sub-model was based on the DAISY approach (Hansen et al. 1991), which calculated dynamics as first order kinetics of four organic C pools (SOM, BIOM, DPM and RPM, described further below) with corresponding N compartments and pool-specific turnover parameters (e.g. decay rates, turnover efficiency, for details see Appendix Table A.1). The pools represent stabilized soil organic matter (SOM), the soils’ (micro)biome (BIOM), and organic input in form of senescent plant material. Latter is divided into two different compartments: decomposable plant matter (DPM, simple carbohydrates and proteins) and resistant plant matter (RPM, lignified). The partitioning of the brought in C and N is done using a function that depends on both the C:N ratio of the residues and of the RPM (Eqs. 13).

$$C_{DPM} = C_{residues} * f_{DPM}$$
(1)
$$C_{RPM} = C_{residues} *\left( {1 - f_{DPM} } \right)$$
(2)
$$f_{DPM } = \frac{{(CN_{DPM} * CN_{residues} ) - \left( {CN_{DPM} * CN_{RPM} } \right)}}{{\left( {CN_{DPM} - CN_{RPM} } \right)* CN_{residues} }}$$
(3)

CDPM and CRPM are the initial load of the respective organic pool after crop residue incorporation; fDPM is the proportion of decomposable plant matter to total residue carbon; assumptions: C:N ratio of the decomposable plant matter (CNDPM) = 6 and the C:N ratio of the recalcitrant plant material (CNRPM) according to Table A.1 to avoid negative initial pool sizes.

Model initialization

Soil texture, pH, clay and organic matter content were taken from the soil survey map BÜK200 (BGR 2023). The organic matter content, which is responsible for the initial loading of the SOM and BIOM pools of the models, was considered constant over the years. To minimize distracting variation, initial conditions regarding residual mineral N at harvest of the preceding WOSR as well as the actual amount and C:N ratio of the tested crop residues were set uniformly for all sites and years. A total mineral N of 60 kg N ha−1 was distributed within the top 0.9 m of the soil profile, as detailed below: 30 kg N ha−1 in the 0–0.3 m layer, 21 kg N ha−1 in the 0.3–0.6 m layer, and 9 kg N ha−1 in the 0.6–0.9 m layer. The deeper layers received 8 kg N ha−1 in the 0.9–1.2 m layer, and the remaining 0.9 m of the simulated 2.1 m profile was loaded with 24.3 kg N ha−1. However, only N that passed the 1.2 m depth boundary after the start of the simulation was considered as lost by leaching. The initial residual mineral N is based on former experiences from the site near Kiel after intensive WOSR cultivation with > 200 kg N ha−1 y−1 (Sieling et al. 1999; Henke et al. 2008; Rothardt et al. 2021). However, such conditions can be considered as characteristic after crops with a low nitrogen use efficiency or after weather conditions that prevented the expected N-uptake in the previous growth period. Crop residue amounts and properties are based on empirical values from years of high yields (Table 2, (Rothardt et al. 2021). Admittedly, these standardizations represent an oversimplification, but served the purpose of bringing differences in climate and soil to the focus of the study. Whereas initial soil water contents (SWC) were derived after a 2-month pre-simulation phase, the actual simulation started with the harvest of WOSR. Harvest dates were set year- and site-specific according to phenological data (DWD 2023).

Table 2 Crop residues properties, uniform for all sites and years; potential Immobilization (Ipot) means here the initial potential, calculated with Eq. 4

Potential immobilization (Ipot)

The potential of stoichiometric N immobilization (Ipot) by incorporated organic matter was calculated with an equation derived from the C/N turnover model (Eq. 4, Verberne et al. 1990). Whereas this approach does not consider the temporal component, it still can help to get an idea about the effect duration if calculated at several timepoints. Therefore, we will present the values at incorporation, after the percolation period and after WW harvest.

$$I_{pot} = - 1* C_{residues} *\left[ {\frac{{N_{residues} }}{{C_{residues} }} - E* \frac{1}{{CN_{BIOM} }}} \right]$$
(4)

Immobilization potential of incorporated crop residues (Ipot) derived from the total C and N input by the crop residues, decomposition efficiency of plant matter to soil biome (E) and the C:N ratio of the soils biome (CNBIOM); assumptions: E = 0.4 (Verberne et al. 1990), and CNBIOM = 8 (Hassink (unpublished) in Verberne et al. (1990).

Parameterization

Site-specific climate data, soil properties, and long-term yield data were used to initialize and parameterize the model. In addition, other model-relevant parameters were taken from the literature or derived from established estimates adapted to the current sites (Appendix Table A.1). Since the authors are not aware of any comparable field trials at the sites of interest specifically related to residue management, no actual field data were used for parameterization in this study. In the following, the adjustments with respect to sites and residue decomposition are described.

As suggested by literature (Hansen et al. 1990; Verberne et al. 1990; Mikutta et al. 2006), C:N ratio of the stabilized soil organic matter (SOM) for the model was adjusted in dependency of the sites’ top soil texture (Table 1). Also, the first-order decomposition rate coefficient of that compartment, kSOM, was set site-wise. Neukam et al. (2023, unpublished) made an estimation on base of gathered field data in dependency of the organic matter content (values not shown). During the modelling, turnover rates of all organic compartments are further reduced in dependency of the top soil clay content and the daily abiotic conditions (i.e. soil temperature and moisture (Hansen et al. 1990; Thorburn et al. 2010; Mielenz et al. 2016).

On the treatment side values of CNRPM and kRPM were adopted from Rothardt and Kage (2023), who did a parameterization on base of field data from comparable scenarios at HSK. These adaptations refer to the residues’ biochemical composition. The C:N ratio of RPM had to be increased (i.e. have to be wider than CNresidues) in order to avoid malfunctioning of the models’ organic input distribution algorithm (Equations I-III). The kRPM estimation, on the other hand, follows the assumption that a higher C:N ratio indicates more stable molecules (e.g. lignin, phenols), thus requires a lower decomposition rate.

Scenarios & treatments

In order to estimate robust average impact of the proposed treatments on N dynamics and crop development the model was run for every site with weather data from 25 years (1990–2015, (Cornes et al. 2018). This results in a total of 750 independent scenarios. Shortly after simulation start crop residues were incorporated by 0.3 m ploughing the day after application (5 days after WOSR harvest). Same applies to the treatment without residue incorporation to avoid superimposing effects of differences in soil cultivation. The ploughing depth has been set uniformly for all sites, regardless of local requirements. On the one hand, practical experience shows that such amounts of straw must be incorporated deeply in order to prepare a good seedbed. On the other hand, this depth corresponds to the layer usually defined as topsoil in German agronomy and for which the mineral N content is usually determined separately (VDLUFA 2002).

Mineral N fertilization was implemented according to German fertilization regulatory (DüV 2020). Therefore, the basic N requirement value for WW, 230 kg N ha−1 y−1 for yield expectation level of 8 t dry matter grain ha−1, was reduced in dependency of the local long-term average yield expectation and the preceding crop (for WOSR − 10 kg N ha−1 y−1). Subtracting mineral N of the top 0.9 m at the start of the vegetation period (here uniformly the modeled value from March 15th) results in a fertilizer requirement (= maximum applicable mineral N fertilizer) that varies by site, year, and treatment. The resulting range of applied fertilizer N across all sites was from 85 to 215 kg N ha−1 y−1 with an overall mean of 160 ± 24 kg N ha−1 y−1. Following common advices for quality wheat, three applications of calcium ammonium nitrate (CAN 27–0–0) were implemented. The application dates were set to specific EC development stages: the first application was between EC 21 and 29 (tillering promotion and shoot strengthening, 25% of the fertilizer requirement), the second between EC 30 and 32 (shoot maintenance, 45% of the fertilizer requirement) and the last 30% for protein formation (EC 37–49). The site- and year-specific dates were extracted from a fertilization-independent run of the phenology model (weather driven). Additionally, following restrictions were applied manually based on personal experience: the first application had to take place between February 15 and March 30, and the second was to be at least 30 days later.

To assess the systems' response to extreme weather conditions, we looked separately at periods of two different extremes:

  • Percolation periods (August to March) with percolation water amounts above the 75% quantile of the site representing seasons with increased risk of N leaching losses

  • Growth periods (March – June) with precipitation below the 25% quantile of the site (spring drought) as seasons with limited N uptake, and hence, potentially high post-harvest residual mineral N

Statistical methods

Data exploration was orientated on the steps described by Zuur et al. (2010). All data operations, visualizations and statistical procedures were realized with R, RStudio and mostly tools from the tidyverse-package (Wickham et al. 2019; R Core Team 2020; RStudio Team 2020). With two exceptions, the visual representation of data is done using sina plots (Sidiropoulos et al. 2018), augmented with violin plots and means. The tails of the violins were trimmed to the range of the data. The combination of these two plots provides a visual representation of individual raw data points with density-based vertical jitter and a concise summary of their distribution. The plots allow identification of patterns, gaps or potential outliers within the data set. Linear models for regression analysis were fitted by general least squares (GLS) as the data was found to be heteroscedastic among sites. The gls-function from the nlme-package (Pinheiro et al. 2019) was used for implementation in R. The explaining variables were site, treatment, and their interaction. The intra-site variability is caused by different weather conditions each year. The inter-site variability is additionally caused by the soil properties. Subsequently, multiple comparisons of means of different groups were carried out using the multcomp-package (Hothorn et al. 2008). The entire data and the two periods of two different extremes were analyzed separately. Results are presented as arithmetic mean ± 1 standard deviation or else as indicated in the text. Maximum statistical significance level, as returned from the software, is named in the text. Detailed statistical output can be found in the appendix. As nutrient efficiency indicator we chose the N Uptake Efficiency (NUpE) (Moll et al. 1982), which is defined as the quotient of the plants’ N uptake and the N supply which consists of the applied fertilizer, mineral N at March and net mineralized N during growth period.

Results

In relation to the hypotheses described above, results for well drained percolation periods and growth periods with extraordinary low precipitation sums are presented parallel to the over-all average. There was no correlation between the periods of extreme conditions and the preceding respectively the successive time, i.e. a well-drained fall/winter was not necessarily followed by a rainy growth period and vice versa spring droughts were not necessarily preceded by dry conditions. Therefore, the results of the extreme years are mainly described for the period primarily affected.

The goal of covering a wide range of water and nutrient budgets among the sites can be considered successful given the contrasting modeled amounts of drainage water during the infiltration period (Fig. 1). The gradient ranges from well-drained sites, with an average of up to 118 ± 71 mm in 6 months, to hardly drained locations with only 6 ± 14 mm in 6 months. For the upper quartile of the percolation water sums this range was shifted to 268 ± 39 mm to 20 ± 15 mm. For each site, the amount of water percolated deeper than 1.2 m from WOSR harvest to March was significantly greater in the seven wettest years compared to the remaining 18 years (p < 0.01, for details see Appendix Tables A3 and A4). Regarding dry conditions during the vegetation period, the sum of precipitation from March to the end of July (data not shown) was significantly different for the seven driest years at GUL, SPE, IND, ASH, IHH, and MER (p < 0.01, for details see Appendix Tables A5 and A6).

Fig. 1
figure 1

Modeled amount of percolation water of the period August–March; black dots and violins represent the entire data (n = 750); red squares indicate the arithmetic mean of the entire data, red triangles indicate the arithmetic mean of the seven years with the most percolation water (n = 210), lower case letters indicate significant differences between the years in the upper quartile of percolation and the remaining years (p < 0.01)

N leaching

The Pearson correlation coefficient (r) between the sum of percolation water and N leaching losses was 0.93 (df = 747), indicating a strong positive correlation (p < 0.01). At the beginning of March, an overall average of 19 ± 16 kg N ha−1 was lost by leaching (Fig. 2). At the end of the cropping season of WW, the overall average leaching loss increased to 23 ± 18 kg N ha−1. The maximum of 78 kg N ha−1 occurred in GUL in 2002 where residues were removed. In seasons with drainage water sums well above the average, the N leaching loss by March increased on average to 33 ± 18 kg N ha−1. At harvest after spring droughts leaching losses were reduced on average to 20 ± 17 kg N ha−1.

Fig. 2
figure 2

Modeled sum of Nitrate translocated deeper than 1.2 m at the end of the percolation period in the beginning of March (Nleached); Colored dots and violins represent the entire data (n = 750); squares = arithmetic mean of the entire data, triangles = arithmetic mean of the seven years with the most cumulated percolation water (n = 210), lower case letters indicate significant differences between the treatments per site on average of all years, capital letters indicate significant differences between the treatments per site on average of the seven wettest percolation periods (p < 0.05)

The average effect size of the treatments was calculated by the difference of N leaching of the control (keeping the WOSR residues) and the other treatments. On average of all sites and years WW straw application reduced leaching by 1 kg N ha−1, whereas the removal of residues led to 1.4 kg N ha−1 additional loss. Looking at the upper quartile of N leaching at harvest separately from the other years, the effect size increased: for the wettest years, the average saving with WW straw was about 2.6 kg N ha−1 and the same amount was averagely lost without crop residue incorporation. The maximum treatment effect was observed at HSK and GUL, where on average of the high-risk seasons about 4.5 kg N ha−1 could be retained by keeping the WOSR residues. However, no significant treatment effect could be proved at any site, neither on average of all 25 years nor for the quartile with the highest risk of leaching losses (detailed statistics can be found in Appendix Tables A7 to A8).

Figure 3 is an attempt to visualize the simulated mineral N dynamics, showing the site and treatment specific trends of mineral N in the topsoil (top 0.3 m) averaged over all simulated years. The lines are flattened compared to the single year dynamics because the dates of tillage and precipitation events vary from year to year. As a result, extremes caused by such events have little effect on the value per day of year (DOY). Due to these sources of uncertainty mentioned, a statistical analysis for specific dates has not been carried out. However, it should be noted that at almost all sites (except BER) the fall mineral N increase is simulated to be lower where crop residues have been applied. The lowest values at the peak were observed in the wheat straw treatment, indicating the strongest simulated immobilization effect. After the general peak around October, mineral N concentrations decrease due to translocation and uptake. Depending on the site, the mineral N dynamics were more or less differentiated.

Fig. 3
figure 3

Modeled post-harvest mineral N dynamics in top 0.3 m. Solid lines represent the average value per day of year (DOY) on average of the 25-year period, the dashed lines indicate the upper and lower quartile. WOSR Winter oilseed rape, WW Winter wheat

Net N mineralization

The cumulated net mineralization rates of the organic compartments provide valuable information on the origin of mineral N, as shown exemplarily in Fig. 4. At all sites the stabilized organic matter (SOM) consistently contributes to net mineralization, with site-specific average rates exceeding net immobilization by recalcitrant residue components (RPM) in all cases. Additionally, the soils’ biome (BIOM) and considerably for WOSR residues decomposable plant parts (DPM) release N. Despite this, in comparison with the treatment without any organic input, the overall difference with the incorporation of residues can—if negative—be considered as relative net immobilization. During the percolation period WW straw immobilized on average of all sites and years 12 kg N ha−1 (significant, p < 0.01, statistical details can be seen in Appendix Tables A.9 and A.10). WOSR straw induced at 8 sites between 1.2 and 5 kg N ha−1 of immobilization (n.s., p < 0.05). Only at SPE and BER average conditions led to larger amounts of N mineralization after incorporation of WOSR residues than without organic input (for BER on average 8 kg N ha−1 more, p < 0.01). It is worth noting that at the end of the percolation period, WW straw had on average 82% of its initial Ipot remaining, whereas for WOSR residues already 85% were used up. In the well-drained percolation periods net mineralization at HSK, SPE, IND, IHH, MER and DOR was on average 2.3 ± 0.7 kg N ha−1 lower than the overall average (n.s., p < 0.05, see Appendix Tables A.11 and A.12 for details). At the other sites (namely GUL, ASH, BER and DED) the wet conditions increased net mineralization on average by 1.5 ± 1.2 kg N ha−1 (n.s., p < 0.05). Regarding the treatment effect, differences were smaller: only at 3 sites immobilization with WW was still significant (SPE, IND, ASH, p < 0.05).

Fig. 4
figure 4

Exemplary cumulated net mineralization rates per organic C compartment on average of the 25 years at the site near Kiel (HSK); negative rates indicate net immobilization; the numbers in the bars indicate the average total net mineralization per treatment; lower case letters indicate significant differences (p < 0.05). BIOM Soil biome, DPM Decomposable plant matter, RPM Recalcitrant plant matter, SOM Stabilized organic matter, WOSR Winter oilseed rape, WW winter wheat

Since organic matter degradation was modeled as negative exponential, the absolute rates decreased with time. Thus, during the growth period net immobilization induced by residue incorporation got smaller and the difference to the control shrunk. Still, between March and August WW straw retained on average almost 6 kg N ha−1 in organic form (significant at all sites, p < 0.05, see Appendix Tables A.13 and A.14 for details), and moreover it had still the potential to immobilize further 182 kg N ha−1 (70% of the initial Ipot) in the subsequent cropping period. In contrast, more than 5 kg N ha−1 were averagely released by WOSR residues (significant at HSK, GUL, SPE, ASH, BER, and DED, p < 0.001), which were almost completely decomposed with only 4% left of the initial Ipot. Drought conditions during the growth period decreased at all sites except for SPE net mineralization on average by 2.5 ± 1.4 kg N ha−1 (n.s., see Appendix Tables A.15 and A.16 for details). At SPE an increase of averagely 2 ± 0.8 kg N ha−1 occurred (n.s.). Treatment-specific differences got less pronounced. Thus, only at 4 sites (GUL, IND, ASH, and DOR) WW straw still immobilized significant amounts of N (on average 48.3 ± 4.9 kg N ha−1, p < 0.05), while at 4 sites (GUL, SPE, ASH, and BER) still significantly more N was released from WOSR residues than from soil without organic input (on average 61.8 ± 12.8 kg N ha−1, p < 0.05).

Soil mineral N dynamics

On average of all modeled years mineral N levels in top 0.9 m in March with WW were consistently lower than in the control (Fig. 5). Although simulated differences ranged from 1 to 23 kg N ha−1 (on average 7 ± 8 kg N ha−1) depending on site and year, this effect could only be proven significant at BER (p < 0.01, details see Appendix Tables A.17 and A.18). In contrast, the sole removal of the WOSR residues led averagely to an increase in mineral N at almost all sites, with a maximum of 10 kg N ha−1 (on average 2 ± 3 kg N ha−1, n.s.). At BER and in some years at SPE, the mineral N concentrations without crop residues were on average lower than in the control (n.s.). In the 7 years per site with the most percolation water the general mineral N level was averagely 14 kg N ha−1 lower (see Appendix Tables A.19 and A.20 for details). At GUL, IHH, MER, and DOR this difference was significant (on average 10.3 kg N ha−1, p < 0.05). However, the site-specific contrasts among the treatments remained the same as for the overall average with no significant differences.

Fig. 5
figure 5

Mineral N levels of top 0.9 m of soil in March per site and treatment; colored dots and violins represent the entire data (n = 750); squares = arithmetic mean of the entire data, triangles = arithmetic mean of the seven years with the most cumulated percolation water (n = 210), lower case letters = significant differences between the treatments per site on average of all years, capital letters = significant differences between the treatments per site on average of the seven wettest percolation periods (p < 0.05), WOSR Winter oilseed rape, WW Winter wheat

After the WW harvest, the overall average mineral N level decreased due to crop N uptake and fertilization that was adjusted to the plants’ demand. The highest average mineral N concentrations were generally found in the control treatment, while application of WW straw resulted in the lowest average mineral N level, with an average difference of 9 ± 7 kg N ha−1 (Fig. 6). At six sites (GUL, SPE, IND, ASH, BER, and DED) this difference was found to be significant (on average − 8.8 kg N ha−1, p < 0.01, see Appendix Table A.22 for details). At SPE and BER, mineral N was also significantly lower without the crop residues (on average −8 kg N ha−1, p < 0.05). Following dry conditions during the major phase of crop development, average mineral N level in top 0.9 m did not differ significantly from the long-term mean of all years (on average + 3.6 kg N ha−1, n.s., see Appendix Table A.25 for details). An ANOVA did not provide any evidence of a significant interaction between site and treatment (details see Supplementary Table A.24).

Fig. 6
figure 6

Mineral N levels in top 0.9 m of soil at winter wheat harvest per site and treatment; colored dots and violins represent the entire data (n = 750); squares = arithmetic mean of the entire data, asterisks = arithmetic mean of the seven years with the driest growth periods (‘spring drought’, n = 210), lower case letters = significant differences between the treatments per site on average of all years, capital letters = significant differences between the treatments per site on average of the seven driest growth periods (p < 0.05), WOSR Winter oilseed rape, WW Winter wheat

Crop N uptake

N uptake (aboveground biomass only) by winter wheat until March ranged from 4.3 to 52 kg N ha−1 with an average of 18 ± 8 kg N ha−1. Values differed considerably per site and year, but there was no significant treatment effect at that time (p < 0.05, see Appendix Table A.27). In the rainy years, uptake was generally at the same low level with no differentiation per treatment. In contrast, average N uptake at the time of WW harvest differed in dependency of site and treatment (Table 3). Average N uptake ranged between 215 ± 1 in ASH and 152 ± 14 kg N ha−1 in MER. At 7 Sites (HSK, GUL, SPE, ASH, BER, DED, and DOR) long-term average N uptake of the control was significantly higher (+ 11 ± 5 kg N ha−1, p < 0.05, see Appendix Table A.29 for details) than when the preceding crop residues have been exchanged with WW straw. Whereas at most sites sole residue removal had no significant impact, at BER N uptake was significantly lower without crop residues compared to the control (− 13 ± 3 kg N ha−1, p < 0.001). Although spring droughts lowered the final N uptake averagely by 3.8 kg N ha−1, contrasts among the treatments diminished: only at BER N uptake after WW straw application was still significantly lower than in the control (on average − 18.4 ± 3.8 kg N ha−1, p < 0.05, details see Appendix Table A.30).

Table 3 Average modeled yield parameters

N uptake efficiency

The average NUpE as quotient of N retrieved by grain yield plus N in residues and the N provided by mineral N fertilizer, mineral N and net mineralization (= N supply) was at all sites highest after residue removal (Fig. 7). The average NUpE with WW straw was only slightly lower. Linear regression proved significant less efficient N uptake at DED and DOR where the WOSR residues were incorporated (p < 0.05, for details see Appendix Tables A.32 to A.37). Drought conditions decreased the NUpE on average by 2%, but the treatment effect got no more explicit.

Fig. 7
figure 7

Nitrogen Uptake Efficiency (NUpE = above-ground Nuptake/(Nfert + mineral NMarch + Net Mineralization)); Colored dots and violins represent the entire data (n = 750); squares = arithmetic mean of the entire data, triangles = arithmetic mean of the seven years with the most cumulated percolation water (n = 210), asterisks = arithmetic mean of the seven years with the driest growth periods (‘spring drought’, n = 210),), lower case letters = significant differences between the treatments per site on average of all years, capital letters = significant differences between the treatments per site on average of the seven wettest percolation periods, Greek letters = significant differences between the treatments per site on average of the seven driest growth periods (p < 0.05), WOSR Winter oilseed rape, WW Winter wheat

Discussion

Common practice counters the risk of N leaching losses after the harvest of WOSR, among other things, by incorporating crop residues into the soil. Microbial immobilization is intended to protect excess N as organically bound from translocation to deeper soil layers respectively into the groundwater. The aims of this study were to investigate a possible increase of this effect by more recalcitrant residues (i.e. cereal straw) and to identify interactions of the effect and site-specific environmental factors (soil and climate) in a model-based scenario study. The extreme events that may occur more frequently in the course of climate change, such as pronounced percolation in fall/winter and spring drought, were also considered separately.

Limitations

High mineral N concentrations left in soil after the harvest of WOSR are not uncommon and have been documented in previous research (Aufhammer et al. 1992; Kaul et al. 1996; Sieling et al. 1999; Henke et al. 2008). Although, these conditions can arise from general overfertilization of WOSR (Henke et al. 2008), they might occur more frequently when spring droughts negatively affect the crop N uptake. Furthermore, the actual post-harvest mineral N levels, serving as starting the starting point for N loss risk in the context of this study, depends on climatic conditions influencing N uptake and mineralization. Consequently, they exhibit significant inter-annual variability. However, to enhance comparability, we standardized the initial conditions.

We are aware of the limitations of our study regarding the lack of evaluation with actual field data. Although the model as well as most of the parameters used are well established, modeling results should always be considered as estimates, taking into account all inherent constraints. It is to note, that the model is likely to over-estimate mineralization and yield (Ratjen and Kage 2015). However, the presented results, technically sound and plausible, may primarily serve as a qualitative and relative comparison of the treatments and locations.

Impact of precipitation outweighs treatment effects

Soil texture (as a proxy for field capacity) and precipitation have a considerable influence on N leaching (Vinten et al. 1994; Beaudoin et al. 2005; Wang et al. 2021), especially if the initial mineral N load is uniform. Hence, the combination of low field capacity and high annual precipitation sum is supposed to bear the highest risk of leaching losses, especially after fertilization that is not adapted to the site and current weather conditions. However, none of the modeled sites exhibited these properties. Therefore, we observed in the current data, that precipitation had a greater impact than soil properties on N leaching.

Average modeled quantities of percolation water and leached N were within the range of comparable studies (Justes et al. 1999; Beaudoin et al. 2005; Henke et al. 2008). While also the relative net immobilization of WOSR residues ranged in expected levels (Jensen et al. 1997; Justes et al. 1999; Engström and Lindén 2012), the observed reduction in leaching losses by crop residue induced microbial immobilization, especially with the potent cereal straw, fell short of expectations. We found that the major part of the excess N, which is susceptible to the treatment effects (comprising initial mineral N plus mineralized N in top-soil), predominantly resides in the upper 0.9 m of the soil profile. In fact, the average amount of leached N was essentially explained by the initial mineral N concentration in the soil between 0.5 and 1.2 m. Importantly, this portion was not directly affected at all by the treatment. Even the average leaching loss of the 7 most drained years was covered to 77% by the initial mineral N load of the unaffected part of the soil profile. On average only 8 kg N ha−1 of the initial top-soil mineral N contributed to the leaching loss, which is less than 10% from what might be affected by immobilization (initial load of 30 kg N ha−1 plus approx. 60 kg N ha−1 from mineralization). However, the reduction effect with WW straw grew with absolute leaching losses, because proportionally more N affected by immobilization was in danger of being lost by leaching.

At sites with negligible leaching losses (i.e. MER, DED, and DOR) mineralized N accumulates in the horizon accessible for wheat roots in the growth period (approx. 1.2 m). Therefore, high mineralization already in fall/winter (realized without residues and with WOSR straw) represent an advantage for the upcoming growth period at such sites.

Immobilization dynamics affected by residue decomposition

At the level of total biomass in soil, both crop residue treatments tend immediately to net mineralization, which aligns well with the observations of Beaudoin et al. (2005) in extent as well as in timing. Although, the high turnover rate of RPM for WOSR residues induced a considerable immobilization, the large share of DPM led to net mineralization in the percolation period close to the treatment without residue incorporation. This finding is in accordance with the results of Trinsoutrot et al. (2000), where 50% of the WOSR residue-C was already mineralized in the first two months after incorporation into top-soil. During the growth period, the re-mineralization finally exceeds the immobilizing RPM turnover. On the other hand, with WW straw, mineralization of DPM is limited, resulting in a relative net immobilization persisting throughout the entire cropping season of WW.

Above average fall/winter precipitation increased or decreased mineralization rates depending on location, i.e. soil properties. In relation to the local average conditions, the increasing moisture either just reached the optimal level for mineralization or created already anoxic conditions. In the case of the latter, denitrification counteracts the mineralization. Dryness in the growth period deteriorated mineralization conditions with soil moisture below optimum. This resulted in a lower N supply from the soil which was intensified in dependency of the treatment, albeit not to a significant extent.

Mineral N and crop yield in dependency of treatment and precipitation

The mineral N concentration in top 0.9 m at the beginning of March serves as the assessment base for upcoming mineral fertilization in accordance with the German Fertilizer Application Ordinance (DüV 2020). Therefore, it is the first important benchmark for assessing the impact of the tested treatments. Consequently, to the mineralization dynamics described above, average mineral N concentrations in March were significantly lower after WW straw application than the control. In accordance with the German Fertilizer Application Ordinance, this led to higher N fertilization recommendations. However, with regard to the subsequent supply of N from the soil during the growth period, the estimation method considers no other option than to keep the preceding crop residues. Therefore, in the present scenarios, it was not possible to compensate for the persistent relative net immobilization even with the increased fertilizer quantity derived for the WW straw treatment. In the variant without residues, the recommended fertilizer amount was also not sufficient, due to the missing mineralization of residue N. As a result, for the two treatments that differ from common practice, these shortcomings add to the deficient N supply, which was partly reflected in the yield.

N uptake until March was not limited by any of the treatments. Obviously, in fall and winter, no matter if with or without crop residues incorporated, N supply exceeded N demand. In contrast, during the growth period WW with incorporated WW straw was restricted to a lower uptake than the control, which seemed to have met the N demand. The treatment effect decreased after drought conditions. Sub-optimal water supply generally reduced the N demand, which was then met in nearly every case.

The consistently low average mineral N concentrations after harvest of WW where WW straw was incorporated was another indicator of ongoing immobilization combined with inadequate N supply. In contrast, high mineral N concentrations of the control treatment pointed towards surplus of N and nearly depleted Ipot. At sites with frequent extraordinary dry growth periods (SPE, IND, ASH) extremely high average residual mineral N concentrations occurred. Even with the generally higher mineral N levels in the dry years of all sites, the treatment effects were the same as with the overall average.

The results of the N uptake efficiency evaluation were surprising: additional N from WOSR residue mineralization was thought to be beneficial for WW, especially as the German Fertilizer Application Ordinance was implemented, which was criticized for its subdued recommendations (Kage et al. 2022). Instead, on average WW of the control treatment was oversupplied and the ratio of N output by the crops and N input by soil and fertilizer turned out to be less efficient than for the other treatments. Efficiencies larger than 1, as occurred at DED and DOR, are an artifact of an already high N uptake until March, which is not considered on the input side.

The buffering of the mineral N increase in the following fall by the remaining Ipot from the wheat straw treatment should be moderate and should not cause any problems for the winter barley that usually follows. Assumed farm-internal relocation as the source for the wheat straw application after WOSR, the subsequent barley could even benefit: due to the earlier sowing, winter barley has a higher pre-winter N uptake than wheat. If the wheat residues remain in the soil as usual in combination with a lower initial mineral N, N-limitation is likely to occur. However, if the residues would be removed and used elsewhere as treatment prior to WW, the N supply would increase due to the absent immobilization. If the remaining Ipot of wheat straw treatment from last year is exhausted, residue N is additionally released in the long term. Thus, the treatment effect may add up at crop rotation level.

Conclusion

Among the selected sites studied, precipitation had a greater influence on N leaching than local soil properties. We found that the site-specific reduction of N leaching losses falls short of a significant effect and that most of the initial excess N remains in the upper layers of the soil profile accessible to wheat roots during the growth period. This applies also for periods of exceptional percolation. Therefore, future research could aim to quantify the medium to long-term effect of topsoil treatments on N concentrations in deeper layers, as these determine N leaching in subsequent cropping years. We also recommend validating the findings of this study with actual field data. Probably, corresponding trials will be needed, as suitable data from the past was not available to the authors. As the German Fertilizer Application Ordinance does not offer an option to account for various residue treatments, N limitation after incorporation of WW straw as well as without residues could not be compensated resulting in negative yield effects. Also the buffered N surplus after drought periods in spring due to the incorporation of WW straw has to be considered in the N balance. We therefore suggest that residue management options should be taken into account in the regulation.

Hence, the findings suggest that the current common practice of keeping the preceding crop residues may not be the most effective option to reduce N leaching losses, but may still be the best when considering the additional workload and general resource input required to remove or replace residues. In addition, the study highlights the importance of accounting for site-specific and management-induced mineralization when estimating fertilizer demand in order to achieve both: maximizing agronomic productivity and minimizing the risk of nutrient losses.