Introduction

Since 2002, European ash dieback caused by the invasive fungus Hymenoscyphus fraxineus (T. Kowalski) Baral, Queloz & Hosoya has been observed in Germany. The pathogen and associated symptoms on European ash (Fraxinus excelsior L.) are now present in all regions of Germany (Enderle et al. 2017; Langer 2017). The symptoms, such as crown dieback, basal lesions (stem collar necroses) and secondary infections with bark- or wood-decaying fungi as well as insect infestations with beetles, often have fatal consequences for the survival, growth, and wood quality of European ash. Subsequently, stand stability and ash-associated biodiversity are severely threatened in affected forests (Hultberg et al. 2020).

European ash is of considerable importance in German forestry and present in a variety of natural broad-leaved forest communities. In floodplain and ravine forests as well as on calcareous soils, ash can outcompete other broad-leaved species (e.g. European beech) and become dominant (Fuchs 2021; Langer et al. 2022). This robust, ecologically and economically valuable tree species is considered advantageous in the face of climate change due to its high drought and heat tolerance (Schmidt 2007; Fuchs et al. 2021). For this reason, German forest policies aimed to increase the share of ash before the effects of ash dieback became apparent (Enderle et al. 2017). According to the third German National Forest Inventory in 2012, European ash covered an area of approximately 250,000 ha, or 2.4% of Germany’s total forest area (Enderle et al. 2017).

Between 1987 and 2000, survey data from the ICP Forests Level I network (International Cooperative Programme on Assessment and Monitoring of Air Pollution Effects on Forests) showed that the frequency of tree mortality due to ash dieback was moderate in the affected European countries. As in other parts of Europe, mortality of ash in Germany accelerated between 2000 and 2010 and increased significantly in the decade 2010–2020 (George et al. 2022). As a consequence of increasing mortality, decreasing vitality and intensified timber logging, the share of European ash is declining in Germany over the past decade (Enderle et al. 2019). This is, inter alia, evident from the sharply declining share of ash trees in the ash containing collective of northwest German ICP Forests Level I plots over the past 10 years (Fig. 12, Appendix).

The research network FraxForFuture, funded by the Forest Climate Fund (in German: Waldklimafonds), was initiated in order to study the disease progression of ash dieback in Germany and to preserve European ash for wood production as well as for the protection of associated and ash-dependent ecosystem communities (Langer et al. 2022). This network has developed a monitoring programme for the continuous observation of the vitality status of ash populations. Therefore, one of the first milestones of the research consortium was to develop a key to assess the damage caused by ash dieback in a nationwide standardised format (Peters et al. 2021a, b). To serve as a common research platform for the network, 14 intensive monitoring plots (in German: Intensivbeobachtungsflächen, IBF) were established in Germany across the typical range of ash forest habitats (Langer et al. 2022) and assessed twice a year with the aforementioned key. This monitoring approach focuses on the observation of whole ash stands (1 hectare) and the difference of disease progression in dependence on small-scale variation of soil conditions and stand properties as well as the interdependence of certain symptoms on tree-level.

This FraxForFuture study aims at (i) summarising the results of the first six assessments over 3 years of observation, (ii) reporting the first experience with the systematic application of the assessment key, (iii) assessing the effect of tree and stand characteristics and small-scale soil conditions on ash condition and disease progress, and (iv) identifying the predisposing factors for the deadfall of ash trees.

Material and methods

Study sites and stands

In 2020, the project network FraxForFuture established a common research platform of 14 intensive monitoring plots (IBF) distributed across Germany in deciduous, broad-leaved mixed forests with a high share of European ash (for a detailed description of the sites and stands see Langer et al. 2022). Three of the 14 IBF differed to some extent regarding structure and setup and hence were excluded in the present study (BY2, BY3, and TH2). The remaining eleven IBF (Fig. 1) were setup identically following a strict protocol with a core area of one hectare, where all tree individuals with more than 7 cm diameter at breast height (DBH) have been marked, their positions georeferenced, and their attributes (species, DBH) recorded. The eleven plots contained 1365 ash trees in total (> 7 cm DBH) that were mostly part of the upper canopy layer of the stands. In some cases, a second layer of younger ash trees is part of the sample (compare the histograms of DBH distributions in Fig. 2). Consequently, DBH distributions and ash density differed notably between the plots. Even within plots, ash density varied notably—which is evident from visual inspection of the spatial distribution of ashes (Fig. 8, Appendix). To generate a measure of ash density on a higher resolution than plot-level (e.g. ash tree density or ash basal area), an ash density index per single tree was calculated. The index is based on the Hegyi competition index (Hegyi 1974), but only accounts for ash trees and is dependent on the sum of all ash trees around the target tree (12 m radius), weighted by their DBH and inverse-weighted by their distance to the target tree (Fig. 12, Appendix).

Fig. 1
figure 1

Eleven of the 14 Intensive monitoring plots (IBF) of the research network FraxForFuture in Germany that were analysed in the present study (resources: QGIS 3.24 © GeoBasis-DE/BKG 2022). The letter codes are abbreviations of the respective federal states: BB—Brandenburg, BW—Baden-Wuerttemberg, BY—Bavaria, HE—Hesse, MV—Mecklenburg-Western Pomerania, NI—Lower Saxony, SN—Saxony, ST—Saxony-Anhalt, TH—Thuringia. For detailed information on the plots: see Langer et al. (2022)

Fig. 2
figure 2

Histograms of the diameter at breast height (DBH) distribution of ash trees in relation to the DBH distribution of all other tree species for each plot

Detailed soil properties were collected from one soil pit per plot for a general characterisation of the research plots (e.g. Fig. 13, Appendix). For this purpose, physical and chemical soil analyses at seven depth levels up to a depth of 200 cm and a morphological soil description were conducted following the national forest soil inventory methodology (Höhle et al. 2018). The spatial heterogeneity of soil properties within each plot was mapped using a 30 × 30 m grid with a soil auger to a depth of 150 cm (10–12 points per plot). The information from the analysed soil pits was used to infer and correct the soil and site classifications at each auger position.

Based on this soil auger grid, each plot was subdivided into several sub-plots (polygons), dependent on coincident soil and terrain properties. This way, 66 sub-plots were defined in total, and 56 of them contained at least one ash tree. These sub-plots were demarcated following the Northwest German forest site mapping scheme (NFP 2009) on a soil nutrient index scale with 14 ordinal classes (from rich-carbonatic to poor). Additionally, each sub-plot was classified as either ‘hydromorphic’ (groundwater or stagnic conditions) or ‘non-hydromorphic’ (well drained, no indication of waterlogging/anoxic soil conditions). Available soil water capacity (AWC) was calculated from the profiles of soil texture, bulk density and coarse fractions at each soil auger position up to 150 cm of depth by using a pedotransfer-function (Ad-Hoc-Arbeitsgruppe Boden, KA5, 2005), and then aggregated on sub-plot-level.

Ash dieback assessments

All plots contained between 38 and 235 (~ 124 on average) ash trees (> 7 cm DBH) that were visually assessed from the ground twice a year from winter 2021 to summer 2023 according to the ash dieback assessment key by Peters et al. (2021a). The assessments include an ordinal rating of crown damage with six classes from 0 (no ash dieback at all) to 5 (dead tree), integrating vitality and mortality in one scale. Fallen/uprooted trees (deadfall within the period of 3 years 2021–2023) are recorded as well and assigned to the additional damage class 6 (fallen tree). The crown damage classes are defined independently for summer and winter inventories and evaluate the presence and severity of a certain combination of symptoms. The winter key focuses on structural anomalies and necrotic branches, whereas the summer key evaluates a combination of defoliation and structural anomalies. In the original version of the key, discrete ranges of defoliation were assigned to each crown damage level, but this turned out to be inapplicable in the field because structural crown damage and defoliation varied independently to some degree. Therefore, defoliation was estimated separately on a percentage scale in the summer assessment 2022 and 2023 analogous to the European forest condition assessment on Level I and II plots of the ICP forests network (Eichhorn et al. 2016). Additionally, the winter assessments included a rating of basal lesions with three classes from 0 (no necrotic tissue visible) to 2 (large lesions), and an estimation of epicormic shoots in the crown as percentages of the total amount of shoots/branchlets in three classes (< 5% /5% to 50%/ > 50%).

Two training courses were conducted in the first and second assessment year to validate and maintain the comparability of assessments by different surveyors.

Data handling and statistical analyses

The dependent variables addressed in this study are either ordered categorical (model 1: crown damage classes, model 2: basal lesions rating) or binomial (model 3: deadfall probability). To model ordered categorical variables (model 1 and 2), mixed ordinal logistic regression models (cumulative threshold model with a logit link function) were applied. The software R (Version 4.2.3) with the function gam(family = ‘ocat’) from the package mgcv (Wood et al. 2016; Wood 2017) was used. Please refer to Kneib and Fahrmeir (2006) for the mathematical background and to Divjak and Baayen (2017) for an in-detail description and application of this model class (latent variable thresholds, link function, and estimated response-class specific probabilities).

Following the structure of the dataset, some independent variables were prerequisite. Season (winter/summer), year (2021, 2022, 2023; as factor) and their interaction were included in the crown condition model to account for temporal variation and the methodological differences between winter and summer key.

A range of predictors (e.g. measured nutrient stocks, climate, measured microclimate, geographic position, management type) were only available on plot-level (n = 11) and therefore not individually included in the model. Instead, the sum of their effects is assumed to be included in the random effect terms on plot-level (see below). Only predictors on sub-plot or tree-level were considered in the further course. The choice of predictors was based on a set of ecologically meaningful variables (hydromorphic soil conditions, AWC, soil nutrient index, ash density index, DBH, humus layer thickness) and backwards selection by means of the Akaike information criterion (AIC). Season, year, hydromorphic soil conditions, DBH, and ash density index were selected for the final model:


Model 1:


crown damage class~ hydromorphic soil conditions (sub-plot)

+ DBH (tree)

+ ash density index (tree)

+ season

+ year

+ season : year

+ s(PLOT_ID, bs = ‘re’)

+ s(PLOT_ID, inventory nr, bs = ‘re’)

+ s(TREE_ID, bs = ‘re’)

Numeric predictors (DBH and ash density index) were centred and scaled by their standard deviation in advance to make their estimates comparable. PLOT_ID and TREE_ID were included as random intercepts to account for spatially dependent measurements on plot-level and temporally repeated observations on tree-level. Additionally, the centred number of the respective inventory (‘inventory nr’) was included as random slope within each PLOT_ID to account for different shifts in crown damage over time between the plots.

For model 1, the crown damage classes 5 (dead tree) and 6 (fallen tree) were combined to one category, because the order of classes is not clearly defined in this case. Crown damage classes 0 and 1 were combined as well, because class 0 was very rare and only occurred in few plots in certain years (four cases in total).

To investigate the occurrence of basal lesions, a temporally more static model (model 2) was chosen for two reasons: Basal lesions were only rated in the winter inventories (ninventories = 3) and the correct tracking of temporal development is very difficult for basal lesions due to several reasons (fallen trees could not be rated anymore, increasing number of detected lesions over time due to an observation time bias). For this purpose, the terms ‘season’, ‘season:year’, and the plot-specific random slope over time were removed from the model. The AIC-based selection of predictors led to the same three variables (DBH, ash density index and hydromorphic soil conditions):


Model 2:


basal lesions class~ hydromorphic soil conditions (sub-plot)

+ DBH (tree)

+ ash density index (tree)

+ year

+ s(PLOT_ID, bs = ‘re’)

+ s(TREE_ID, bs = ‘re’)

Model 3 was fitted in order to test whether the prevalence and severity of basal lesions at the beginning of the observation period were a good predictor for the probability of deadfall of a tree. In order to do so, the probability of a tree falling into the class 6 category (fallen tree) within the 2.5 years of observation was modelled by applying a binomial model with a logit link function. The software R (Version 4.2.3) with the function gam(family = ‘binomial’) from the package mgcv (Wood et al. 2016; Wood 2017) was used. The basal lesions rating from the first winter inventory (2021) was set as categorical predictor and several ecologically meaningful predictors (DBH, AWC, absence/presence of hydromorphic soil conditions, ash density index, soil nutrient index) were included in the full model. After backwards model selection based on the AIC, only basal lesion rating, DBH and hydromorphic soil conditions remained in the model:


Model 3:


deadfall (yes/no)~ basal lesions class (Winter2021)

+ hydromorphic soil conditions(sub-plot)

+ DBH (tree)

+ year

+ s(PLOT_ID, bs = ‘re’)

Model 3 did not include any temporally repeated observations on tree-level, thus, only a random intercept on plot-level was necessary.

The deviance explained by the models was calculated according to Wood (2017) as 1-deviance/null deviance. To roughly allocate the explained deviance to either parametric or random effects, reduced models without random effect smooths were compared to the respective full models.

To investigate whether the general crown damage trend was positive or negative per IBF, reduced mixed ordinal logistic regression models were fit on data subsets per IBF:


Model 4:


crown damage class~ inventory nr

+ season

+ inventory nr : season

+ s(TREE_ID, bs = ‘re’)

The inventory season and its interaction with the inventory number was kept in the model to account for the seasonal differences in the winter and summer inventories besides the general trend over time.

Results

The ash dieback assessment key on the monitoring plot network

The crown damage classes defined in the aforementioned ash dieback assessment key were defined to partition the crown symptomology of ashes in classes with roughly equidistant cut points. The application of the key on the monitoring plots showed that this does not imply that each class is covered by similarly sized ash cohorts. The majority of ashes were classified within the intermediate to heavily damaged classes (2–4), whereas especially the vital end of the spectrum (0–1) was severely under-represented (Fig. 3). Furthermore, the distribution of class allocation differed between winter and summer inventories: The summer key evaluates the same cohort of ash trees slightly better (i.e. higher proportions of class 1 and class 2 ash trees). A considerable proportion of ash trees switched between winter and summer inventories in a range of 1–3 crown damage classes, indicating a very different appearance of certain trees depending on the season. The crown damage classes are well correlated with crown defoliation (Fig. 4) and as well with the amount of epicormic shoots in the crown (Fig. 5a). Anyway, there is some considerable variance of defoliation and the share of epicormic shoots within the damage classes.

Fig. 3
figure 3

Influence of time and season on the crown damage class distributions (all intensive monitoring plots pooled). For a faceted figure per plot, refer to Fig. 9 and 10, Appendix

Fig. 4
figure 4

Range of observed crown defoliation rates for each crown damage class in the summer inventory of 2023. Univariate ordinal regression (crown damage class ~ crown defoliation) results in 80.4% of explained variance

Fig. 5
figure 5

Shares of a epicormic shoots percentage classes and b basal lesions classes in each crown damage class

Crown condition: dependency on time, site and stand parameters

The crown damage class distributions were dependent on season and year (Table 1, Fig. 3). Generally, the summer inventories scored the crown condition significantly better than the winter inventories, and each year, the rating became significantly worse. The effect size of summer (vs winter) was comparable to the effect size of the first year (vs the last year). The significant positive interaction of ‘season_summer:year_2022’ indicates that the summer crown condition in the year of 2022 was exceptionally bad (in relation to the general trends).

Table 1 Model 1 (crown damage) summary

Furthermore, crown damage was significantly higher in small ash trees (low DBH) and in ash trees with a high ash density in their close vicinity (Table 1, Fig. 6). Here, the effect size of ash density was twice as high as the effect size of DBH (in relation to the standard deviations of the observed distributions of DBH and the ash density index, cf. Figures 2 and 8, Appendix). The largest effect could be attributed to hydromorphic soil conditions, where the crown damage was significantly higher than on non-hydromorphic soil conditions (Table 1, Fig. 6). The AWC had only minor to no influence on crown damage (parameter deselected in model selection). Thus, stagnant and permanently available groundwater were better predictors for ash decay than the water capacity in the soil matrix. Nutrient availability (soil nutrient index) did not influence crown damage as well.

Fig. 6
figure 6

Model predictions (Model 1) for the occurrence probability of crown damage classes in dependence of ash density, soil water regime and DBH (diameter at breast height). Soil water regime classes refer to the presence (‘hydromorphic’) or absence (‘non-hydromorphic’) of ground- or stagnant water. Year and season are kept constant at the levels ‘winter’ and ‘2023’. The ash density index corresponds to the values shown in Fig. 8 (Appendix). For model details, refer to Tab. 1 and the chapter material and methods. Note: the coloured lines are slightly smoothed between damage classes for better visibility

Generally, the explained deviance of Model 1 was relatively high (60.3%), but a comparison with a reduced model without random effect smooths showed that a high share of the explained deviance could be attributed to random tree effects (47%) and another minor share to random plot effects (5%). This indicates a very high amount of crown damage variability between individual ash trees that cannot be explained by stand or site properties.

Other than these general trends, many plots (BW1, BW2, HE1, NI1, SN1, SN2) showed a divergent drift of crown damage classes in their ash populations (Fig. 9, Appendix): Over time, the share of relatively vital to mediocre individuals (class 1–3) increased, whereas the share of extensively damaged individuals (class 4) decreased sharply. These class four trees either died/collapsed or improved their crown condition towards better crown damage classes. Thus, the slight general trend over time towards higher crown damage classes in this study is mainly driven by the increase of dead and fallen individuals. The crown condition of the remaining living cohort of ashes did not generally deteriorate, but even improved significantly on six of eleven plots (Table 2).

Table 2 Trends of crown damage classes (Model 4) for each intensive monitoring plot (IBF)

Basal lesions on ash trunks

Basal lesions were identified on 60% of all ash trees in 2021 and on 70 and 72.8% in the years of 2022 and 2023, respectively, and their occurrence was correlated with crown damage classes (Fig. 5b). For a detailed overview of the distributions of basal lesions classes on all monitoring plots over the years, refer to Fig. 11 (Appendix). Basal lesions were significantly more prevalent and more severe on moister sub-plots, on ashes with higher ash density in their immediate vicinity, and on ashes with higher DBH (Table 3).

Table 3 Model 2 (basal lesions) summary

Within the total observation period of 2.5 years, 10.4% of all ash trees collapsed due to a combination of wind and predisposition because of ash dieback and associated wood-decaying fungi. The probability of deadfall (collapsing) of a tree depended significantly on its DBH and on the diagnosis of large lesions at the stem base (basal lesions class 2) at the beginning of the observation period (Fig. 7). Hydromorphic soil conditions and small basal lesions (in comparison with no lesions) only had slight, non-significant effects on the probability of deadfall.

Fig. 7
figure 7

Model predictions for a trees’ risk of deadfall (collapsing) in dependence of its diameter at breast height (DBH), the presence/absence of hydromorphic soil conditions, and basal lesions that were diagnosed at the beginning of the observation period (Model 3). The shaded areas represent 95%-confidence bands. The influence of DBH and basal lesions class 2 (‘large lesions’, in contrast to small/no lesions) is significant on a 95%-level

Discussion

Condition of ash trees depending on stand, site, and time

Tree size (expressed here by DBH) has been one of the strongest predictors for crown condition and deadfall probability in our study, with larger trees being less affected by the disease. The lower/slower impact of ash dieback on larger trees in comparison to pole size ash has frequently been reported (Husson et al. 2012; Chandelier et al. 2016; Lenz et al. 2016; Marcais et al. 2017; Klesse et al. 2021). This effect can most probably be attributed to the higher ability of bigger and dominant trees to keep up photosynthesis and to replace dead twigs with epicormic shoots to avoid carbon starvation as a result of leaf loss (Marcais et al. 2017). To some degree, this effect is promoted by the higher background decay and mortality in pole wood in comparison with older individuals, within stands as well as between stands. However, this cannot explain the higher degree in structural crown damage in younger ash trees. Klesse et al. (2021) concluded as well that the overproportional increase in damage and mortality in younger ash trees in the past decade can broadly be attributed to the presence of H. fraxineus.

Crown damage just like the prevalence and severity of basal lesions were closely connected and dependent on similar stand and site factors (ash density and soil water regimes within plots). More severe basal lesions on moister sites have been confirmed in other studies (Husson et al. 2012; Chandelier et al. 2016; Marcais et al. 2016). Worse crown condition and higher mortality were also repeatedly found on edaphically and/or climatically moister sites (Schumacher 2011; Havrdová et al. 2017; Klesse et al. 2021; George et al. 2022). The most commonly assumed driver of this relationship is the dependency of fructification and sporulation success of H. fraxineus on periods of moist conditions in litter and topsoil in the summer months (Skovsgaard et al. 2017). Additionally, secondary infections with wood-decaying fungi, especially in basal lesions (e.g. Armillaria species), are presumably facilitated by moister conditions (Marcais et al. 2016; Peters et al. 2023).

Ash share or ash density has as well been found to increase ash dieback symptoms in the crown as well as at the stem collar in several studies (Havrdová et al. 2017; Enderle et al. 2018). Higher host density leading to more infected leaves and a higher share of ash rachises in the litter is a possible explanation for higher sporulation success and infection pressure of H. fraxineus. This is in accordance with the conclusions of Grosdidier et al. (2020) and Klesse et al. (2021) that the disease symptoms are far less severe in forest ecosystems with low ash density and in open canopies (such as hedges and isolated trees). In contrast, George et al. (2022) found no effect of ash share on ash mortality on ICP forest level I monitoring plots over Europe. To some extent, this can be explained by the different study populations, since ICP forest only includes dominant and co-dominant layers of a stand, whereas the other mentioned studies as well as the present study included ash trees from all layers of a stand (> 7 cm DBH). Additionally, George et al. (2022) used the number of ash trees within the plots as a rough proxy for ash share or ash density because the dataset of ICP forest level I monitoring plots do not offer area-related measures of density.

By far the highest share of variance in crown damage classes was explained by the random intercept on individual tree-level (inferred from calculated differences in explained deviance between full and reduced models). All parametric terms and plot-specific random terms together explained only a small share of total deviance. This indicates that the strongest drivers for individual differences in crown condition are most probably not small-scale site or stand properties, nor tree size (DBH), but unknown or unmeasured differences between individuals—such as genotypes and differences in phenotypic plasticity. Inheritable genetic variability in susceptibility to ash dieback has been proven in clonal and progeny trials (McKinney et al. 2011, 2014; Kjær et al. 2012; Enderle et al. 2015). Our study reinforces the assumption that genetic differences explain a larger share of susceptibility in comparison to site and stand differences.

The general trend over time towards higher crown damage classes in this study is mainly driven by the increase of dead and fallen individuals. The crown condition of the remaining living cohort of ashes did not generally deteriorate, but even improved on six of eleven plots. This indicates a selection process within the respective ash populations.

Evaluation of the ash dieback assessment key

The application of the ash dieback assessment key (Peters et al. 2021a) on the IBF showed that there were almost no trees falling into the crown damage classes 0 and 1 (no damage and first symptoms) and most trees were either category 2, 3 or 4. Within the first years of practical application of the assessment key, some verbal feedback regarding the classification scale of crown damage already indicated that this might lead to insufficient differentiation in the medial classes. In some cases, assessors felt the urge to interpose transient classes between the categories (e.g. 2− or 3 +). This lack of differentiation leads to information loss from visual inspection of a tree to data analyses to some degree. Anyway, the decision to stick to only 5 classes within the spectrum of living ash trees was a compromise to ease training courses and transfer into forest practice.

Methodologically, the two observed ash cohorts mentioned in Table 2 (constant observation of the same trees versus observation of a shrinking cohort of surviving trees) represent the extremes of a spectrum, either underestimating or overestimating the general ‘condition’ of an ash stand. Both are not the optimal solution to track the crown condition of stands over long periods of time, because background mortality and background succession (particularly the unconsidered new individuals entering the > 7 cm DBH cohort of the stands) are not correctly considered. Anyway, the number of ashes potentially entering the > 7 cm DBH cohort was extremely low on the plots because of reduced radial increment and high mortality in the younger stages over the past decade of ash dieback. Considering the relatively short observation period of 2.5 years, our approach to include dead ashes on the ordinal scale as the worst possible crown damage class and to ignore background succession may still be valid.

The good correlation of crown damage classes with defoliation rates and epicormic shoots in the crown is generally expected, as epicormics shoots and defoliation are part of the criteria for the classification of crown damage. This can be seen as a confirmation that the damage classes are meant as an integrative measure of several ash dieback related symptoms in the crown. Likewise, this makes sense in the light of the pathology of the disease: The onset of the disease directly leads to leaf loss and the subsequent progression to the dieback of branches—which then again leads to growing foliation gaps over years and the development of epicormics shoots. The remaining variance is mainly driven by the fact that secondary growth of epicormic shoots fills or masks the foliation gaps to some extent. This results in lower defoliation rates, but increasing structural anomalies in branching patterns, that are negatively rated by the ash dieback assessment key. The last point is also the main reason for the difference in the evaluation between summer and winter assessments, as structural anomalies are partly masked by foliated epicormic shoots.

Defoliation rates are the main vitality indicator of various national forest health inventory frameworks and the European ICP Forests monitoring programme (Eichhorn et al. 2016). The good overall agreement of defoliation rates and crown damage classes (80% explained variance) encourages the comparison of results inferred from crown damage classes of the assessment key by Peters et al. (2021a, b) with findings inferred from defoliation rates. This includes the retrospective and prospective evaluation of defoliations rates from the ICP Forests Level I monitoring network linked with research questions regarding ash dieback. Additionally, this finding supports the general application of remote sensing techniques for ash dieback monitoring, because the spectrum of reflected radiation from the canopy surface is strongly linked to defoliation rates (Townsend et al. 2012; Ackermann et al. 2022).

The distinctly higher probability of deadfall of ash trees with large basal lesions (class 2 in the assessment key of Peters et al. 2021a) emphasises the necessity for arborists and foresters to screen for basal lesions whenever infrastructure or working safety is affected.

Conclusion and future directions

The majority of the monitoring plot framework of FraxForFuture will be maintained to observe the future development of the stands. The plots BB1, NI1, ST1, and HE1 are integrated into the forest condition survey programme and will be assessed analogous to the ICP Forests Level I monitoring network. Additionally, all IBF are scheduled to be part of the potential follow-up project FraxForFuture 2.0 including continued monitoring and possibly planting and management trials.

The main findings (ash density and hydromorphic soil conditions facilitating the disease progress) should be taken into consideration for future ash stand management and possible ash re-cultivation. Pure ash stands or bigger ash patches without any admixture species should be avoided in all age stages of ash stands. In forest ecosystems with already low ash share, remaining ashes should be fostered as they have a higher probability of surviving even with moderate symptoms. In the past, forest management practice actively pushed ash towards the moister end of the spectrum of their water demand, because ashes grow faster under ample water supply and tolerate periodically wet sites. In the light of ash dieback, the promotion and fostering of ashes on hydromorphic soils have to be questioned. Especially cost-intensive reforestation trials with ashes in the future—possibly progenies from breeding trials with selected genotypes that are less susceptible to H. fraxineus—should be avoiding hydromorphic soil conditions.