Introduction

Fire regimes affect long-term vegetation dynamics, taxon extinction risks, the carbon cycle and other ecosystem processes (Clarke et al. 2010b) and shape global biomes (Chuvieco et al. 2008; Archibald et al. 2013; Pausas and Ribeiro 2013). Knowledge of fire ecology is thus fundamental to ecosystem management (Lawes and Clarke 2011). This knowledge is vital given the predicted increase in fire weather under climate change in many ecosystems (Archibald et al. 2010; Bradstock 2010; Clarke et al. 2011; Little et al. 2012; Clarke et al. 2013a). This special issue of Plant Ecology is dedicated to Peter J. Clarke who made diverse contributions to ecological and evolutionary responses of plants to fire. The research papers in this special issue have been contributed by Peter’s colleagues and in many instances co-authored by Peter. They reflect the breadth and depth of his contribution to the field.

Understanding and characterising fire regimes

Measuring and assessing the effects of severe fires

Characterising fire regimes and determining how typical fire regimes affect the composition and ecology of different plant communities (Murphy et al. 2013) lies at the heart of fire ecology. Much of Peter’s early work focused on characterising and measuring the attributes of fire regimes and consequent habitat associations (Clarke 2002a, b; Clarke and Knox 2002; Campbell and Clarke 2006; Knox and Clarke 2006; Nano and Clarke 2008; Knox and Clarke 2011, 2012; Clarke and Lawes 2013). Here, Knox and Clarke (2016; this issue) examine the spatial complexities of fire severity and explore the relationships between common methods of measuring fire severity and their utility in predicting the effects of fire severity on plant ecological processes. Fire severity at the canopy level, understorey level and below-ground were not strongly coupled and Knox and Clarke (2016) recommend that fire severity should be measured in all three strata to determine the overall fire severity at a site.

Variations in fire intensity, as commonly estimated via severity, can have long-term compositional and structural consequences. For example in the mixed EucalyptusCallitris forest in southeastern Australia, Denham et al. (2016; this issue) showed how alternative structural states resulted from the responses of co-dominant non-resprouting conifers (Callitris spp.) and resprouting Eucalyptus spp. to contrasting burn severities. Under high-severity fires (i.e. high levels of crown scorch and consumption), the post-fire stature of these forests was reduced because resprouting Eucalyptus spp. were top-killed (i.e. resprouting was from basal lignotubers) and regeneration of Callitris spp. after high mortality was confined to seedlings. By contrast, low severity fires caused less structural change due to epicormic resprouting and high levels of survival due to thick bark, respectively, in these two groups. Denham et al. (2016) estimated that recovery of mixed forest structure after high-severity fire would require fire-free intervals of several decades, with interim fires of any severity inevitably leading to declines in the populations of the more fire-sensitive obligate-seeding Callitris species.

Fire-mediated alternative stable states?

The juxtaposition of flammable and apparently non-flammable vegetation in fire-prone landscapes worldwide has raised questions about how vegetation structure and microclimate interact to promote or prevent fire in different habitat types. Clarke et al. (2014) examined this issue in low-flammability temperate rainforest and adjacent higher-flammability eucalypt forests. Both Just et al. (2016; this issue) and Clarke et al. (2014), studying vegetation mosaics in North America and eastern Australia, respectively, found that the manifest pyrogenicity could not be explained by natural selection for more flammable foliage. Instead, a more likely mechanism involves open tree crown structure and associated microclimates, which promote growth and drying of ground fuels, as well as elevated understorey fuels, allowing the vegetation to be in a low-moisture state that promotes fire spread more often than closed-canopy vegetation with which they co-occur.

Knox and Clarke (2012) postulated that warm-temperate rainforest species were resilient to a relatively severe fire regime and that these forests do not represent alternative community states to surrounding Eucalyptus dominated forests driven solely by the presence or absence of fire. Contrary evidence has been reported from tropical savannas in northern Australia where fire exclusion apparently promotes expansion of non-eucalypt ‘monsoon rainforest’ species, resulting in the eventual transition of species composition from savanna to monsoon rainforest (Woinarski et al. 2004; Lawes et al. 2011b). Unlike warm temperate rainforests, monsoon rainforest and tropical savanna are thus likely alternative stable states driven by fire (Clarke and Lawes 2013).

Foliar flammability: adaptation or exaptation?

There is ongoing debate about the adaptive significance of plant traits linked to flammability (Bradshaw et al. 2011; Keeley et al. 2011; Bowman et al. 2014). Mason et al. (2016; this issue) explore this proposition in common native woody and herbaceous species of New Zealand, where fire has not been a strong selection pressure. They found that foliar flammability increased with increasing foliar surface area:volume ratio (SAV) and the nutrient content of the leaves, and with decreasing tissue density, lignin and secondary metabolite concentrations. Their results suggest a general relationship between resource allocation strategies and foliar flammability, demonstrating that selective pressures other than fire can drive variation in foliar flammability. The data of Mason et al. (2016) support the interpretation of foliar flammability in New Zealand species as an exaptation (i.e. not a specific fire adaptation). However, whether flammability traits in other more fire-prone systems are exaptations is uncertain. Clarke et al. (2014) argued that in the case of frequently burnt eucalypt forest, foliar flammability was not an adaptation to a crown fire regime, but that eucalypt canopy architecture or the spatial arrangement and retention of fuels may very well be under direct selection from fire.

Plants responses to fire

Plant resilience and resistance fire traits

Plant functional fire traits can be broadly categorised into resilience or resistance traits. In both instances, these traits increase the likelihood of individuals surviving fire. Resilience traits enable recovery of the individual after fire, such as by a vigorous resprouting response (Clarke et al. 2013c), but may also involve fire-cued recruitment from seed (e.g. serotiny; Clarke et al. 2010a, 2013b, 2016a), which increases the long-run fitness of an individual (Causley et al. 2016; this issue). While resilience traits enable recovery after fire, resistance traits such as thick bark (Schubert et al. 2016; this issue) protect individuals from the effects of fire (Lawes et al. 2011a, 2011c), reducing the costs to individuals of recovery after fire. Thick bark has been shown to be associated with ecosystems that are under selection from fire (Lawes et al. 2013; Pausas 2015; Clarke et al. 2016a) and is a likely adaptation to frequent surface fire regimes (Schubert et al. 2016). However, thick bark may arise from selection pressures other than fire regimes (Lawes et al. 2014; Richardson et al. 2015). The utility of bark thickness as an indicator of fire history depends on how local fire regimes affect absolute bark thickness (i.e. what constitutes thick bark in the local context; see Midgley and Lawes 2016; this issue). In this issue, various aspects of plant functional fire resilience and resistance traits are examined.

Lignotubers and basal resprouting: resilience to crown fire regimes?

Lignotubers (woody swelling of the root crown) are considered an adaptive trait in ecosystems with highly frequent and severe disturbances, such as fire (Paula et al. 2016; this issue). Lignotubers are associated with a basal resprouting response that enables post-fire regeneration (Burrows 2013; Clarke et al. 2013c), conferring a degree of resilience to individuals that are prone to being top-killed by fire. Species that resprout from lignotubers have a larger bud bank than those resprouting from root crowns (Clarke et al. 2013c) with associated higher energy costs. Large bud banks should therefore only be selected when they have a significant effect on survival, as they do in fire-prone environments (Paula et al. 2016). Consequently, basal resprouting is prevalent particularly in crown-fire affected ecosystems (Knox and Clarke 2004; Paula et al. 2009; Burrows 2013; Paula et al. 2016).

Bark thickness: a fire resistance trait

Relative bark thickness (RBT) is the proportion of the stem that comprises bark and is a useful metric for determining whether plant species differ in bark thickness. Midgley and Lawes (2016) examine the measurement, analysis and interpretation of species bark thickness trends. They argue that in contrast to current methods that use stem diameter or radius, that RBT should be measured with respect to the bole diameter. In addition, they argue that comparisons of RBT should be made only among small stems (< 10 cm DBH) as the effect of fire is greatest on small stems and thus selection is greatest for thick bark on small stems.

Few studies have examined bark thickness trends in fire-prone ecosystems other than savannas, preventing generalisation of the relationship between bark thickness and fire-activity at the global scale. Schubert et al. (2016) compare standardised RBT trends in trees and shrubs across a large-scale fire-rainfall gradient from desert to dry savanna in northern Australia. Overall, across this dryland gradient, species with thick bark at the sapling stage dominated where fire was frequent. With increasing aridity, there was a shift in dominance from thicker-barked epicormic resprouter tree species to thinner-barked shrub and mallee species that either basally resprout or are killed by fire. An important finding, confirming the utility of bark thickness as an indicator of recent fire history (Pausas 2015), is that there was no phylogenetic signal of fire on bark thickness trends. Pairwise congeneric species comparisons showed a consistent relationship of thicker bark under high fire activity (Schubert et al. 2016).

Resilience traits: post-fire resprouting response

Many species display resilience to fire through their ability to resprout after fire (Clarke et al. 2013c). In this issue, Schwilk et al. (2016) demonstrate that post-fire resprouting oaks exhibit plasticity in xylem vulnerability to drought. Post-fire resprouts were more vulnerable to drought than adults and thus estimates of drought tolerance based on adult measurements may underestimate species susceptibility to dessication. The greater susceptibility of resprouting tissues to xylem embolism could well be a trait trade-off for the rapid growth of resprouting individuals that enables them to capture resources competitively in the post-fire environment. Although resprouters have lower tissue-specific desiccation tolerance than nonsprouters, they still can still tolerate climatic drought as adults.

In tropical northern Australia, rainforest patches are embedded in a matrix of fire prone-tropical savanna. Do rainforest saplings possess traits that enable them to survive in the savanna environment, including recovering from a relatively frequent but low intensity surface fire regime? Ondei et al. (2016; this issue) compared the post-fire resprouting response of saplings of savanna species, burnt by an ambient early dry season (low-intensity) fire, with that of rainforest species burnt using an experimentally simulated fire. All rainforest and savanna species resprouted from aerial buds after fire, although mortality of both stems and whole plants was much higher for rainforest than savanna individuals. After a year, savanna species had regained their original height, while rainforest plants were on average 43 % shorter than their pre-fire height. Thus, although rainforest species were able to recover from the prevailing low-intensity fire regime, they were less able to escape the ‘fire trap’ than savanna species, and may be susceptible to cumulative mortality from successive fires.

Comparative studies of post-fire seeders and sprouters

Three ecological models are proposed to explain the relative proportions of the post-fire seeder and resprouter life history types in an ecosystem:

  1. 1.

    The gap-dependent recruitment model, which focuses on post-disturbance seedling recruitment, proposes that large post-disturbance gaps should favour seedling recruitment over resprouting (Pausas and Keeley 2014).

  2. 2.

    The disturbance frequency model contends that resprouters and obligate seeders are distributed along gradients in disturbance frequency (Bellingham and Sparrow 2000; Bond and Midgley 2003; Clarke et al. 2015a). In general, as disturbance frequency and severity increase, the model proposes increasing selection for resprouting over seeding.

  3. 3.

    The resource/productivity model (Clarke et al. 2005, 2013c) proposes that as resource availability increases, the intensity of competition increases, and this favours resprouters, which are presumed to be better competitors.

The findings of Ondei et al. (2016) above provide partial support for the proposition that the composition of plant communities may be driven by responses of key plant resilience traits (resprouting and reseeding) to either resource competition or disturbance regimes (Clarke et al. 2015a). Hammill et al. (2016; this issue) examined this proposition by measuring the responses of overall species richness and the richness of herbs and shrubs within the three most common plant resilience functional types (i.e. facultative resprouters R+ P+, obligate resprouters R+ P-, obligate seeders R-P+; see Clarke et al. 2013c, 2015b) to orthogonal gradients of temperature (MAT), rainfall (MAP) and fire frequency (FF) in dry sclerophyll forest in the Sydney basin (south-eastern Australia). The richness of species, their resilience traits and their assembly into functional types based on morphology, and the presence of these traits, were largely governed by variations in MAP and to a lesser extent MAT at a regional scale. Thus the responses of proportions of species within trait- and functional-type groups were inconsistent with the disturbance frequency and resource competition models of resilience variation, possibly challenging the nature of the evidentiary base on which these models have been proposed (e.g. effects of ‘productivity’ and disturbance have not been adequately disentangled in past studies).

In a similar vein, Keeley et al. (2016; this issue) examined syndromes within the large woody genus (Arctostaphylos) in the highly fire-prone Californian chaparral. Keeley et al. (2016) inferred that the more open communities occupied by seeders offered larger gaps for seedling recruitment after fire, which is consistent with predictions from the gap-dependent recruitment model. They found no support for either the disturbance frequency or resource/productivity model because there was no consistent pattern between fire frequency, between-fire intervals or limiting resources (precipitation) and the relative proportions of post-fire resprouters and seeders.

Resprouting is spatially, ecologically, and phylogenetically widespread and consequently is regarded as the ancestral state in most lineages, and the loss of resprouting in favour of post-fire seeding is assumed to be a derived trait (Pausas and Keeley 2014). It has been proposed that seeder species should have greater genetic diversity than resprouters because they have shorter generation times (Yue et al. 2010) and faster population turnover, which in turn result in high rates of molecular evolution and diversification (Wells 1969). Furthermore, rates of molecular evolution and diversification could be reduced in long-lived resprouters if they accumulate somatic mutations that suppress reproduction (Lamont et al. 2011). Ojeda et al. (2016; this issue) confirmed the ancestral status of the resprouter form in discrete obligate seeder and resprouter populations of Erica coccinea in fynbos in the southwestern Cape Floristic Region of South Africa. They also found comparatively higher rates of molecular evolution in populations of derived seeders than resprouter populations. These findings contrast with those of Verdú et al. (2007), who found that seeders from a Mediterranean heath ecosystem had neither higher rates of molecular differentiation nor higher diversification than resprouters. Verdú et al. (2007) argue that the lack of differences in molecular rates in their study is due in part to alternation of generations in plants that may purge the genetic load so that somatic mutations are rarely passed on to germlines (Dickinson and Grant-Downton 2009). The conflicting evidence reflects ongoing uncertainty and debate on the role of fire as a driver of diversification in fire-prone ecosystems.

Under climate change, rising atmospheric CO2 concentrations may alter resource allocation patterns in plants which in turn may alter resprouting responses to disturbances and the relative proportions of resprouter species in affected plant communities (Clarke et al. 2013c). For temperate shrubby ecosystems, Clarke et al. (Clarke et al. 2016b; this issue) compared resource allocation patterns in congeneric species pairs of shrubs with contrasting resprouting abilities under ambient and elevated CO2 levels. Clarke et al. (2016b) conclude that although elevated CO2 levels may not affect resprouting ability directly, it may enhance other aspects of persistence such as stem growth rates and bud protection, and thus the capacity to ‘escape’ fires, while non-resprouters may also benefit by being able to set seed more quickly and increase seed production thus enhancing their recruitment after fire.

Recruitment responses to fire

Fire stimulated recruitment from seed (post-fire seeders) may result from several mechanisms; one of which is the delayed release of seeds stored in the canopy (serotiny) stimulated by fire. Causley et al. (2016; this issue) found that fire enhanced fitness of serotinous species more than drought death. Causley et al. (2016) concluded that serotiny/pyriscence enhances species fitness by releasing seeds into an optimal post-fire habitat, supporting the proposition that serotiny is a specific adaptation to fire.

Seed predators, fire and soil nutrients have all been suggested as selective forces influencing serotiny (Lamont and Enright 2000). Clarke et al. (2013b) tested whether protection of seeds and/or synchronised dispersal were associated with different levels of serotiny, and if resprouting ability influenced selection for strong serotiny. They compared the numbers and abundance of 146 woody species with delayed seed release among five community types, ranging from rainforests to heathlands, and varying in combinations of fire severity, fire frequency, soil fertility and seed predators. Both protection of seeds and synchronized seed release were related to fire effects in nutrient-limited environments. Serotiny was most prevalent in low nutrient shrublands characterised by severe fires, and least prevalent in high nutrient, low flammability forests. Strong serotiny was prominent in species killed by fire, whereas weak serotiny was more common in resprouting species. Clarke et al. (2013b) argued that recruitment failure in the inter-fire interval drives selection for strong maternal protection of seeds and synchronized seed dispersal in fire-prone environments, while weak serotiny is a bet-hedging strategy in species that rely mainly on resprouting after fire for population persistence.

Many plants in fire-prone environments have limited dispersal ability and thus rely on in situ mechanisms such as serotiny to persist through fire. The regenerative phases of these typically post-fire seeder species, such as seed dispersal, germination and seedling establishment, define their environmental niches and are cued to fire events. Keith and Myerscough (2016; this issue) showed that within the widespread serotinous Banksia spinulosa group the thermal germination niches, defined by the germination response to temperatures, varied among source populations suggesting local adaptation or other mechanisms of differentiation. Intrapopulation variation was greater within source populations of taxa from warm climates than those from cooler climates, suggesting varied resilience to warming expected under climate change. In spite of the adaptive potential of these serotinous species, changes to current fire management strategies will be necessary to ensure their in situ persistence (Keith and Myerscough 2016).

In some fire-prone systems, especially dryland ecosystems where fires are infrequent but nevertheless exert significant selection pressure on the life-history strategies of plants, species may display a range of post-fire recruitment patterns. Wright et al. (2016; this issue) investigated the variability in post-fire recruitment of mulga (a fire-killed shrub with soil seed bank) in relation to fire severity and soil heating during fire. Maximum germinability of mulga seeds occurred when soil was heated to between 80 and 100 °C, and these temperatures were achieved only in high-severity fires. Less severe fires produced lower post-fire recruitment, but the response was highly variable among sites. Wright et al. (2016) propose that this variable pattern in post-fire seedling recruitment may be a risk-spreading strategy to unpredictable fire severity, with high recruitment levels offsetting high adult mortality following high-severity fire.

Fire is an important factor driving the position and stability of ecotones between fire-prone and less flammable forest types. In eastern Australia, Campbell et al. (2016; this issue) examined seedling recruitment and survival of shrub species in the wet sclerophyll forest ecotone between highly flammable dry sclerophyll forest and low-flammability rainforest. These ecotonal communities are dominated by tall open-canopy eucalypts, making them more fire-prone than rainforests. However, their understories may support a number of rainforest species and retain moisture for long periods, making them less prone to frequent fire than dry sclerophyll forests (Knox and Clarke 2012). For characteristic rainforest shrubs, Campbell et al. (2016) found no evidence that herbivores or desiccation regulated seedling recruitment in undisturbed wet sclerophyll forest. However, post-fire herbivory and water stress caused high rates of seedling mortality when these wet sclerophyll ecotones were burnt. In contrast, shrubs characteristic of flammable dry sclerophyll forests recruited and survived in both burnt and unburnt forests. Campbell et al. (2016) argue that, consistent with a lower risk of fire in wet sclerophyll forest, the ability of shrub species in these ecotones to recruit mainly in undisturbed conditions provides insurance against recruitment failure under post-fire conditions. This capacity to spread risks is important to maintaining boundary stability and plant diversity in these ecotonal systems.

Concluding remarks

The diverse contributions in this special issue document a rapidly advancing and broadening understanding of plant fire responses and their evolutionary drivers. Peter Clarke’s prolific research in fire ecology, in no small way, reflects the breadth and depth of his influence on these advances. Most notable is his expertise and active research that spans ecological processes and patterns from individual plants and their populations to plant communities and the landscapes and climates in which they occur. The authors of these papers include Peter’s long-term collaborators and generations of students. This special publication honours Peter’s lasting achievements and contributions.