Introduction

Wildfire presents risks to communities, landscapes, and watersheds and to those who respond to and attempt to manage it. Risk—broadly defined as the effect of uncertainty on objectives—is thus an inescapable aspect of wildfire management [1,2,3]. This framing of risk makes clear the need to explicitly address uncertainty, which can manifest in various types and stem from various sources (e.g., fire weather, suppression resource productivity) [4]. Further, this framing makes clear that risk is a value-laden concept predicated on defined objectives and that objectives can be positively or negatively impacted (e.g., fire can enhance or degrade forest health).

Risk management (RM)—a set of coordinated activities to direct and control an organization with regard to risk—has become somewhat of an organizing framework for wildfire management, with applications ranging from programmatic budgeting to fire prevention, fuel reduction, community planning, and broader topics such as performance, communication, and governance [1, 5,6,7,8,9,10,11,12, 13•]. Wildfire management is rich with opportunities to apply and refine RM acumen—organizations around the globe implement RM practices as a matter of routine. As one example, the Australasian Fire Authorities Council adopted International Standard 31000 Risk management—principles and guidelines [1] as a guidepost for all firefighting operations [14]. As another, the USDA Forest Service describes RM as a required core competency for fire managers, and promulgates a RM protocol to guide assessment, analysis, communication, decision-making, review, and learning [15]. RM can help these organizations increase the likelihood of achieving objectives, establish a reliable basis for decision-making and planning, efficiently allocate and use resources, improve operational effectiveness and safety, and improve organizational learning [1,2,3].

Here, we limit our review to evaluating RM concepts and principles in the context of wildfire response, i.e., the development of a response strategy and its operational execution over the duration of an active fire incident from detection to containment. Strategies can range from full suppression to managing for ecosystem benefit, depending on a variety of factors like relevant policies, land ownership patterns, potential socioeconomic and ecological impacts, fire growth potential, and availability of resources. Important components of response strategies include mobilizing/demobilizing fire management resources, allocating and assigning resources to various tasks (e.g., line construction, structure protection, mop-up), and monitoring and updating strategies in response to changing conditions. As the complexity, duration, or size of fires increases, response strategies may increasingly entail blend direct and indirect tactics, mobilize a greater amount and diversity of ground and aerial resources, and require coordination of a wide variety of activities such as locating drop points and conducting burnout operations [16].

These incident response decisions can be complex, uncertain, time-pressured, and require balancing tradeoffs across many dimensions (e.g., fire impacts, suppression expenditures, and public and responder safety), which highlights the need for structured and timely decision support [3, 16,17,18]. Indeed, there is a long history of application of science to support wildfire response decisions, with much of the effort focusing on predicting expected wildfire behavior given existing fuels, topography, and forecasted weather [e.g., [19,20,21]. Continued advances in spatial risk assessment methodologies provide additional information on the potential consequences to highly valued resources and assets, which can help guide fire management prioritization and decision processes [22, 23]. Further, researchers are developing tools to improve situational awareness and operational responder safety [24,25,26]. However, reviews from around the globe on decision support and fire modeling collectively point to a lack of systems that provide empirically credible and operationally relevant information on the effectiveness of alternative suppression strategies and tactics [27,28,29,30,31,29].

For the purposes of this review, three interrelated principles of RM are particularly relevant: that RM is part of decision-making; that RM is based on the best available information; and that RM facilitates learning and continual improvement. The emphasis on decisions recognizes that performance is inexorably tied to decisions. As Blenko et al. [30] note, and with clear analogies to wildfire management, “An army’s success depends at least as much on the quality of the decisions its officers and soldiers make and execute on the ground as it does on actual firefighting power.” The emphasis on decisions is also based on the fact that in an uncertain world, bad outcomes can result from well-made decisions, and vice versa [31], thus the need to emphasize the quality of decisions, and further to understand how characteristics of decisions and the environment in which they are made may influence decision quality [32].

The emphasis on best available information supports risk-informed decision-making and reinforces decision quality. Three general challenges here are generating relevant and actionable information, making that information available to decision makers, and convincing them to seek it out and incorporate it into their decision processes [33, 34]. Key information necessary to make a high-quality strategic response decision is the differentiation of response alternatives on the basis of relative safety, effectiveness, and efficiency [31], and these elements are a primary focus of this paper. Factors such as fire weather, fuel type, and total amount of resources have been shown to play a role, but relationships between actual suppression activities and the effectiveness of suppression resources remain unclear [35,39,40,41,42,43,41]. Plucinski [42••, 43••] comprehensively reviewed the state of knowledge regarding suppression effectiveness and identified multiple prominent gaps including the effectiveness of different suppression resources used in different suppression techniques, operational fire behavior limits on suppression, use and productivity of all types of suppression resources, how and whether suppression resources can work synergistically, the impact of fuel management on suppression effectiveness, and the impact of protective actions on structure loss.

In practice, the limited scientific evidence and decision support functionality for generation and evaluation of response alternatives mean that these decisions are largely left to expert judgment and intuition. Decisions left to expert judgment can often be excellent, but the dynamic and unpredictable nature of the fire environment coupled with the lack of structured feedback on operational effectiveness likely leads to compromised performance of even expert decision makers [44, 45]. For example, studies from Australia and the USA have demonstrated that fire managers as well as emergency managers are susceptible to a number of cognitive errors and biases, and may exhibit wide variation in risk preferences and judgment [14, 46,50,51,52,53,54,52]. Further, empirical observations suggest possible inefficiencies in operations due to ineffective or excessive use of resources [53,57,58,59,60,61,62,60].

Lastly, the emphasis on learning and continual improvement relates to developing systems to create and transfer knowledge, as well as systems of accountability to monitor performance and correct or reinforce behavior to achieve better outcomes over time. Here, the behavior of the organization rather than the manager is of even greater importance, requiring concrete steps to develop learning processes that generate, collect, interpret, and disseminate information [61]. One challenge is around the organization’s capabilities for improving data collection, developing performance indicators, and adopting new technologies and innovations. Meeting this challenge has proven difficult. In the USA, for example, federal agencies have been criticized for lacking sufficient data fidelity and reliability along with limited analytic capability to understand effectiveness or to inform decision-making [62].

Plucinski [43••] succinctly summarized the state of affairs, stating that there remain many gaps in the current understanding of suppression effectiveness, that the data to fill these gaps are not routinely recorded, and that adoption of technologies such as resource tracking will be essential to capture more and better data streams. In other words, limited collection, documentation, archiving, and analysis of operational data on response effectiveness over time has inhibited systematic learning. This, in turn, has limited meaningful and measurable expansion of the knowledge base on the effectiveness of alternative response strategies and tactics to inform wildfire response decision-making. The net results are bounded operational utility of decision aids which may lead to disproportionate reliance on expert intuition over scientific and organizational evidence, with predictable decision biases and possible operational inefficiencies, and gaps in fully putting the three key RM principles into practice.

A central thesis of this paper is that a stronger emphasis on data-driven decisions and analytics would help bridge those gaps. To that end, here, we aim to expose readers to principles and insights by drawing from the analytics literature with relevance to wildfire management and emergency response. We also synthesize relevant fire literature from around the globe, primarily drawing from studies published by authors in North America, the Mediterranean, and Australia.

Ultimately, we argue for a paradigm based on stronger adoption of data-driven decisions in fire management that we colloquially refer to as “Moneyball for fire” [63], inspired by the innovative use of advanced data analytics in professional baseball and other sports (Box 1). The improvements in sports analytics started from the ability to conduct more complex analyses of recorded performance data. Real-time tracking in sports evolved from these earlier successes, which has opened the door for more analysis, insight, and innovation, and has fundamentally transformed the games in unexpected ways (see https://www.ted.com/talks/rajiv_maheswaran_the_math_behind_basketball_s_wildest_moves). Although fire management organizations collect a considerable amount of data related to wildfires, robust data on fire response and suppression resource performance is lacking [28, 42••, 43••, 59]. Thus, the core analogy is not necessarily about making real-time strategic adjustments in response to a changing fire environment or a changing adversary’s game strategy, but rather the gains in performance from preparatory work through investment in real-time monitoring and analysis that lead to a better informational basis to support strategic planning through to real-time response decisions.

The remainder of the paper is structured as follows. First, we briefly introduce some core analytics concepts and an analytics management framework, along with some observations on implementing an analytics agenda within organizations. Next, we offer our perspective on the nexus between RM and analytics in wildfire response, present a stylized figure linking a RM decision cycle to the three main types of analytics, and describe potential application of descriptive, predictive, and prescriptive analytics in wildfire. We attempt to ground some of these ideas by relating real-world application of analytics to support strategic wildfire decisions, using examples with which the authors have direct experience in the western USA. We present these real-world examples in light of the aforementioned analytics management framework. Lastly, we discuss future opportunities and challenges for the fire management and science communities.

Box 1

“Moneyball” for Fire: Lessons from the Sports Analytics Revolution

Sports organizations around the world are leveraging analytics and evidence-based management to improve performance. What lessons from the sports analytics revolution can be applied to wildfire management?

Popularized by the book Moneyball: The Art of Winning an Unfair Game by Michael Lewis [63] (and the movie of the same title), the idea of “moneyball” is simply the adoption of data-driven decision-making to improve sports performance. Often the brilliance of sports analytics is in its simplicity. In baseball, for example, a key insight was to evaluate players on the basis of their on-base percentage (OBP) rather than their batting average, as it was demonstrated that OBP was a better predictor of ability to score runs. In basketball, it was the realization that the expected number of points per shot was higher for three-point shots than almost all other two-point shots, apart from those taken very close to the basket.

In a contentious scene in the movie, a scout expresses his dissatisfaction with the analytics approach, stating that it is not possible to put a team together with a computer, and that science could never replace his experience and his intuition. This is, of course, a false narrative—the whole point of sports analytics is that it is not an either/or situation—it is intended to complement, not replace, expert judgment. Furthermore, coaches still need to make game-time adjustments, and players still need to execute. Analytics can help teams better prepare for strategic decisions and their execution.

Three Main Types of Analytics Applied to Personnel Decisions in Sports, and Analogies to Fire

Player acquisition ➔ Suppression resource acquisition

Game strategy ➔ Wildfire response strategy

Training (fitness, game situations) ➔ Training (fitness, fire simulations)

Importantly, these three personnel decisions build upon and are dependent upon each other. For example, teams select personnel and design their training regimen in order to successfully implement the most effective strategies.

Key Lessons

Engaging experts and analysts to work together to solve organizational problems

Defining approachable and understandable analytics

Investing in more and better data

Keeping the human element front and center

Sources: [64, 65]

Why Analytics?

We organize our discussion around basing decisions on rigorous analysis and analytic insight over intuition, and on the key linkages between analytics, decision-making, and organizational performance [66•]. Analytics shares a common underlying purpose with operations research and management science, improving business operations and decision-making through the utilization of information, quantitative analysis, and technology [67]. In general, data-driven decisions tend to be better ones, and organizations with comparatively stronger analytics capabilities tend to outperform their peers and competitors [68, 69•]. Analytics can be a tool to facilitate improved decision-making, to measure performance, and even to measure improvements in performance due to adoption of analytics-based management.

Timely provision of analytics can be a key driver of success in many areas, and real-time analytics have been developed for a range of time-sensitive applications including financial market trading, military operations, smart electrical grids, intelligent transportation systems, and, most relevant, emergency response [70, 71]. The increasing use of big data in analytics is often characterized in terms of the volume, variety, and velocity of data, and technologies are emerging that can rapidly ingest and analyze large quantities of data from sources as variable as mobile phones and call records, social media activity, wearable devices, satellites, remote sensors, and crowdsourcing platforms [72]. Location-specific data is critically important for emergency response, for instance, to track first responders with respect to changing conditions and emerging threats [73]. Machine learning in particular seems to be growing in emergency response application potential due its ability to assimilate multiple sources and data types and to generate insights from complex and dynamic environments. Hong and Akerkar [74] comprehensively review machine learning approaches for emergency response, for a range of emergencies including earthquakes, hurricanes, floods, landslides, and wildfires, and for a range of tasks including event prediction, early warning, event detection and tracking, and situational awareness. Real-time analytics are a core component of a broader effort to capitalize on state-of-the-art big data analytics and advanced technology to provide improved insights for accurate and timely emergency response decision-making [75].

If not already apparent, key elements of embracing analytics are investing in better data and better science. Of course, the notion of needing better data and science to support wildfire response decisions is not new [27, 28, 31, 29, 42••, 43••, 59]. What is new, arguably, is embedding these issues within a coherent, principles-based framework that recognizes better data as but one of many steps towards improved decision-making and performance, i.e., an analytics management framework. Key drivers of analytics success include clear goals, focused problems to solve, quality data from multiple sources, multidisciplinary analytics teams, accessible analytics systems, data translators, collaborative decision-making processes, end users as advocates, and iteration and continuous improvement [64]. Modern analytics is also being driven by new technologies, especially advances in computer science related to analyzing large, dynamic datasets in real time, but new technologies can fail to have impact if they are not deployed in an effective organizational framework.

Table 1 presents the nine components of an analytics management framework, which can also be thought of as comprising an iterative cycle. The framework makes clear required core competencies of a successful analytics program: strategic—planning for how data will be used to help solve problems and achieve organizational goals; technical—organizing the people, processes, and technologies required to manage and analyze data; and managerial—communicating data, applying it in decision-making, and using it for continuous improvement [64]. This framework emphasizes the broader connections to people and process and even culture, as well as the path dependency of data to insight to value [66•].

Table 1 Analytics Management Framework© 2019 Ben Shields, MIT Sloan School of Management

Translating analytics insight into action requires more than simply setting up data collection systems connected to a team of data analysts; embracing analytics may require a broader “data-driven cultural change” predicated on existence of an analytics strategy, strong senior management support, and careful change management initiatives [76••]. That is, the value of data analytics comes not just from the technologies that enable it but also from the organizational shifts in behavior and enhanced capabilities for strategic insight and performance measurement [77]. The underpinning of this shift is an acknowledgement that analytics is needed in addition to expert judgment and experience. Similarly, Davenport [78] stresses that the right technology is but one aspect of a successful analytics initiative; the right focus, the right people, and the right culture are also essential. A final caveat, perhaps the most important one, is that better data will not necessarily lead to better decisions. Shah et al. [79•] caution that “unquestioning empiricists” who trust analysis over judgment may be no better than “visceral decision makers” who go exclusively with their gut. The challenge is to develop “informed skeptics” who possess strong analytic skills and who can effectively leverage both judgment and analysis in decision-making. Human judgment therefore remains front and center in the context of embracing analytics to improve decision-making.

The Risk Management and Analytics Nexus in Wildfire Response

In this section, we outline concepts and applications from RM and analytics to wildfire management. Boxes 2 and 3 summarize some core themes of each topic, respectively. As we see it, the nexus of both topics includes developing fluency with uncertainty and probability, emphasizing structured decision-making, making a commitment to generate and use the best available information, monitoring, and iteratively improving these core competencies over time. Furthermore, this nexus is defined by an emphasis on people and culture. For our purposes here, we can contextualize and distill the nexus as follows: providing more and better operationally relevant information on the safety and effectiveness of suppression strategies and tactics, the formal and transparent use of that information by fire managers in decision processes, and the comprehensive tracking of decisions and actions in relation to strategic response objectives and fire outcomes.

Box 2

What Is Risk Management?

Risk management (RM) is a set of coordinated processes and activities that identify, monitor, assess, prioritize, and control risks that an organization faces.

What Are the Different Levels of Risk Management?

Enterprise ➔ Strategic ➔ Operational ➔ Real-time

What Are the Main Principles of Risk Management?

Integrating RM into all organizational processes, including decision-making

Explicitly accounting for uncertainty

Addressing problems in a systematic, structured, and timely manner

Basing decisions on the best available information

Tailoring processes to context, and accounting for human and cultural factors

Promoting transparency and inclusiveness

Being dynamic, iterative, and responsive to change

Facilitating continual improvement

How Is Risk Management Different from “Business as Usual”?

Informal ➔ Formal

Implicit ➔ Explicit

Intuitive ➔ Analytical

Reactive ➔ Proactive

Short-term perspective ➔ Long-term perspective

Sources: [1,2,3]

Box 3

What is Analytics?

Analytics is the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions

How Does Using Analytics Improve Performance?

Data ➔ Insight ➔ Value

What Are the Main Principles of Analytics?

Treating fact-based decision-making as not only a best practice but also a part of culture

Recognizing the value of analytics, and making their development and maintenance a primary focus

Applying sophisticated information systems and rigorous analysis to a range of functions

Considering analytics to be so important it is managed at the enterprise level

Avidly consuming data and seizing every opportunity to generate information

Emphasizing the importance of analytics internally

Making quantitative capabilities part of the organization’s story

Creating a workforce with strong analytical skills and considering it a key to organizational success

How Do Data-Driven Organizations Act Differently?

The first question a data-driven organization asks itself is not “what do we think?” but “what do we know?”

Decision makers move away from acting solely on hunches and instinct, and move away from citing data to support decisions already made

Sources: [69•, 78]

Figure 1 presents a styled RM decision cycle and its relationship to descriptive, predictive, and prescriptive analytics. Descriptive analytics use statistical methods such as statistical modeling and data mining to provide insight into what happened in the past. This would include real-time and post hoc monitoring of suppression operations to gauge effectiveness and development performance measures related to resource use, productivity, and effectiveness [54, 56, 59]. Predictive analytics use techniques such as forecasting and machine learning to assess what might happen in the future. This would include predictions of fire weather, fire behavior, and potential control locations [80,84,82]. Prescriptive analytics use operation research methods such as optimization and simulation to recommend efficient solutions. This would include assigning suppression resources to tasks such as asset protection or fire line construction [83,87,85]. Note that some applications can entail combining multiple types of analytics, for example, comparison of observed fire size and impacts (descriptive) with simulated size and impacts in the absence of suppression (predictive) to estimate the productivity and effectiveness of suppression operations [86]. Table 2 provides an illustrative set of examples of descriptive, predictive, and prescriptive analytics in the context of wildfire response strategy and operational execution.

Fig. 1
figure 1

Stylized three-stage RM decision cycle and relationship to the three main types of analytics; adapted from [31, 67]

Table 2 Examples of analytics in context of wildfire response strategy and operational execution, adopted from [16, 42••, 87]. Italic text indicates predictive analytics highlighted in this paper that have recently been developed and applied in the western USA. Bold text indicates analytics that are either known knowledge gaps (descriptive) or that have had limited or no application in real-world contexts (predictive and prescriptive)

The usefulness of predictive and prescriptive analytics is predicated on reliable descriptive analytics on fire and fire operations. Descriptive analytics can be used to validate predictive models, which in turn can be used to parameterize prescriptive models. Developing a road map to enhance descriptive analytics is therefore critically important. Thompson et al. [31] and Plucinski [42••] outline operational data collection needs, much of which is reliant on obtaining information from fire crews. However, there is a broader horizon for data capture that could capitalize on advanced technologies like the internet of things and machine-to-machine communication. These ideas are related to the Industry 4.0 initiative (referencing a 4th industrial revolution); the goals of which are to achieve higher levels of operational efficiency, productivity, and automation [88]. These ideas are already being integrated into forest operations and supply chain management, for instance, the installation of sensors and on-board computers onto harvest equipment to provide decision support and streamline operations [89]. Gingras and Charette [90] introduce a Forestry 4.0 initiative seeking to, among other things, improve communication networks in remote locations and enable real-time data exchange between operators and decision centers. Through combination of technologies like radio frequency identification, remote sensing through satellites and drones, and adding sensors and wireless communications to suppression machinery, it may be possible to similarly create a Fire 4.0 initiative.

In the context of supporting strategic and tactical response decisions, predictive analytics might provide the most value. Clearly fire behavior predictions are essential, and the literature is rich with descriptions of fire behavior models, their applications, and their limitations [19, 20, 91,95,93]. Here, we focus instead on analytics more specifically related to operations, which in broad strokes can help assess factors related to safety and effectiveness, for instance, determination of locations to avoid for safety reasons or of locations where control opportunities might be most successful. Researchers have sought to incorporate factors such as fire intensity, heat exposure, safety zones, snag hazard, egress, accessibility, and mobility into these analytics. Table 3 briefly summarizes some of these recent studies.

Table 3 Illustrative set of studies that generate predictive analytics to guide and inform safe and effective response operations

As described earlier, the growing use machine learning techniques (including classification and regression trees, artificial neural networks, evolutionary computation, and support vector machines) could provide rich opportunities for improving predictive models. Machine learning techniques have some advantages over traditional approaches like generalized linear models due to their ability to handle complex problems with multiple interacting elements, and, increasingly, big data streams [99, 100]. They can also require large amounts of data, some of which may come from modern data capture technology, but accounting for extremely rare events in models may also benefit from international collaboration and data sharing.

Table 4 presents an illustrative set of machine learning applications in wildfire. One observation is the lack of models on fine-scale suppression operations, which presents a possible opening for future machine learning applications when such data are available. Future applications could, for example, train models to predict where fire managers will build line, or where operators will locate water and retardant drops. At larger scales, models could predict emerging resource needs for prepositioning of resources and dispatch prioritization.

Table 4 Illustrative set of studies that apply machine learning techniques to wildfire

Because of nested dependencies, prescriptive analytics are currently perhaps the most constrained and have the least direct connection to on-the-ground fire management operations [27, 29]. Prescriptive modeling often serves the role of integrating descriptive data and predictive results through mathematical equations to reflect operational constraints and to optimally achieve management objectives. By design, complicated decision rules and logic could be built in the model to guide the search process for optimal solutions, and uncertainty can be accounted for with probabilistic rather than deterministic frameworks. Such a system could be difficult for humans to track due to its complexity and potential tradeoffs between alternative solutions. However, parameterizing a prescriptive model with unreliable data can lead to infeasible, unreasonable, or suboptimal decisions. Once a decision is suggested by the system, it is often difficult for managers to intuitively understand the quality of the decision and all the reasons for it to be chosen due to unknown causal relationships among statistically weighted variables, and especially due to data quality. A role for prescriptive models, in the near future at least, might be for exploring tradeoffs, generating efficient frontiers, identifying hypotheses to potentially improve decision quality, and performing scenario and sensitivity analyses [e.g., 85, 112•]. Using prescriptive models to directly support fire management decisions not only requires improved data quality and model design but may also require that the models are easy to understand to decision makers and are not viewed as a “black box” with unknown and mysterious inner workings.

Analytics Demonstration and Real-World Application: Potential Control Locations and Fire Line Effectiveness

To further ground these analytics concepts in fire operations, we consider the construction of fire containment line and present recent research and application on that topic. A descriptive model of fire line effectiveness analyzes fire perimeter and operations data to describe where line was built, whether it engaged the fire, and if so, whether it was effective at stopping fire [113•]. A predictive model analyzes historical fire perimeters in relation to environmental and landscape characteristics, and trains a machine learning algorithm to estimate the probability of a given location being a suitable control line to stop fire [82•]. A prescriptive model recommends strategies to managers that combine the most suitable locations to construct line, based on manager preferences regarding damage, cost, and firefighter safety [112•]. At present, the descriptive model is informing agency discussions regarding development of key performance indicators, the predictive model is being actively used to support real-time decision making and strategic planning throughout the western USA, and the prescriptive model is a prototype.

Figure 2 illustrates these interrelated concepts from a post hoc analysis of the 2018 Ferguson Fire (39,200 ha) that burned in the Sierra National Forest, Stanislaus National Forest, and Yosemite National Park in California, USA.Footnote 1 Panel a displays the fire perimeter in relation to locations of constructed fire line, and summarizes fire line effectiveness per the framework of Thompson et al. [113•]. Panel b displays the fire perimeter in relation to an underlying fire control probability surface [82•], with a histogram showing the percentage of fire perimeter length in one of five control probability categories. Panel c displays results of a spatial optimization model developed by Wei et al. [112•] that outlines one possible response strategy based on landscape risk, potential control locations, and manager-specified preferences.

Fig. 2
figure 2

Case study analysis of the 2018 Ferguson Fire, showing a descriptive analytics of fire line effectiveness, b predictive analytics of potential control locations in relation to the final perimeter, and c prescriptive analytics that recommend combinations of control locations to create overarching response strategies, here showing the results of one model run with a user-specified set of preferences regarding cost and safety. For panel a: Tr = ratio of total length of line to perimeter; Er = ratio of engaged line to total line; HEr = ratio of held line to engaged line; HTr = ratio of held line to total line [59, 113•]

Table 5 displays how the fire assessment and planning processes using predictive modeling of potential control locations fits into the analytics management framework. The predictive model developed by O’Connor et al. [82•] has been widely deployed in the USA for pre-fire planning applications as well as for real-time decision support on more than two dozen wildfires during the 2017 and 2018 fire seasons. As of this writing, fire ecologists and analysts have delivered potential control location map products to National Forest System staff, partners, and stakeholders on over thirty landscapes across the western USA. The pairing of such information with quantitative risk assessments and the delineation of strategic response zones are not only enhancing cross-boundary planning processes but also helping managers achieve desirable fire outcomes that protect assets while enhancing ecosystem health [114]. Following an intensive strategic wildfire risk planning process in the spring of 2017, the Tonto National Forest in central Arizona has managed six large wildfires in accordance with pre-identified strategic wildfire response zones that help to align incident response objectives with land and resource management planning direction. The 2017 Pinal, Highline, and Brooklyn fires were managed for ecological restoration, asset protection, and wildfire process maintenance objectives, respectively. During the 2018 fire season, when milder fire weather conditions were prevalent, the Bears, Daisy, and Cholla fires were each managed for ecological restoration and future risk reduction objectives consistent with pre-identified strategic wildfire response zones and corresponding fire response objectives.

Table 5 Analytics Management Framework as applied for predictive model of potential control locations, adapted from [82•, 114]

Discussion

A core idea from this paper is the need to develop a broader analytics strategy for wildfire management, and to infuse analytics into planning, decision, and learning processes. The aims of such an effort would be to align actions with management objectives, enhance transparency and accountability, improve decision support, create learning opportunities, and ultimately improve response safety and effectiveness [115]. Doing so would help fire management organizations redeem some of their core RM responsibilities, such as generating better information to support risk-informed decisions, and continually improving.

We recognize there are a broader range of important incident-related decisions than that considered here, such as routing detection aircraft, stationing and prepositioning suppression resources, and transferring or reassigning resources to different regions or different fires [29, 116,120,118]. Some of these decisions are inexorably linked to response strategies, for example, allocating scarce resources between fires is premised on some understanding of how resources would be used and with what degree of efficacy if assigned to different fires. We opted to focus on incident-level response strategies because these can be high-impact decisions, and because we believe there is great potential for stronger adoption of RM and analytics principles within this decision context.

Looking to the future, fire management organizations will need to modernize data collection and analysis systems associated with fire activity and wildfire suppression operations. The emphasis will be on measuring and monitoring quantitative performance metrics to evaluate operational capabilities, safety, efficiency, and effectiveness. Digital technology integration, including automated resource tracking, is likely to be a key component of new data collection systems. Operations research and industrial engineering, combined with data science, are perfectly suited to this task. These disciplines offer well-defined scientific approaches for performance measurement, process engineering, logistics, and operations management, including health and safety. As stated earlier, similar modernization is already occurring in the forest sector under a variety of precision forestry initiatives.

That many of these technologies have been around for years does raise questions of why fire management organizations have not already turned to them. For example, responders might find tracking technology that allows close monitoring of all of their actions to be invasive, or key decision makers may fear increased liability risks. Organizations are likely to face issues of data governance, privacy and security, leading to data policy questions such as which data to make available, to whom, through what channels, and for what purposes [72, 73, 75]. Limited abilities to effectively manage and standardize the complex data streams collected from various sources decrease the utility and increase the cost of data capture [119]. Another barrier is the need to build workforce capacity to effectively use data science [120]. In addition to addressing data concerns, organizations may not have supportive learning environments in place, where learners have psychological safety to express disagreement or own up to mistakes, where opposing ideas and competing outlooks are welcomes, and where taking risks is appreciated [61].

We see a future with increased automation and prescriptive analytics recommendations; however, fire management will always be a question of human judgment. Every fire event has unique circumstances and potentially unresolvable uncertainties, rendering completely prescriptive approaches infeasible, hence the focus on decision makers. In analytics, many but not all aspects of expert judgment and organizational wisdom may track closely with measurable and predictable metrics of performance and risk. Embracing analytics could have a lot to offer regarding issues of overreliance on expert judgment and decision quality. This includes decomposing the problem into manageable sub-problems and developing operationally relevant decision aids. Through more proactive acquisition and use of more reliable and more trustworthy information, fire management organizations could help dampen some of the biases described earlier and improve fire management decision-making [33, 34].

Although challenges exist regarding capture, interpretation of fire operations data, and advances in scientific understanding, the bigger challenges may well be organizational and cultural. Effectuating a data-driven cultural change is a known barrier to wider spread adoption of analytics [76••]. How to convince fire managers that enhanced monitoring and collection (i.e., descriptive analytics) will not be used for “Monday morning quarterbacking,” and furthermore, how to demonstrate the value of predictive and prescriptive analytics to managers will be some of the fire science community’s challenges moving forward. Stronger collaboration between the fire science and management communities could help here. As summarized by Martell [87]: predictive analytics specialists cannot develop useful predictive models unless they collaborate with fire management organizations that are willing to share their problems and data, and prescriptive analytics specialists cannot develop useful prescriptive models unless they collaborate with predictive analysts and with fire management organizations that are willing to share their problems and test their models. To the degree that analytics results in demonstrably better outcomes in terms of operational safety, efficiency, and effectiveness, it is likely to help incentivize its own adoption over time.

The emphasis on people and culture in analytics frameworks also highlights the need for stronger collaboration with social scientists. There are opportunities to convert experiential evidence into organizational and scientific evidence (i.e., informing the development of meaningful performance metrics), to improve knowledge exchange, to identify potential barriers to meaningful organizational change, and to design communication strategies in light of how information is currently shared within fire response networks [121, 122].

Beyond rolling out the analytics management framework for an expanded set of fire operations, we see a number of productive pathways forward. As a possible interim strategy, the fire community could seek to more comprehensively catalog lessons learned [86] and develop more decision support systems based on elicitation of expert judgment [123]. On the technical side, analysts can capitalize on the growing use of machine learning techniques in forest and fire modeling. The widespread use of open-source technology also presents opportunities for sharing data and code [124]. On the organizational side, performance management systems could be redesigned with risk in mind to better account for non-deterministic relationships between decisions and outcomes [125]. Performance measurement could also expand to a systems-based approach that conceptualizes effective response not just in operations terms but also factors such as evacuations, road closures, and public information [13•].

Conclusion

The translation of scientific ideas and insights into practice can be essential for improving decision quality and cultivating stronger RM acumen. A number of hurdles remain: increasing investment in data capture and analysis technology, dealing with data privacy and security issues, developing better models and decision support tools, and catalyzing organizational and cultural change. Despite these challenges, we believe there are ample opportunities for use of descriptive, predictive, and prescriptive analytics in fire. We provided one example of a fire line effectiveness framework and its role in serving as a connective thread between the three types of analytics. Similar cycles of the analytics strategy could be pursued for mop-up, aviation, burn out operations, and structure protection. Developing core competencies with analytics and cultivating a “Moneyball for fire” paradigm may help fire management organizations on their RM journey.