Decision-Making: Preventing Miscommunication and Creating Shared Meaning Between Stakeholders

  • Emma E. H. DoyleEmail author
  • Douglas Paton
Open Access
Part of the Advances in Volcanology book series (VOLCAN)


The effective management and response to either volcanic eruptions or (often prolonged) periods of heightened unrest, is fundamentally dependent upon effective relationships and communication between science advisors, emergency managers and key decision makers. To optimise the effectiveness of the scientific contribution to effective prediction and management decision making, it is important for science advisors or scientific advisory bodies to be cognisant of the many different perspectives, needs and goals of the diverse organisations involved in the response. Challenges arise for scientists as they may need to be embedded members of the wider response multi-agency team, rather than independent contributors of essential information. Thus they must add to their competencies an understanding of the different roles, responsibilities, and needs of each member organisation, such that they can start to provide information implicitly rather than in response to explicit requests. To build this shared understanding, the team situational awareness (understanding of the situation in time and space), and the wider team mental model (a representation of the team functions and responsibilities), requires participating in a response environment together. Facilitating the availability of this capability has training and organizational development implications for scientific agencies and introduces a need for developing new inter-agency relationships and liaison mechanisms well before a volcanic crisis occurs. In this chapter, we review individual and team decision making, and the role of situational awareness and mental models in creating “shared meaning” between agencies. The aim is to improve communication and information sharing, as well as furthering the understanding of the impact that uncertainty has upon communication and ways to manage this. We then review personal and organisational factors that can impact response and conclude with a brief review of methods available to improve future response capability, and the importance of protocols and guidelines to assist this in a national or international context.

1 Introduction

Whether it involves a period of unrest (e.g., Long Valley, CA in 1982), an ongoing eruption (e.g., Soufrière Hills Volcano, WI), or responding to a blue sky eruption (e.g., Ruapehu, NZ, 2007; Mt Ontake, Japan, 2014), the response to complex volcanic crises requires the coordinated and complementary contributions of numerous organizations and agencies. The degree to which this can effectively be achieved depends on whether the quality and degree of relationship and network building conducted before, during, and after, a crisis can facilitate the shared understanding required for communicated information to enhance effective decision making.

The challenge in this task is twofold. The first relates to the need to bring representatives from diverse sources together (Paton et al. 1998; Doyle et al. 2015), including technical advisors (such as geologists, geophysicists, engineers, and social scientists), emergency management (civil defence, fire service, police, army, national and local government), lifeline organisations (lifelines companies, transport, water), as well as community organisations and special interest groups (e.g. neighbourhood support and volunteer groups, Rotary, Lions club, etc.). A major challenge to developing effective crisis management arises because these representatives bring with them different objectives, priorities and interpretive and operational beliefs (Paton et al. 1998; Doyle et al. 2015). The second task is thus how to facilitate the ability of these representatives to collaborate and share knowledge in order to effectively respond to a crisis.

Recognition of the diverse consequences volcanic crises create can result in organisations appreciating why they need to be part of a multi-agency group response. However, this appreciation does not automatically translate into acceptance of either the need to develop new roles and responsibilities, or that crisis response goals may need to be reconciled with the political or economic pressures that each representative brings with them to the crisis response environment. A further challenge to the scientific community arises from the need for some of them to be embedded members of the wider response multi-agency team, rather than independent contributors of essential information. For example, to enhance interagency communications during recent hazard events and exercises in NZ, members of the GNS Science team were situated as liaison officers within the Emergency Operations Centre (EOC) and responded within the emergency management team itself. In addition, the crisis response context can introduce a need to deal with demands that would rarely, if ever, be encountered in routine work contexts and that can elevate levels of stress and interfere with decision making.

The atypical demands that can impair response during a disaster were evident in evaluation of the multi-agency response to the Ruapehu 1995–1996 eruptions. These demands included: intense media interest or public scrutiny, resource availability and adequacy, co-ordination problems, a lack of defined responsibility for co-ordination of response, inadequate communication with other organisations, conflict between agencies, and inadequate and changing role definition (Paton 1996; Paton et al. 1998, 1999) (Table 1). The IAVCEI Subcommittee for Crisis Protocols (IAVCEI 1999) provides additional examples of problems commonly experienced in volcanological response (see Table 2), each of which correspond to the disaster stressors identified by Paton (1996; see Table 1).
Table 1

Potential stressors that negatively impact on response capability and personal and team performance when responding to or managing crisis events and disasters (after Paton 1996)

• Degree of warning or change in conditions (low warning times or rapid change increases physical and psychological demands)

• Degree of uncertainty from event and organizational sources

• Time of day (stress greater at night and when having to respond at the end of a working day)

• Presence of traumatic stimuli (such as sensationalised news coverage)

• Lack of opportunity for effective action (attributions about perceived response failure can be internalised rather than more accurately attributed to environmental factors outside of their control)

• Knowing victims or families

• Intense media interest or public scrutiny directed at event management and those responsible

• Higher than usual or expected responsibility

• Higher than usual physical, time and emotional demands (including cumulative stress over time)

• Contact with those affected

• Resource availability and adequacy (and how these change over time)

• Co-ordination problems

• Conflict between agencies

• Inadequate and changing role definition

• Inappropriate leadership practices

• Single versus multiple threats

Table 2

Common problems of professional interaction of volcanologists during crises, as identified in the IAVCEI subcommittee for crisis protocols (IAVCEI 1999)



Poor communication and teamwork among scientists

• Failure to value diverse scientific expertise, approach, and experience

• Overselling of new methods

• Failure to honour prior work on a volcano, and, in the reverse direction, failure to share study opportunities

• Failure to share information and scarce logistical resources

• Failure to work as a single scientific team, and thus loss of potential synergism, i.e., loss of a cooperative result that is greater than the sum of individual results

• Failure of scientists to use a single voice for public statements

• Failure of science-funding agencies, job supervisors, and promotion panels to give full credit for self-sacrifice and teamwork during volcanic crises

Leadership problems

• Leaders without leadership skills

• Failure of leaders to recognize the limits of their own technical expertise

• Confusion about team roles, policies, and procedures

• Failure to encourage those who can and wish to help

• Failure to develop (a) respect for scientific differences within a team, (b) a method for developing consensus, and (c) a means for acknowledging differences that cannot be resolved

• Failure to balance risk and rewards of dangerous field work

• Failure to recognize and minimize fatigue

Issues for visiting scientists, invited and uninvited

• Scientists who arrive at a crisis without invitation

• Invitations from other than the primary scientific team, e.g., from a competing or peripheral local group

• Unilateral foreign funding decisions

• Cultural differences regarding scientific discussion and decision making

• Public statements by visiting scientists

• Pre-emption of research and publication opportunities by visitors, while local scientists are still busy managing the crisis

Unwise and unwelcome warnings

• Warnings from pseudo-scientists

• Warnings or forecasts from scientists from other fields

• Warnings or forecasts by volcanologists working in isolation, either on-site or far from the volcano in question

• Exaggerated statements of risk, or, conversely, overly reassuring statements about safety of an area when significant risk exists

• Outdated warnings or forecasts in need of change

Poor communication between scientists and public officials

• Unfamiliarity with each other’s needs and expectations, methods, expertise, and limits

• A conscious decision to withhold or delay some hazards information

• Official scepticism of scientific advice

• Procedural failures in communication with public officials:

   – Failure to put warnings in writing, for clarity and later accountability

   – Failure to distribute warnings to all key parties. Failure to establish a clear “chain of communication” between scientists, public officials, and external agencies such as civil defines

   – Failure to confirm that officials truly understand our warnings

Ineffective relations with news media

• Inadequate interaction with the news media

• Premature or excessive interaction with the news media

While some potential event-related stressors reflect the dynamics of hazard impacts (e.g., volcanic ash affecting communication infrastructure), others reflect inadequacies in crisis communication systems and the expertise available to use them (Paton et al. 1998, 1999; Johnston et al. 1999). In the absence of the development of appropriate crisis management procedures and training in crisis management, which is the norm, the associated negative reactions can detrimentally affect performance and decision making (e.g., physiological and psychological symptoms of anxiety and fear, “tunnel vision”, failure to prioritise, “freezing” and loss of concentration; Flin 1996; Flin et al. 1997; Klein 1997; Paton et al. 1999). It thus becomes important to identify the management systems and procedures, and personal and team capabilities, required to facilitate effective multi-agency response and use this to inform the training needs and training strategies adopted in all response agencies. Evaluation of previous volcanic crisis management experiences can provide a good starting point for this process.

Mitigating these issues prior to a crisis, particularly at the science:decision-maker interface is important, as effective multi-organisational management needs to be built on “consensus about task goals and priorities; co-operation and team framework; a sense of group identity; a strong sense of community within the organization; and the breakdown of bureaucracy and formalities” (Paton et al. 1999, p. 17). Addressing these issues requires an appreciation of how a shared understanding of response needs can be achieved prior to a crisis, such that this multi-organisational, multi-level, multi-team response is managed effectively, and how communication impacts the quality of decision making. In lieu of real events, these capabilities can be developed through shared exercises, scenario planning, and other relationship building activities, which have training and organizational development implications for scientific agencies.

In this chapter we review the fundamentals of decision making at the individual, team, multi-team, organisational and agency levels (Sect. 2), by drawing on psychological and critical incident management research. We review the concepts of mental models, which are an individual’s representation of a situation such as a response environment including needs, responsibilities, and interdependencies; or a representation of a system such as volcanic unrest which incorporates their internal, personalised, experiential, and contextual understanding of how the volcanic system operates. We discuss how these mental models contribute to a “shared meaning” between agencies (Sect. 3), and how that relates to communication and information sharing; as well as the impact that uncertainty has upon communication and ways to manage this. We then review a number of personal and organisational factors that can impact response (Sect. 4), and conclude (in Sect. 5) with a brief review of the methods available to enhance response, and the importance of protocols and guidelines to assist this in a national or international context. Throughout this chapter we focus on the response phase of a crisis. There are however many other complementary approaches to enhance risk communication with communities living with volcanic risk, which we do not consider here, including community-based disaster risk management and other participatory techniques (see review in Barclay et al. 2008; and also Williams and Dunn 2003; Cronin et al. 2004; Gaillard 2006; Cadag and Gaillard 2014).

2 Introducing Decision Making

During a volcanic crisis, several decision making styles and processes are required. Decision making itself has been studied extensively across a range of fields. Here we focus on those supported through research in crisis and risk management contexts (Lipshitz et al. 2001; Doyle and Johnston 2011).

2.1 Individual Processing Systems

Considering first the processes occurring at an individual level, the field of psychology offers us understanding of the theory of two “parallel processing systems” (Epstein 1994; Sloman 1996; Chaiken and Trope 1999; Slovic et al. 2004). The first, known as either Type 1 or the affective processing system, involves rapid, unconscious, action-oriented processing, and results in people interpreting risk as an emotional state or feeling (e.g., fear, dread, anxiety; Epstein 1994; Loewenstein et al. 2001; Slovic et al. 2004; Doyle et al. 2014b), and can thus reduce or increase risk perceptions. These are assumed to be the default response “unless intervened by distinctive higher order” Type 2 processes (Evans and Stanovich 2013a), or analytical processing systems (Epstein 1994), which heavily load working memory, and utilise hypothetical thinking, more deliberate computational cognitive processes (and thus longer decision times). These are learnt processes that apply rules and procedures (algorithms, normative rules and logic) to the analyses of data and to justify actions (i.e., to respond to demands rather than reacting to them).

As of Doyle et al. (2014b, p. 78) we consider that “the adoption of the affective and analytical processing systems [is] not an either-or situation, but rather a more complex balancing act influenced by the degree of uncertainty or threat in the decision context, and … relative experiences” (Keren and Schul 2009; Kruglanski and Gigerenzer 2011; Evans and Stanovich 2013a, b; Keren 2013; Osman 2013; Thompson 2013). Thus, if time permits, scientists tend to adopt the analytical process due to their formal training in data analysis and decision making. Meanwhile, non-scientists adopt a more affective process dependent upon prior experience, time pressures and operating procedures. However, when scientists are called upon to respond to atypical demands (particularly in a multi-agency context), if there are no formal or procedural rules to abide by, the affective system usually prevails (Loewenstein et al. 2001) and decision making effectiveness is compromised as a result (Weber 2006). Mitigating this problem calls for all those interacting in decision making to receive training which develops competency in different decision making styles, and, importantly, practice using them in simulated and actual crisis events.

2.2 Incident Management and Naturalistic Decision Making

The analytical (or Type 2) decision making has been identified as having four steps (Flin 1996, p. 141–142): (1) identifying the problem; (2) generating a set of options; (3) evaluating these options; (4) implementing the preferred option (Saaty 2008). However, this assumes a ‘perfect’ environment. In reality, most decisions are made in uncertain ‘naturalistic settings’ defined by: ill-structured problems; uncertain dynamic environments; shifting, ill-defined, or competing goals; action/feedback loops; time stress; high stakes; multiple players; and influences from organizational goals and norms (Zsambok 1997; Crichton and Flin 2001; Klein 2008; Doyle and Johnston 2011). Research into incident management has identified four distinct ‘naturalistic decision making’ processes seen in these conditions (Crego and Spinks 1997; Pascual and Henderson 1997; Crichton and Flin 2002): (1) recognition primed and intuition led action; (2) action based on written or memorized procedures; (3) analytical comparison of different options; and (4) creative designing of a novel course of action; ordered in terms of decreasing pressure and time commitments.

Within a crisis, an individual decision maker (whether scientist or emergency manager) may move along this spectrum of decision processes depending upon the evolving conditions, and will not be limited to just one decision making style (see Martin et al. 1997, p. 283; Doyle and Johnston 2011). Those operating at a strategic level should use the analytic style to accommodate the broader perspective required under these circumstances (Paton et al. 1998, 1999; Paton and Flin 1999). For those working at a tactical/coordinating level, an analytical approach should be adopted in (relatively) high time, low risk circumstances (such as when planning courses of action between eruption episodes and identifying future eruptive scenarios). However, in an eruption phase, rapid decisions need to be taken in minutes, making adoption of naturalistic decision making styles essential. For example, in Exercise Ruaumoko (which simulated the response to the lead up to an eruption in Auckland; MCDEM 2008; McDowell 2008), on site science-liaison officers often found themselves having to give almost instantaneous responses to officials during the peak of the crisis, which would have encouraged more recognition primed decision making.

Fundamental to all of these decision making processes, and an effective decision resulting from those processes, is individual and team situational awareness (SA, Endsley 1997; Martin et al. 1997), which is the understanding of the situation and needs in both time and space. This encompasses a capacity to use key environmental cues to comprehend the current situation (in relation to goals) and to project future status. Training in this essential competence enhances the ability of decision makers to anticipate and make proactive decisions that deal more effectively with emergent issues. During the initial and on-going situation assessment, individuals and team members play crucial roles in this process (Sarna 2002). As stated by Doyle and Johnston (2011, p. 75), “a decision maker may make the correct decision based on his or her perception of the situation, but if his or her situation assessment is incorrect, this may negatively influence his or her decision (Crichton and Flin 2002)”. Because the inputs into decision making can come from different professionals and/or from team members that may be geographically dispersed (e.g., in an EOC, and at the volcano), decision making training must include the development of distributed decision-making skills (where the decision-making responsibility does not lie with a single entity, but rather is distributed throughout the responding organisations; Flin 1996; Paton et al. 1998; Kapucu and Garayev 2011). For distributed decision making to work effectively, decision makers must have some shared meaning (mental model) about the event and their respective roles in defining and resolving response problems.

3 Shared Meaning in Multi-agency Response: Mental Models

An individual’s mental model of a hazard is defined by Bostrom et al. (2008, p. 308) as “how people understand and think about the hazard, and their causal beliefs”. For incident management, this represents a mental “map” of the operating environment. This must encompass event characteristics and hazard consequences and each other’s differing needs, responsibilities, roles, and demands, as well as the interdependencies that contribute to effective problem solving and decision making (Rogalski and Samurçay 1993; Flin 1996; Paton and Jackson 2002). A shared mental model allows a distributed team to share understanding of the task at hand, anticipate and proactively respond to information needs (Lipshitz et al. 2001; Pollock et al. 2003), and make shared decisions (Orasanu 1994; Salas et al. 1994).

From an incident management perspective, the major challenge is the need for participants to be able to continue to use their “routine” and “expert” mental models of the constituent hazard consequences and individual response. These need to complement an over-arching or super-ordinate mental model, which is a cooperative mental model that integrates individual mental models that describe their understanding of: their role within their organisation, how they relate to others within their organisation, and their organisation’s role within, and communication with, the wider response team (see Figs. 1 and 2). It is this super-ordinate model that facilitates communication.
Fig. 1

Example of the information flow, communication network and many agencies involved in an eruption, as observed from the response to the 1995 Ruapehu eruption by Paton et al. (1999). CAA Civil Aviation Authority; DOC Department of Conservation; ECNZ Electricity Corporation of New Zealand; GNS Institute of Geological and Nuclear Sciences (now GNS Science); MAF Ministry of Agriculture and Fisheries

Fig. 2

a Examples of the various mental models within an individual’s over-arching or super-ordinate mental model during a volcanic crisis. b A poor shared mental model between individuals. c A good shared mental model between individuals

3.1 Shared Meaning in Multi-agency Response: Communication

Information received by emergency managers from scientists will be considered with respect to other demands and issues placed on emergency managers considering the risk to lives, economies and infrastructure. The communication of risk information between emergency managers and the public, or scientists and emergency managers, is also subject to the mental models gulf (Morgan et al. 2002). This is where there is a gap between “what experts know and the plan they develop, versus what key public know and prefer” (Heath et al. 2009, p. 129; see also Doyle et al. 2014a). By minimising this gulf, communication between advisors and key decision makers can move from explicit requests for information (which can result in an increase in time delays, pressures and stress, impacting decision making effectiveness—particularly if reformatting of that information is also required; Klein 1997; Crichton and Flin 2002) through to implicit supply of advice by the advisors as they recognise ahead of time what information the decision maker needs. Effective teams have been shown to be dominated by communication styles such as this (Paton and Flin 1999; Lipshitz et al. 2001; Paton and Jackson 2002; Kowalski-Trakofler et al. 2003). Table 3 describes the characteristics of teams that display these effective communication and advice provision styles.

Implicit communication also facilitates the maintenance of situational awareness during periods of dynamic information as it allows decision makers to focus on task management (see review in Doyle et al. 2015; Paton and Jackson 2002; Paton 2003; Wilson et al. 2007; Owen et al. 2013). For this to occur, these scientists and experts must recognise and understand the needs of the decision makers, as well as their timelines and thresholds, supplying information that is useful, useable, and used (Rovins et al. 2014).
Table 3

Characteristics of teams that display effective communication and advice provision styles during a multi-agency crisis response

Provision of science advice should involve (Paton and Jackson 2002; Paton 2003)

Effective team communication involves (Wilson et al. 2007; Owen et al. 2013)

Effective team and interagency science advice communication (Doyle et al. 2015)

Anticipation and definition of information needs

Accurate and timely information exchange, correct phraseology and ‘closed-loop’ communication techniques

Should consider the ability of diverse stakeholders to interpret data communicated, and apply to resolve response issues while operating in a collaborative environment

Organized networks between information providers and recipients

Coordinated behaviour based on shared knowledge, performance monitoring, back up and adaptability

Needs the development of a super-ordinate team identity and the ability to switch between agency and shared mental models as required

Established capability to “provide, access, collate, interpret and disseminate information compatible with decision needs and systems”

Co-operative team orientation, efficacy, trust and cohesion

May involve stakeholders who rarely interact with one another outside the context of managing a volcanic crisis, and so depends on emergent team dynamics and concepts such as “swift-trust” (Meyerson et al. 1996)

During the Ruapehu 1995–1996 eruptions, comparison of pre-existing networks with information providers revealed both incomplete networks and inconsistencies with respect to information sources, in particular with the main information provider GNS Science (Paton et al. 1998, 1999). This resulted in agencies seeking information in an ad hoc basis (i.e. through explicit requests) which would have contributed to communication difficulties, as evidenced by the 37% of organisations who reported inadequate communication during the response (ibid). One way of mitigating this issue involves advisors developing a capacity to, where possible, anticipate others information needs (Doyle and Johnston 2011; Doyle et al. 2015).

Advisors must recognise the very specific information and advice needs of decision makers prior to an event and have procedures in place in advance of an event to provide that information in a timely manner directly where it is needed within the organisational structure. For this to effectively occur scientists (either as on-site science advisors, or off-site expert panels) must not just be external experts to a multi-agency response team, but must be considered part of the extended and distributed team handling the emergency management response (Doyle and Johnston 2011). Integrating Science Advisory Groups (SAGs) into a wider response team offers the opportunity for technical and scientific experts to directly inform effective planning, intelligence gathering, and decision making of the emergency personnel and government officials.

The Auckland Volcanic Science Advisory Group (AVSAG), an example of such a SAG, was tested during Exercise Ruaumoko (MCDEM 2008; McDowell 2008). From this it was identified that the AVSAG process facilitated the provision of valuable advice in a clear, timely manner. A clear advantage was the presence of a science advisor in both the National Crisis Management Centre and the Auckland Civil Defence and Emergency Management (CDEM) Group EOC, maintaining shared situational awareness between the SAG and the emergency management team. However, having two on-site liaison officers at different EOCs did result in a divergence of advice in these two locations at the peak of the crisis (Cronin 2008), as shared situational awareness could not be maintained between them.

These past experiences, through both response and exercises, has identified that this SAG approach is beneficial as it provides “one trusted source” for science information during a crisis (MCDEM 2008; Smith 2009), facilitates an integration of a wide range of expert opinions required to manage uncertainty during decision making (as recommended by Lipshitz et al. 2001) and can help combat issues that may arise due to conflict between scientists (Barclay et al. 2008). It also enabled the volcanologists to speak with “a single voice” to reduce confusion, as advocated by the IAVCEI protocols (1999).

3.2 Shared Meaning: Uncertainty

If the volcanic crisis environment was perfectly predictable, the development of the relationships and competencies discussed above would be a relatively straightforward task. However, volcanic crises present evolving, emergent demands, and a highly uncertain response environment. The IAVCEI protocols (1999) highlight that uncertainty should always be acknowledged. This raises two issues. Firstly, how uncertainty should be communicated. Secondly, how uncertainty influences the quality of the relationships between individuals.

Regarding the first, reviews by Doyle et al. (2014a, b), identified that there is much discourse as to whether revealing the uncertainties associated with a risk assessment will strengthen or decrease trust in a risk assessor and their message, and how it impacts decision-making behaviour (Miles and Frewer 2003; Wiedemann et al. 2008). On the one hand, the communication of uncertainty has been suggested to enhance credibility and trustworthiness. On the other, however, studies have suggested that it can decrease people’s trust and the credibility of the provider, as it can allow people to justify inaction or their own agenda, or to perceive the risk as being higher or lower than it is depending on their personal attitudes (Johnson and Slovic 1995, 1998; Smithson 1999; Miles and Frewer 2003; Johnson 2003; Wiedemann et al. 2008; Doyle et al. 2014b). The role of ethics in whether or not to communicate uncertainty has also become a focus of recent discussion across disciplines, including whether communicating this uncertainty actually enhances or diminishes the autonomy of the receiver of the message, and whether it produces an overall benefit or can actually cause harm (Han 2013; Austin et al. 2015; Grasso and Markowitz 2015). Keohane et al. (2014) suggest that scientists ‘should understand their own ethical choices in using scientific information to communicate to audiences’ (p. 343), and identify five principles for scientific communication under uncertain conditions: honesty, precision, audience relevance, process transparency, and specification of uncertainty about conclusions.

To address how to manage and communicate uncertainty, many disciplines including volcanology, climate change, and meteorology (IAVCEI 1999; Moss and Schneider 2000; Gill et al. 2008; Mastrandrea et al. 2010; see also Moss and Schneider 2000; Patt and Dessai 2005; Budescu et al. 2009; Doyle and Potter 2016), have established guidelines that advocate the clear and transparent communication of uncertainty, a documentation of all processes related to uncertainty, and the use of formalised probabilistic terms and frameworks for assessment and communication (see Table 4). In volcanology, it has become increasingly popular to use probability statements in communications (Doyle et al. 2014a), which involve knowledge of both the dynamical phenomena and the uncertainties involved (Sparks 2003). Further, the use of probabilistic cost benefit analysis and Bayesian Event Trees has been driven by a desire to make objective and traceable decisions via quantitative volcanic risk metrics (Aspinall and Cooke 1998; Marzocchi and Woo 2007; Woo 2008; Lindsay et al. 2009).
Table 4

A summary of the existing guidelines for communicating uncertainty from the volcanological, weather and climate change communities

Budescu et al. (2009)

• “Make every possible effort to differentiate between the ambiguity of a target event and its underlying uncertainty”

• “Specify the various sources of uncertainty underlying key events and outline their nature and magnitude, to the degree that this is possible”

• “Use both verbal terms and numerical values to communicate uncertainty”

• “Adjust the width of the numerical ranges to match the uncertainty of the target events”

World meteorological office (Gill et al. 2008)

Uncertainties should be communicated:

• “For improved decision making”—especially when they have many options available to them, to weigh up contingencies

• “Helps manage user expectations”—a more open, honest and effective relationship

• “Promotes user confidence”—If users understand forecasts have a degree of uncertainty… they can tune their decision-making to manage this uncertainty …

• “As it reflects the state of science”

Moss and Schneider (2000, p. 37) (as used by the IPCC; see also Patt and Dessai 2005; Mastrandrea et al. 2010)

• Identify the most important factors and uncertainties that are likely to affect the conclusions

• Document ranges and distributions in the literature [… the key causes of uncertainty …]

• Make an initial determination of the appropriate level of precision [… after considering the state of the science and the nature of the uncertainties…]

• Characterize the distribution of values that a parameter, variable, or outcome may take

• Using standard terms, rate and describe the state of scientific information on which the conclusions and/or estimates are based

• Prepare a “traceable account” of how the estimates were constructed

• Use formal probabilistic frameworks for assessing expert judgment

• Consider target audience and develop a pluralistic approach (Patt and Dessai 2005):

   – Sophisticated part in numeric format

   – General chapters using verbal and narrative phrase

   – Formalise translation between numerical and verbal probabilities

IAVCEI (1999) subcommittee for crisis protocols

• Uncertainty should be acknowledged

• Forecasts, warnings, and other important public statements are best when written first

• Date-stamped, team-approved hazard maps, together with their assumptions, should also be entered into the formal record of warnings. Competing or uncoordinated, multiple hazards maps are confusing to the public and should be avoided

• Scientific caution in the face of uncertainty is good, but it needs to be balanced against the legitimate information needs of decision makers and the public at risk. If the data do not allow a definitive forecast, factual statements about what is known are an important step. Warnings of serious events that are known to be possible, issued before such events can be forecast as probable, may hasten precautions and save lives

• Use probabilities to calibrate qualitative assessments of risk. Avoid commonly used adjectives such as “soon” or “high-” or “low-(risk),” because they mean different things to different people. Probabilities and comparisons to familiar non-volcanic risks help to avert misunderstanding that risk is higher or lower than it actually is

• Under no circumstances should hazard be intentionally overstated or understated. Any decision to “err on the safe side” should be a conscious, openly discussed decision. Never disregard what seems like a low-probability, “worst case” event, because such events can and do occur (e.g., Mount St. Helens and Pinatubo). Instead, estimate the probabilities of worst-case and lesser scenarios, as above, to put the “worst-case scenario” in proper perspective

Doyle et al. (2014a, b) see also the operational guidelines in Doyle and Potter (2016)

• There is a need to adopt formal numerical and verbal probability translation tables that are specific to volcanology

• If communicating time window forecasts, be consistent in the use of either “within” or “in” throughout out all statements, bulletins and reports; particularly for long window forecast statements where “within the next X days” has a statistically significant different interpretation to “in the next X days”

• Scientists should make all possible efforts to communicate forecasts, likelihoods, and probabilities over a range of relevant time windows, including a probability forecast for a shorter immediate time window in particular (such as the first 24 h)

• Any formalised communication strategy should be accompanied by exercises, simulations, and education programs with both the decision-makers and the public to help facilitate a greater understanding of the complexities inherent in these uncertain forecasts

However, probabilistic statements, whether in numeric or linguistic formats, can commonly be misinterpreted because their framing, directionality and probabilistic format can bias people’s understanding, thereby affecting people’s action choices (see Fig. 3; e.g., Teigen and Brun 1999; Karelitz and Budescu 2004; Honda and Yamagishi 2006; Joslyn et al. 2009; Budescu et al. 2009; Lipkus 2010). This has been identified as a particular issue in volcanic crisis communications (IAVCEI Subcomittee for Crisis Protocols 1999; Cronin 2008; Haynes et al. 2008; Solana et al. 2008; McGuire et al. 2009; Doyle et al. 2014a; Doyle and Potter 2016). Thus scientists communicating in a multi-agency volcanic response must consider the best practice guidelines that have been established across a range of disciplines to address this issue (see Table 4).
Fig. 3

The factors that affect the interpretation of forecasts, and the influences on resultant decision making (Doyle et al. 2014b, after the graphical abstract of Doyle et al. 2014a)

The second issue regarding uncertainty in a crisis is its influence on the quality of the relationships between responding individuals, and thus the quality of mental models and performance, discussed next.

4 People and Organizations

A common thread running through the previous sections concerned the crucial role shared mental models played in cooperative action. It is important to appreciate how organizational and work-family relationships influence the ease with which respective agency representatives can “participate” in a super-ordinate mental model and the degree to which they cope with and adapt to crisis demands (i.e., how organizational factors influence susceptibility to stress and thus performance).

4.1 Organizational Characteristics

Organizational characteristics influence the mental models that agency representatives bring to the crisis management environment. The organizational socialization (the norms, customs, and ideologies of that organisation) and the organizational cultural change that occurs when organisations interact, will influence the thinking and behaviour of people within an organization, as well as their mental models of communication between organizations. These processes then spill over to affect how volcanic crisis events and their consequences are responded to and interpreted (Paton et al. 2009; Paton and Norris 2014). The knowledge and interpretive processes that representatives bring to the crisis response role as a result of their organizational cultural provides the foundation (capabilities represented by pre-existing mental models) for their individual contribution. However, as introduced above, the need for diverse organizational representatives to integrate their respective contributions in complementary ways requires the development of a super-ordinate mental model defined by the collective and multifaceted demands of the crisis event. The ease with which this can be achieved is complicated by the fact that this is done in a climate of uncertainty. Interaction under uncertainty can result in pre-existing beliefs dominating which may or may not be amenable to alteration. This then prevents the development of inter-agency trust and undermines collective performance (Paton et al. 2009). When dealing with atypical, challenging crisis events, emergency management representatives, by virtue of the need for them to play complementary roles in defining and managing complex events, become more reliant on others for information and guidance about how to respond.

4.2 Trust

Faced with uncertainty, when decision makers are reliant on one another for information and decision making, trust plays a pivotal role in facilitating sustainable collaboration in multi-agency crisis response contexts (Siegrist and Cvetkovich 2000; Mayer et al. 1995). Trust influences organizational intention to collaborate and share information between stakeholders (Mohr and Spekman 1994; Kapucu 2006).

Trust was identified as a key issue for relationships between scientists and officials during the eruptions of Soufriere Hills Volcano, Montserrat, WI, where Haynes et al. (2007) identified that this trust was influenced by a number of factors including: competence; integrity; value similarity; openness; and conflicting messages of safety and danger. Trust in the scientists was based upon high perceived reliability, competence, openness and integrity; while trust in government authorities was based upon high perceived levels of competence, reliability, and fairness.

The diversity of, for example, agencies, organizational cultures and operating practices brought together into a crisis management environment, and the need for diverse volcanic hazard consequences to be managed by agency representatives that differ in levels of familiarity with each other, can threaten the degree to which crisis management activities are characterized by trust (Dietz et al. 2010). In part, this reflects the lack of familiarity between interacting agencies. It is also influenced by how organizational cultural characteristics ingrained in routine work, such as hierarchical reporting practices and levels of bureaucracy (including command and control expectations), influence the relationships that emerge in an inter-agency crisis management context (Dietz et al. 2010; Dirks and Ferrin 2001).

Scientists tend to bring experience of working in organizational cultures that emerge in flatter, more organic organizational cultures in which information flow is common. This makes it easier for them to engage in practices that focus on sharing information and building trust. However, representatives from government departments and emergency services agencies, whose routine culture is characterized by generally higher levels of hierarchical relationships and reporting, tend to be predisposed towards maintaining their own agency-based independence. This fosters an emergent culture of rivalry among organizations in ways that work against information sharing between agencies during a crisis (Waugh and Streib 2006; Iannella and Henricksen 2007; Marcus et al. 2006; Marincioni 2007). Furthermore, these predisposing cultural features can result in relationships that are characterized by in- and out-group differentiation. These in- and out- groups then affect the quality of information flow and increases the likelihood of information being restricted to members within their own organization (or includes those with whom they are familiar), rather than sharing information across all stakeholders (Militello et al. 2007; Owen 2013).

These factors interact to not only constrain information flow, but in the process, introduce significant challenges to the development of the level of trust required for effective collaboration and decision making under conditions of uncertainty (McKnight et al. 1988; Banai and Reisel 1999; Siegrist and Cvetkovich 2000). Thus the agency representatives brought together for response needs may not have the mutual interaction experiences needed to forge trust in each other, with aspects such as cultural diversity adding to this challenge. Thus, there is an important need to develop trust in situ while responding in a high demand environment.

If trust is absent, those working in an EOC setting are more likely to focus on task demands in ways that reflect their core expertise and normal operating practices, rather than functional collaboration in ways that ensure they work in complementary ways to resolve multi-faceted response problems (Pollock et al. 2003). This reduces their capacity to contribute effectively to the emerging needs of the response (Pollock et al. 2003). The dynamic, prolonged nature of volcanic crisis response thus requires different ways of ensuring the development of trust. The concept of swift trust represents an approach to trust building in situations where people must collaborate on complex, evolving volcanic crisis tasks under high risk, low time constraints that preclude the development of trust through normal means (Goodman and Goodman 1976; Meyerson et al. 1996; Hyllengren et al. 2011; Faraj and Xiao 2006; Robert et al. 2009; Lester and Vogelgesang 2012; Crisp and Jarvenpaa 2013).

For swift trust to develop, “team” members must be assigned specific roles (that align with key response issues and needs) in the temporary work group (Meyerson et al. 1996). This can be facilitated by establishing a super-ordinate mental model that makes the key contributions of all agencies to effective whole-of-incident management clear. It can be developed pre-event via use of techniques such as cross training in organizational crisis management training (Blickensderfer et al. 1998). Second, swift trust emerges if members are informed that there is a high likelihood of future collaboration (in incident reviews, simulations) with those with whom they are collaborating (Goodman and Goodman 1976; Meyerson et al. 1996). Finally, swift trust develops by ensuring that all participants identify that success relates to the super-ordinate management as much as it does to how they contribute their personal expertise (Curnin et al. 2015). This developmental task focuses on how the input of different representatives is necessary to develop a holistic response to multifaceted demands that exceed the expertise of any one agency. Doing so facilitates role clarification, and increases the capacity of team members to understand that each stakeholder brings to the collective task, their specialist skills and knowledge as required to fulfil one of several specialist roles in the multi-agency team (Kramer 1999). This, in turn, enables the development and performance of collaborative working practices and supports emergent multi-agency coordination.

4.3 Work-Family

Another element that has a significant bearing on people’s stress management and performance in high risk settings and that has not been considered in volcanic crisis response contexts concerns how “work-family” relationships affect how well people cope with working in high risk, high stress contexts (Paton and Norris 2014). Organizations that take steps to facilitate family involvement in the employment experience, for their personnel who work in high risk settings, record better communication and trust between personnel and management, and thus more effective stress management in high risk work settings. Family involvement includes, for example, providing support services for partners and children when personnel are deployed, providing regular updates for partners and other family members when personnel are deployed, providing roles (e.g., administrative roles, media liaison roles) and setting up peer support programs for partners (Paton and Kelso 1991; Paton and Norris 2014). In contrast, in organizations where these kinds of family engagements are not offered, factors such as lack of information about what is happening in the field, and the resultant increased anxiety amongst family members, will increase perceived psychological stress in partners and children and lessens their availability and effectiveness as support resources for personnel deployed to deal with volcanic crises (Paton et al. 2009). Awareness of these issues is imperative for volcanic crisis response, where responding scientists may be geographically dispersed and away from family during periods of high risk and high stress work. Under such circumstances, ensuring partners are involved as much as possible to facilitate opportunities for personnel to remain connected with family, and access social support from this quarter, can provide cost-effective stress management resources for those deployed (Paton and Kelso 1991). This aspect of a comprehensive crisis management strategy plays a key role in supporting performance and well-being, particularly when personnel can be deployed and working in high demand, high stress contexts over prolonged periods of time.

5 Concluding Remarks: Developing Future Response Capacity

We have highlighted how good shared mental models of the response situation between individuals within and across organisations, characterised by good situational awareness, strong inter-organisational networks, and high trust between responding organisations and individuals, have been shown to enhance communication and thus decision making. However, developing and maintaining such shared mental models is itself an important task. Research has shown that shared experience, through training, can help improve the quality of such mental models (Cannon-Bowers and Bell 1997; Crego and Spinks 1997; Paton et al. 2000; Pliske et al. 2001; Borodzicz and van Haperen 2002).

Ideally, multi-organisational and multi-disciplinary planning activities, collaborative exercises and simulations should be undertaken with all team members and advisors to help in the development of similar mental models of the task (see review in Paton and Jackson 2002; Doyle and Johnston 2011; Doyle et al. 2015). A comprehensive suite of training and relationship building activities prior to an event, and a detailed analysis of event and exercise response, can help enhance this future response capability and identify areas for improvement. This is particularly important given the rarity of volcanic and other hazard events, and thus a lack of opportunity for real world experience. This training and exercising needs to develop both individual and team situational awareness and explore how and when each is appropriate for response, within evolving, dynamic response environments (Doyle et al. 2015). Team situational awareness can be developed in post-event and post-exercise reviews that include identifying inter-agency relationship issues as opportunities for development (and not as problems). Through the analysis of past events, lessons for successful communication, advice provisions and distributed decision-making can also be learnt.

The above processes describe group learning from crises, exercises and training, which is identified by Borodzicz and van Haperen (2002) to occur along three dimensions: personal, interpersonal, and institutional. Several training methods have been identified that can enhance naturalistic decision-making (Cannon-Bowers and Bell 1997), enhance decision skills (Pliske et al. 2001), train effective teams (Salas et al. 1997), and develop effective critical incident and team based simulations (Flin 1996; Crego and Spinks 1997) all of which are relevant for volcanologists and scientific advisory groups. These include cross training, positional rotation, scenario planning, collaborative exercises and simulations, shared exercise writing tasks including co-writing, swapped writing and ‘train the trainer’ type tasks, in addition to workshops, seminars and specific knowledge sharing activities (Doyle and Johnston 2011; Doyle et al. 2015).

Adopting such an evaluative approach has greatly enhanced the response environment in New Zealand, resulting in a formation of a number of scientific advisory groups with formalised Terms of Reference, protocols for communication and networking with emergency management and key response organisations (e.g., CPVAG 2009). These, accompanied by regular workshops and meetings to facilitate relationship building and shared understanding, will help improve the communication and information flow and thus the shared situational awareness in a crisis. Being part of an exercise schedule (i.e. Exercise Ruaumoko; MCDEM 2008), also provided Auckland CDEM and the associated science agencies a focus to develop the Auckland Volcanic Science Advisory Group structure, including arranging formal contract agreements for the participating scientists (McDowell 2008; Cronin 2008).

The development of formalised protocols, Terms of Reference and the use of established guidelines for response and communication (such as those issued by IAVCEI, WMO, and IPCC; IAVCEI 1999; Gill et al. 2008; Mastrandrea et al. 2010) can greatly enhance response processes by reducing ambiguity about ‘what to do’, ‘what to communicate’, ‘how to communicate’, and ‘who to communicate to’, particularly in high stress, high pressure, high consequence events. As stated by IAVCEI (1999), responding scientists must also identify a team plan for crisis response (Table 5), and we suggest that such plans and procedures should be tested prior to an event. Simulations should aim to reproduce reality as closely as possible, reflecting the realities of advisory processes in turbulent conditions (Rosenthal and ’t Hart 1989; Borodzicz and van Haperen 2002). However, evaluation of exercises and events must minimise the risk of creating an optimistic bias that overestimates future response preparedness and capability (Paton et al. 1998).
Table 5

Steps identified by IAVCEI (1999, p. 332) to form a team plan for volcanologists responding to a crisis

1. Clear identification of scientific, warning, and other tasks (including communications with civil defence, news media, and others)

2. Clear identification of responsibility (group or individual) for each task, including that of team leader

3. Clear identification of a mechanism for selecting team leader(s)

4. Procedures and policies on likely issues of scientific interaction, including:

   (a) Rights and responsibilities for data and sample sharing

   (b) Resolution of differences in scientific approach and/or interpretation

   (c) Preparation and release of forecasts, warnings, and other public statements

   (d) Restrictions of access to hazardous areas (and application/approval procedures for access permits)

   (e) Requirements and roles of visiting scientists

   (f) Communication, within and outside the scientific team; and

   (g) Publication of scientific results, and distribution of authorship

In conclusion, it is also important for future work to consider the role that international frameworks and initiatives will have on any protocols and procedures developed for volcanic science advisors at a local or national level. For example, recently the UN Office for Disaster Risk Reduction’s Hyogo Framework for Action (HFA) 2005–2015 (United Nations International Strategy for Disaster Reduction (UNISDR) 2007) has been reviewed. Changes include a reconsideration of the role of science and technology, the role of local science and local knowledge, and the role of international science advice mechanisms as ratified in the Sendai Framework for Disaster Risk Reduction 2015–2030 (SFDRR: UNISDR 2015). Volcanologists must be aware of the impact of any changes to international frameworks such as the HFA and the SFDRR, as they will affect regional and local frameworks and support, including both funding, resources, and the legitimacy of any formalised local approaches that may be developed to maintain “shared meaning” with stakeholders during a volcanic crisis.



EEHD was supported by a Foundation for Research Science and Technology NZ S&T Postdoctoral Fellowship MAUX0910 2010–2014, and funding from EQC and GNSScience 2014–2016.


  1. Aspinall W, Cooke R (1998) Expert judgement and the Montserrat Volcano eruption. In: Mosleh A, Bari RA (eds) Proceedings of the 4th international conference on probabilistic safety assessment and management PSAM4, September 13–18. New York, USA, pp 2113–2118Google Scholar
  2. Austin J, Gray G, Hilbert J, Poulson D (2015) The ethics of communicating scientific uncertainty. Environ Law Report 45(2):10105Google Scholar
  3. Banai M, Reisel W (1999) Would you trust your foreign manager? An empirical investigation. Int J Hum Res Manag 10(3):477–487Google Scholar
  4. Barclay J, Haynes K, Mitchell T, Solana C, Teeuw R, Darnell A, Crosweller HS, Cole P, Pyle DM, Lowe C, Fearnley C, Kelman I (2008) Framing volcanic risk communication within disaster risk reduction: finding ways for the social and physical sciences to work together. Geol Soc London, Spec Publ 305:163–177. doi: 10.1144/SP305.14CrossRefGoogle Scholar
  5. Blickensderfer E, Cannon-Bowers JA, Salas E (1998) Cross-training and team performance. In: Cannon-Bowers JA, Salas E (eds) Making decisions under stress: implications for individual and team training. American Psychological Association, Washington, D.C., USA, pp 299–311CrossRefGoogle Scholar
  6. Borodzicz E, van Haperen K (2002) Individual and group learning in crisis simulations. J Contingencies Cris Manag 10:139–147. doi: 10.1111/1468-5973.00190CrossRefGoogle Scholar
  7. Bostrom A, French S, Gottlieb S (2008) Risk assessment, modeling and decision support: strategic directions. Springer, BerlinCrossRefGoogle Scholar
  8. Budescu DV, Broomell S, Por H-H (2009) Improving communication of uncertainty in the reports of the intergovernmental panel on climate change. Psychol Sci 20(3):299–308CrossRefGoogle Scholar
  9. Cadag JR, Gaillard JC (2014) Integrating people’s capacities in disaster risk reduction through participatory mapping. In: Lopez-Carresi A, Fordham M, Wisner B, Kelman I, Gaillard JC (eds) Disaster management: international lessons in risk reduction, response and recovery. Routledge, New York, pp 269–286Google Scholar
  10. Cannon-Bowers JA, Bell HE (1997) Training decision makers for complex environments: implications of the naturalistic decision making perspective. In: Zsambok CE, Klein G (eds) Naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, NJ, pp 99–110Google Scholar
  11. Chaiken S, Trope Y (1999) Dual process theories in social psychology. Guilford Press, New YorkGoogle Scholar
  12. CPVAG (2009) Central Plateau Volcanic Advisory Group Strategy, October 2009. Report No 2010/EXT/1117. In: Morris B (ed) Horizons Regional Council, Palmerston North, NZGoogle Scholar
  13. Crego J, Spinks T (1997) Critical incident management simulation. In: Flin R, Salas E, Strub M, Martin L (eds) Decision making under stress emerging themes and applications. Ashgate Publishing Limited, Aldershot, England, pp 85–94Google Scholar
  14. Crichton M, Flin R (2001) Training for emergency management: tactical decision games. J Hazard Mater 88(2–3):255–266CrossRefGoogle Scholar
  15. Crichton M, Flin R (2002) Command decision making. In: Flin R, Arbuthnot K (eds) Incident command: tales from the hot seat. Ashgate Publishing Limited, Aldershot, England, pp 201–238Google Scholar
  16. Crisp CB, Jarvenpaa SL (2013) Swift trust in global virtual teams. J Pers Psychol 12(1):45–56CrossRefGoogle Scholar
  17. Cronin SJ (2008) The Auckland Volcano Scientific Advisory Group during Exercise Ruaumoko: observations and recommendations. In: Civil defence emergency management: exercise Ruaumoko. Auckland Regional Council, AucklandGoogle Scholar
  18. Cronin SJ, Gaylord DR, Charley D, Alloway BV, Wallez S, Esau JW (2004) Participatory methods of incorporating scientific with traditional knowledge for volcanic hazard management on Ambae Island, Vanuatu. Bull Volcanol 66:652–668CrossRefGoogle Scholar
  19. Curnin S, Owen C, Brooks B, Paton D (2015) A theoretical framework for negotiating the path of emergency management multi-agency coordination. Appl Ergon 47:300–307CrossRefGoogle Scholar
  20. Dietz G, Gillespie N, Chao G (2010) Unravelling the complexities of trust and culture. In: Saunders M, Skinner D, Dietz G, Gillespie N, Lewicki R (eds) Organizational trust: a cultural perspective Cambridge. Cambridge University Press, UK, pp 3–41CrossRefGoogle Scholar
  21. Dirks K, Ferrin D (2001) The role of trust in organizational settings. Organ Sci 12:450–467CrossRefGoogle Scholar
  22. Doyle EE, Johnston DM (2011) Science advice for critical decision-making. In: Paton D, Violanti J (eds) Working in high risk environments: developing sustained resilience. Charles C Thomas, Springfield, pp 69–92Google Scholar
  23. Doyle EEH, McClure J, Johnston DM, Paton D (2014a) Communicating likelihoods and probabilities in forecasts of volcanic eruptions. J Volcanol Geotherm Res 272:1–15. doi: 10.1016/j.jvolgeores.2013.12.006CrossRefGoogle Scholar
  24. Doyle EEH, McClure J, Paton D, Johnston DM (2014b) Uncertainty and decision making: volcanic crisis scenarios. Int J Disaster Risk Reduct 10:75–101CrossRefGoogle Scholar
  25. Doyle EEH, Paton D, Johnston DM (2015) Effective management of volcanic crises: evidence-based approaches to enhance scientific response. J Appl Volcanol 4:1CrossRefGoogle Scholar
  26. Doyle EEH, Potter HS (2016) Methodology for the development of a probability translation table for GeoNet. GNS Science Report 2015/67. GNS Science, Lower Hutt, NZGoogle Scholar
  27. Endsley MR (1997) The role of situation awareness in naturalistic decision making. In: Zsambok CE, Klein G (eds) Naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, pp 269–284Google Scholar
  28. Epstein S (1994) Integration of the cognitive and the psychodynamic unconscious. Am Psychol 49(8):709–724CrossRefGoogle Scholar
  29. Evans JSBT, Stanovich KE (2013a) Dual process theories of cognition: advancing the debate. Perspect Psychol Sci 8:223–241CrossRefGoogle Scholar
  30. Evans JSBT, Stanovich KE (2013b) Theory and metatheory in the study of dual processing: reply to comments. Perspect Psychol Sci 8:263–271CrossRefGoogle Scholar
  31. Faraj S, Xiao Y (2006) Coordination in fast-response organizations. Manage Sci 52(8):1155–1169CrossRefGoogle Scholar
  32. Flin R (1996) Sitting in the hot seat: leaders and teams for critical incident management. Wiley, ChichesterGoogle Scholar
  33. Flin R, Salas E, Strub M, Martin L (eds) (1997) Decision making under stress: emerging themes and applications. Ashgate Publishing Limited, AldershotGoogle Scholar
  34. Gaillard J-C (2006) Traditional communities in the face of natural hazards: the 1991 Mount Pinatubo eruption and the Aetas of the Philippines. Int J Mass Emerg Disasters 24:5–43Google Scholar
  35. Gill J, Rubiera H, Martin C, Cacic I, Mylne K, Dehui C, et al (2008) World meteorological organization guidelines on communicating forecast uncertainty. World Meteorological Organization, WMO/TD No. 4122Google Scholar
  36. Goodman R, Goodman L (1976) Some management issues in temporary systems: a study of professional development and manpower-the theater case. Adm Sci Q 21(3):494–501CrossRefGoogle Scholar
  37. Grasso M, Markowitz EM (2015) The moral complexity of climate change and the need for a multidisciplinary perspective on climate ethics. Clim Change 130:327–334CrossRefGoogle Scholar
  38. Han P (2013) Conceptual, methodological, and ethical problems in communicating uncertainty in clinical evidence. Med Care Res Rev 70(1):14S–36SCrossRefGoogle Scholar
  39. Haynes K, Barclay J, Pidgeon N (2007) The issue of trust and its influence on risk communication during a volcanic crisis. Bull Volcanol 70(5):605–621CrossRefGoogle Scholar
  40. Haynes K, Barclay J, Pidgeon N (2008) Whose reality counts? Factors affecting the perception of volcanic risk. J Volcanol Geotherm Res 172(3–4):259–272CrossRefGoogle Scholar
  41. Heath R, Lee J, Ni L (2009) Crisis and risk approaches to emergency management planning and communication: the role of similarity and sensitivity. J Public Relat Res 21(2):123–141CrossRefGoogle Scholar
  42. Honda H, Yamagishi K (2006) Directional verbal probabilities. Exp Psychol 53(3):161–170CrossRefGoogle Scholar
  43. Hyllengren P, Larsson G, Fors M, Sjöberg M, Eid J, Olsen OK (2011) Swift trust in leaders in temporary military groups. Team Perform Manage 17(7/8):354–368CrossRefGoogle Scholar
  44. Iannella R, Henricksen K (2007) Managing information in the disaster coordination centre : lessons and opportunities. In: Van de Walle B, Burghardt P, Nieuwenhuis C (eds) Proceedings of the 4th international ISCRAM conference, May 2007, pp 1–11Google Scholar
  45. International Association for Volcanology and Chemistry of the Earth’s Interior (1999) IAVCEI subcommittee for crisis protocols, professional conduct of scientists during volcanic crises. Bull Volcanol 60:323–334CrossRefGoogle Scholar
  46. Johnson BB (2003) Further notes on public response to uncertainty in risks and science. Risk Anal 23(4):781–789CrossRefGoogle Scholar
  47. Johnson BB, Slovic P (1995) Presenting uncertainty in health risk assessment: initial studies of its effects on risk perception and trust. Risk Anal 15(4):485–494CrossRefGoogle Scholar
  48. Johnson BB, Slovic P (1998) Lay views on uncertainty in environmental health risk assessment. J Risk Res 1(4):261–279CrossRefGoogle Scholar
  49. Johnston DM, Paton D, Houghton BF (1999) Volcanic hazard management: promoting integration and communication. In: Ingleton J (ed) Natural disaster management. United Nations (IDNDR), Coventry, pp 243–245Google Scholar
  50. Joslyn SL, Nadav-Greenberg L, Taing MU, Nichols RM (2009) The effects of wording on the understanding and use of uncertainty information in a threshold forecasting decision. Appl Cognitive Psychol 23(1):55–72CrossRefGoogle Scholar
  51. Kapucu N (2006) Interagency communication networks during emergencies: boundary spanners in multiagency coordination. Am Rev Publ Admin 36(2):207–225CrossRefGoogle Scholar
  52. Kapucu N, Garayev V (2011) Collaborative decision-making in emergency and crisis management. Int J Publ Admin 34(6):366–375CrossRefGoogle Scholar
  53. Karelitz TM, Budescu DV (2004) You say “probable” and I say “likely”: improving interpersonal communication with verbal probability phrases. J Exp Psychol: Appl 10(1):25–41Google Scholar
  54. Keohane R, Lane M, Oppenheimer M (2014) The ethics of scientific communication under uncertainty. Politics Philos Econ 1394:343–367CrossRefGoogle Scholar
  55. Keren G (2013) A tale of two systems: a scientific advance or a theoretical stone soup? Commentary on Evans & Stanovich (2013). Perspect Psychol Sci 8:257–262CrossRefGoogle Scholar
  56. Keren G, Schul Y (2009) Two is not always better than one: a critical evaluation of two-system theories. Perspect Psychol Sci 4:533–550CrossRefGoogle Scholar
  57. Klein G (1997) The current status of the naturalistic decision making framework. In: Flin R et al (eds) Decision making under stress: emerging themes and applications. Ashgate Publishing Limited, Aldershot, pp 11–28Google Scholar
  58. Klein G (2008) Naturalistic decision making. Hum Factors J Hum Factors Ergon Soc 50:456–460. doi: 10.1518/001872008X288385CrossRefGoogle Scholar
  59. Kowalski-Trakofler KM, Vaught C, Scharf T (2003) Judgment and decision making under stress: an overview for emergency managers. Int J Emerg Manag 1:278–289. doi: 10.1504/IJEM.2003.003297CrossRefGoogle Scholar
  60. Kramer R (1999) Trust and distrust in organizations: emerging perspectives, enduring questions. Annu Rev Psychol 50(1):569–598CrossRefGoogle Scholar
  61. Kruglanski AW, Gigerenzer G (2011) Intuitive and deliberative judgements are based on common principles. Psychol Rev 118:97–109CrossRefGoogle Scholar
  62. Lester P, Vogelgesang G (2012) Swift trust in ad hoc military organizations. In: Laurence J, Michael M (eds) The Oxford handbook of military psychology. Oxford University Press, New York, pp 176–186Google Scholar
  63. Lindsay J, Marzocchi W, Jolly G, Constantinescu R, Selva J, Sandri L (2009) Towards real-time eruption forecasting in the Auckland Volcanic Field: application of BET_EF during the New Zealand National Disaster Exercise “Ruaumoko”. Bull Volcanol 72:185–204. doi: 10.1007/s00445-009-0311-9CrossRefGoogle Scholar
  64. Lipkus IM (2010) Numeric, verbal, and visual formats of conveying health risks: suggested best practices and future recommendations. Med Decis Mak 27(5):696–713CrossRefGoogle Scholar
  65. Lipshitz R, Klein G, Orasanu J, Salas E (2001) Focus Article: taking stock of naturalistic decision making. J Behav Decis Mak 14:331–352. doi: 10.1002/bdm.381CrossRefGoogle Scholar
  66. Loewenstein GF, Weber EU, Hsee CK, Welch N (2001) Risk as feelings. Psychol Bull 127(2):267–286CrossRefGoogle Scholar
  67. Marcus LJ, Dorn BC, Henderson JM (2006) Meta-leadership and national emergency preparedness: a model to build government connectivity. Biosecurity Bioterrorism: Biodefense Strategy Pract Sci 4(2):128–134CrossRefGoogle Scholar
  68. Marincioni F (2007) Information technologies and the sharing of disaster knowledge: the critical role of professional culture. Disasters 31(4):459–476CrossRefGoogle Scholar
  69. Martin L, Flin R, Skriver J (1997) Emergency decision making—a wider decision framework? In: Flin R, Salas E, Strub M, Martin L (eds) Decision making under stress emerging themes and applications. Ashgate Publishing Limited, Aldershot, pp 280–290Google Scholar
  70. Marzocchi W, Woo G (2007) Probabilistic eruption forecasting and the call for an evacuation. Geophys Res Lett 34:1–4. doi: 10.1029/2007GL031922CrossRefGoogle Scholar
  71. Mastrandrea MD, Field CB, Stocker TF, Edenhofer O, Ebi KL, Frame DJ et al (2010) Guidance note for lead authors of the IPCC fifth assessment report on consisten treatment of uncertainties. IPCC cross-working group meeting on consistent treatment of uncertainties. Jasper Ridge, CA USA Last accessed 22 Oct 2014
  72. Mayer R, Davis J, Schhorman F (1995) An integrative model of organizational trust. Acad Manag Rev 20(3):709–734CrossRefGoogle Scholar
  73. MCDEM (2008) Exercise Ruaumoko ‘ 08 final exercise report. Ministry of Civil Defence and Emergency Management, Wellington 79 ppGoogle Scholar
  74. McDowell S (2008) Exercise Ruaumoko: evaluation report. Auckland Civil Defence Emergency Management Group, Auckland 24 ppGoogle Scholar
  75. McGuire WJ, Solana MC, Kilburn CRJ, Sanderson D (2009) Improving communication during volcanic crises on small, vulnerable islands. J Volcanol Geotherm Res 183(1–2):63–75CrossRefGoogle Scholar
  76. McKnight D, Cummings L, Chervany N (1988) Initial trust formation in new organizational relationships. Acad Manag Rev 23(3):473–490CrossRefGoogle Scholar
  77. Meyerson D, Weick KE, Kramer RM (1996) Swift trust and temporary groups. In: Kramer RE, Tyler TR (eds) Trust in organisations: frontiers of theory and research. Sage Publications Inc, Thousand Oaks, pp 166–195CrossRefGoogle Scholar
  78. Miles S, Frewer LJ (2003) Public perception of scientific uncertainty in relation to food hazards. J Risk Res 6(3):267–283CrossRefGoogle Scholar
  79. Militello L, Patterson ES, Bowman L, Wears RL (2007) Information flow during crisis management: challenges to coordination in the emergency operations center. Cognit Technol Work 9(1):25–31CrossRefGoogle Scholar
  80. Mohr J, Spekman R (1994) Characteristics of partnership success: partnership attributes, communication behavior, and conflict resolution techniques. Strateg Manag J 15(2):135–152CrossRefGoogle Scholar
  81. Morgan MG, Fischhoff B, Bostrom A, Atman CJ (2002) Risk communication: a mental models approach. Cambridge University Press, CambridgeGoogle Scholar
  82. Moss RH, Schneider SH (2000) Uncertainties in the IPCC TAR: recommendations to lead authors for more consistent assessment and reporting. In: Pachauri R, Taniguchi T, Tanaka K (eds) IPCC supporting material, guidance papers on the cross cutting issues of the third assessment report of the IPCC. pp 33–51Google Scholar
  83. Orasanu J (1994) Shared problem models and flight crew performance. In: Johnston N, McDonald N, Fuller R (eds) Aviation psychology in practice. Aldershot, England, pp 255–285Google Scholar
  84. Osman M (2013) A case study: dual-process theories of higher cognition—commentary on Evans & Stanovich (2013). Perspect Psychol Sci 8:248–252CrossRefGoogle Scholar
  85. Owen C (2013) Gendered communication and public safety: women, men and incident management. Aust J Emerg Manag 28(2):3–10Google Scholar
  86. Owen C, Campus SB, Brooks B, Chapman J, Paton D, Hossain L (2013) Developing a research framework for complex multi-team coordination in emergency management. Int J Emerg Manag 9:1–17CrossRefGoogle Scholar
  87. Pascual R, Henderson S (1997) Evidence of naturalistic decision making in military command and control. In: Zsambok CE, Klein G (eds) Naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, pp 217–226Google Scholar
  88. Paton D (1996) Training disaster workers: promoting wellbeing and operational effectiveness. Disaster Prev Manag 5(5):11–18CrossRefGoogle Scholar
  89. Paton D (2003) Stress in disaster response: a risk management approach. Disaster Prev Manag 12:203–209. doi: 10.1108/09653560310480677CrossRefGoogle Scholar
  90. Paton D, Flin R (1999) Disaster stress: an emergency management perspective. Disaster Prev Manag 8(4):261–267CrossRefGoogle Scholar
  91. Paton D, Jackson D (2002) Developing disaster management capability: an assessment centre approach. Disaster Prev Manag 11:115–122. doi: 10.1108/09653560210426795CrossRefGoogle Scholar
  92. Paton D, Kelso BA (1991) Disaster stress: the impact on the wives and family. Counselling Psychol Quart 4:221–227CrossRefGoogle Scholar
  93. Paton D, Norris K (2014) Vulnerability to work-related posttraumatic stress: family and organizational influences. In: Violanti JM (ed) Dying for the Job: police work exposure and health. Charles C. Thomas, Springfield, pp 126–141Google Scholar
  94. Paton D, Johnston DM, Houghton BF (1998) Organisational response to a volcanic eruption. Disaster Prev Manag 7:5–13. doi: 10.1108/09653569810206226CrossRefGoogle Scholar
  95. Paton D, Johnston DM, Houghton BF, Flin R, Ronan K, Scott B (1999) Managing natural hazard consequences: planning for information management and decision making. J Am Soc Prof Emerg Plan 6:37–47Google Scholar
  96. Paton D, Ronan KR, Johnston DM, Houghton BF, Pezzullo L (2000) La Riduzione del Rischio Vulcanico: Integrare le prospettive psicologiche e geologiche. Psychomedia, Mental Health and Communication, 5, (May), [Online Serial], URL Accessed 14 Oct 2014
  97. Paton D, Violanti JM, Burke K, Gherke A (2009) Traumatic stress in police officers: a career length assessment from recruitment to retirement. Charles C Thomas, SpringfieldGoogle Scholar
  98. Patt A, Dessai S (2005) Communicating uncertainty: lessons learned and suggestions for climate change assessment. Comptes Rendus Geosci 337(4):425–441CrossRefGoogle Scholar
  99. Pliske RM, McCloskey MJ, Klein G (2001) Decision skills training: facilitating learning from experience. In: Salas E, Klein G (eds) Linking expertise and naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, pp 37–53Google Scholar
  100. Pollock C, Paton D, Smith D, Violanti J (2003) Team resilience. In: Paton D, Violanti J, Smith L (eds) Promoting capabilities to manage posttraumatic stress: perspectives on resilience. Charles C. Thomas, Springfield, pp 74–88Google Scholar
  101. Robert LP, Denis AR, Hung Y-TC (2009) Individual swift trust and knowledge-based trust in face-to-face and virtual team members. J Manage Inf Syst 26(2):241–279CrossRefGoogle Scholar
  102. Rogalski J, Samurçay R (1993) A method for tactical reasoning (MTR) in emergency managment: analysis of individual acquisition and collective implementation. In: Rasmussen B, Brehmer B, Leplat J (eds) Distributed decision making: cognitive models for co-operative work. Wiley, New York, pp 287–298Google Scholar
  103. Rosenthal U, ’t Hart P (1989) Managing terrorism: the south Moluccan hostage takings. In: Rosenthal U, Charles MT, ’t Hart P (eds) Coping with crises: the management of disasters, riots and terrorism. Charles C Thomas, Springfield, pp 340–366Google Scholar
  104. Rovins JE, Doyle EEH, Huggins TJ (2014) 2nd integrated research on disaster risk conference—integrated disaster risk science: a tool for sustainability. In: Planet@Risk, vol 2, Issue 5, Special Issue for the Post-2015 Framework for DRR, Global Risk Forum GRF Davos, Davos. pp 332–336Google Scholar
  105. Saaty TL (2008) Decision making with the analytic hierarchy process. Int J Serv Sci 1:83–98Google Scholar
  106. Salas E, Stout RJ, Cannon-Bowers JA (1994) The role of shared mental models in developing shared situational awareness. In: Gilson RD, Garland DJ, Koonce JM (eds) Situational awareness in complex systems: proceedings of a Cahfa conference. Embry-Riddle Aeronautical University Press, Daytona Beach, FL, pp 298–304Google Scholar
  107. Salas E, Cannon-Bowers JA, Johnston JH (1997) How can you turn a team of experts into an expert team? Emerging training strategies. In: Zsambok CE, Klein G (eds) Naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, pp 359–370Google Scholar
  108. Sarna P (2002) Managing the spike: the command perspective in critical incidents. In: Flin R, Arbuthnot K (eds) Incident command tales from the hot seat. Ashgate Publishing Limited, Aldershot, pp 32–57Google Scholar
  109. Siegrist M, Cvetkovich G (2000) Perception of hazards: the role of social trust and knowledge. Risk Anal 20:713–719CrossRefGoogle Scholar
  110. Sloman SA (1996) The empirical case for two systems of reasoning. Psychol Bull 119(1):3–22CrossRefGoogle Scholar
  111. Slovic P, Finucane M, Peters E, MacGregor DG (2004) Risk as analysis and risk as feelings: some thoughts about affect, reason, risk, and rationality. Risk Anal 24(2):311–322CrossRefGoogle Scholar
  112. Smith R (2009) Research, science and emergency management: partnering for resilience. Tephra, community resilience: research, planning and civil defence emergency management. Ministry of Civil Defence & Emergency Management, Wellington, New Zealand, pp 71–78Google Scholar
  113. Smithson M (1999) Conflict aversion: preference for ambiguity vs conflict in sources and evidence. Organ Behav Hum Dec Processes 79(3):179–198CrossRefGoogle Scholar
  114. Solana MC, Kilburn CRJ, Rolandi G (2008) Communicating eruption and hazard forecasts on Vesuvius, Southern Italy. J Volcanol Geotherm Res 172(3–4):308–314CrossRefGoogle Scholar
  115. Sparks RSJ (2003) Forecasting volcanic eruptions. Earth Planet Sci Lett 210(1–2):1–15CrossRefGoogle Scholar
  116. Teigen KH, Brun W (1999) The directionality of verbal probability expressions: effects on decisions, predictions, and probabilistic reasoning. Organ Behav Hum Dec Processes 80(2):155–190CrossRefGoogle Scholar
  117. Thompson VA (2013) Why it matters: the implications of autonomous processes for dual-process theories—commentary on Evans & Stanovich (2013). Perspect Psychol Sci 8:253–256CrossRefGoogle Scholar
  118. United Nations International Strategy fo Disaster Reduction (2007) Hyogo framework for action 2005–2015: Building the resilience of nations and communities to disasters. Extract from the final report of the world conference on disaster reduction (A/CONF:206/6). UNISDR Secretariat, Geneva. Last accessed 22 Oct 2014
  119. United Nations International Strategy for Disaster Reduction (2015) Sendai framework for disaster risk reduction 2015–2030. UNISDR Secretariat, Geneva. Last accessed 21 Dec 2015
  120. Waugh WLJ, Streib G (2006) Collaboration and leadership for effective emergency management. Public Adm Rev 66(s1):131–140CrossRefGoogle Scholar
  121. Weber EU (2006) Experience-based and description-based perceptions of long-term risk: why global warming does not scare us (yet). Clim Change 77(1–2):103–120CrossRefGoogle Scholar
  122. Wiedemann P, Borner F, Schultz H (2008) Lessons learned: recommendations for communicating conflicting evidence for risk characterization. In: Wiedemann PM, Schultz H (eds) The role fo evidence in risk characterisation: making sense of conflicting data. Wilery-VCH Verlag GmbH & Co. KGaA, Weinheim, pp 205–213CrossRefGoogle Scholar
  123. Williams C, Dunn CE (2003) GIS in participatory research: assessing the impact of landmines on communities in North-west Cambodia. Trans Geo Inf Syst 7:393–410Google Scholar
  124. Wilson KA, Salas E, Priest HA, Andrews D (2007) Errors in the heat of battle: taking a closer look at shared cognition breakdowns through teamwork. Hum Factors 49:243–256CrossRefGoogle Scholar
  125. Woo G (2008) Probabilistic criteria for volcano evacuation decisions. Nat Hazards 45(1):87–97CrossRefGoogle Scholar
  126. Zsambok C (1997) Naturalistic decision making: where are we now? In: Zsambok CE, Klein G (eds) Naturalistic decision making. Lawrence Erlbaum Associates, Mahwah, pp 3–16Google Scholar

Copyright information

© The Author(s) 2017

Open Access    This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Joint Centre for Disaster ResearchMassey UniversityWellingtonNew Zealand
  2. 2.Faculty of Engineering, Health, Science and the Environment, School of Psychological and Clinical SciencesCharles Darwin UniversityDarwinAustralia

Personalised recommendations