1 Introduction

Since the late 1990s important advances have taken place in the development of methodologies for estimating the potential impacts of climate change on human and natural systems. The standard approach of climate scenario-driven impact studies has been expanded to a range of different approaches such as the assessment of current and future adaptation capacity, social vulnerability and policy, among others (Carter et al. 2007).

The Fourth Assessment Report (AR4, WGII-IPCC 2007; Carter, et al., 2007) briefly reviews the new assessment methods for climate change impact, adaptation and vulnerability (CCIAV). The IPCC distinguishes five approaches for CCIAV, four of them are considered conventional: impact assessment, adaptation assessment, vulnerability assessment and, integrated assessment. The fifth approach, risk management, has evolved as the CCIAV studies have begun to be used in mainstream policy-making (Carter et al., 2007). These studies are oriented towards decision-making activities such as planning, management and policy-making, and are suitable for a wide range of scales: from global to local and from sector to case study.

The development of models that combine physical and socioeconomic determinants has also been of particular importance since it has allowed the integrated assessment of potential impacts, providing more realistic estimations that consider multiple stressors. Nevertheless, some other issues like uncertainty and variability are still pervasive in CCIAV studies, and there is a lack of methodologies to formally address these issues. For example, even in the UNFCCC’s Compendium on methods and tools to evaluate impacts of, and vulnerability to, climate change (UNFCCC 2008A) there are few methods that integrate uncertainty and variability in their estimations.

Most of the current assessments of the potential impacts of climate change have been carried out using methodologies that do not allow the inclusion of the variability and uncertainty inherent to climate and socioeconomic variables. In consequence, such methodologies can only provide a very limited (and potentially biased) assessment of the risk that the systems in study would face under climate change conditions, and are based on a single realization of a combination of different stochastic processes.

In the AR4, the Working Group II of the IPCC stresses the importance of producing new methods to address uncertainty in climate change studies in order to inform decision-making under uncertainty, and provides examples of studies that have suggested methods to do this (see for example, Toth et al. 2003a, b; Jones 2001; Willows and Connell 2003; UNDP, 2005; more recent studies include, for example, Tebaldi and Lobell 2008).

The methodology proposed in this paper builds on the risk management framework and aims to inform decision-making under uncertainty by producing probabilistic assessments of the potential impacts of climate change that: consider the uncertainty in climate and socioeconomic variables as well as their inherent variability; promote stakeholder involvement by including their subjective information and; produce tailor-made risk measures as well as estimations of the date to reach critical thresholds defined by the stakeholder.

1.1 Uncertainty in climate scenarios

In recent years, advances in the understanding and modeling of the climate system have permitted to increase the confidence in, and the complexity of, climate models as well as to further improve the space and time resolution of climate change scenarios (IPCC-WGI 2007). There is now the possibility of estimating uncertainty ranges for the future climate at global and regional levels and of constructing probabilistic climate change scenarios using different storylines and models (Meehl et al. 2007; Christensen et al. 2007).

Unfortunately, due to the lack of appropriate methodologies for handling uncertainty in climate change scenarios, a large portion of the available information is being poorly exploited for CCIAV and therefore, climate change science is failing to fully accomplish one of its main objectives: assisting decision-making and promoting optimal or “best response” decision-making.

The vast majority of the literature on impact assessment is based on recommendations such as using at least two climate change scenarios (the highest and lowest or those corresponding to two development scenarios that, based on “expert” subjective information, are thought to be most likely) for addressing uncertainty in climate projections (UNFCCC 2008a, b). This is one of the most common recommendations that have been used in important large-scale studies such as National Communications (see for example, Gobierno de la República de Argentina 2008; Secretaría de Medio Ambiente y Recursos Naturales/Instituto Nacional de Ecología 2006; Government of Japan 2006; Government of the Federal Republic of Germany 2006). Nevertheless, results following these recommendations are usually hard to interpret, provide no measure of the probability of the different outcomes, are difficult for communicating risk to stakeholders and decision-makers, and do not make full use of the available information/uncertainty.

Since over a decade, new methodologies for impact assessment (including those that use of Monte Carlo simulation) that are more suitable for dealing with uncertainty have been developed and implemented (see for example, Titus and Narayanan 1996; Yohe and Schlesinger 1998; Jones 2000; New and Hulme 2000; Preston 2006; Nawaz and Adeloye 2006; Gay et al. 2006c). These methodologies can take advantage of the recently available possibility of constructing probabilistic climate change scenarios but greatly depend on the assumptions needed to express uncertainty in terms of a probability distribution. One of the major pitfalls is that these assumptions (probability distributions) are commonly presented as “objective” facts because they are based on a frequentist approach, when it should be clearly stated that they are all subjective representations of beliefs (Moss and Schneider 2000; Ahmad and Warrick 2001; Gay and Estrada 2010). Subjective beliefs should be brought forward, be clearly stated, and probability distributions should be a meaningful expression of the decision-maker beliefs and not some impersonal, one-size fits all, statistically inadequate device.

Dealing with uncertainty in climate change scenarios has become an issue of intense debate in the scientific community (Schneider 2001, 2002; Allen 2003; Pittock et al. 2001;Grübler and Nakicenovic 2001; IPCC-WGI, 2007; for example) that has lead to the proposal of different methods. The contribution of the Working Group I (WGI) to the IPCC’s AR4 provides a review of some of the advances that have taken place in the last decade regarding probabilistic scenarios and uncertainty management.

In the AR4, the WGI takes on the issue of trying to assess model uncertainty and provides best estimates or multi-model averages, likely ranges, as well as other types of probabilistic scenarios for the six marker scenarios included in the Special Report on Emissions Scenarios (SRES, Nakicenovic et al. 2000). Some of the shortcomings of the approaches included in the WGI contribution to the AR4 for producing probabilistic climate change scenarios have been discussed by the authors in a previous paper (Gay and Estrada 2010). We believe that one of the basic problems is that, in general, the approaches proposed are based on forecast methodologies. These methodologies are designed to address aleatory uncertainty and not epistemic uncertainty, which is dominant in climate change scenarios. In this manner, forecast methodologies cannot resolve epistemic uncertainty and therefore they may not be adequate for supporting decision-making.

Furthermore, the IPCC keeps avoiding the emissions uncertainty and therefore, these probabilistic scenarios are conditional on the emission scenario for which we have no information regarding its probability of occurrence. Consequently, decision-makers are left to again only with probability estimates that are conditional on something for which probabilities of occurrence have not been assigned, to carry on with their responsibilities. Clearly, these estimates are of very limited practical interest for decision-making.

Other methods that have become popular are based on weighting models accordingly to their accuracy reproducing observed climate (see for example, IPCC-TGICA 1999; 2007; Wigley 2008, Carter et al., 2007). This type of methods might provide important information which might be used to reduce uncertainty. Nevertheless, there is no guarantee that the models that provide an accurate representation of the current climate will continue to do so in the future, and therefore this method could lead to trading uncertainty by ignorance (Schneider 2003). Also, the diversity of models could be at stake, and models could become less and less independent (Allen 2003).

All these methodologies represent important advances towards making climate change science increasingly useful for decision-making. Nevertheless, all of them still greatly dismiss uncertainty and therefore do not provide an adequate representation of the current state of knowledge in climate modeling. It is important to realize that the complexity of the problem ensures that the available information for decision-making will always be incomplete and that therefore scientific or objective information will have to be complemented with subjective expert information. As discussed by the authors in a previous paper (Gay and Estrada 2010), objective probabilities in climate change scenarios are not attainable and therefore no “true or objective” impact scenario can be constructed, even in a probabilistic framework.

This paper uses the methodology for constructing probabilistic climate change scenarios, based on the Maximum Entropy Principle and proposed in Gay and Estrada (2010). These estimates have desirable properties such as: they are the least biased estimate possible on the available information; maximize the uncertainty (entropy) subject to the partial information that is given; the maximum entropy distribution assigns a positive probability to every event that is not excluded by the given information; no quantifiable possibility is ignored. The probabilities obtained in this manner are the best predictions possible with the state of knowledge and subjective information that is available.

1.2 Natural climate variability

Estimations regarding how climate variability could evolve under climate change conditions are commonly based on simulations from physical climate models (General Circulation Models, for example). The two main approaches are based on performing basic statistical analysis on one single run or on ensembles of runs of the same climate model.

For example, the document Handling Uncertainties of UK Climate Impacts ProgrammeFootnote 1 recommends estimating future climate variability by creating an ensemble of three simulations of the same model under the same emissions scenario and parameterizations but with slightly different initial conditions. The variability shown by these simulations is assumed to adequately represent natural climate variability under a particular climate change scenario. On the other hand, the MAGICC-SCENGEN software (developed by the Climate Research Unit and the National Center for Atmospheric Research, Wigley 1994, 2003, 2008; Hulme et al. 2000), which is widely used for investigating future climate change at global and regional levels, uses 20-year samples for calculating the standard deviations of single simulations for approximating future climate variability (Wigley 1994, 2008).

Both approaches have important drawbacks. On the one hand, although the ability of climate models for simulating the climate system at global and regional scales has greatly improved, climate models are currently not able to entirely reproduce observed climate variability (Meehl et al. 2007; Christensen et al. 2007). In addition, using a limited number of model runs (three, for example) will hardly produce an adequate representation of even the model’s internal variability, much less will it provide an approximation of present, and even worse, of future natural climate variability.

On the other hand, using fixed sample sizes (20 years in the case of the MAGICC-SCENGEN software) of climate model simulations - or for that matter, observed sub-samples of 30 years (WMO 1983) - for estimating climate variability will clearly produce deficient estimations. There are two main reasons for this argument: first, by definition, under climate change, climate variables will be (are) non-stationary processes, so if the time-series properties of these processes are not considered for any statistical analysis (even for calculating a simple standard deviation), they are very likely to lead us to find spurious changes in the moments of their distributions (Gay et al. 2007). In this case, climate variability could be wrongly thought to be time-dependent and therefore could be expected to increase (decrease) in the future, producing poor estimations of potential future climate risk. In this case, limiting sample size to 20, 30 or to any fixed number or years (as is currently done and recommended by the WMO) will still produce inadequate estimations of climate variability. Second, if the process turns out to be stationary, limiting the sample size to a sub-sample of the available data will only produce inefficient estimations of its variability.

Recent studies (Gay et al. 2007; 2009; Estrada et al. 2010; IPCC-WGI, 2007) suggest that climate change will possibly manifest as a change-in-the-mean of (some) climate variables, without altering other distribution moments, at least in the case of monthly and annual means.

For these reasons, it is proposed that the best estimation of future climate variability (with the available information) is the best possible estimation of current natural variability, given that it also provides information of how climate variables have responded to current observed climate change. For this purpose, in this paper statistically adequate time-series modelsFootnote 2 will be estimated to infer the data generating processes.

2 Methodology description and simulation procedures

The simulation methodology presented here considers two possible alternatives for generating probabilistic scenarios of the potential impacts: a) static, which generates a potential impact distribution for a particular time horizon (for example 2020 or 2050) and; b) dynamic, which provides the evolution of the potential impacts’ distribution over a time period (present time to 2050, for example). The methodology for both alternatives can be described in the following steps:

  1. 1)

    Define potential input and output variables for model and simulation.

  2. 2)

    Build a statistically adequate model for the dependent variable being simulated (i.e. the activity or sector under study). This simulation methodology could also be applied to other types of models such as physically based crops simulators, water-balance models or any other type of model that can be adapted for producing a large number of simulations.

  3. 3)

    Obtain a comprehensive range of climate change scenarios for each of the climate variables used as input variables for simulation. Obtain, as well, future scenarios for the non-climate variables included as input variables.

  4. 4)

    Obtain the maximum entropy distributions for climate and non-climate variables. For this step, it is required to choose an arbitrarily mean average change. As described in Gay and Estrada (2010), two main approaches can be used for selecting this arbitrary average value:

    1. a)

      The first approach makes use of the agent’s attitude towards uncertainty (see, for example, Mukerji 2000; Kelsey and Eichberger 2009; Stein and Segal 2006; Wakker 2001). When decisions are to be taken under subjective, deep or epistemic uncertainty, the decision-maker has limited knowledge or information to assign probabilities to the various possible outcomes. As stated by Ellsberg (1961) "what is at issue might be called the ambiguity of this information, a quality depending on the amount, type, reliability and "unanimity" of information, and giving rise to one's degree of "confidence" in an estimate of relative likelihoods" .Footnote 3 In this case a uncertainty averse or ambiguity averse decision maker adjusts his probability distribution on the side of caution in response to his imprecise knowledge of the odds (Mukerji 2000). In Gay and Estrada (2010) three types of attitudes towards uncertainty are used: cautious, neutral and reckless.

      The term cautious agent is used for those agents that when facing an uncertain situation will assign a higher subjective probability of occurrence to the least favorable outcomes than a neutral agent would. This type of decision-maker will chose a higher average mean change than the central value of the distribution (defined as the sum of the lowest and highest values divided by two). A neutral decision-maker would choose a uniform distribution which is a maximum entropy distribution in the absence of additional information; his attitude towards uncertainty does not lead him to assign higher or lower weights to any possible outcome. In such manner he will chose the central value of the distribution as the average mean change. A reckless decision-maker will tend to select a lower average mean change than the central value of the distribution, assigning lower probability of occurrence to the least favorable outcomes, showing a lower level of concern for the possibility of underestimating the probabilities of occurrence of these outcomes.

    2. b)

      The second approach consists in choosing “high” and “low” average change values, as is currently recommended for other methodologies used for impact assessment.

      The relative entropy and the information index can be used to keep track of how much the probability assignment depends on the included subjective information.

  5. 5)

    Build statistically adequate time-series models in order to infer the properties of the data generating processes of the input variables. With this information, produce stationary time-series of these variables and find the probability distribution that provides the best fit. Look for possible correlation among input variables.Footnote 4 For other input variables for which information is not available or is judged to be non-representative of their future evolution, choose non-informative probability distributions such as uniform, for example. Recall that a uniform distribution is a maximum entropy distribution when no additional information is available.

  6. 6)

    a) If the objective is to generate a realization for a given year (2050, for example) proceed as follows. Generate a random number from one of the maximum entropy distributions found in step 4, and use it as the mean of the corresponding probability distribution found in step 5. Generate a realization form this distribution. Repeat for each variable. b) If the objective is to generate a dynamic simulation for a time period of length n such that t = 1, 2,…,n, proceed as follows. Generate a realization from the maximum entropy distribution and divide it by n (the length of the time period). This will be the slope of a linear trend of length n. Generate random numbers from the probability distribution found in step 5; generate a realization of size n of the data generating process (white noise, autoregressive (AR), moving averages (MA), ARMA, etc.) and add it to the linear trend. Repeat for every variable.

  7. 7)

    Define the evolution of the uncertainty in variables represented with non-informative distributions. For example, it could be reasonable to represent our increasing uncertainty regarding the value that a price variable could take in the future by means of a uniform distribution with a time-dependent support. Repeat for every variable for which information is not available or is judged to be non-representative of their future evolution.

  8. 8)

    Introduce these simulations as input variables in the model defined in step 2. Save the realizations of the desired output variables.

  9. 9)

    Repeat steps 6, 7 and 8 in order to obtain the desired number of simulations.

This methodology offers the advantage of integrating the user’s beliefs and expert judgment in the probabilistic assessment of the potential impacts of climate change in a quantitative and objective way. In this manner, tailor-made information is obtained for the decision-maker and opens the possibility of also offering time-dependent information for decision-making concerning the potential impacts of climate variability and change that is consistent with the current state of knowledge and the subjective “expert” information available. Producing time-dependent information is crucial for planning, policy making and for building possible adaptation strategies.

3 Methodology illustration: a case study of coffee production in Veracruz, México

In this section, a simple illustration of this methodology is presented by means of the simple coffee production model shown in Gay et al. (2006a). The objective is to generate production and income probability distributions conditional on “expert” subjective information, and the available state of knowledge, for the average coffee producer and to estimate the risk that climate variability and change represents (present and future) for this activity. Some representative measures of risk, such as VaR (value at risk), the probability of positive (or greater than a given threshold) income values, as well as some measures of variability and central tendency of production and income and their evolution in time, are reported.

3.1 General information regarding coffee production in Veracruz

Agriculture in Veracruz represents 7.9% of the state’s GDP and employs 31.7% of the state’s labor force and coffee production contributes notably to these figures. Veracruz ranks as the second largest national coffee producer. According to the 1992 Coffee Census (Consejo Mexicano del Café 1996), there are 67,000 producers and 153,000 hectares devoted to coffee production in 82 municipalities generating 300,000 permanent jobs and 30 million daily wages a year. Coffee production in the state is very labor intensive amounting to 80% of the activity’s production costs (Consejo Mexicano del Café 2001).

Several national and international factors already make this activity vulnerable. 73% of coffee producers in the state are small-scale owning 2 hectares or less and the dominant type of production is “rustic” grown inside the forests in small plantations having low level of technology and a large traditional component.

Considering that in 2001 about half of the municipalities of the state were classified as under very high and high poverty levels,Footnote 5 and that for most of the municipalities their sources of income depend on agriculture, and in particular on coffee production, this activity is clearly of great socioeconomic relevance for the state of Veracruz.

Since the 1990s, a combination of national and international factors has lead coffee production in Mexico to a critical situation, compromising the economic viability of this activity for a large number of producers. Currently, producers face a strong low-price competition brought about by the entrance of low-quality production to the international and national coffee markets. This has had a great impact on local producers because, even though Mexican coffee has better quality, processing companies prefer buying cheaper, low-quality coffee and improving its taste using chemicals. In recent years, national and international coffee prices have reached levels so low that a large fraction of producers are not even able to fully cover production costs. A recent study from TechnoServe (2003, in collaboration with McKinsey & Company) reveals that the current crisis is different from all the previous coffee price crises because not only is the price volatile, but in the last 10 years the coffee industry structure has changed with the entrance of cost-efficient competitors, innovations and the increasing demand for Robusta coffee (that has lower production costs than Arabica). This means that while coffee prices will recover from their current historic low, the long term coffee price level will remain below its historical averages and will make this activity unprofitable for many producers.

The prevailing socioeconomic conditions, the limited economic and technical resources that are available to the vast majority of producers and the lack of institutional support are factors that make this activity already vulnerable and potentially limit adaptation capacities. In addition, coffee production has an important component of inertia caused mainly by following factors: increasing or decreasing production levels is a long-term decision (a coffee plant takes up to 6 years to become productive); coffee has to be harvested to maintain plantation health; market distortions (such as subsidies), make coffee supply very inelastic in the short-term to changes in coffee prices, slowing adaptation to changing market conditions (Gay et al. 2006a). As a result, although coffee prices have dramatically fallen during the past decade, coffee production in the state has remained constant or has even increased. When coffee prices fall, coffee producers partially absorb losses and are compensated to some extent by government subsidies. This has generated a great dependence on government subsidies and has increased coffee producers’ vulnerability to changes in government policy - a particularly important factor in the current context of market liberalization. Not surprisingly, coffee prices and changes in government policies are the factors that coffee producers in the state identify as most threatening for this activity, while climate is perceived as a much lesser threat (Conde et al. 2007). Nevertheless, recent studies conducted in other parts of the world have shown that coffee production could be very sensitive to changes in climate variables,Footnote 6 and this is also true for Veracruz (Eakin et al. 2005). Climate change could be a determinant factor for the physical and economical viability of coffee production in the state that has not been taken into account by coffee producers and government decision-makers.

3.2 Simulation model

For simulating coffee production in Veracruz we will adopt the coffee production model shown in Gay et al. (2006a). This is a simple, yet statistically adequate model that considers as independent variables spring precipitation, winter and summer temperatures and minimum wage paid in the state. The model adopts (Eq. 1) a linear functional form for spring precipitation and minimum wage paid in the state, and a quadratic functional form for temperature variables.

$$ \begin{gathered} \Pr o{d_{{coffee}}} = - 35965262 + 2296270\left( {{T_{{summ}}}} \right) - 46298.67{\left( {{T_{{summ}}}} \right)^2} + 658.01618\left( {{P_{{spr}}}} \right) \hfill \\ + 813976.3\left( {{T_{{win}}}} \right) - 20318.27{\left( {{T_{{win}}}} \right)^2} - 3549.71\left( {MINWAGE} \right) \hfill \\ \end{gathered} $$
(1)

where:

T summ :

is the average temperature during Summer.

P spr :

is the average precipitation during Spring.

T win :

is the average temperature during Winter

MINWAGE :

is real minimum wage.

The authors use elasticities as a measure of sensitivity of coffee production to changes in spring precipitation and to changes in minimum wage. These estimations reveal that production is inelastic to changes in these variables; production has a negative relation with minimum wage and is very sensitive to changes in this variableFootnote 7 (for a 1% increase in minimum wage, coffee production decreases 0.37%), while coffee production holds a positive relation with spring precipitation but of a much lower magnitude (a 1% change in spring precipitation will produce an increase of 0.15% in coffee production). The linear functional form obtained for spring precipitation permits inferring that, for the current values of this variable in the state, the relation is positive and fairly monotonic, therefore the inflection point at which spring precipitation start to damage coffee production has not been reached (see Gay et al. 2006a).

The quadratic form in temperature variables allows capturing the non-monotonic effects that these variables have over coffee production and permit to obtain optimal temperature values for coffee production. Figure 1 illustrates the sensitivity of production to changes in temperature variables implied by Eq. 1, for fixed values of minimum wage and spring precipitation. The maximum potential production (that is a 100% value in Fig. 1) is attained when summer and winter temperatures values are optimal for coffee production (24.79°C and 20.03°C respectively). For any other values and combinations of summer and winter temperatures, the potential production will be smaller than 100%. For example, for a summer temperature of 22.5°C and a winter temperature of 17.5°C the potential production will be about 50% smaller than for the one obtained in the case of optimal temperature values. Furthermore, if the summer temperature was of 22.5°C for example, no matter what the winter temperature would be, potential production would be at most 60%.

Fig. 1
figure 1

Sensitivity of coffee production to changes in summer and winter temperatures

Statistically adequate time series models for each variable in the coffee production model were estimated in order to approximate the data generating processes including their deterministic and stochastic components, dynamic structure, and distribution. This will permit to correctly simulate the variability shown by the series. Results show that summer temperature can be adequately described as a stationary AR(1) process around a constant, winter temperature can be represented by a deterministic trend plus a white noise process and for spring precipitation a constant plus a white noise process provides an adequate description of this series. Normality tests (Jarque-Bera) were conducted on the stochastic component of the climate variables (i.e. residuals of time series models) and the null hypothesis of normality cannot be rejected at the 5% significance level. Therefore a normal distribution with zero mean and a specific variance for each variable will be used to represent the variability of climate variables around a deterministic component. Misspecification tests (not shown here) indicate that the models fitted to climate series are statistically adequate and therefore, provide a good approximation to their data generating process (Spanos and Mcguirk 2002).

The time series models estimated for minimum wage indicate the presence of a unit root with a structural change in 1976, and two outliers occurring in 1983 and 1988, all of them correspond to economic crises and presidential changes. Unit root processes are hardly predictable because they contain stochastic trends and their variance increases with time (). Taking into account the time series properties of this type of processes and the time-horizon of interest for this study (2050), we consider that the observed series does not necessarily convey relevant information about its future evolution.Footnote 8 This variable will be better simulated using a uniform distribution with a time-dependent support in order to describe how its uncertainty increases with time. Recall that this is a maximum entropy distribution when no additional information is included.

For calculating the income of the average producer, the total number of coffee producers will be assumed to remain constant at its current value of 67,000, each producer having 2.23 hectares devoted to this activity. Field studies conducted in the state (Gay et al. 2006b; Eakin 2003) showed that a producer faces an average cost of $8,000 pesos or $727 US dollarsFootnote 9 a year per hectare, while the average producer receives government subsidies for an amount of about $750 pesos ($68.2 dollars) a year per hectare. The income of the average producer will be calculated as the gross product, minus production costs, plus government subsidies. All economic variables needed to estimate the income of the average coffee producer will be simulated using uniform distributions and are expressed in real terms.

3.3 Climate change scenarios for 2050

A total of 32 climate change scenarios of the mean value of each climate variable in the production model were obtained for the Veracruz region for year 2050 using the Pacific Climate Impacts Consortium Regional Analysis Tool,Footnote 10 considering 7 climate models (CGCM2, HADCM3, ECHAM4, CCSRNIES, GFDLR30, NCARPCM and CSIROMK2B; note that some of them have more than one simulation for a specific emissions scenario) and the four emission scenario families (A1, A2, B1, B2).

Figure 2a, b and c show the histograms of changes of spring precipitation (in%), summer and winter mean temperatures (in °C) for 2050, respectively. Histograms are shown for descriptive purposes; it is important to note that frequencies should not be interpreted as objective probabilities (Jaynes 1957, 1962; Gay and Estrada 2010). Uncertainty in climate change scenarios is particularly large for winter temperature with a range of 3.8°C, extending from 0.9°C to 4.7°C, and for spring precipitation for which different scenarios span a range of possible values from increases of 56% to decreases of 41%. These scenarios clearly illustrate the need for methodologies that can deal with these large ranges of uncertainties in order to make climate change science useful for decision making. For winter temperature the range of uncertainty in climate change scenarios is smaller but still very large, amounting to 1.5°C.

Fig. 2
figure 2

a Histogram of spring precipitation changes for 2050 in Veracruz, México. b Histogram of summer temperature changes for 2050 in Veracruz, México. c Histogram of winter temperature changes for 2050 in Veracruz, México

4 Results

4.1 Simulation under current climate and economic conditions

A baseline simulation was carried out for approximating the current probability distributions of the state’s coffee production and of the income of the average producer. This will permit to estimate some measures of the risk that the average producer faces under current conditions and comparing them with the estimations that can be obtained using different probabilistic climate change scenarios. The probability distributions of the input variables for the simulation model were parameterized as follows:

  • T summer  ~ N(24.96, 0.54) Footnote 11

  • T winter  ~ N(21.6,Footnote 12 0.68)

  • P spring  ~ N(81.35, 28.93)

  • MINWAGE (daily) ~ U(43, 53)

  • Coffee price (per ton) ~ U(2,300, 3,200)

  • Production costs (per hectare) ~ U(7,900, 8,100)

  • Subsidy (per hectare) ~ U(700, 750)Footnote 13

Table 1 presents some descriptive statistics of a simulation of 10,000 realizations of coffee production in Veracruz. These statistics show that coffee production has a small variation with respect to its mean level, as shown by its coefficient of variation which is less that 12%, which implies a low risk in production level. Under current conditions, coffee producers can expect a stable level of production, with a minimum state production greater than 160,000 tons. Figure 3 shows the histogram of the simulated coffee production under current conditions. The probability density function of coffee production shows that much of the probability mass is concentrated around its mean value, with a small standard deviation, but that the distribution is leptokurtic and asymmetric, having a long tail extending to lower values of coffee production. As can be seen from Fig. 3, production levels less than 300,000 tons have low probabilities of occurring.

Table 1 Descriptive statistics of simulated coffee production under current climate conditions and uncertain minimum wage
Fig. 3
figure 3

Histogram of coffee production in Veracruz under current conditions (uncertainty on prices, wages, production costs and subsidies). 10,000 realizations

The simulations of net income of the average producer (Table 2 and Fig. 4) show that, even if its mean value is small ($2546.2 pesos or $231.50 dollars), 82.51% of the times the net income is positive and the probability of having income greater than a threshold of $5,000 pesos ($455 dollars) is of 19.58%. Under current climate conditions and this set of (arbitrary but reasonable) economic conditions, the average producer will seldom have losses and his maximum expected loss, as measured by the VaR (value at risk) at 99% confidence level, is of $4,832.68 ($439.33 dollars). Nevertheless, it is important to notice that, given the uncertainty in prices, production costs and subsidies, the net income of the average producer is very variable as shown by its coefficient of variation (110%).Footnote 14 The range of the net income of the average producer amounts to $21,136.70 (1,121.52 dollars), providing another measure of its wide variability. These descriptive statistics show that, even without climate change and with a stable coffee production, coffee producers currently face a high level of risk regarding their income.

Table 2 Descriptive statistics of simulated net income of the average producer under current climate conditions and the assumptions of economic variables described in the text (figures are in pesos)
Fig. 4
figure 4

Histogram of the net income of the average producer in Veracruz under current climate conditions and the assumptions of economic conditions described in the text

4.2 Dynamic simulation of state coffee production and of the average net income of the average producer under climate change conditions

Seven dynamic simulations are presented in this section, all representing different beliefs regarding future climate change. A control simulation of state coffee production and income of the average producer was carried out under the assumption of no climate change; that is climate change variables remain with the same observed distributions and moments, and uncertainty is only introduced in economic variables. This simulation could also be interpreted as the product of the (subjective) judgment of a decision-maker/stakeholder that ignores climate change.

A second dynamic simulation of coffee production and of the income of the average producer was conducted using a single model (CGCM2) and the B2 emission scenario (Nakicenovic et al. 2000). It is important to notice that using a single emission scenario and model would imply that the analyst/decision-maker has an enormous quantity of information in order dismiss all uncertainty, and to assign zero probability to all other scenarios (the probability distribution is degenerated in one point). The main objective of these first two simulations is to compare with other simulations under uncertain climate change scenarios and to provide a rough estimation of the contribution of the uncertainty in economic variables, and of the uncertainty in both economic and climate variables to the time-dependent conditional probability distributions of coffee production and income.

The next five simulations were carried out using the Maximum Entropy methodology shown in Gay and Estrada (2010). Two of them correspond to different beliefs ranging from different degrees of “reckless” to “neutral” and up to different degrees of “cautious”. Notice that these scenarios are tailor-made for the individual decision-maker and use his personal beliefs and uncertainty aversion.

As revealed by the information index (see Table 3), as the subjectively chosen mean change differs from the central value of the ensemble of scenarios, the more information the decision-maker is assuming to have, and the more uncertainty (as measured by the relative entropy) is dismissed. As an example, consider the decision-maker defined as “reckless 1”, the highest levels of information and the lowest of uncertainty are assumed (the information index ranges from 0.46 in summer temperature to 0.66 in spring precipitation). This decision-maker is assuming to have a great amount of information regarding climate change in order to subjectively dismiss such a large part of the uncertainty; therefore, his probabilistic scenarios depend in great manner of his subjective (arbitrary) assumptions.

Table 3 Average mean changes for summer and winter temperatures and spring precipitation

On the other hand, the choice of the mean average change of the “neutral” agent in Table 3 is the central value of the distribution of the ensemble. This mean value translates into the principle of Insufficient Reason which consists in assigning the same probability to each of the possible outcomes; this agent has no subjective reason to do it differently. The agent’s uncertainty aversion does not lead him to give a greater weight to any of the known possible outcomes. In this case, the information index is zero and the relative entropy reaches its maximum value (1). The information index provides a measure of how much the probability distribution depends on the subjective information provided by the agent, ranging from zero (non informative subjective judgment), to 1 which represents a degenerated distribution in one point and therefore is completely dependent on the subjective information provided by the agent.

Table 3 also shows a linear reckless/cautious (uncertainty aversion) index which provides a measure of the level of reckless or cautious attitude a particular average mean change represents. This is a linear function whose domain is the values obtained from the collection of climate change scenarios and its range is −1 to 1. The value of minus one represents the most reckless attitude while the plus one value is the most cautious.

In the case of winter and summer temperatures, all scenarios project temperatures to increase, resulting in that coffee production would be negatively affected, and therefore entailing higher risk to coffee producers (see Eq. 1 and Fig. 1). Reckless attitude is associated with selecting mean temperature changes smaller than the central value of the distribution of the ensemble, and cautious attitude with choosing higher increments. In the case of spring precipitation, scenarios show that increments and decrements are possible. Equation 1 shows that there is a direct and linear relation between coffee production and spring precipitation, consequently reductions in the amount of spring precipitation lead to lower production levels and therefore represent higher risks for coffee producers. Reckless attitude is associated with selecting higher mean values than the central value of the ensemble.

In all cases a neutral attitude implies a zero value of the linear reckless/cautious index.

All the dynamic simulations presented in this section share the same arbitrary (but reasonable) scenarios for the evolution of economic variables. Uncertainty in economic variables is represented by a maximum entropy distribution with no additional information (that is, a uniform distribution) with a linearly increasing support which is expressed in the simulation model as follows:

  • Minimum wage (daily) ~ U(\( 43 - \sum\limits_{{i = 1}}^t s, \,53 + \sum\limits_{{i = 1}}^t s \)); s = 10/50

  • Coffee price (ton) ~ U(\( 2,300 - \sum\limits_{{i = 1}}^t p, \,3,200 + \sum\limits_{{i = 1}}^t p \)); p = 1,000/50

  • Costs (hectare) ~ U(\( 7,900 - \sum\limits_{{i = 1}}^t c, \,8,100 + \sum\limits_{{i = 1}}^t c \)); c = 500/50

  • Subsidies (hectare) ~ U(\( 700 - \sum\limits_{{i = 1}}^t g, 750 + \sum\limits_{{i = 1}}^t g \)); g = 100/50

For each climate variables the simulation was carried out as follows:

  1. 1)

    A realization of the corresponding maximum entropy distribution was generated (∆C).Footnote 15

  2. 2)

    The mean change obtained ∆C was divided by the number of periods to be simulated (50 years) and a linear trend is constructed using this value as its slope.

  3. 3)

    The climate variable is generated obtaining realizations from C t  ~ N(μ + T t , σ) where T is the linear trend generated in step 2), μ and σ are the current mean value and standard deviation of the observed variable. In the case of spring precipitation the normal distribution was truncated in zero to avoid the possible occurrence of negative values.

  4. 4)

    For each variable these three steps were repeated 10,000 times.

Tables 4 and 5 show, as an example, some of the risk measures that can be obtained by means of this methodology. It is important to notice that, given that this methodology provides the conditional distributions of coffee production and income of the average producer, a wide range of risk measures can be produced. The relevant risk measures, and their thresholds, are to be defined according to the informational needs of the agent/stakeholder in order to effectively aid in the planning and decision-making processes. These can include risk measures such as mean, variance and VaR which are typical for “investment” analysis, or more sophisticated ones such as median, interquartile range and estimations of the probability of reaching a predetermined threshold value. The idea is that going over each of these thresholds would trigger an action that has to be defined also by the stakeholder, which, as a whole, would produce a time-line for strategic planning.

Table 4 Risk measures obtained for state coffee production
Table 5 Risk measures obtained for income of the average coffee producer in Veracruz

For example, assume that the relevant risk measures and thresholds for two particular decision-makers (government agencies, for example) are a 10 and a 30% of decrease in state production, 10 and 50% probability of state production being less than 250,000 tons, a standard deviation of state coffee production of 60,000 and 80,000 tons, 70 and 60% probability of positive net income for the average producer, and a value for the largest expected loss for the average producer (as measured by the VaR) of -$10,000 pesos ($909 dollars). Also assume that each of these decision-makers has different beliefs (reckless and cautious) about the evolution of climate variables.Footnote 16 Each of these thresholds would trigger some specific action defined by the decision-maker in order to adapt. This methodology provides an estimation of the date on which each of these thresholds will occur according to the subjective beliefs of the decision-maker and therefore permits planning on which actions should have to be implemented for these dates.

This is a sequential process: it is important to have in mind that new simulations should be carried out as new information becomes available. In the light of this new information, beliefs are to be revised and updated as a clearer picture of climate change and its impacts can be obtained. The relevance of the proposed risk measures, thresholds and actions should also be regularly evaluated.

Tables 4 and 5 show that for the “cautious” decision-maker (line labeled as Cautious 2) the first threshold to be reached is 60,000 tons of standard deviation in state coffee production on the fifth year, followed by the probability of the average producer having a positive income of 70% on the sixth year and 60% in the ninth year. On the 11th year, mean state coffee production reaches the thresholds of a 10% decrease and a standard deviation of 80,000 tons, while the Value at Risk (99%) for the average producer also reaches its pre-defined threshold of -$10,000 pesos ($909 dollars). The probability of state coffee production being smaller than 250,000 tons reaches the pre-defined thresholds of 10% and 50% on the 14th and 26th years, respectively, while the reduction of 30% in mean state coffee production occurs on the 21st year.

The uncertainty in production (as measured by the standard deviation) generates greater variability (and uncertainty) in the expected income of the average producer. Therefore, lower probabilities of positive income for the average producer follow shortly as well as increments in the value of the VaR. It is also interesting to notice, that for this type of belief, the rate of decrease in mean state coffee production quickly accelerates as a function of time, similar to the case of the probability of coffee production being less than 250,000 tons where the increment of this probability accelerates as a function of time.

For the “reckless” decision-maker (line labeled as Reckless 2) the timing of occurrence of the pre-defined critical thresholds is very different. The first thresholds to be reached are also the 60,000 tons of standard deviation in state coffee production the 10th year, followed on the 11th year by reaching a 70% probability of having a positive income for the average producer, and a 60% probability in this same risk measure by the 18th year. On year 23, the standard deviation of state coffee production reaches the second pre-defined threshold, and a year later the threshold for the VaR occurs. The second threshold of standard deviation in coffee production occurs on the 23rd year, followed by the thresholds of 10% in the mean state coffee production in the 25th year, 10% probability of having state production below 250,000 tons in year 35, and a 30% reduction in mean state coffee production in year 49. The threshold of 50% probability of having state production below 250,000 tons never is reached. Notice that for this type of belief, the rate of change of the risk measures is much smaller than for the case of cautious beliefs, most of them being almost linear functions of time.

Tables 4 and 5 also show how production would develop if no climate change occurs, and therefore provides a baseline for inferring the costs of climate change in this activity. As can be seen from this table, when no climate change is considered, the economic scenario chosen for all simulations implies a stable production, very similar to the production that can be achieved under current conditions, with a very small increase in variability for the last years of the simulation. The mean/median of the income of the average producer also remains fairly constant, while its variability increases almost linearly, reflecting the uncertainty in prices and production costs. This no climate change scenario produces a slow decrease in the probability of positive income for the average producer, while also the probabilities of having an income increase up to more than a 35% (this is a direct result of uncertainty in prices and costs). Under this scenario, the maximum expected loss (VaR) for the average producer never reaches - $10,000 pesos ($909 dollars). Panel h of Table 5 presents the estimations of the mean economic losses for this activity in Veracruz. The total accumulated present value of the losses up to 2050 are in the range of 3,000 to 14,000 millions of pesos (from 273 to 1,273 million dollars), depending on the different beliefs assumed by the decision-maker, using an annual discount rate of 3%. These losses represent from 3 to 14 times the current annual value of the state coffee production.

Climate change, through its direct and indirect effects, has two important types of costs: a reduction of production and of the associated income of the producer, and an increase in the financial and planning costs caused by a rise in the risk of this activity due to the large uncertainty in climate scenarios.

Lastly, it is also shown in Tables 4 and 5 a simulation (labeled CGCM2 B2) using just one model (CGCM2) and one emission scenario (B2). This simulation represents the impacts assuming degenerated probability distributions in the values obtained by this run; that is, as if there was no uncertainty in the climate change projections. Although the results of this simulation are similar in the case of central tendency measures to those of the simulation labeled as Reckless 1, there are some fundamental differences. First of all, the CGCM2 B2 discards all other possible climate change scenarios (outcomes), it assumes to have all the information (no uncertainty, zero entropy) representing an information index value of 1, while the Reckless 1 simulation uses all possible outcomes, and therefore has an information index value ranging from 0.46 to 0.66; results from the CGCM2B2 simulation are completely dependent on the subjective information (beliefs) of the decision-maker and does not reflect all the state of knowledge regarding climate change scenarios for the region. The CGCM2B2 provides a false “confidence” in estimations as revealed by its low range of uncertainty as measured by all dispersion estimates, and necessarily provides a poor estimation of risk (because it is based on a single realization) entailing greater dangers for decision-making. This is to reduce uncertainty in an unjustifiable manner: trading uncertainty by ignorance (Schneider 2003). Furthermore, to consider a single outcome (a model and an emission scenario) without the context of the range of scenarios cannot tell the decision-maker if the scenario he chose represents a reckless, cautious (and by how much) or neutral.

5 Conclusions

One of the main conclusions of this paper is that, given that objective probabilities in climate change scenarios are not attainable, no “true” impact scenario can be constructed even in a probabilistic framework (see Gay and Estrada 2010). Uncertainty must therefore be preserved as much as possible in order not to discard in an unjustifiable manner any available knowledge (outcome) to be used for constructing impact scenarios.

Some of the drawbacks of the methods used for integrating uncertainty in impact assessment are discussed. One of the major pitfalls of the commonly used frequentist approach is that the resulting probability distributions are presented as “objective” facts, when it should be clearly stated that they are all subjective representations of beliefs (Gay and Estrada 2010). It is argued that subjective beliefs should be brought forward, be clearly stated and that probability distributions should be a meaningful expression of the decision-maker beliefs and not some impersonal, one-size fits all, statistically inadequate device.

This paper presents a new approach for generating climate change impact scenarios that integrates uncertainty and variability, as well as the agent’s beliefs or expert judgment, in order to produce tailor-made information for supporting decision-making and planning. It is also shown that time-dependent impact probability distributions (conditional on the available state of knowledge and subjective beliefs) can be constructed and that a great variety of risk measures can be obtained. These risk measures should be aimed to fulfill the information needs of the decision-maker.

The methodology is based in the Maximum Entropy Principle as presented in Gay and Estrada (2010), time series modeling for approximating the true data generating process and Monte Carlo simulation. The proposed methodology for constructing probabilistic climate change scenarios can produce estimates which have desirable properties such as: they are the least biased estimate possible on the available information; maximize the uncertainty (entropy) subject to the partial information that is given; the maximum entropy distribution assigns a positive probability to every event that is not excluded by the given information; no quantifiable possibility is ignored; the probabilities obtained in this manner are the best predictions possible with the state of knowledge and subjective information that is available.

The methodology is illustrated with a case study of coffee production in Veracruz (see Gay et al. 2006a). Seven simulations are presented using arbitrary, but reasonable, assumptions for the economic variables and various maximum entropy probability distributions which represent different subjective beliefs of different possible decision-makers. The time evolution of several possible risk measures and the date on which they would reach arbitrary thresholds are presented.

The costs of climate change for coffee production are estimated to have a present value in the range of approximately 3,000 to 14,000 million pesos (from 273 to 1,273 million dollars), depending on the different beliefs assumed by the decision-maker. Although it is important to notice that no adaptation activity is included and the number of producers is fixed (there are neither entries nor exits of the activity).

It is also argued that climate change, through its direct and indirect effects, has two important types of costs: a reduction of production and of the associated income of the producer and; an increase in the financial and planning costs caused by a rise in the risk of this activity due to the large uncertainty in climate scenarios.