Skip to main content

Advertisement

Log in

Do probabilistic expert elicitations capture scientists’ uncertainty about climate change?

  • Letter
  • Published:
Climatic Change Aims and scope Submit manuscript

Abstract

Expert elicitation studies have become important barometers of scientific knowledge about future climate change (Morgan and Keith, Environ Sci Technol 29(10), 1995; Reilly et al., Science 293(5529):430–433, 2001; Morgan et al., Climate Change 75(1–2):195–214, 2006; Zickfeld et al., Climatic Change 82(3–4):235–265, 2007, Proc Natl Acad Sci 2010; Kriegler et al., Proc Natl Acad Sci 106(13):5041–5046, 2009). Elicitations incorporate experts’ understanding of known flaws in climate models, thus potentially providing a more comprehensive picture of uncertainty than model-driven methods. The goal of standard elicitation procedures is to determine experts’ subjective probabilities for the values of key climate variables. These methods assume that experts’ knowledge can be captured by subjective probabilities—however, foundational work in decision theory has demonstrated this need not be the case when their information is ambiguous (Ellsberg, Q J Econ 75(4):643–669, 1961). We show that existing elicitation studies may qualitatively understate the extent of experts’ uncertainty about climate change. We designed a choice experiment that allows us to empirically determine whether experts’ knowledge about climate sensitivity (the equilibrium surface warming that results from a doubling of atmospheric CO2 concentration) can be captured by subjective probabilities. Our results show that, even for this much studied and well understood quantity, a non-negligible proportion of climate scientists violate the choice axioms that must be satisfied for subjective probabilities to adequately describe their beliefs. Moreover, the cause of their violation of the axioms is the ambiguity in their knowledge. We expect these results to hold to a greater extent for less understood climate variables, calling into question the veracity of previous elicitations for these quantities. Our experimental design provides an instrument for detecting ambiguity, a valuable new source of information when linking climate science and climate policy which can help policy makers select decision tools appropriate to our true state of knowledge.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. In fact, one of Ellsberg’s choice problems (see Fig. 1 below) rules out preference representations much more general than SEU. The choices described in Fig. 1 are inconsistent with any probabilistically sophisticated (Machina and Schmeidler 1992) preference representation.

  2. Kriegler et al. (2009) is an exception, however its results are difficult to interpret since it prompted experts for a range of probabilities (taking the existence of imprecise probabilities for granted), instead of inferring the non-existence of subjective probabilities from observed choices.

  3. The precise definition of climate sensitivity we used is quoted in the Supplementary Information, and is also available on the survey website.

  4. We computed a two-sided Wilcoxon rank-sum test at each of the three percentiles to test the hypotheses that the percentile estimates in our study and Zickfeld et al. (2010) were drawn from the same sampling distributions. All of the P-values from the three tests exceed the Bonferroni corrected 5 % threshold.

  5. Experts in our sample were less likely to violate SEU on the Ellsberg Problem than in most other published studies, where SEU violation rates can be up to 80 % (Slovic and Tversky 1974; Camerer and Weber 1992). This most likely reflects the scientist’s mathematical training, and suggests that those SEU violations that we do observe are likely not due to lack of familiarity with the rules of probability theory.

  6. Note that all we need to show is that SEU is violated once in order to conclude that an expert’s knowledge cannot be described by subjective probabilities. One might argue that it is easy to observe at least one violation of SEU by simply asking experts to make bets on a large number of values of S, however the larger the number of bets, the less power the experimental design has to detect correlations between behavior in the Ellsberg Problem and that in the Climate Problem. The current design achieves a balance between detecting SEU violations, and preserving sufficient statistical power to allow us to ascribe them to the presence of ambiguity.

  7. We use a one-sided test as our hypothesis is that ambiguous beliefs about climate sensitivity cause SEU violations on the Climate Problem to be more likely amongst those who violate SEU on the Ellsberg Problem than amongst those who do not. Thus our alternative hypothesis is ‘positive dependence’ between SEU violations on the Ellsberg and Climate Problems.

References

  • Allen M et al (2006) Observational constraints on climate sensitivity. In: Schellnhuber H, Cramer W, Nakicenovic N, Wigley T, Yohe G (eds) Avoiding dangerous climate change. Cambridge University Press, Cambridge, UK, p 406

    Google Scholar 

  • Barnard GA (1945) A new test for 2 × 2 tables. Nature 156(3954):177–177

    Article  Google Scholar 

  • Binmore K (2009). Rational decisions. Princeton University Press

  • Camerer C, Weber M (1992) Recent developments in modeling preferences: uncertainty and ambiguity. J Risk Uncertain 5(4):325–370

    Article  Google Scholar 

  • De Finetti B (1937) La prévision : ses lois logiques, ses sources subjectives. Annales de l’Institut Henri Poincaré. Institut Henri Poincaré, Paris

    Google Scholar 

  • Ellsberg D (1961) Risk, ambiguity, and the Savage axioms. Q J Econ 75(4):643–669

    Article  Google Scholar 

  • Ellsberg D (2001) Risk, ambiguity and decision. Routledge

  • Gilboa I (2009) Theory of decision under uncertainty, 1 edn. Cambridge University Press

  • Gilboa I et al (2009) Is it always rational to satisfy Savage’s axioms? Econ Philos 25(3):285–296

    Article  Google Scholar 

  • Heath C, Tversky A (1991) Preference and belief: ambiguity and competence in choice under uncertainty. J Risk Uncertain 4(1):5–28

    Article  Google Scholar 

  • Kahneman D et al (1982) Judgment under uncertainty: heuristics and biases. Cambridge University Press

  • Knutti R (2008) Should we believe model predictions of future climate change? Philos Trans R Soc A: Math Phys Sci 366(1885):4647–4664

    Article  Google Scholar 

  • Kriegler E et al (2009) Imprecise probability assessment of tipping points in the climate system. Proc Natl Acad Sci 106(13):5041–5046

    Article  Google Scholar 

  • Lemoine D, Traeger CP (2012) Tipping points and ambiguity in the economics of climate change. NBER working paper no 18230

  • Machina MJ, Schmeidler D (1992) A more robust definition of subjective probability. Econometrica 60(4):745–780

    Article  Google Scholar 

  • Meinshausen M et al (2009) Greenhouse-gas emission targets for limiting global warming to 2C. Nature 458(7242):1158–1162

    Article  Google Scholar 

  • Millner A et al (2012) Scientific ambiguity and climate policy. Environ Resour Econ (forthcoming). doi:10.1007/s10640-012-9612-0

    Google Scholar 

  • Morgan MG et al (2006) Elicitation of expert judgments of aerosol forcing. Climate Change 75(1–2):195–214

    Article  Google Scholar 

  • Morgan MG, Henrion M (1992) Uncertainty: a guide to dealing with uncertainty in quantitative risk and policy analysis. Cambridge University Press

  • Morgan MG, Keith DW (1995) Subjective judgements by climate experts. Environ Sci Technol 29(10):468A–476A

    Google Scholar 

  • Nordhaus WD (2008) A question of balance. Yale University Press

  • Ramsey F (1931) Truth and probability. In: Braithwaite R (ed) The foundations of mathematics and other logical essays. Kegan, Paul, Trench, Trubner & Co., London, Harcout, Brace and Company, New York

  • Reilly J et al (2001) Uncertainty and climate change assessments. Science 293(5529):430–433

    Article  Google Scholar 

  • Resnik MD (1987) Choices: an introduction to decision theory. University of Minnesota Press

  • Roe GH, Baker MB (2007) Why is climate sensitivity so unpredictable? Science 318(5850):629–632

    Article  Google Scholar 

  • Savage LJ (1954) The foundations of statistics. Wiley

  • Slovic P, Tversky A (1974) Who accepts Savage’s axiom? Behav Sci 19(6):368–373

    Article  Google Scholar 

  • Smith LA (2002) What might we learn from climate forecasts? Proc Natl Acad Sci USA 99(Suppl 1):2487–2492

    Article  Google Scholar 

  • Smith LA (2007) Chaos: a very short introduction, vol 159. Oxford University Press, Oxford

    Google Scholar 

  • Stainforth D et al (2007) Confidence, uncertainty and decision-support relevance in climate predictions. Philos Trans R Soc A: Math Phys Sci 365(1857):2145–2161

    Article  Google Scholar 

  • Stern NH (2007) The economics of climate change: the Stern review. Cambridge University Press, Cambridge

    Google Scholar 

  • Walley P (1990) Statistical reasoning with imprecise probabilities. Monographs on statistics and applied probability, vol 42. Chapman & Hall

  • Woodward RT, Bishop RC (1997) How to decide when experts disagree: uncertainty-based choice rules in environmental policy. Land Econ 73(4):492–507

    Article  Google Scholar 

  • Zickfeld K et al (2007) Expert judgements on the response of the Atlantic meridional overturning circulation to climate change. Climatic Change 82(3–4):235–265

    Article  Google Scholar 

  • Zickfeld K et al (2010) Expert judgments about transient climate response to alternative future trajectories of radiative forcing. Proc Natl Acad Sci 107(28):12451–12456

    Article  Google Scholar 

Download references

Acknowledgements

AM was supported by a Ciriacy-Wantrup postdoctoral fellowship at UC Berkeley during the course of this work. RC is supported by the UK Economic and Social Research Council (ESRC) and the Jan Wallander and Tom Hedelius Foundation. DAS acknowledges the support of the LSE’s Grantham Research Institute on Climate Change and the Environment and the ESRC Centre for Climate Change Economics and Policy, funded by the Economic and Social Research Council and Munich Re. We thank Rachel Denison for advice and comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Antony Millner.

Electronic Supplementary Material

Below is the link to the electronic supplementary material.

(PDF 155 KB)

Appendices

Appendix A: Methods summary

Participants were recruited by e-mail and word of mouth over a period of 3 months beginning in December 2010. Forty two respondents completed the survey (available in full online at http://www.climate.websperiment.org). All respondents consented to be identified as participants, and were informed that their responses would be anonymized. Thirteen respondents were removed from the sample as they either stated that they are not familiar with the literature on climate sensitivity estimation, or were not primarily engaged in climate science research at the time of the survey.

The 29 experts in our sample were: Gab Abramowitz, James Annan, Kyle Armour, David Easterling, Seita Emori, John Fasullo, Chris Folland, Chris Forest, Piers Forster, John Harte, Gabriele Hegerl, Gregory Jones, Reto Knutti, Gerald Meehl, James Murphy, Falk Niehoerster, Geert Jan van Oldenborgh, John Reilly, Gerard Roe, Ben Sanderson, Stephen Schwartz, Carolyn Snyder, Andrei Sokolov, Claudia Tebaldi, Simon Tett, Warren Washington, Andrew Weaver, Rob Wilby, Carl Wunsch. All reported results have been anonymized.

Each experts’ hypothesized 5th, 50th and 95th percentile of the distribution for S were initially elicited using standard probabilistic elicitation methods. They then completed four sets of betting tasks—three on the Climate Problem (one at each of the elicited percentiles of S), and one on the Ellsberg Problem. Full details of these betting tasks are available in the Supplementary Information. Participants could move back and forth through the survey at any time, and had access to help boxes on each screen with reminders about quantity definitions and judgmental biases to be aware of when forming their answers. They could also change their answers at any time. We used data only from those experts who completed the survey in full. There was no time limit on the survey, and experts were informed that they should take as much time as they need to form their best judgments. The Ellsberg Problem was presented at the very end of the survey, so as not to prime participants to think in terms of ambiguity.

Appendix B: Author contributions

AM conceived of the research. RC and AM designed the experiment. GM implemented the online survey. DAS provided guidance on the formulation of the survey questions. RC, AM, and DAS recruited participants and ran the experiment. RC analyzed the data, and AM wrote the paper.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Millner, A., Calel, R., Stainforth, D.A. et al. Do probabilistic expert elicitations capture scientists’ uncertainty about climate change?. Climatic Change 116, 427–436 (2013). https://doi.org/10.1007/s10584-012-0620-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10584-012-0620-4

Keywords

Navigation