Skip to main content

Advertisement

Log in

An assessment of the foundational assumptions in high-resolution climate projections: the case of UKCP09

  • Published:
Synthese Aims and scope Submit manuscript

Abstract

The United Kingdom Climate Impacts Programme’s UKCP09 project makes high-resolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared errors of all current climate models, treating model outputs as decision-relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change, there is little reason to expect that post-processing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy probabilistic projections at high resolution out to the end of the century.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

Notes

  1. The existence of a wide-spread a consensus is documented in Oreskes (2007); the evidence for the warming being anthropogenic is documented in the most recent IPCC report (Stocker et al. 2013); a shorter summary is Dessler (2011, Chap. 3).

  2. Knowing even roughly what is likely to happen may be reason enough not to go there.

  3. ‘UKCP’ stands for United Kingdom Climate Projections and ‘09’ indicates that it was launched for public use in 2009. The project’s broad outline is documented in the briefing report (Jenkins et al. 2009) (a revised version has been published in 2010); the science report (Murphy et al. 2010) and two recent papers (Sexton et al. 2012; Sexton and Murphy 2012) provide a detailed exposition.

  4. A ‘projection’ is the ‘response of the climate system to emission or concentration scenarios of greenhouse gases and aerosols, or radiative forcing scenarios [...]’ (Solomon et al. 2007, 943}. Unlike predictions or forecasts, projections ‘depend upon the emission/concentration/radiative forcing scenario used, which are based on assumptions concerning, for example, future socioeconomic and technological developments that may or may not be realised and are therefore subject to substantial uncertainty’ (ibid.).

  5. ‘IPCC’ refers to the Intergovernmental Panel on Climate Change, the international body for the assessment of climate change established by the United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO) in 1988. The panel’s findings are documented in its assessment reports. The 4th assessment report was published in 2007 (Solomon et al. 2007), and the 5th has been released in phases from September 2013 to October 2014.

  6. Similar projects include: Cal Adap (http://www.cal-adapt.org/precip/decadal/), Climate Wizard (http://www.climatewizard.org/), ClimateimpactsOnline (http://www.climateimpactsonline.com/).

  7. In this paper we shall use the word ‘trustworthy’ to denote probability forecasts, which one might rationally employ for decision-making purposes using probability theory in the standard way. Such probability forecasts are expected to be robust and reliable, the kind a good Bayesian would make. There may be many justifiable and interesting scientific reasons to construct probability forecasts; our criticism of them in this paper is only in regard to their direct use in decision support (as, for instance, illustrated in the worked examples of UKCP09).

  8. Uncertainty in climate modelling has been given considerable attention, among others, by Parker (2010a), Parker (2013), Winsberg and Biddle (2010), and Winsberg (2012). Our discussion has a different focus in that it deals specifically with local climate projection and concentrates on post-processing.

  9. We take this phrase to refer to the median of the probability distribution.

  10. The full set of UKCP09 predictions is at http://www.ukclimateprojections.defra.gov.uk/.

  11. Our account of the method is based on Murphy et al. (2010, Chap. 3) and Sexton et al. (2012).

  12. See http://www.metoffice.gov.uk/research/modelling-systems/unified-model/climate-models/hadcm3 (information retrieved on 23 March 2014). Further information about HadCM3 can be found at http://www.badc.nerc.ac.uk/view/badc.nerc.ac.uk__ATOM__dpt_1162913571289262.

  13. A ‘run’ is the calculation of the value of \(x\) at some particular future instant of time given a certain \(x_0\) and a set of specific values for \(\alpha \). It is synonymous with the term ‘simulation’. With today’s climate models a run of a 100 years may take between hours and months depending on the model’s complexity and resolution and on the computing hardware utilised.

  14. Going from HadCM3 to HadSM3 roughly doubles the speed of the model.

  15. We note that the notion of a vector representing the world’s climate raises many serious questions. Which variables ought to be included? At what time and length scales should its components be defined? And more fundamental, how is climate (as opposed to weather) to be defined in the first place. The documentation of UKCP09 provides little information about how these issues have been resolved. Since nothing in our discussion depends on the definition of \(y\) we don’t pursue this issue further.

  16. Given our provisos about the definition of \(y\), selecting observations as indicators of the climate vector is an equally difficult task. UKCP09 use so-called ‘pseudo-observations’: ‘We obtain these by using two or three alternative data sets for each observed quantity, from which we generate 100 pseudo-observations made by adding random linear combinations (where coefficients sum to one) of the different data sources [...] regridded onto the HadSM3 grid’ (Sexton et al. 2012, p. 2517). Again, nothing in the discussion to follow depends on how exactly observations are treated and so we set this issue aside.

  17. Of course these distributional assumptions are often questionable; for example they cannot hold for precipitation which is positive definite.

  18. Arguably parameters of an imperfect model are not uncertain but rather indeterminate, as there is no ideal set of parameter values which will make the model adequate for all predictive purposes, as would be the case if the model structure was perfect and the values of the parameters were well defined but simply unknown (Du and Smith 2012).

  19. This assumption is controversial. Smith (2006) argues that for imperfect models appropriate values (leading to trustworthy forecasts) may not exist. For want of space we set these worries aside and proceed as if the question was one of uncertainty not indeterminacy; for more on this point see Smith and Stern (2011).

  20. We note in passing the lack of unanimity on whether the second ‘P’ of PPE stands for ‘parameter’, ‘parameterization’, or ‘physics’.

  21. For a discussion of what ‘best’ might mean see also Parker (2010b).

  22. Note the difference between the range of reasonable model-parameter values within the model and the uncertainty in the value of the corresponding physical parameter, when such a thing exists.

  23. There are a host of challenges here, as climate is a distribution, the model climate and the real climate are not in the same state space, whatever notion of ‘best’ is taken the simulation model will not be ‘best’ for all target variants, and so on. These points raise important questions about how the distance is measured and what the discrepancy is intended to represent in practice in the climate case.

  24. UKCP09 expresses this by writing (in our notion) \(y=\varphi (a^{*}) + \varepsilon \); see Eq. (1) in (Sexton et al. 2012, p. 2521). However this is not a correct formal expression of the concept of a discrepancy term because, as noted above, \(y\) and \(x\) are not members of the same vector space and hence cannot be added. For a discussion of ‘subtractability’, see Smith (2006) and references therein.

  25. See also Murphy et al. (2010, pp. 63–64).

  26. For details see Sexton et al. (2012, pp. 2525–2527).

  27. This is Eq. (5) in Sexton et al. (2012, p. 2523); for a discussion of the derivation see Rougier (2007).

  28. Note that emulators may elsewhere be called ‘surrogate models’, ‘meta-models’, and ‘models of models’. A general introduction to emulation (unrelated to UKCP09) can be found at http://www.mucm.aston.ac.uk/MUCM/MUCMToolkit/index.php?page=MetaFirstExample.html.

  29. UKCP09 does so indirectly in the sense that it emulates the coefficients of a set of basis vectors for the output space in question. Nothing in the discussion that follow depends on this.

  30. Furthermore, there are limitations to our scientific understanding of the climate system and there may be relevant factors and process that we are simply unaware of—there may be unknown unknowns which would lead us to alter the equations of the model even under our current computational constraints.

  31. Even today’s best climate models do not simulate blocking realistically. For a discussion of this point see Smith and Stern (2011) and Hoskins’ review of UKCP09 (available at http://www.ukclimateprojections.metoffice.gov.uk/23173).

  32. The question of what good science should report on the lead times beyond those on which quantitative guidance is informative is a separate issue.

  33. An example is Murphy et al. (2010, pp. 63–69).

  34. The challenges to smoothness posed by computations on a digital computer are ignored below. We note in passing that such challenges exist when, for example, a change in the least significant bit of \(x_0\) yield significant changes in the target variable (Lorenz 1968).

  35. The principle and its problems are well documented in the philosophical literature on probability; see, for instance, Salmon et al. (1992, pp. 74–77).

  36. See Stainforth et al. (2005, (2007) for a discussion of the ice fall rate.

  37. Initial conditions are mentioned again on p. 129, but no information beyond what has been said on pp. 26–27 is provided.

  38. Initial conditions are briefly mentioned but not discussed in Sexton et al. (2012) and in the Briefing Report (Jenkins et al. 2009). The Science Report (Murphy et al. 2010), a document of over 190 pages, dedicates three pages in the introductory part to the problem of initial condition uncertainty.

  39. The multi-years averages in (2007) were 8 years long compared to 30 years in UKPC09. However, the problem pinpointed by Stainforth et al. is unlikely to disappear by moving from 8 year averages to 30 year averages.

  40. A summary of the review is available at http://www.ukclimateprojections.metoffice.gov.uk/23173. The above quotation has been retrieved on 7 March 2014.

  41. We thank an anonymous referee for raising this concern.

  42. Sometimes this observation comes in the guise of there being a cascade of uncertainty, with moderate confidence at the continental scale and less confidence at the local scale; see for instance (Jenkins et al. 2009, pp. 6 and 22). How exactly the point that local projections are uncertain is expressed is immaterial to the dialectic in this section.

  43. These alternative readings are also discussed in Parker (2014).

  44. See http://www.ukclimateprojections.metoffice.gov.uk/23102 for details.

  45. Discussion of the translation between model variables and real world variables with similar names can be found in Smith (2000) and Smith (2002).

  46. Expected to change even without a deeper scientific understanding of the phenomena, or new observations. Scientific projects are always subject to change when our basic understanding of science changes, the question is whether they are mature conditioned on everything we know today.

References

  • Allen, M. R., & Stainforth, D. A. (2002). Towards objective probabalistic climate forecasting. Nature, 419(6903), 228–228.

    Article  Google Scholar 

  • Beven, K. (2012). Causal models as multiple working hypotheses about environmental processes. Comptes Rendus Geoscience, 344, 77–88.

    Article  Google Scholar 

  • Bishop, C. H., & Abramowitz, G. (2013). Climate model ependence and the replicate earth paradigm. Climate Dynamics, 41, 885–900.

    Article  Google Scholar 

  • Daron, J. D., & Stainforth, D. A. (2013). On predicting climate under climate change. Environmental Research Letters, 8, 1–8.

    Article  Google Scholar 

  • Deser, C., Knutti, R., Solomon, S., & Phillips, A. S. (2012). Communication of the role of natural variability in future North American climate. Nature Climate Change, 2(November), 775–779.

    Article  Google Scholar 

  • Dessler, A. (2011). Introduction to modern climate change. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Du, H., & Smith, L. A. (2012). Parameter estimation through ignorance. Physical Review E, 86(1), 016213.

    Article  Google Scholar 

  • Frigg, R., Bradley, S., Du, H., & Smith, L. A. (2014). The adventures of Laplace’s demon and his apprentices. Philosophy of Science, 81(1), 31–59.

    Article  Google Scholar 

  • Jenkins, G., Murphy, J., Sexton, D., Lowe, J., & Jones, P. (2009). UK climate projections: briefing report, DEFRA. Met Office Hadley Centre, Exeter.

  • Judd, K., & Smith, L. A. (2004). Indistinguishable states II: The imperfect model scenario. Physica D, 196, 224–242.

    Google Scholar 

  • Jun, M. Y., Knutti, R., & Nychka, D. W. (2008a). Spatial analysis to quantify numerical model bias and dependence: How many climate models are there? Journal of the American Statistical Association, 103, 934–947.

    Article  Google Scholar 

  • Jun, M. Y., Knutti, R., & Nychka, D. W. (2008b). Local eigenvalue analysis of CMIP3 climate model errors. Tellus A: Dynamic Meteorology and Oceanography, 60, 992–1000.

    Article  Google Scholar 

  • Kennedy, M. C., & O’Hagan, A. (2001). Bayesian calibration of computer models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(3), 425–464.

    Article  Google Scholar 

  • Knutti, R., Furrer, R., Tebaldi, C., Cermak, J., & Meehl, G. A. (2010). Challenges in combining projections from multiple climate models. Journal of Climate, 23, 2739–2758.

    Article  Google Scholar 

  • Lorenz, E. (1968). Climate determinism. Meteorological Monographs, 8(30), 1–3.

    Google Scholar 

  • McWilliams, J. C. (2007). Irreducible imprecision in atmospheric and oceanic simulations. Proceedings of the National Aacdemy of Sciences, 104(21), 8709–8713.

    Article  Google Scholar 

  • Meehl, G. A., Goddard, L., Murphy, J., Stoufer, R. J., Boer, G., Danabasoglu, G., et al. (2009). Decadal prediction can it be skillful? Bulletin of the American Meteorological Society, 90, 1467–1485.

    Article  Google Scholar 

  • Murphy, J., Sexton, D., Jenkins, G., Boorman, P., Booth, B., Brown, K., et al. (2010). UK climate projections science report: Climate change projections. Version 3, updated December 2010. http://www.ukclimateprojections.defra.gov.uk/22544. Met Office Hadley Centre, Exeter.

  • Murphy, J. M., Booth, B. B. B., Collins, M., Harris, G. R., Sexton, D. M. H., & Webb, M. J. (2007). A methodology for probabilistic predictions of regional climate change for perturbed physics ensembles. Philosophical Transactions of the Royal Society A, 365, 1993–2028.

    Article  Google Scholar 

  • Murphy, J. M., Sexton, D. M. H., Barnett, D. N., Jones, G. S., Webb, M. J., Collins, M., et al. (2004). Quantification of modelling uncertainties in a large ensemble of climate change simulations. Nature, 430(12 Agust), 768–772.

    Article  Google Scholar 

  • Oreskes, N. (2007). The scientific consensus on climate change: How do we know we’re not wrong? In J. F. C. DiMento & P. Doughman (Eds.), Climate change: What it means for us, our children, and our grandchildren (pp. 65–99). Boston: MIT Press.

    Google Scholar 

  • Oreskes, N., Stainforth, D. A., & Smith, L. A. (2010). Adaptation to global warming: Do climate models tell us what we need to know? Philosophy of Science, 77(5), 1012–1028.

    Article  Google Scholar 

  • Parker, W. (2010a). Predicting weather and climate: Uncertainty, ensembles and probability. Studies in History and Philosophy of Modern Physics, 41(3), 263–272.

    Article  Google Scholar 

  • Parker, W. (2010b). Whose probabilities? Predicting climate change with ensembles of models. Philosophy of Science, 77(5), 985–997.

    Article  Google Scholar 

  • Parker, W. (2013). Ensemble modeling, uncertainty and robust predictions. Wiley Interdisciplinary Reviews: Climate Change, 4(3), 213–223.

    Google Scholar 

  • Parker, W. S. (2014). Values and uncertainties in climate prediction, revisited. Studies in History and Philosophy of Science, 46, 24–30.

    Article  Google Scholar 

  • Reichler, T., & Kim, J. (2008). How well do coupled models simulate today’s climate? Bulletin of the American Meteorological Society, 89(3), 303–311.

    Article  Google Scholar 

  • Rougier, J. (2007). Probabilistic inference for future climte using an ensemble of climate model evaluations. Climatic Change, 81, 247–264.

    Article  Google Scholar 

  • Rougier, J. (2008). Efficient emulators for multivariate deterministic functions. Journal of Computational and Graphical Statistics, 17(4), 27–843.

    Article  Google Scholar 

  • Salmon, M., Earman, J., Glymour, C., Lennox, J. G., Machamer, P., McGuire, J. E., et al. (1992). Introduction to the philosophy of science. Indianapolis and Cambridge: Hackett.

    Google Scholar 

  • Seager, R., Kushnir, Y., Ting, M. F., Cane, M., Naik, N., & Miller, J. (2008). Would advance knowledge of 1930s SSTs have allowed prediction of the Dust Bowl drought? Journal of Climate, 21, 3261–3281.

    Article  Google Scholar 

  • Sexton, D. M. H., & Murphy, J. M. (2012). Multivariate probabilistic projections using imperfect climate models part II: Robustness of methodological choices and consequences for climate sensitivity. Climate Dynamics, 38, 2543–2558.

    Article  Google Scholar 

  • Sexton, D. M. H., Murphy, J. M., Collins, M., & Webb, M. J. (2012). Multivariate probabilistic projections using imperfect climate models part I: Outline of methodology. Climate Dynamics, 38, 2513–2542.

    Article  Google Scholar 

  • Smith, L. A. (2000). Disentangling uncertainty and error: on the predictability of nonlinear systems. In A. I. Mees (Ed.), Nonlinear Dynamics and Statistics (pp. 31–64). Boston: Birkhauser.

  • Smith, L. A. (2002). What might we learn from climate forecasts? Proceedings of the National Academy of Science, USA, 4(99), 2487–2492.

    Article  Google Scholar 

  • Smith, L. A. (2006). Predictability past predictability present. In T. Palmer & R. Hagedorn (Eds.), Predictability of weather and climate (pp. 217–250). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Smith, L. A., Du, H., Suckling, E. B., & Niehörster, F. (2014). Probabilistic skill in ensemble seasonal forecasts. Quarterly Journal of the Royal Meteorological Society. doi:10.1002/qj.2403.

  • Smith, L. A., & Stern, N. (2011). Uncertainty in science and its role in climate policy. Philosophical Transactions of the Royal Society A, 369, 1–24.

    Google Scholar 

  • Solomon, S., Qin, D., & Manning, M. (Eds.). (2007). Contribution of Working group I to the fourth assessment report of the intergovernmental panel on climate change. Cambridge: Cambridge University Press.

  • Stainforth, D. A., Aina, T., Christensen, C., Collins, M., Faull, N., Frame, D. J., et al. (2005). Uncertainty in predictions of the climate response to rising levels of greenhouse gases. Nature, 433(7024), 403–406.

    Article  Google Scholar 

  • Stainforth, D. A., Allen, M. R., Tredger, E. R., & Smith, L. A. (2007). Confidence, uncertainty and decision-dupport relevance in climate predictions. Philosophical Transaction of the Royal Socity A, 365(1857), 2145–2161.

    Article  Google Scholar 

  • Stocker, T. F., Qin, D., Plattner, G.-K., Tignor, M. M. B., Allen, S. K., Boschung, J., et al. (Eds.). (2013). Climate change 2013. The physical science basis. Working group I contribution to the fifth assessment report of the intergovernmental panel on climate change. Cambridge: Cambridge University Press.

  • Tang, S., & Dessai, S. (2012). Usable science? The UK climate projections 2009 and decision support for adaptation planning. forthcoming in weather, climate, and society.

  • Thompson, E. L. (2013). Modelling North Atlantic storms in a changing climate. Ph.D. Thesis. Imperial College, London.

  • Winsberg, E. (2012). Values and uncertainties in the predictions of global climate models. Kennedy Institute of Ethics Journal, 22(2), 111–137.

    Article  Google Scholar 

  • Winsberg, E., & Biddle, J. (2010). Value judgements and the estimation of uncertainty in climate modeling. In P. D. Magnus & J. B. Busch (Eds.), New waves in philosophy of science (pp. 172–197). London: Palgrave Macmillan.

    Google Scholar 

Download references

Acknowledgments

Work for this paper has been supported by the LSE’s Grantham Research Institute on Climate Change and the Environment and the Centre for Climate Change Economics and Policy funded by the Economics and Social Science Research Council and Munich Re. Frigg further acknowledges financial support from the AHRC-funded ‘Managing Severe Uncertainty’ Project and Grant FFI2012-37354 of the Spanish Ministry of Science and Innovation (MICINN). Smith would like to acknowledge continuing support from Pembroke College, Oxford, from the Blue-Green Cities Research Consortium funded by EPSRC (Grant EP/K013661/1), and from RDCEP via NSF grant No. 0951576. We would like to thank Wendy Parker, Erica Thompson, and Charlotte Werndl for comments on earlier drafts and/or helpful discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roman Frigg.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Frigg, R., Smith, L.A. & Stainforth, D.A. An assessment of the foundational assumptions in high-resolution climate projections: the case of UKCP09. Synthese 192, 3979–4008 (2015). https://doi.org/10.1007/s11229-015-0739-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11229-015-0739-8

Keywords

Navigation