Advertisement

Advances in Atmospheric Sciences

, Volume 35, Issue 9, pp 1101–1113 | Cite as

Characterizing the Relative Importance Assigned to Physical Variables by Climate Scientists when Assessing Atmospheric Climate Model Fidelity

  • Susannah M. Burrows
  • Aritra Dasgupta
  • Sarah Reehl
  • Lisa Bramer
  • Po-Lun Ma
  • Philip J. Rasch
  • Yun Qian
Open Access
Original Paper

Abstract

Evaluating a climate model’s fidelity (ability to simulate observed climate) is a critical step in establishing confidence in the model’s suitability for future climate projections, and in tuning climate model parameters. Model developers use their judgement in determining which trade-offs between different aspects of model fidelity are acceptable. However, little is known about the degree of consensus in these evaluations, and whether experts use the same criteria when different scientific objectives are defined. Here, we report on results from a broad community survey studying expert assessments of the relative importance of different output variables when evaluating a global atmospheric model’s mean climate. We find that experts adjust their ratings of variable importance in response to the scientific objective, for instance, scientists rate surface wind stress as significantly more important for Southern Ocean climate than for the water cycle in the Asian watershed. There is greater consensus on the importance of certain variables (e.g., shortwave cloud forcing) than others (e.g., aerosol optical depth). We find few differences in expert consensus between respondents with greater or less climate modeling experience, and no statistically significant differences between the responses of climate model developers and users. The concise variable lists and community ratings reported here provide baseline descriptive data on current expert understanding of certain aspects of model evaluation, and can serve as a starting point for further investigation, as well as developing more sophisticated evaluation and scoring criteria with respect to specific scientific objectives.

Key words

climate climate model model evaluation numerical model skill expert elicitation 

摘要

评估气候模式对观测到的现代气候的保真再现能力是气候模式研发中的关键环节.它决定模式是否适用于对未来气候预估, 也可以指导对模式参数的进一步调整. 模式开发人员通常基于他们自己的判断去综合考虑模式在不同方面的模拟性能并给出一个折衷方案. 但是, 这些评估在多大程度上是气候学家的共识呢?针对不同科学目标时, 专家们能否秉持同样的标准? 本研究以评估一个全球大气模式模拟的气候平均态为例, 完成了一次广泛的气候学家问卷调查, 用以研究气候学家对模式不同输出变量相对重要性的认识. 研究发现, 气候学家对各变量重要性的意见常随科学目标的不同而变化. 比如, 他们多认为表面风应力对南大洋气候显然比对亚洲流域的水循环更为重要. 此外, 大家对某些变量(比如:短波云强迫)的重要性往往比对其他变量(比如:气溶胶光学厚度)更有共识. 在参与调查的人员中, 气候模式研发经验在他们对这些问题的看法上几乎没有影响, 气候模式的开发人员和使用人员认识的差别也并不显著. 本文提供了一个简要的模式评估变量名单以及它们的相对重要性排名, 为目前模式评估的一些特定方面提供了一个基准性描述数据, 这也是未来研究中针对特定科学问题发展更深入评估和评分标准的新起点.

关键词

气候 气候模式 模式评估 数值模式技巧 专家引导 

Notes

Acknowledgements

The authors would like to express their sincere gratitude to everyone who participated in the survey described in this paper. While privacy restrictions prevent us from publishing their identities, we greatly appreciate the time that many busy individuals have taken, voluntarily, to contribute to this research. We would like to thank Hui WAN, Ben KRAVITZ, Hansi SINGH, and Benjamin WAGMAN for helpful comments and discussions that helped to inform this work. This research was conducted under the Laboratory Directed Research and Development Program at PNNL, a multi-program national laboratory operated by Battelle for the U.S. Department of Energy under Contract DEAC05- 76RL01830.

Supplementary material

376_2018_7300_MOESM1_ESM.pdf (227 kb)
Electronic Supplementary Material to: Characterizing the Relative Importance Assigned to Physical Variables by Climate Scientists when Assessing Atmospheric Climate Model Fidelity

References

  1. Booth, B. B. B., N. J. Dunstone, P. R. Halloran, T. Andrews, and N. Bellouin, 2012: Aerosols implicated as a prime driver of twentieth-century North Atlantic climate variability. Nature, 484, 228–232,  https://doi.org/10.1038/nature10946 CrossRefGoogle Scholar
  2. Braverman, A., N. Cressie, and J. Teixeira, 2011: A likelihoodbased comparison of temporal models for physical processes. Statistical Analysis and Data Mining: The ASA Data Science Journal, 4, 247–258,  https://doi.org/10.1002/sam.10113 CrossRefGoogle Scholar
  3. Ericsson, K., 1996: The Road to Expert Performance: Empirical Evidence from the Arts and Sciences, Sports, and Games. Lawrence Erlbaum Associates, 369 pp.Google Scholar
  4. Gleckler, P. J., K. E. Taylor, and C. Doutriaux, 2008: Performance metrics for climate models. J. Geophys. Res., 113, D06104,  https://doi.org/10.1029/2007JD008972 CrossRefGoogle Scholar
  5. Herger, N., G. Abramowitz, R. Knutti, O. Angélil, K. Lehmann, and B. M. Sanderson, 2017: Selecting a climate model subset to optimise key ensemble properties. Earth System Dynamics, 9, 135–151,  https://doi.org/10.5194/esd-9-135-2018 CrossRefGoogle Scholar
  6. Hourdin, F., and Coauthors, 2017: The art and science of climate model tuning. Bull. Amer. Meteor. Soc., 98, 589–602,  https://doi.org/10.1175/BAMS-D-15-00135.1 CrossRefGoogle Scholar
  7. Knutti, R., J. Sedláček, B. M. Sanderson, R. Lorenz, E. M. Fischer, and V. Eyring, 2017: A climate model projection weighting scheme accounting for performance and interdependence. Geophys. Res. Lett., 44, 1909–1918,  https://doi.org/10.1002/2016GL072012 Google Scholar
  8. Min, S. K., and A. Hense, 2006: A Bayesian approach to climate model evaluation and multi-model averaging with an application to global mean surface temperatures from IPCC AR4 coupled climate models. Geophys. Res. Lett., 33, L08708,  https://doi.org/10.1029/2006GL025779 CrossRefGoogle Scholar
  9. Nosedal-Sanchez, A., C. S. Jackson, and G. Huerta, 2016: A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields. Geoscientific Model Development, 9, 2407–2414,  https://doi.org/10.5194/gmd-9-2407-2016 CrossRefGoogle Scholar
  10. Qian, Y., and Coauthors, 2015: Parametric sensitivity analysis of precipitation at global and local scales in the Community Atmosphere Model CAM5. Journal of Advances in Modeling Earth Systems, 7, 382–411,  https://doi.org/10.1002/2014MS000354 CrossRefGoogle Scholar
  11. Qian, Y., and Coauthors, 2016: Uncertainty quantification in climate modeling and projection. Bull. Amer. Meteor. Soc., 97, 821–824,  https://doi.org/10.1175/BAMS-D-15-00297.1 CrossRefGoogle Scholar
  12. Reichler, T., and J. Kim, 2008: How well do coupled models simulate today’s climate? Bull. Amer. Meteor. Soc., 89, 303–311,  https://doi.org/10.1175/BAMS-89-3-303 CrossRefGoogle Scholar
  13. Riffenburgh, R. H., and P. A. Johnstone, 2009: Measuring agreement about ranked decision choices for a single subject. The International Journal of Biostatistics, 5,  https://doi.org/10.2202/1557-4679.1113
  14. Seinfeld, J. H., and Coauthors, 2016: Improving our fundamental understanding of the role of aerosol-cloud interactions in the climate system. Proceedings of the National Academy of Sciences of the United States of America, 113, 5781–5790,  https://doi.org/10.1073/pnas.151404311 CrossRefGoogle Scholar
  15. Stevens, B., 2013: Aerosols: Uncertain then, irrelevant now. Nature, 503, 47–48,  https://doi.org/10.1038/503047a CrossRefGoogle Scholar
  16. Stier, P., 2016: Limitations of passive remote sensing to constrain global cloud condensation nuclei. Atmospheric Chemistry and Physics, 16, 6595–6607,  https://doi.org/10.5194/acp-16-6595-2016 CrossRefGoogle Scholar
  17. Stocker, T. F., and Coauthors, 2013: Climate Change 2013: The Physical Science Basis. Contribution of Working group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, 1535 pp,  https://doi.org/10.1017/CBO9781107415324 Google Scholar
  18. Suckling, E. B., and L. A. Smith, 2013: An evaluation of decadal probability forecasts from state-of-the-art climate models. J. Climate, 26, 9334–9347,  https://doi.org/10.1175/JCLI-D-12-00485.1 CrossRefGoogle Scholar
  19. Yang, B., and Coauthors, 2013: Uncertainty quantification and parameter tuning in the CAM5 Zhang-McFarlane convection scheme and impact of improved convection on the global circulation and climate. J. Geophys. Res., 118, 395–415,  https://doi.org/10.1029/2012JD018213 Google Scholar
  20. Zhang, T., L. Li, Y. Lin, W. Xue, F. Xie, H. Xu, and X. Huang, 2015: An automatic and effective parameter optimization method for model tuning. Geoscientific Model Development, 8, 3579–3591,  https://doi.org/10.5194/gmd-8-3579-2015 CrossRefGoogle Scholar

Copyright information

© The Authors 2018

Open AccessThis article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.

Authors and Affiliations

  • Susannah M. Burrows
    • 1
  • Aritra Dasgupta
    • 1
  • Sarah Reehl
    • 1
  • Lisa Bramer
    • 1
  • Po-Lun Ma
    • 1
  • Philip J. Rasch
    • 1
  • Yun Qian
    • 1
  1. 1.Pacific Northwest National LaboratoryRichlandUSA

Personalised recommendations