Validation of Climate Models: An Essential Practice

  • Richard B. RoodEmail author
Part of the Simulation Foundations, Methods and Applications book series (SFMA)


This chapter describes a structure for climate model verification and validation. The construction of models from components and subcomponents is discussed, and the construction is related to verification and validation. In addition to quantitative measures of mean, bias, and variability, it is argued that physical consistency must be informed by correlative behavior that is related to underlying physical theory. The more qualitative attributes of validation are discussed. The consideration of these issues leads to the need for deliberative, expert evaluation as a part of the validation process. The narrative maintains a need for a written validation plan that describes the validation criteria and metrics and establishes the protocols for the essential deliberations. The validation plan, also, sets the foundations for independence, transparency, and objectivity. These values support both scientific methodology and integrity in the public forum.


Climate Modeling Verification Validation Science Society Quantitative Qualitative Community 



I thank the editors, Claus Beisbart and Nicole J. Saam, for the opportunity to contribute this chapter and for their efforts in putting together this volume. I thank Cecelia Deluca for reading an early version of the manuscript and many discussions on modeling infrastructure, verification and validation, and insights into modeling culture.


  1. Cash, D. W., Clark, W. C., Alcock, F., Dickson, N. M., Eckley, N., Guston, D. H., et al. (2003). Knowledge systems for sustainable development. Proceeding of the National Academy of Sciences, 100, 8086–8091.CrossRefGoogle Scholar
  2. Clune, T. L., & Rood, R. B. (2011). Software testing and verification in climate model development. IEEE Software, 28, 49–55. Scholar
  3. Data Assimilation Office (DAO). (1996). Algorithm Theoretical Basis Document Version 1.01, Data Assimilation Office, Goddard Space Flight Center. Retrieved from
  4. Dee, D. P. (1995). A pragmatic approach to model validation. In Quantitative skill assessment for coastal ocean models. American Geophysical Union (pp. 1–14).Google Scholar
  5. Douglass, A. R., Prather, M. J., Hall, T. M., Strahan, S. E., Pasch, P. J., Sparling, L. C., et al. (1999). Choosing meteorological input for the global modeling initiative assessment of high-speed aircraft. Journal Geophysical Research, 104, 27545–27564.CrossRefGoogle Scholar
  6. Edwards, P. N. (2010). A vast machine. Cambridge, MA, USA: The MIT Press.Google Scholar
  7. Farber, D. A. (2007). Climate models: A user’s guide. Berkeley, CA, USA, UC Berkeley Public Law Research Paper No. 1030607.Google Scholar
  8. Flato, G., Marotzke, J., Abiodun, B., Braconnot, P., Chou, S. C., Collins, W., et al. (2013) Evaluation of climate models. In T. F. Stocker, D. Qin, G. -K. Plattner, M. Tignor, S. K. Allen, J. Boschung, et al. (Eds.) Climate change 2013: The physical science basis. Contribution of working group I to the fifth assessment report of the intergovernmental panel on climate change. Cambridge, United Kingdom and New York, NY, USA: Cambridge University Press.Google Scholar
  9. Gates, W. L. (1992). AMIP: The atmospheric model intercomparison project. Bulletin of the American Meteorological Society, 73, 1962–1970.CrossRefGoogle Scholar
  10. Gettelman, A., & Rood, R. B. (2016). Demystifying climate models: A users guide to earth systems models. Berlin, Heidelberg: Springer. Scholar
  11. Guillemot, H. (2010). Connections between simulations and observation in climate computer modeling. Scientist’s practices and ‘‘bottom-up epistemology’’ lessons. Studies in History and Philosophy of Modern Physics, 41, 242–252.CrossRefGoogle Scholar
  12. Jablonowski, C., & Williamson, D. L. (2006). A baroclinic wave test case for dynamical cores of General Circulation Models: Model intercomparisons. NCAR Technical Note NCAR/TN-4691STR, National Center for Atmospheric Research, Boulder, CO (89 pp).Google Scholar
  13. Johnson, S. D., Battisti, D. S., & Sarachik, E. S. (2000). Empirically derived Markov models and prediction of tropical Pacific sea surface temperature anomalies. Journal of Climate, 13, 3–17.CrossRefGoogle Scholar
  14. Lenhard, J., & Winsberg, E. (2010). Holism, entrenchment, and the future of climate model pluralism. Studies in History and Philosophy of Modern Physics, 41, 253–262.CrossRefGoogle Scholar
  15. Lemos, M. C., & Rood, R. B. (2010). Climate projections and their impact on policy and practice. Wiley Interdisciplinary Reviews: Climate Change, 1, 670–682. Scholar
  16. Lloyd, E. A. (2012). The role of ‘complex’ empiricism in the debates about satellite data and climate models. Studies in History and Philosophy of Science, 43, 390–401.Google Scholar
  17. Mears C. A., & Wentz F. J. (2017). A satellite-derived lower tropospheric atmospheric temperature dataset using an optimized adjustment for diurnal effects. Journal of Climate. Early online release Scholar
  18. National Aeronautics and Space Administration (NASA). (2016). Independent Verification and Validation Framework. IVV 09-1, Version: P. Retrieved from
  19. Norton, S. D., & Suppe, F. (2001). Why atmospheric modeling is good science. In C. A. Miller & P. N. Edwards (Eds.), Changing the atmosphere: Expert knowledge and environmental governance (pp. 67–105). Cambridge, MA, USA: The MIT Press.Google Scholar
  20. Oberkampf, W. L., & Trucano, T. G. (2002). Verification and validation in computational fluid dynamics, SAND2002 – 0529. Albuquerque, NM, USA: Sandia National Laboratories.Google Scholar
  21. Oreskes, N., Shrader-Frechette, K., & Belitz, K. (1994). Verification, validation, and confirmation of numerical models in the earth sciences. Science, 263, 641–646.CrossRefGoogle Scholar
  22. Post, D. E., & Votta, L. G. (2005). Computational science demands a new paradigm. Physics Today, 58, 35–41.CrossRefGoogle Scholar
  23. Roache, P. J. (1998). Verification of codes and calculations. AIAA Journal, 36, 696–702.CrossRefGoogle Scholar
  24. Roache, P. J. (2016). Verification and validation in fluids engineering: Some current issues. ASME Journal of Fluids Engineering, 138, 11.CrossRefGoogle Scholar
  25. Robock, A. (1983). El Chichón provides test of volcanoes’ influence on climate. National Weather Digest, 8, 40–45.Google Scholar
  26. Roesler, E. L., Posselt, D. J., & Rood, R. B. (2017). Using large eddy simulations to reveal the size, strength, and phase of updraft and downdraft cores of an Arctic mixed-phase stratocumulus cloud. Journal Geophysical Research, 122, 4378–4400.Google Scholar
  27. Rood, R. B. (2010). The role of the model in the data assimilation system. In W. Lahoz, B. Khattatov, & R. Menard (Eds.), Data assimilation: Making sense of observations (pp. 351–379). Berlin, Heidelberg: Springer.
  28. Roy, C. J., & Oberkampf, W. L. (2011). A comprehensive framework for verification, validation, and uncertainty quantification in scientific computing. Computer Methods in Applied Mechanics and Engineering, 200, 2131–2144.MathSciNetCrossRefGoogle Scholar
  29. Santer, B. D., Solomon, S., Pallotta, G., Mears, C., Po-Chedley, S., Fu, Q., et al. (2017). Comparing tropospheric warming in climate models and satellite data. Journal of Climate, 30, 373–392.CrossRefGoogle Scholar
  30. Shackley, S. (2001). Epistemic lifestyles in climate change modeling. In C. A. Miller & P. N. Edwards (Eds.), Changing the atmosphere: Expert knowledge and environmental governance (pp. 107–133). Cambridge, MA, USA: The MIT Press.Google Scholar
  31. Strang, G. (1968). On the construction and comparison of difference schemes. SIAM Journal on Numerical Analysis, 5, 506–517.MathSciNetCrossRefGoogle Scholar
  32. Stajner, I., Winslow, N., Rood, R. B., & Pawson, S. (2004). Monitoring of observation errors in the assimilation of satellite ozone data. Journal Geophysical Research, 109, D06309. Scholar
  33. Sundberg, M. (2011). The dynamics of coordinated comparisons: How simulationists in astrophysics, oceanography and meteorology create standards for results. Social Studies of Science, 41, 107–125.CrossRefGoogle Scholar
  34. Taylor, K. E. (2001). Summarizing multiple aspects of model performance in a single diagram. Journal Geophysical Research, 106, 7183–7192.CrossRefGoogle Scholar
  35. Theurich, G., DeLuca, C., Campbell, T., Liu, F., Saint, K., Vertenstein, M., et al. (2016). The earth system prediction suite: Toward a coordinated US modeling capability. Bulletin of the American Meteorological Society, 98, 1229–1247.CrossRefGoogle Scholar
  36. Williams, D. N., Balaji, V., Cinquini, L., Denvil, S., Duffy, D., Evans, B., et al. (2016). A global repository for planet-sized experiments and observations. Bulletin of the American Meteorological Society, 98, 803–816. Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Climate and Space Sciences and EngineeringUniversity of MichiganAnn ArborUSA

Personalised recommendations