Climate Dynamics

, Volume 44, Issue 1–2, pp 95–114 | Cite as

An evaluation of the CMIP3 and CMIP5 simulations in their skill of simulating the spatial structure of SST variability

  • Gang WangEmail author
  • Dietmar Dommenget
  • Claudia Frauen


The natural sea surface temperature (SST) variability in the global oceans is evaluated in simulations of the Climate Model Intercomparison Project Phase 3 (CMIP3) and CMIP5 models. In this evaluation, we examine how well the spatial structure of the SST variability matches between the observations and simulations on the basis of their leading empirical orthogonal functions-modes. Here we focus on the high-pass filter monthly mean time scales and the longer 5 years running mean time scales. We will compare the models and observations against simple null hypotheses, such as isotropic diffusion (red noise) or a slab ocean model, to illustrate the models skill in simulating realistic patterns of variability. Some models show good skill in simulating the observed spatial structure of the SST variability in the tropical domains and less so in the extra-tropical domains. However, most models show substantial deviations from the observations and from each other in most domains and particularly in the North Atlantic and Southern Ocean on the longer (5 years running mean) time scale. In many cases the simple spatial red noise null hypothesis is closer to the observed structure than most models, despite the fact that the observed SST variability shows significant deviations from this simple spatial red noise null hypothesis. The CMIP models tend to largely overestimate the effective spatial number degrees of freedom and simulate too strongly localized patterns of SST variability at the wrong locations with structures that are different from the observed. However, the CMIP5 ensemble shows some improvement over the CMIP3 ensemble, mostly in the tropical domains. Further, the spatial structure of the SST modes of the CMIP3 and CMIP5 super ensemble is more realistic than any single model, if the relative explained variances of these modes are scaled by the observed eigenvalues.


CMIP Climate variability Model evaluation Eigenvalue projection 



We like to thank Tobias Bayr, Johanna Baehr, Katja Lorbacher and Timofej Woyzichowzki for fruitful discussions and comments. The comments of two anonymous referees have helped to improve the presentation of this study substantially. The ARC Centre of Excellence in Climate System Science (CE110001028) and the Deutsche Forschung Gemeinschaft (DO1038/5-1) supported this study. The slab ocean model simulations were computed on the National Computational Infrastructure in Canberra.

Supplementary material

382_2014_2154_MOESM1_ESM.pdf (1.1 mb)
First three leading EOF patterns, as in Fig. 10, but of high-pass SSTA in the Tropical Indian Ocean and Pacific for a selection of model simulations and observations. In addition to the observations, the CMIP3 and CMIP5 super models, the models with the largest (GISS-AOM) and smallest (ECHAM5/MPI-OM) RMSEEOF value are shown (PDF 1153 kb)
382_2014_2154_MOESM2_ESM.pdf (750 kb)
First three leading EOF patterns, as in Fig. 10, but of high-pass SSTA in the North Atlantic for a selection of model simulations and observations. In addition to the observations, the CMIP3 and CMIP5 super models, the models with the largest (CSIRO-Mk3.0) and smallest (CGCM3.1 (T47)) RMSEEOF value are shown (PDF 751 kb)
382_2014_2154_MOESM3_ESM.pdf (490 kb)
First three leading EOF patterns, as in Fig. 10, but of high-pass SSTA in the Tropical Atlantic for a selection of model simulations and observations. In addition to the observations, the CMIP3 and CMIP5 super models, the models with the largest (INMCM4) and smallest (CCSM4) RMSEEOF value are shown (PDF 491 kb)
382_2014_2154_MOESM4_ESM.pdf (999 kb)
First three leading EOF patterns, as in Fig. 10, but of high-pass SSTA in the Southern Ocean for a selection of model simulations and observations. In addition to the observations, the CMIP3 and CMIP5 super models, the models with the largest (INMCM4) and smallest (CCSM4) RMSEEOF value are shown (PDF 999 kb)
382_2014_2154_MOESM5_ESM.pdf (92 kb)
Nspatial values for all models as shown in Fig. 4 (PDF 93 kb)
382_2014_2154_MOESM6_ESM.pdf (82 kb)
RMSEEOF values relative to the observations as shown in Fig. 9 for all models (PDF 81 kb)
382_2014_2154_MOESM7_ESM.pdf (83 kb)
RMSEEOF values of the pairwise model comparison as shown in Fig. 11 (PDF 82 kb)


  1. Bayr T, Dommenget D (2014) Comparing the spatial structure of variability in two datasets against each other on the basis of EOF-modes. Clim Dyn 42(5–6):1631–1648CrossRefGoogle Scholar
  2. Boer GJ, Lambert SJ (2001) Second order space–time climate difference statistics. Clim Dyn 17:213–218CrossRefGoogle Scholar
  3. Brayshaw DJ, Hoskins B, Blackburn M (2008) The storm-track response to idealized SST perturbations in an aquaplanet GCM. J Atmos Sci 65:2842–2860CrossRefGoogle Scholar
  4. Bretherton CS, Widmann M, Dymnikov VP, Wallace JM, Bladé I (1999) The effective number of spatial degrees of freedom of a time-varying field. J Clim 12:1990–2009CrossRefGoogle Scholar
  5. Cahalan RF, Wharton LE, Wu W-L (1996) Empirical orthogonal functions of monthly precipitation and temperature over the United States and homogeneous stochastic models. J Geophys Res 101:26309–26318CrossRefGoogle Scholar
  6. Cai W, Sullivan A, Cowan T, Ribbe J, Shi G (2011) Simulation of the Indian Ocean Dipole: a relevant criterion for selecting models for climate projections. Geophys Res Lett 38:L03704. doi: 10.1029/2010GL046242 CrossRefGoogle Scholar
  7. Davies T, Cullen MJP, Malcolm AJ, Mawson MH, Staniforth A, White AA, Wood N (2005) A new dynamical core for the Met Office’s global and regional modelling of the atmosphere. Q J R Meteorol Soc 131:1759–1782CrossRefGoogle Scholar
  8. Delecluse P, Davey MK, Kitamura Y, Philander SGH, Suarez M, Bengtsson L (1998) Coupled general circulation modeling of the tropical Pacific. J Geophys Res 103(C7):14357–14373CrossRefGoogle Scholar
  9. Deremble B, Lapeyre G, Ghil M (2012) Atmospheric dynamics triggered by an oceanic SST front in a moist quasigeostrophic model. J Atmos Sci 69:1617–1632CrossRefGoogle Scholar
  10. Dommenget D (2007) Evaluating EOF modes against a stochastic null hypothesis. Clim Dyn 28(5):517–531CrossRefGoogle Scholar
  11. Dommenget D (2010) The slab ocean El Niño. Geophys Res Lett 37:L20701. doi: 10.1029/2010GL044888 CrossRefGoogle Scholar
  12. Dommenget D (2011) An objective analysis of the observed spatial structure of the tropical Indian Ocean SST variability. Clim Dyn 36:2129–2145CrossRefGoogle Scholar
  13. Dommenget D (2012) Analysis of the model climate sensitivity spread forced by mean sea surface temperature biases. J Clim 25:7147–7162CrossRefGoogle Scholar
  14. Dommenget D, Latif M (2002) Analysis of observed and simulated SST spectra in the midlatitudes. Clim Dyn 19:277–288CrossRefGoogle Scholar
  15. Downes SM, Hogg AM (2013) Southern Ocean circulation and eddy compensation in CMIP5 models. J Clim 26:7198–7220CrossRefGoogle Scholar
  16. Gleckler PJ, Taylor KE, Doutriaux C (2008) Performance metrics for climate models. J Geophys Res 113:D06104. doi: 10.1029/2007JD008972 Google Scholar
  17. Grenier H, Le Treut H, Fichefet T (2000) Ocean-atmosphere interactions and climate drift in a coupled general circulation model. Clim Dyn 16:701–717CrossRefGoogle Scholar
  18. Guilyardi E (2006) El Niño–mean state–seasonal cycle interactions in a multi-model ensemble. Clim Dyn 26:329–348CrossRefGoogle Scholar
  19. Gupta AS, Jourdain NC, Brown JN, Monselesan D (2013) Climate drift in the CMIP5 models. J Clim 26:8597–8615CrossRefGoogle Scholar
  20. Hirota N, Takayabu YN (2013) Reproducibility of precipitation distribution over the tropical oceans in CMIP5 multi-climate models compared to CMIP3. Clim Dyn 41(11–12):2909–2920CrossRefGoogle Scholar
  21. Huang B, Hu Z–Z, Jha B (2007) Evolution of model systematic errors in the tropical Atlantic basin from coupled climate hindcasts. Clim Dyn 28(7–8):661–682CrossRefGoogle Scholar
  22. Jamison N, Kravtsov S (2010) Decadal variations of north Atlantic sea surface temperature in observations and CMIP3 simulations. J Clim 23:4619–4636CrossRefGoogle Scholar
  23. Jolliffe I (2002) Principal component analysis, 2nd edn. Springer, New YorkGoogle Scholar
  24. Kao H, Yu J (2009) Contrasting Eastern-Pacific and Central-Pacific types of ENSO. J Clim 22(3):615–632CrossRefGoogle Scholar
  25. Kirtman BP, Bitz C, Bryan F, Collins W, Dennis J, Hearn N, Kinter JL III, Loft R, Rousset C, Siqueira L, Stan C, Tomas R, Vertenstein M (2012) Impact of ocean model resolution on CCSM climate simulations. Clim Dyn 39(6):1303–1328CrossRefGoogle Scholar
  26. Knutti R, Furrer R, Tebaldi C, Cermak J, Meehl GA (2010) Challenges in combining projections from multiple models. J Clim 23:2739–2758CrossRefGoogle Scholar
  27. Krzanowski WJ (1979) Between-groups comparison of principal components. J Am Stat Assoc 74:703–707CrossRefGoogle Scholar
  28. Li JLF, Waliser DE, Stephens G, Lee S, Ecuyer TL, Kato S, Loeb N, Ma Y (2013) Characterizing and understanding radiation budget biases in CMIP3/CMIP5 GCMs, contemporary GCM, and reanalysis. J Geophys Res Atmos 118:8166–8184CrossRefGoogle Scholar
  29. Mantua NJ, Hare SR, Zhang Y, Wallace JM, Francis RC (1997) A Pacific decadal climate oscillation with impacts on salmon. Bull Am Meteorol Soc 78:1069–1079CrossRefGoogle Scholar
  30. Martin GM, Milton SF, Senior CA, Brooks ME, Ineson S, Reichler T, Kim J (2010) Analysis and reduction of systematic errors through a seamless approach to modelling weather and climate. J Clim 23:5933–5957CrossRefGoogle Scholar
  31. Martin GM, Bellouin N, Collins WJ, Culverwell ID, Halloran P, Hardiman S, Hinton TJ, Jones CD, McLaren A, O’Connor F, Rodriguez J, Woodward S et al (2011) The HadGEM2 family of met office unified model climate configurations. Geosci Model Dev Discuss 4:723–757CrossRefGoogle Scholar
  32. Meehl GA, Covey C, Delworth T, Latif M, McAvaney B, Mitchell JFB, Stouffer RJ, Taylor KE (2007) The WCRP CMIP3 multi-model dataset: a new era in climate change research. Bull Am Meteorol Soc 88:1383–1394CrossRefGoogle Scholar
  33. Murphy JM, Sexton DMH, Barnett DN, Jones GS, Webb MJ, Collins M, Stainforth DA (2004) Quantification of modeling uncertainties in a large ensemble of climate change simulations. Nature 430:768–772CrossRefGoogle Scholar
  34. North GR, Bell TL, Cahalan RF, Moeng FJ (1982) Sampling errors in the estimation of empirical orthogonal functions. Mon Weather Rev 110:699–706CrossRefGoogle Scholar
  35. Pierce DW, Barnett TP, Schneider N, Saravanan R, Dommenget D, Mojib L (2001) The role of ocean dynamics in producing decadal climate variability in the North Pacific. Clim Dyn 18(1–2):51–70CrossRefGoogle Scholar
  36. Rayner NA, Parker DE, Horton EB, Folland CK, Alexander LV, Rowell DP, Kent EC, Kaplan A (2003) Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late nineteenth century. J Geophys Res 108(D14):4407. doi: 10.1029/2002JD002670 CrossRefGoogle Scholar
  37. Reifen C, Toumi R (2009) Climate projections: past performance no guarantee of future skill? Geophys Res Lett 36:L13704. doi: 10.1029/2009GL038082 CrossRefGoogle Scholar
  38. Santer BD, Taylor KE, Gleckler PJ, Bonfils C, Barnett TP, Pierce DW, Wigley TML, Mears C, Wentz FJ, Brüggemann W, Gillett NP, Klein SA, Solomon S, Stott PA, Wehner MF (2009) Incorporating model quality information in climate change detection and attribution studies. Proc Natl Acad Sci USA 106:14778–14783CrossRefGoogle Scholar
  39. Smith TM, Reynolds RW, Peterson TC, Lawrimor J (2008) Improvements to NOAA’s historical merged land–ocean surface temperature analysis (1880–2006). J Clim 21:2283–2296CrossRefGoogle Scholar
  40. Stockdale TN (1997) Coupled ocean–atmosphere forecasts in the presence of climate drift. Mon Weather Rev 125:809–818CrossRefGoogle Scholar
  41. Taylor KE (2001) Summarizing multiple aspects of model performance in a single diagram. J Geophys Res 106(D7):7183–7192. doi: 10.1029/2000JD900719 CrossRefGoogle Scholar
  42. Taylor KE, Stouffer RJ, Meehl GA (2012) An overview of CMIP5 and the experiment design. Bull Am Meteorol Soc 93:485–498CrossRefGoogle Scholar
  43. Tebaldi C, Knutti R (2007) The use of the multi-model ensemble in probabilistic climate projections. Philos Trans R Soc Lond A 365:2053–2075CrossRefGoogle Scholar
  44. Washington WM, Meehl GA (1984) Seasonal cycle experiment on the climate sensitivity due to a doubling of CO2 with an atmospheric general circulation model coupled to a simple mixed-layer ocean model. J Geophys Res 89(D6):9475–9503. doi: 10.1029/JD089iD06p09475 CrossRefGoogle Scholar
  45. Xavier PK, Duvel J-P, Braconnot P, Doblas-Reyes FJ (2010) An evaluation metric for intraseasonal variability and its application to CMIP3 twentieth-century simulations. J Clim 23:3497–3508CrossRefGoogle Scholar
  46. Zhou T, Wu B, Wang B (2009) How well do atmospheric general circulation models capture the leading modes of the interannual variability of the Asian–Australian monsoon? J Clim 22:1159–1173CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.School of Mathematical SciencesMonash UniversityClaytonAustralia

Personalised recommendations