Quantifying the agreement between observed and simulated extratropical modes of interannual variability

Abstract

Using historical simulations of the Coupled Model Intercomparison Project-5 (CMIP5) and multiple observationally-based datasets, we employ skill metrics to analyze the fidelity of the simulated Northern Annular Mode, the North Atlantic Oscillation, the Pacific North America pattern, the Southern Annular Mode, the Pacific Decadal Oscillation, the North Pacific Oscillation, and the North Pacific Gyre Oscillation. We assess the benefits of a unified approach to evaluate these modes of variability, which we call the common basis function (CBF) approach, based on projecting model anomalies onto observed empirical orthogonal functions (EOFs). The CBF approach circumvents issues with conventional EOF analysis, eliminating, for example, corrections of arbitrarily assigned, but inconsistent, signs of the EOF’s/PC’s being compared. It also avoids the problem that sometimes the first observed EOF is more similar to a higher order model EOF, particularly if the simulated EOFs are not well separated. Compared to conventional EOF analysis of models, the CBF approach indicates that models compare significantly better with observations in terms of pattern correlation and root-mean-squared-error (RMSE) than heretofore suggested. In many cases, models are doing a credible job at capturing the observationally-based estimates of patterns; however, errors in simulated amplitudes can be large and more egregious than pattern errors. In the context of the broad distribution of errors in the CMIP5 ensemble, sensitivity tests demonstrate that our results are relatively insensitive to methodological considerations (CBF vs. conventional approach), observational uncertainties in pattern (as determined by using multiple datasets), and internal variability (when multiple realizations from the same model are compared). The skill metrics proposed in this study can provide a useful summary of the ability of models to reproduce the observed EOF patterns and amplitudes. Additionally, the skill metrics can be used as a tool to objectively highlight where potential model improvements might be made. We advocate more systematic and objective testing of simulated extratropical variability, especially during the non-dominant seasons of each mode, when many models are performing relatively poorly.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

References

  1. AchutaRao K, Sperber KR (2006) ENSO simulation in coupled ocean–atmosphere models: are the current models. better? Clim Dyn 27:1–15. https://doi.org/10.1007/s00382-006-0119-7

    Google Scholar 

  2. Allan R, Ansell T (2006) A new globally complete monthly historical gridded mean sea-level pressure dataset (HadSLP2): 1850–2004. J Clim 19:5816–5842. https://doi.org/10.1175/JCLI3937.1

    Google Scholar 

  3. Ambaum MHP, Hoskins BJ, Stephenson DB (2001) Arctic Oscillation or North Atlantic Oscillation. J Clim 14:3495–3507

    Google Scholar 

  4. Baldwin MP (2001) Annular modes in global daily surface pressure. Geophys Res Lett 28:4115–4118. https://doi.org/10.1029/2001GL013564

    Google Scholar 

  5. Barnston AG, Livezey RE (1987) Classification, seasonality and persistence of low-frequency atmospheric circulation patterns. Mon Weather Rev 115:1083–1126

    Google Scholar 

  6. Bayr T, Dommenget D (2014) Comparing the spatial structure of variability in two datasets against each other on the basis of EOF modes. Clim Dyn 42:1631–1648

    Google Scholar 

  7. Berrisford P, Kållberg P, Kobayashi S, Dee D, Uppala S, Simmons AJ, Poli P, Sato H (2011) Atmospheric conservation properties in ERA-Interim. Q J R Meteorol Soc 137:1381–1399. https://doi.org/10.1002/qj.864

    Google Scholar 

  8. Bond NA, Overland JE, Spillane M, Stabeno P (2003) Recent shifts in the state of the North Pacific. Geophys Res Lett. https://doi.org/10.1029/2003GL018597

    Google Scholar 

  9. Bonfils C, Santer BD (2011) Investigating the possibility of a human component in various Pacific Decadal Oscillation indices. Clim Dyn 37:1457–1468. https://doi.org/10.1007/s00382-010-0920-1

    Google Scholar 

  10. Bonfils CJW, Santer BD, Phillips TJ, Marvel K, Ruby Leung L, Doutriaux C, Capotondi A (2015) Relative contributions of mean-state shifts and ENSO-driven variability to precipitation changes in a warming climate. J Clim 28:9997–10013. https://doi.org/10.1175/JCLI-D-15-0341.1

    Google Scholar 

  11. Cattiaux J, Cassou C (2013) Opposite CMIP3/CMIP5 trends in the wintertime Northern Annular Mode explained by combined local sea ice and remote tropical influences. Geophys Res Lett 40:3682–3687. https://doi.org/10.1002/grl.50643

    Google Scholar 

  12. Charlton-Perez AJ, Baldwin MP, Birner T, Black RX, Butler AH, Calvo N, Davis NA, Gerber EP, Gillett N, Hardiman S, Kim J, Krüger K, Lee YY, Manzini E, McDaniel BA, Polvani L, Reichler T, Shaw TA, Sigmond M, Son SW, Toohey M, Wilcox L, Yoden S, Christiansen B, Lott F, Shindell D, Yukimoto S, Watanabe S (2013) On the lack of stratospheric dynamical variability in low-top versions of the CMIP5 models. J Geophys Res Atmos 118:2494–2505. https://doi.org/10.1002/jgrd.50125

    Google Scholar 

  13. Chen S, Wu R (2017) Impacts of winter NPO on subsequent winter ENSO: sensitivity to the definition of NPO index. Clim Dyn. https://doi.org/10.1007/s00382-017-3615-z

    Google Scholar 

  14. Chen Z, Gan B, Wu L, Jia F (2017) Pacific-North American teleconnection and North Pacific Oscillation: historical simulation and future projection in CMIP5 models. Clim Dyn. https://doi.org/10.1007/s00382-017-3881-9

    Google Scholar 

  15. Choi J, Son SW, Ham YG, Lee JY, Kim HM (2016) Seasonal-to-interannual prediction skills of near-surface air temperature in the CMIP5 decadal hindcast experiments. J Clim 29:1511–1527. https://doi.org/10.1175/JCLI-D-15-0182.1

    Google Scholar 

  16. Chronis T, Raitsos DE, Kassis D, Sarantopoulos A (2011) The summer North Atlantic Oscillation influence on the eastern Mediterranean. J Clim 24:5584–5596. https://doi.org/10.1175/2011JCLI3839.1

    Google Scholar 

  17. Cohen J, Barlow M (2005) The NAO, the AO, and global warming: how closely related? J Clim 18:4498–4513. https://doi.org/10.1175/JCLI3530.1

    Google Scholar 

  18. Cohen J, Frei A, Rosen RD (2005) The role of boundary conditions in AMIP-2 simulations of the NAO. J Clim 18:973–981. https://doi.org/10.1175/JCLI-3305.1

    Google Scholar 

  19. Compo GP, Whitaker JS, Sardeshmukh PD (2006) Feasibility of a 100-year reanalysis using only surface pressure data. Bull Am Meteorol Soc 87:175–190. https://doi.org/10.1175/BAMS-87-2-175

    Google Scholar 

  20. Compo GP, Whitaker JS, Sardeshmukh PD, Matsui N, Allan RJ, Yin X, Gleason BE, Vose RS, Rutledge G, Bessemoulin P, BroNnimann S, Brunet M, Crouthamel RI, Grant AN, Groisman PY, Jones PD, Kruk MC, Kruger AC, Marshall GJ, Maugeri M, Mok HY, Nordli O, Ross TF, Trigo RM, Wang XL, Woodruff SD, Worley SJ (2011) The twentieth century reanalysis project. Q J R Meteorol Soc 137:1–28

    Google Scholar 

  21. Dawson A (2016) eofs: a library for EOF analysis of meteorological, oceanographic, and climate data. J Open Res Softw 4:e14. https://doi.org/10.5334/jors.122

    Google Scholar 

  22. Dee DP, Uppala SM, Simmons AJ, Berrisford P, Poli P, Kobayashi S, Andrae U, Balmaseda MA, Balsamo G, Bauer P, Bechtold P, Beljaars ACM, van de Berg L, Bidlot J, Bormann N, Delsol C, Dragani R, Fuentes M, Geer AJ, Haimberger L, Healy SB, Hersbach H, Hólm EV, Isaksen L, Kållberg P, Köhler M, Matricardi M, Mcnally AP, Monge-Sanz BM, Morcrette JJ, Park BK, Peubey C, de Rosnay P, Tavolato C, Thépaut JN, Vitart F (2011) The ERA-Interim reanalysis: configuration and performance of the data assimilation system. Q J R Meteorol Soc 137:553–597. https://doi.org/10.1002/qj.828

    Google Scholar 

  23. Deser C (2000) On the teleconnectivity of the “Arctic Oscillation”. Geophys Res Lett 27:779–782

    Google Scholar 

  24. Deser C, Alexander MA, Xie S-P, Phillips AS (2010) Sea surface temperature variability: patterns and mechanisms. Annu Rev Mar Sci 2:115–143. https://doi.org/10.1146/annurev-marine-120408-151453

    Google Scholar 

  25. Di Lorenzo E, Schneider N, Cobb KM, Franks PJ, Chhak K, Miller AJ, McWilliams JC, Bograd SJ, Arango H, Curchitser E, Powell TM (2008) North Pacific Gyre Oscillation links ocean climate and ecosystem change. Geophys Res Lett. https://doi.org/10.1029/2007GL032838

    Google Scholar 

  26. Di Giuseppe F, Molteni F, Tompkins AM (2013) A rainfall calibration methodology for impacts modelling based on spatial mapping. Q J R Meteorol Soc 139:1389–1401. https://doi.org/10.1002/qj.2019

    Google Scholar 

  27. Dommenget D (2007) Evaluating EOF modes against a stochastic null hypothesis. Clim Dyn 28:517–531

    Google Scholar 

  28. Doutriaux C, Williams DN, Nadeau D, Lipsa D, Chaudhary A, Durack PJ, Lee J, Shaheen Z, Maxwell T, Brown E (2017) UV-CDAT/uvcdat: UV-CDAT 2.12. Zenodo, Geneva. https://doi.org/10.5281/zenodo.886621

  29. Eyring V, Gleckler PJ, Heinze C, Stouffer RJ, Taylor KE, Balaji V, Guilyardi E, Joussaume S, Kindermann S, Lawrence BN, Meehl GA, Righi M, Williams DN (2016a) Towards improved and more routine Earth system model evaluation in CMIP. Earth Syst Dyn 7:813–830. https://doi.org/10.5194/esd-7-813-2016

    Google Scholar 

  30. Eyring V, Righi M, Lauer A, Evaldsson M, Wenzel S, Jones C, Anav A, Andrews O, Cionni I, Davin EL, Deser C, Ehbrecht C, Friedlingstein P, Gleckler P, Gottschaldt KD, Hagemann S, Juckes M, Kindermann S, Krasting J, Kunert D, Levine R, Loew A, Mäkelä J, Martin G, Mason E, Phillips AS, Read S, Rio C, Roehrig R, Senftleben D, Sterl A, Van Ulft LH, Walton J, Wang S, Williams KD (2016b) ESMValTool (v1.0)—a community diagnostic and performance metrics tool for routine evaluation of Earth system models in CMIP. Geosci Model Dev 9:1747–1802. https://doi.org/10.5194/gmd-9-1747-2016

    Google Scholar 

  31. Feldstein SB (2007) The dynamics of the North Atlantic Oscillation during the summer season. Q J R Meteorol Soc 133:1509–1518. https://doi.org/10.1002/qj.107

    Google Scholar 

  32. Feser F, Barcikowska M, Krueger O, Schenk F, Weisse R, Xia L (2015) Storminess over the North Atlantic and northwestern Europe—a review. Q J R Meteorol Soc 141:350–382

    Google Scholar 

  33. Fogt RL, Perlwitz J, Pawson S, Olsen MA (2009) Intra-annual relationships between polar ozone and the SAM. Geophys Res Lett 36:L04707. https://doi.org/10.1029/2008GL036627

    Google Scholar 

  34. Folland CK, Knight J, Linderholm HW, Fereday D, Ineson S, Hurrell JW (2009) The summer North Atlantic Oscillation: past, present, and future. J Clim 22:1082–1103. https://doi.org/10.1175/2008JCLI2459.1

    Google Scholar 

  35. Furtado JC, Di Lorenzo E, Schneider N, Bond NA (2011) North pacific decadal variability and climate change in the IPCC AR4 models. J Clim 24:3049–3067. https://doi.org/10.1175/2010JCLI3584.1

    Google Scholar 

  36. Gates WL, Henderson-Sellers A, Boer GJ, Folland CK, Kitoh A, McAvaney BJ, Semazzi F, Smith N, Weaver AJ, Zeng Q-C (1996) Climate models—evaluation. In: Houghton JT, Meira Filho LG, Callander BA, Harris N, Kattenberg A, Maskell K (eds) Climate change 1995: the science of climate change, chap 5. Contribution of Working Group I to the Second Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, New York, pp 228–284

    Google Scholar 

  37. Gillett NP, Fyfe JC (2013) Annular mode changes in the CMIP5 simulations. Geophys Res Lett 40:1189–1193. https://doi.org/10.1002/grl.50249

    Google Scholar 

  38. Gleckler PJ, Taylor KE, Doutriaux C (2008) Performance metrics for climate models. J Geophys Res Atmos. https://doi.org/10.1029/2007JD008972

    Google Scholar 

  39. Gleckler P, Doutriaux C, Durack P, Taylor K, Zhang Y, Williams D, Mason E, Servonnat J (2016) A more powerful reality test for climate models. Eos (Washington, DC) 97:1–8. https://doi.org/10.1029/2016EO051663

    Google Scholar 

  40. Gong G, Entekhabi D, Cohen J (2002) A large-ensemble model study of the wintertime AO-NAO and the role of interannual snow perturbations. J Clim 15:3488–3499. https://doi.org/10.1175/1520-0442(2002)015%3C3488:ALEMSO%3E2.0.CO;2

    Google Scholar 

  41. Gong H, Wang L, Chen W, Chen X, Nath D (2016) Biases of the wintertime Arctic Oscillation in CMIP5 models. Environ Res Lett 12:014001. https://doi.org/10.1088/1748-9326/12/1/014001

    Google Scholar 

  42. Gottschalck J, Wheeler M, Weickmann K, Vitart F, Savage N, Lin H, Hendon H, Waliser D, Sperber K, Nakagawa M, Prestrelo C, Flatau M, Higgins W (2010) A framework for assessing operational Madden-Julian oscillation forecasts: a clivar MJO working group project. Bull Am Meteorol Soc 91:1247–1258. https://doi.org/10.1175/2010BAMS2816.1

    Google Scholar 

  43. Hanna E, Cappelen J, Allan R, Jónsson T, Le Blanco F, Lillington T, Hickey K (2008) New insights into North European and North Atlantic surface pressure variability, storminess, and related climatic change since 1830. J Clim 21:6739–6766. https://doi.org/10.1175/2008JCLI2296.1

    Google Scholar 

  44. Hannachi A, Jolliffe IT, Stephenson DB (2007) Empirical orthogonal functions and related techniques in atmospheric science: a review. Int J Climatol 27:1119–1152

    Google Scholar 

  45. Hawkins E, Sutton R (2009) The potential to narrow uncertainty in regional climate predictions. Bull Am Meteorol Soc 90:1095–1107. https://doi.org/10.1175/2009BAMS2607.1

    Google Scholar 

  46. Hawkins E, Sutton R (2011) The potential to narrow uncertainty in projections of regional precipitation change. Clim Dyn 37:407–418. https://doi.org/10.1007/s00382-010-0810-6

    Google Scholar 

  47. Hurrell JW, Deser C (2009) North Atlantic climate variability: the role of the North Atlantic Oscillation. J Mar Syst 78:28–41. https://doi.org/10.1016/j.jmarsys.2008.11.026

    Google Scholar 

  48. Hurrell JW, Kushnir Y, Ottersen G, Visbeck M (2003) An overview of the North Atlantic Oscillation. In: Hurrell JW, Kushnir Y, Ottersen G, Visbeck M (eds) The North Atlantic Oscillation: climatic significance and environmental impact. American Geophysical Union, Washington, DC, pp 1–35. https://doi.org/10.1029/134GM01

    Google Scholar 

  49. Irving D, Simmonds I (2016) A new method for identifying the Pacific–South American pattern and its influence on regional climate variability. J Clim 29:6109–6125. https://doi.org/10.1175/JCLI-D-15-0843.1

    Google Scholar 

  50. Jerez S, Jimenez-Guerrero P, Montávez JP, Trigo RM (2013) Impact of the North Atlantic Oscillation on European aerosol ground levels through local processes: a seasonal model-based assessment using fixed anthropogenic emissions. Atmos Chem Phys 13:11195–11207. https://doi.org/10.5194/acp-13-11195-2013

    Google Scholar 

  51. Kay JE, Deser C, Phillips A, Mai A, Hannay C, Strand G, Arblaster JM, Bates SC, Danabasoglu G, Edwards J, Holland M, Kushner P, Lamarque JF, Lawrence D, Lindsay K, Middleton A, Munoz E, Neale R, Oleson K, Polvani L, Vertenstein M (2015) The Community Earth System Model (CESM) large ensemble project: a community resource for studying climate change in the presence of internal climate variability. Bull Am Meteorol Soc 96:1333–1349. https://doi.org/10.1175/BAMS-D-13-00255.1

    Google Scholar 

  52. Keeley SP, Collins M, Thorpe AJ (2008) Northern hemisphere winter atmospheric climate: modes of natural variability and climate change. Clim Dyn 31:195–211. https://doi.org/10.1007/s00382-007-0346-6

    Google Scholar 

  53. Kidson JW (1999) Principal modes of Southern Hemisphere low-frequency variability obtained from NCEP-NCAR reanalyses. J Clim 12:2808–2830. https://doi.org/10.1175/1520-0442(1999)012%3C2808:PMOSHL%3E2.0.CO;2

    Google Scholar 

  54. Kim HM, Webster PJ, Curry JA (2012) Evaluation of short-term climate change prediction in multi-model CMIP5 decadal hindcasts. Geophys Res Lett. https://doi.org/10.1029/2012GL051644

    Google Scholar 

  55. Kirtman B, Power SB, Adedoyin JA, Boer GJ, Bojariu R, Camilloni I, Doblas-Reyes FJ, Fiore AM, Kimoto M, Meehl GA, Prather M, Sarr A, Schär C, Sutton R, van Oldenborgh GJ, Vecchi G, Wang HJ (2013) Near-term climate change: projections and predictability. In: Stocker TF, Qin D, Plattner G-K, Tignor M, Allen SK, Boschung J, Nauels A, Xia Y, Bex V, Midgley PM (eds) Climate change 2013: the physical science basis. contribution of working group I to the fifth assessment report of the intergovernmental panel on climate change. Cambridge University Press, Cambridge

    Google Scholar 

  56. Lauer A, Eyring V, Righi M, Buchwitz M, Defourny P, Evaldsson M, Friedlingstein P, de Jeu R, de Leeuw G, Loew A, Merchant CJ, Müller B, Popp T, Reuter M, Sandven S, Senftleben D, Stengel M, Van Roozendael M, Wenzel S, Willén U (2017) Benchmarking CMIP5 models with a subset of ESA CCI Phase 2 data using the ESMValTool. Remote Sens Environ. https://doi.org/10.1016/j.rse.2017.01.007 (available online)

    Google Scholar 

  57. Lee YY, Black RX (2013) Boreal winter low-frequency variability in CMIP5 models. J Geophys Res Atmos 118:6891–6904. https://doi.org/10.1002/jgrd.50493

    Google Scholar 

  58. Lee YY, Black RX (2015) The structure and dynamics of the stratospheric Northern Annular Mode in CMIP5 simulations. J Clim 28:86–107

    Google Scholar 

  59. Li XF, Pietrafesa L, Lan SF, Xie LA (2000) Significance test for empirical orthogonal function (EOF) analysis of meteorological and oceanic data. Chin J Oceanol Limnol 18:10–17. https://doi.org/10.1007/BF02842536

    Google Scholar 

  60. Linkin ME, Nigam S (2008) The North Pacific Oscillation–West Pacific teleconnection pattern: mature-phase structure and winter impacts. J Clim 21:1979–1997. https://doi.org/10.1175/2007JCLI2048.1

    Google Scholar 

  61. Lorenz EN (1956) Empirical orthogonal functions and statistical weather prediction. Statistical Forecasting Project report 1, MIT Department of Meteorology

  62. Mantua NJ, Hare SR (2002) The Pacific Decadal Oscillation. J Oceanogr 58:35–44. https://doi.org/10.1023/A:1015820616384

    Google Scholar 

  63. Mantua NJ, Hare SR, Zhang Y, Wallace JM, Francis RC (1997) A Pacific Interdecadal Climate Oscillation with impacts on salmon production. Bull Am Meteorol Soc 78:1069–1079. https://doi.org/10.1175/1520-0477(1997)078%3C1069:APICOW%3E2.0.CO;2

    Google Scholar 

  64. Meehl GA, Hu A, Teng H (2016) Initialized decadal prediction for transition to positive phase of the Interdecadal Pacific Oscillation. Nat Commun 7:11718. https://doi.org/10.1038/ncomms11718

    Google Scholar 

  65. Monahan AH, Fyfe JC, Ambaum MH, Stephenson DB, North GR (2009) Empirical orthogonal functions: the medium is the message. J Clim 22:6501–6514

    Google Scholar 

  66. Newman M, Alexander MA, Ault TR, Cobb KM, Deser C, Di Lorenzo E, Mantua NJ, Miller AJ, Minobe S, Nakamura H, Schneider N, Vimont DJ, Phillips AS, Scott JD, Smith CA (2016) The Pacific Decadal Oscillation, revisited. J Clim 29:4399–4427. https://doi.org/10.1175/JCLI-D-15-0508.1

    Google Scholar 

  67. North GR, Bell TL, Cahalan RF (1982) Sampling errors in the estimation of empirical orthogonal functions. Mon Weather Rev 110:699–706

    Google Scholar 

  68. Oshima K, Tanimoto Y (2009) An evaluation of reproducibility of the Pacific Decadal Oscillation in the CMIP3 simulations. J Meteorol Soc Jpn 87:755–770. https://doi.org/10.2151/jmsj.87.755

    Google Scholar 

  69. Overland JE, Wang M (2007) Future climate of the North Pacific Ocean. Eos Trans Am Geophys Union 88:178,182

    Google Scholar 

  70. Park JH, An SI, Yeh SW, Schneider N (2013a) Quantitative assessment of the climate components driving the Pacific Decadal Oscillation in climate models. Theor Appl Climatol 112:431–445. https://doi.org/10.1007/s00704-012-0730-y

    Google Scholar 

  71. Park JY, Yeh SW, Kug JS, Yoon J (2013b) Favorable connections between seasonal footprinting mechanism and El Niño. Clim Dyn 40:1169–1181. https://doi.org/10.1007/s00382-012-1477-y

    Google Scholar 

  72. Perlwitz J, Pawson S, Fogt RL, Nielsen JE, Neff WD (2008) Impact of stratospheric ozone hole recovery on Antarctic climate. Geophys Res Lett 35:L08714. https://doi.org/10.1029/2008GL033317

    Google Scholar 

  73. Phillips AS, Deser C, Fasullo J (2014) Evaluating modes of variability in climate models. Eos Trans AGU 95:453–455. https://doi.org/10.1002/2014EO490002

    Google Scholar 

  74. Pohl B, Fauchereau N (2012) The Southern Annular Mode seen through weather regimes. J Clim 25:3336–3354. https://doi.org/10.1175/JCLI-D-11-00160.1

    Google Scholar 

  75. Polade SD, Gershunov A, Cayan DR, Dettinger MD, Pierce DW (2013) Natural climate variability and teleconnections to precipitation over the Pacific-North American region in CMIP3 and CMIP5 models. Geophys Res Lett 40:2296–2301. https://doi.org/10.1002/grl.50491

    Google Scholar 

  76. Poli P, Hersbach H, Dee DP, Berrisford P, Simmons AJ, Vitart F, Laloyaux P, Tan DG, Peubey C, Thépaut JN, Trémolet Y (2016) ERA-20C: an atmospheric reanalysis of the twentieth century. J Clim 29:4083–4097. https://doi.org/10.1175/JCLI-D-15-0556.1

    Google Scholar 

  77. Rayner NA, Parker DE, Horton EB, Folland CK, Alexander LV, Rowell DP, Kent EC, Kaplan A (2003) Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late nineteenth century. J Geophys Res 108:4407. https://doi.org/10.1029/2002JD002670

    Google Scholar 

  78. Reboita MS, Ambrizzi T, Rocha RPD (2009) Relationship between the Southern Annular Mode and Southern Hemisphere atmospheric systems. Revista Brasileira de Meteorologia 24:48–55

    Google Scholar 

  79. Rogers JC (1981) The North Pacific Oscillation. J Climatol 1:39–57. https://doi.org/10.1002/joc.3370010106

    Google Scholar 

  80. Rogers JC, McHugh MJ (2002) On the separability of the North Atlantic Oscillation and the Arctic Oscillation. Clim Dyn 19:599–608

    Google Scholar 

  81. Rogers JC, van Loon H (1982) Spatial variability of sea level pressure and 500 mb height anomalies over the Southern Hemisphere. Mon Weather Rev 110:1375–1392

    Google Scholar 

  82. Salinger MJ (2005) Climate variability and change: past, present and future—an overview. Clim Chang 70:9–29. https://doi.org/10.1007/s10584-005-5936-x

    Google Scholar 

  83. Santer BD, Mears C, Wentz FJ, Taylor KE, Gleckler PJ, Wigley TML, Barnett TP, Boyle JS, Brüggemann W, Gillett NP, Klein SA, Meehl GA, Nozawa T, Pierce DW, Stott PA, Washington WM, Wehner MF (2007) Identification of human-induced changes in atmospheric moisture content. Proc Natl Acad Sci USA 104:15248–15253. https://doi.org/10.1073/pnas.0702872104

    Google Scholar 

  84. Sheffield J, Barrett AP, Colle B, Fernando DN, Fu R, Geil KL, Hu Q, Kinter J, Kumar S, Langenbrunner B, Lombardo K, Long LN, Maloney E, Mariotti A, Meyerson JE, Mo KC, Neelin JD, Nigam S, Pan Z, Ren T, Ruiz-Barradas A, Serra YL, Seth A, Thibeault JM, Stroeve JC, Yang Z, Yin L (2013a) North American climate in CMIP5 experiments. Part I: evaluation of historical simulations of continental and regional climatology. J Clim 26:9209–9245. https://doi.org/10.1175/JCLI-D-12-00592.1

    Google Scholar 

  85. Sheffield J, Camargo SJ, Fu R, Hu Q, Jiang X, Johnson N, Karnauskas KB, Kim ST, Kinter J, Kumar S, Langenbrunner B (2013b) North American climate in CMIP5 experiments. Part II: evaluation of historical simulations of intraseasonal to decadal variability. J Clim 26:9247–9290

    Google Scholar 

  86. Sillmann J, Kharin VV, Zhang X, Zwiers FW, Bronaugh D (2013) Climate extremes indices in the CMIP5 multimodel ensemble: part 1. Model evaluation in the present climate. J Geophys Res Atmos 118:1716–1733. https://doi.org/10.1002/jgrd.50203

    Google Scholar 

  87. Smith TM, Reynolds RW, Peterson TC, Lawrimore J (2008) Improvements to NOAA’s historical merged land-ocean surface temperature analysis (1880–2006). J Clim 21:2283–2296. https://doi.org/10.1175/2007JCLI2100.1

    Google Scholar 

  88. Son SW, Polvani LM, Waugh DW, Akiyoshi H, Garcia R, Kinnison D, Pawson S, Rozanov E, Shepherd TG, Shibata K (2008) The impact of stratospheric ozone recovery on the Southern Hemisphere westerly jet. Science 320:1486–1489

    Google Scholar 

  89. Sperber KR (2004) Madden-Julian variability in NCAR CAM2.0 and CCSM2.0. Clim Dyn 23:259–278. https://doi.org/10.1007/s00382-004-0447-4

    Google Scholar 

  90. Sperber KR, Annamalai H (2008) Coupled model simulations of boreal summer intraseasonal (30–50 day) variability, part 1: systematic errors and caution on use of metrics. Clim Dyn 31:345–372. https://doi.org/10.1007/s00382-008-0367-9

    Google Scholar 

  91. Sperber KR, Gualdi S, Legutke S, Gayler V (2005) The Madden–Julian oscillation in ECHAM4 coupled and uncoupled general circulation models. Clim Dyn 25:117–140. https://doi.org/10.1007/s00382-005-0026-3

    Google Scholar 

  92. Sperber KR, Annamalai H, Kang I-S, Kitoh A, Moise A, Turner A, Wang B, Zhou T (2013) The Asian summer monsoon: an intercomparison of CMIP5 vs. CMIP3 simulations of the late 20th century. Clim Dyn 41:2771–2774. https://doi.org/10.1007/s00382-012-1607-6

    Google Scholar 

  93. Stephenson DB, Pavan V, Collins M, Junge MM, Quadrelli R (2006) North Atlantic Oscillation response to transient greenhouse gas forcing and the impact on European winter climate: a CMIP2 multi-model assessment. Clim Dyn 27:401–420. https://doi.org/10.1007/s00382-006-0140-x

    Google Scholar 

  94. Stoner AMK, Hayhoe K, Wuebbles DJ (2009) Assessing general circulation model simulations of atmospheric teleconnection patterns. J Clim 22:4348–4372. https://doi.org/10.1175/2009JCLI2577.1

    Google Scholar 

  95. Sung MK, An SI, Kim BM, Woo SH (2014) A physical mechanism of the precipitation dipole in the western United States based on PDO-storm track relationship. Geophys Res Lett 16:4719–4726. https://doi.org/10.1002/2014GL060711

    Google Scholar 

  96. Taylor KE (2001) Summarizing multiple aspects of model performance in a single diagram. J Geophys Res 106:7183–7192. https://doi.org/10.1029/2000JD900719

    Google Scholar 

  97. Taylor KE, Stouffer RJ, Meehl GA (2012a) An overview of CMIP5 and the experiment design. Bull Am Meteorol Soc 93:485–498

    Google Scholar 

  98. Taylor KE, Balaji V, Hankin S, Juckes M, Lawrence B, Pascoe S (2012b) CMIP5 data reference syntax (DRS) and controlled vocabularies. https://cmip.llnl.gov/cmip5/docs/cmip5_data_reference_syntax.pdf

  99. Thompson DWJ, Wallace JM (1998) The Arctic oscillation signature in the wintertime geopotential height and temperature fields. Geophys Res Lett 25:1297. https://doi.org/10.1029/98GL00950

    Google Scholar 

  100. Thompson DWJ, Wallace JM (2000) Annular modes in the extratropical circulation. Part I: month-to-month variability. J Clim 13:1000–1016

    Google Scholar 

  101. Thompson DWJ, Wallace JM (2001) Regional climate impacts of the Northern Hemisphere annular mode. Science 293:85–89. https://doi.org/10.1126/science.1058958

    Google Scholar 

  102. Titchner HA, Rayner NA (2014) The Met Office Hadley Centre sea ice and sea surface temperature data set, version 2: 1. Sea ice concentrations. J Geophys Res Atmos 119:2864–2889. https://doi.org/10.1002/2013JD020316

    Google Scholar 

  103. von Storch H, Zwiers FW (1999) Statistical analysis in climate research. Cambridge University Press, Cambridge (ISBN: 0 521 45071 3)

    Google Scholar 

  104. Walker GT, Bliss EW (1932) World weather V. Mem R Meteorol Soc 4:53

    Google Scholar 

  105. Wallace JM, Gutzler DS (1981) Teleconnections in the geopotential height field during the Northern Hemisphere winter. Mon Weather Rev 109:784–812

    Google Scholar 

  106. Wang G, Dommenget D (2016) The leading modes of decadal SST variability in the Southern Ocean in CMIP5 simulations. Clim Dyn 47:1775–1792

    Google Scholar 

  107. Wang L, Chen W, Huang R (2007) Changes in the variability of North Pacific Oscillation around 1975/1976 and its relationship with East Asian winter climate. J Geophys Res. https://doi.org/10.1029/2006JD008054

    Google Scholar 

  108. Wang G, Dommenget D, Frauen C (2015) An evaluation of the CMIP3 and CMIP5 simulations in their skill of simulating the spatial structure of SST variability. Clim Dyn 44:95–114. https://doi.org/10.1007/s00382-014-2154-0

    Google Scholar 

  109. Wheeler MC, Hendon HH (2004) An all-season real-time multivariate MJO index: development of an index for monitoring and prediction. Mon Weather Rev 132:1917–1932

    Google Scholar 

  110. Williams DN (2014) Visualization and analysis tools for ultrascale climate data. Eos Trans AGU 95:377–378. https://doi.org/10.1002/2014EO420002

    Google Scholar 

  111. Williams DN, Balaji V, Cinquini L, Denvil S, Duffy D, Evans B, Ferraro R, Hansen R, Lautenschlager M, Trenham C (2016a) A global repository for planet-sized experiments and observations. Bull Am Meteorol Soc 97:803–816

    Google Scholar 

  112. Williams DN, Doutriaux C, Chaudhary A, Fries S, Lipsa D, Jhaveri S, Durack PJ, Painter J, Nadeau D, Maxwell T, Harris M, Beezley J (2016b), UV-CDAT v2.4.0. Zenodo, Geneva. https://doi.org/10.5281/zenodo.45136

    Google Scholar 

  113. Wittenberg AT (2009) Are historical records sufficient to constrain ENSO simulations? Geophys Res Lett 36:L12702. https://doi.org/10.1029/2009GL038710

    Google Scholar 

  114. Xue Y, Smith TM, Reynolds RW (2003) Interdecadal changes of 30-Yr SST normals during 1871–2000. J Clim 16:1601–1612

    Google Scholar 

  115. Yim BY, Kwon MH, Min HS, Kug JS (2015) Pacific Decadal Oscillation and its relation to the extratropical atmospheric variation in CMIP5. Clim Dyn 44:1521–1540. https://doi.org/10.1007/s00382-014-2349-4

    Google Scholar 

  116. Yu JY, Kim ST (2011) Relationships between extratropical sea level pressure variations and the central Pacific and eastern Pacific types of ENSO. J Clim 24:708–720. https://doi.org/10.1175/2010JCLI3688.1

    Google Scholar 

  117. Zhang Y, Wallace JM, Battisti DS (1997) ENSO-like interdecadal variability: 1900–93’s. J Clim 10:1004–1020

    Google Scholar 

  118. Zuo J-Q, Li W-J, Ren H-L (2013) Representation of the Arctic Oscillation in the CMIP5 models. Adv Clim Change Res 4:242–249. https://doi.org/10.3724/SP.J.1248.2013.242

    Google Scholar 

Download references

Acknowledgements

This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. The efforts of the authors are supported by the Regional and Global Climate Modeling Program of the United States Department of Energy’s Office of Science. The authors thank Ben Santer for helpful discussions and suggesting the use of tcor2 as one of our EOF swapping methods. We acknowledge the efforts of Paul Durack, Sasha Ames, Jeff Painter and Cameron Harr for maintaining the CMIP database, and Dean Williams, Charles Doutriaux, Denis Nadeau and their team for developing and maintaining the CDAT analysis package and ESGF. We thank reviewers for their comments. We acknowledge the World Climate Research Programme’s Working Group on Coupled Modelling, which is responsible for CMIP, and we thank the climate modelling groups for producing and making available their model output. The CMIP data is available at ESGF. The Twentieth Century Reanalysis (20CR), HadSLP2r, and ERSSTv3b data are provided by the NOAA/Earth System Research Laboratory (ESRL)/Physical Sciences Division (PSD) from their website at http://www.esrl.noaa.gov/psd/. Support for the 20CR Project dataset is provided by the U.S. Department of Energy, Office of Science Innovative and Novel Computational Impact on Theory and Experiment (DOE INCITE) program, and Office of Biological and Environmental Research (BER), and by the National Oceanic and Atmospheric Administration Climate Program Office. The ERA Interim and ERA-20C data sets are available through ECMWF’s website at http://www.ecmwf.int/en/research/climate-reanalysis. The HadISST data is available through UK Met Office’s website at http://www.metoffice.gov.uk/hadobs/hadisst/.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Jiwoo Lee.

Ethics declarations

Conflict of interest

This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or Lawrence Livermore National Security, LLC. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States government or Lawrence Livermore National Security, LLC, and shall not be used for advertising or product endorsement purposes.

Appendix: Analysis methodology

Appendix: Analysis methodology

EOF analysis forms the basis of our approach to model evaluation. Two analysis methods are used, a conventional EOF analysis and evaluation using a Common Basis Function (CBF).

For the conventional EOF analysis we have used the Python open source EOF routine named eofs (Dawson 2016), which has been implemented into the Climate Data Analysis Tools (CDAT) (Williams 2014; Williams et al. 2016b; Doutriaux et al. 2017) and used in the climate community (e.g. Irving and Simmonds 2016). The anomalies input to the EOF routine are weighted by the square-root of the grid cell area normalized by total grid area of the domain. A scaling option has been selected that results in an EOF pattern normalized to unit variance, and the PC time series is unnormalized. Thus, the standard deviation of the of the PC time series provides a measure of the interannual variability that can be compared across realizations and observations. The unit variance EOF pattern is then multiplied by the standard deviation of the PC time series to give a “representative” pattern of anomalies in the units of the input data. This representative pattern of anomalies is consistent with that obtained using a PC-based linear regression (see the discussion of the CBF method, below). In order to calculate skill metrics, such as the area-weighted pattern correlation and root-mean-square error (RMSE), this representative pattern of anomalies from each realization is interpolated to the corresponding observational grid.

The NAM, NAO, SAM, PNA, and the PDO are often defined in the context of EOF-1 in observations (Fig. 1). For the models, however, we retain EOF’s 1–3 since in model validation there is not always a one-to-one correspondence between the observed and simulated EOF’s (Di Giuseppe et al. 2013; Keeley et al. 2008). For proper comparison, if the sign of the pattern correlation between the simulated mode and observed EOF-1 is negative, we change the sign of the simulated EOF/PC to ensure consistency between the simulated and observed fields. Then, for cases in which the model’s second or third EOF better matches the observed EOF-1, we “swap” the model EOF’s to enable a fair comparison. More problematic are cases in which the observed variability is expressed as a combination of EOF modes in a model. In such cases, we will show that different approaches to determining when to swap EOFs can result in selection of different model EOFs to compare with observations.

In the conventional EOF analysis of the models four approaches for swapping have been explored. Two of these relate to the agreement of the simulated and observed spatial patterns, and two are based on the similarity of observed and simulated PC time series. For each realization the four options for selecting which model EOF to compare with observed EOF-1 are:

  • Option 1: Use the simulated EOF with the largest pattern correlation with observed EOF-1.

  • Option 2: Use the simulated EOF with the smallest RMSE with observed EOF-1.

  • Option 3: Project the simulation anomalies onto the observed EOF-1. In this way the observations and models are evaluated using a common basis function. For a model we obtain a pseudo-PC time series (referred to hereafter as the CBF PC-1, see below). We select the model EOF whose PC time series has the largest temporal correlation with the model CBF PC-1. For swapping, this statistic is referred to as tcor1.

  • Option 4: Project the observed anomalies onto each of the three leading simulated EOF’s to obtain three observed pseudo-PC time series. Select the model EOF whose observed pseudo-PC has the largest temporal correlation with observed PC-1. For swapping, this statistic is referred to as tcor2.

As an alternative to conventional EOF analysis of the models we use what we refer to as the common basis function (CBF) method. As discussed above, an observed mode of variability may be compromised and/or spread across multiple modes in model EOF space. This can be especially problematic if the modes are not well separated, either in observations or models. The CBF method allows us to compare models and observations using a consistent diagnostic framework. Building on EOF theory (e.g., von Storch and Zwiers 1999), here we summarize the steps we use to apply the CBF method to the model anomalies from each simulation. Step 1 applies to observations, whereas steps 2–5 apply to each model simulation.

  • Step 1: Use the conventional EOF approach to calculate the observed EOF, normalized to unit variance.

  • Step 2: For each time sample, calculate the dot product between the simulated spatial pattern of anomalies and the observed time invariant EOF pattern. This projection results in an unnormalized CBF PC time series for each model simulation.

  • Step 3: At each model gridpoint, compute the linear regression between the CBF PC time series and the temporal anomalies, which yields the slope and the y-intercept. Note: the y-intercept = 0 given our calculation of the temporal anomalies.

  • Step 4: Construct the model’s 3-D space–time representation of the mode by multiplying the slopes from step 3 by the value of the CBF PC at each time point. This maximizes the variance associated with the simulated expression of the observed pattern. By calculating the area-weighted mean of the temporal variance at each gridpoint, we calculate the percent of total variance explained by a given mode.

  • Step 5: Calculate the representative pattern of anomalies by multiplying the slopes from step 3 by the standard deviation of the CBF PC time series. This representative pattern of anomalies is used for the calculation of skill metrics.

Both the conventional EOF analysis and the patterns obtained using linear regression with the PCs are linear mathematical frameworks that yield consistent results. If we linearly regress the observed, unnormalized PC against the original observed anomalies, the spatial pattern of the regression slopes is identical to the unit variance normalized EOF pattern from eofs. On a technical note, the pattern and amplitude of anomalies obtained via linear regression is insensitive to whether or not the PC is normalized to unit variance, since scaling changes to the PC simply result in a inversely-proportional change to the regression slope, such that the product of the slope and the PC is conserved.

The representative pattern of model anomalies obtained from the linear regression CBF PC fits will not be identical to the pattern associated with the mode of observed variability, but in realistic simulations they should be similar. Differences between them reflect both differences in the amplitude of variability associated with the mode and also structural differences in the simulated spatial pattern. Thus, the CBF approach allows us to address the question: How well does a model simulate the observed mode of variability? A major benefit of this approach is that the difficulties of the conventional EOF approach are circumvented, such that (1) we do not have to correct for arbitrary sign differences of a model mode compared to observations, (2) we do not have to develop a swapping protocol to try to ensure that the most applicable model mode is compared to the observed mode, and (3) the issue of an observed EOF mode being split across the model’s multiple EOF’s is moot. Thus, in addition to the practical considerations mentioned above, the CBF approach provides a consistent framework to compare how well different models agree with observations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lee, J., Sperber, K.R., Gleckler, P.J. et al. Quantifying the agreement between observed and simulated extratropical modes of interannual variability. Clim Dyn 52, 4057–4089 (2019). https://doi.org/10.1007/s00382-018-4355-4

Download citation

Keywords

  • CMIP5 model evaluation
  • Modes of variability
  • EOF
  • Metrics
  • Common basis function