Climate Dynamics

, Volume 48, Issue 9–10, pp 3283–3307 | Cite as

MJO prediction using the sub-seasonal to seasonal forecast model of Beijing Climate Center

  • Xiangwen Liu
  • Tongwen Wu
  • Song Yang
  • Tim Li
  • Weihua Jie
  • Li Zhang
  • Zaizhi Wang
  • Xiaoyun Liang
  • Qiaoping Li
  • Yanjie Cheng
  • Hongli Ren
  • Yongjie Fang
  • Suping Nie
Open Access


By conducting several sets of hindcast experiments using the Beijing Climate Center Climate System Model, which participates in the Sub-seasonal to Seasonal (S2S) Prediction Project, we systematically evaluate the model’s capability in forecasting MJO and its main deficiencies. In the original S2S hindcast set, MJO forecast skill is about 16 days. Such a skill shows significant seasonal-to-interannual variations. It is found that the model-dependent MJO forecast skill is more correlated with the Indian Ocean Dipole (IOD) than with the El Niño–Southern Oscillation. The highest skill is achieved in autumn when the IOD attains its maturity. Extended skill is found when the IOD is in its positive phase. MJO forecast skill’s close association with the IOD is partially due to the quickly strengthening relationship between MJO amplitude and IOD intensity as lead time increases to about 15 days, beyond which a rapid weakening of the relationship is shown. This relationship transition may cause the forecast skill to decrease quickly with lead time, and is related to the unrealistic amplitude and phase evolutions of predicted MJO over or near the equatorial Indian Ocean during anomalous IOD phases, suggesting a possible influence of exaggerated IOD variability in the model. The results imply that the upper limit of intraseasonal predictability is modulated by large-scale external forcing background state in the tropical Indian Ocean. Two additional sets of hindcast experiments with improved atmosphere and ocean initial conditions (referred to as S2S_IEXP1 and S2S_IEXP2, respectively) are carried out, and the results show that the overall MJO forecast skill is increased to 21–22 days. It is found that the optimization of initial sea surface temperature condition largely accounts for the increase of the overall MJO forecast skill, even though the improved initial atmosphere conditions also play a role. For the DYNAMO/CINDY field campaign period, the forecast skill increases to 27 days in S2S_IEXP2. Nevertheless, even with improved initialization, it is still difficult for the model to predict MJO propagation across the western hemisphere–western Indian Ocean area and across the eastern Indian Ocean–Maritime Continent area. Especially, MJO prediction is apparently limited by various interrelated deficiencies (e.g., overestimated IOD, shorter-than-observed MJO life cycle, Maritime Continent prediction barrier), due possibly to the model bias in the background moisture field over the eastern Indian Ocean and Maritime Continent. Thus, more efforts are needed to correct the deficiency in model physics in this region, in order to overcome the well-known Maritime Continent predictability barrier.


MJO forecast skill Indian Ocean Dipole Improved initial condition Model deficiency 

1 Introduction

Short-term climate prediction is extremely important for both research and operation because of its vital close relationship with national economy and people’s livelihood. In the past decades, climate prediction based on dynamical methods has been a popular practice in many countries.

Based on the hypothesis that seasonal predictability is mainly attributed to the underlying boundary forcing (Charney and Shukla 1981), seasonal-to-interannual forecast has exhibited continuous success. From previous two-tier approach using atmosphere-alone models to current one-tier approach using multi-component coupled models (e.g., Palmer et al. 2004; Weisheimer et al. 2009; Kirtman et al. 2014), the impact of underlying surface and its interaction with atmosphere are gradually reasonably described, leading to an obvious improvement of seasonal forecast skill (e.g., Wang et al. 2005; Kug et al. 2008; Zhu and Shukla 2013). With sustained efforts in improving initial conditions, numerical models and forecast methods, state-of-the-art dynamical models are capable of capturing numerous aspects of seasonal climate characteristics, especially those closely related to strong oceanic–atmospheric events such as the El Niño–Southern Oscillation (ENSO), although many limitations still exist (e.g., Wang et al. 2008; Yang et al. 2008; Lee et al. 2010, 2011; Kim et al. 2012; Jiang et al. 2013).

Compared to the seasonal-to-interannual forecast, sub-seasonal forecast is a more challenging task because of its focus on the extended range of time scales between weather phenomena and seasonal means, which is remarkably influenced by both atmospheric initial conditions and boundary forcing. At present, sub-seasonal forecast based on comprehensive ocean–land–atmosphere–ice coupled models has been an important issue in many agencies, despite many limitations it encounters, such as significant uncertainty in initial conditions, quick growth of forecast errors and limited skills in forecast of regional features (e.g., Pegion and Sardeshmukh 2011; Abhilash et al. 2014; Liu et al. 2013, 2014b, 2015b). To further promote the scientific research and operational application on sub-seasonal forecast, the World Weather Research Program (WWRP) and the World Climate Research Program (WCRP) launched a Sub-seasonal to Seasonal (S2S) Prediction Project in recent years. To establish an extensive database of up-to-60-day forecasts, the implement of S2S Prediction Project will cater to the initiative of bridging the gap between weather and short-term climate communities and develop toward a seamless prediction of weather and climate (Hurrell et al. 2009; Brunet et al. 2010). Presently, about 11 operation or research centers are providing, or about to provide, S2S hindcast and real-time forecast products, which are archived at data servers at the European Centre for Medium-Range Weather Forecasts (ECMWF; and the China Meteorological Administration (CMA; (Vitart et al. 2016).

One of the key issues for S2S Prediction Project to address is the Madden Julian Oscillation (MJO; Madden and Julian 1971), which is a well-known phenomenon prevailing in the tropics with significant intraseasonal variability. The MJO is thought to be a major source of sub-seasonal predictability of tropical and extratropical climate (Waliser et al. 2003; Pegion and Sardeshmukh 2011), and its forecast has been an important task at many research and operational centers. Numerous statistical forecast methods have been used in MJO forecasting, and prediction skills of 2 weeks or above have been achieved (Seo et al. 2009; Kang and Kim 2010; Cavanaugh et al. 2015). Especially, the low-order stochastic model with “past-noise forecasting” method (Kondrashov et al. 2013) and the spatial–temporal projection statistical model (Hsu et al. 2015; Zhu et al. 2015) can provide useful skill of 25–30 days. Dynamical model is another popular tool for MJO forecast, given that MJO’s main characteristics such as intensity, structure, spectrum, and propagation have been reasonably captured by state-of-the-art climate models (e.g., Kim et al. 2009; Hung et al. 2013; Jiang et al. 2015). In recent years, useful skills at the lead time beyond 2–3 weeks have been found for many dynamical forecast models based on the same verification criterion (Lin et al. 2008; Vitart and Molteni 2010; Rashid et al. 2011; Wang et al. 2014; Kim et al. 2014b; Vitart 2014; Xiang et al. 2015). Multi-model ensemble approach can further improve MJO forecast skill to about four weeks (Zhang et al. 2013). Nevertheless, MJO prediction is still challenging because of its highly model-dependent performance (Fu et al. 2013; Neena et al. 2014), excessive sensitivity to initial conditions and air–sea coupling (Vitart et al. 2007; Fu et al. 2011, 2013), as well as apparent dependence on occurrence time, initial phase and amplitude of MJO event itself (e.g., Lin et al. 2008; Rashid et al. 2011). Thus, making continual improvement on MJO prediction is an important mission for most operational and research centers, and untiring efforts and steady progresses have been made in the past decade (Zhang et al. 2013). Particularly, the National Center for Environmental Prediction (NCEP) Climate Forecast System (CFS) exhibited useful skills of 10–15 days in its version 1 (Seo et al. 2009) and about 20–21 days in its version 2 (Kim et al. 2014b; Wang et al. 2014). The Predictive Ocean–Atmosphere Model for Australia (POAMA) can skillfully predict the MJO out to 21 days (Rashid et al. 2011). The ECMWF forecast system has shown a gradual increase of skill, with an average gain of about 1 day per year since 2002, and now it can provide skillful MJO forecast up to 27 days (Kim et al. 2014b; Vitart 2014). Recently, a Geophysical Fluid Dynamics Laboratory (GFDL) coupled model also presented a 27-day skill for the MJO in the boreal winter (Xiang et al. 2015). In this context, assessing the capability of climate models in forecasting MJO and further searching for improved methods are always necessary and meaningful, not only because the MJO is considered as a major signal source of sub-seasonal predictability but also because its forecast skill is a key measure of dynamical climate prediction capability.

The Beijing Climate Center (BCC), one of the participants in the Coupled Model Intercomparison Project phase five (CMIP5), has provided lots of experiments on long-term climate simulations and projections based on two versions of its climate system model. These model outputs gained wide use by the international communities in the past 5 years ( Also, as one of the participants in the S2S Prediction Project, the BCC has conducted comprehensive S2S reforecast experiments using a coupled model since 2014. Before its participation in the S2S Project, only atmosphere-only models were used at the BCC to fulfill sub-seasonal forecast with a below-50-day integration, while coupled models were only applied in seasonal climate prediction and had reasonable skills in predicting Asian monsoon and ENSO (Liu et al. 2014a, 2015a). Because of insufficient practical experience as well as lack of high-quality assimilation analysis, it is found that the initial S2S reforecast experiments showed limited skills in capturing sub-seasonal variability of climate phenomenon, especially the MJO. Thus, an initial objective for this study was to explore the deficiency and its possible causes of MJO forecast and to further improve the forecast skill based on different sets of hindcast experiment by a BCC model. From the diagnostic and improvement processes, the following questions are addressed: (1) what caused the obvious weakness in predicting MJO in the original experiments? What roles might the model deficiency and initial errors play in such low skill? (2) What aspects of MJO prediction are sensitive to the upgrading of initialization scheme in the model? What are the differences between the impacts of atmospheric initials and oceanic initials? (3) What is the best performance in MJO forecasting for the model with improved schemes? What deficiencies should be first considered for the further development of the forecast model?

The rest of this paper is organized as follows. Details of the model, S2S prediction schemes and design of reforecast experiment are provided in Sect. 2. Validation data and methodology are described in Sect. 3. Sections 46 show the MJO characteristics in free run of model, MJO forecast skill in the original reforecast experiments and improvement of MJO forecast, respectively. Summary and discussion are given in Sect. 7.

2 Prediction schemes and experiments

2.1 Model

The model used in this study is the BCC Climate System Model (BCC_CSM). In 2005, the BCC began to design a global climate system model, and used it in climate change simulation and short-term climate prediction. With a decade-long research effort, several versions of a global atmospheric general circulation model (AGCM) and a global climate system model have been built (Wu et al. 2014). The development of BCC_AGCM, based on the Community Atmosphere Model version 3 at the National Center for Atmospheric Research, incorporates many modifications on dynamical framework and model physics parameterizations, such as mass-flux type cumulus convection, adiabatic adjustment, snow cover, and ocean surface heat flux (Wu et al. 2008, 2010; Wu 2012). The BCC_CSM, with the inclusion of ocean–land–atmosphere–ice coupling and carbon cycle, has coarse- and moderate-resolution versions that participated in the CMIP5 and provided reasonable simulations of climate change (Wu et al. 2013, 2014). Also, the BCC_CSM with a moderate resolution has been used in seasonal prediction and exhibited reliable performance (Liu et al. 2014a, 2015a).

In this study, we adopt the moderate-resolution BCC_CSM version 1.1, which is one of the models that participated in the S2S Prediction Project. The atmospheric component of this model is the BCC_AGCM version 2, which has the T106 triangular truncation in the horizontal direction and 40 hybrid sigma/pressure layers in the vertical direction with the top level at 0.5-hPa. The land component is the BCC Atmosphere and Vegetation Interaction Model version 1.0 with T106 horizontal resolution. The ocean component is the National Oceanic and Atmospheric Administration (NOAA) GFDL Modular Ocean Model version 4 (MOM4; Griffies et al. 2005) with a tripolar horizontal grid, in which the resolution is 1° × 1° poleward of 30°N and 30°S and its meridional resolution gradually changes to 1/3° latitude toward the equator. The sea ice component is GFDL’s Sea Ice Simulator (Winton 2000), whose horizontal resolution is the same as that in the ocean component. The different components are directly coupled without any flux adjustment.

2.2 Initialization scheme and observational data

To conduct S2S prediction, initializations for various climate components of the model are devised as follows.
  1. 1.

    Initialization of the atmospheric component. Atmospheric model initial fields are from the NCEP Reanalysis 1 (NCEP-R1; Kalnay et al. 1996), which has 2.5° × 2.5° horizontal resolution, 17 isobaric levels from 1000 to 10-hPa in the vertical and four times daily. The fields of three multi-level variables (i.e., air temperature, zonal wind and meridional wind) and one surface variable (i.e., surface pressure) are used in this study. To reduce the initial shock and increase the compatibility between model and observational states, we adopt a fast nudging strategy, in which the integration step of model and the relaxation time scale are both equal to 450 s. Similar scheme was used in Jie et al. (2014) to initialize the BCC_AGCM, which obtained reliable initial conditions for short-term climate prediction.

  2. 2.

    Initialization of the ocean component. Ocean model initial fields are provided by the BCC Global Ocean Data Assimilation System (BCC_GODAS) version 2, which is built on the MOM4 and three-dimensional variational method (Zhou et al. 2016). With a 10-day assimilation window, the BCC_GODAS can assimilate multiple-source observational data, including satellite altimeter-derived sea level anomalies from TOPEX/Poseidon, Jason-1/2, ERS-1/2, and Envisat, sea surface temperature (SST) from the Advanced Very High Resolution Radiometer (AVHRR) merged product, and temperature and salinity profiles from the Global Temperature and Salinity Profile Project (GTSPP) and the Array for Real-time Geostrophic Oceanography (ARGO). At present, the assimilation system has been operational and produced real-time ocean analysis for short-term climate prediction.

  3. 3.

    Initialization of near-surface atmospheric forcing fields. To reduce climate drift due to interaction of errors among different components of the coupled model, we introduce an initialization for the underlying-surface atmospheric state. In this way, a reasonable representation of atmospheric forcing for the land, ocean and ice surface can be derived. The initialized variables include 2-m air temperature, 10-m meridional wind, 10-m zonal wind, surface pressure, sea level pressure, downward surface longwave radiation and shortwave radiation from the NCEP-R1, and precipitation rate from the BCC merged precipitation dataset with 0.5° × 0.5° horizontal resolution and once-every-3-h temporal frequency (Nie et al. 2015). A nudging scheme similar to that used in the initialization of the atmospheric component is adopted.


As for the land and sea ice components, currently there are no direct initializations aiming at their forecast variables, which will be gradually overcome by the in-progress construction of a global land data assimilation system and a sea ice concentration initialization module for the BCC model.

In practical execution of the above schemes, a sufficiently long nudging process is necessary for initializing these model variables as close as possible to the observational fields, as well as for assuring compatibility among various variables and multiple components. Thus, a long-term coupled initialization experiment, which is integrated from 1 Jan 1990 and rolled forward with updating by reanalysis or observational data records, was conducted beforehand, and its intermediate outputs are used as restart initial fields of reforecast experiments.

2.3 Experimental design

For a better representation of uncertainty in initial fields, an ensemble running scheme is adopted. The ensemble running module operates using 60-day forecast integration for each of the four ensemble members, which are produced by lagged average forecasting (LAF) strategy with a 6-h interval of atmosphere initial conditions. For instance, for the forecast on 1 June 2014, four members use initial conditions at 0000 UTC on 1 June, 1800 UTC on 31 May, 1200 UTC on 31 May, and 0600 UTC on 31 May 2014, respectively. After the end of these runs, the original model outputs are converted into standard formatted files, which have 1.5° × 1.5° horizontal resolution, principal isobaric levels, and grib2 data format as required by the S2S Prediction Project.

With similar basic set ups but different details, the following three sets of experiments are conducted in this study:
  1. 1.

    S2S hindcast experiments (S2S_HST), which are for the S2S Prediction Project. With the above-mentioned initialization and ensemble forecast schemes, S2S_HST start on every day of 1 January 1994 to 31 December 2013, and integrate for 60 days. Each hindcast consists of four LAF ensemble members, which are initialized at 0000 UTC on the hindcast day and 0018, 0012 and 0006 UTC of the previous day, respectively. S2S_HST outputs have been submitted to the S2S Prediction Project; and now a real-time daily rolling forecast that uses the same configuration has become quasi-operational.

  2. 2.

    Improved experiments I (S2S_IEXP1). S2S_IEXP1 use the similar schemes as S2S_HST, but the NCEP-R1 multi-level atmospheric fields for the initial conditions are replaced by the NCEP FNL (final) Operational Model Global Tropospheric Analyses. Given the computing burden for high-frequency experiment sampling and the relatively short data length of the NCEP FNL, S2S_IEXP1 are conducted on 1st, 6th, 11th, 16th, 21st, and 26th of each month during 2000–2013 with a quasi-5-day interval. Each hindcast also includes four members and does 60-day integration. Compared to S2S_HST, the main aim of S2S_IEXP1 is to improve the atmospheric initial conditions by introducing more reliable atmospheric reanalysis data.

  3. 3.

    Improved experiments II (S2S_IEXP2). On the basis of S2S_IEXP1, S2S_IEXP2 further introduce the NOAA 1/4° daily Optimum Interpolation Sea Surface Temperature version 2 (OISST; Reynolds et al. 2007; Reynolds 2009) to replace the SST initial condition. From sea surface to subsurface of 30 m, the OISST initial fields gradually linearly transits to the BCC_GODAS initial fields so as to keep the continuity of ocean temperature in the vertical direction as well as to enlarge the downward impact of surface initial information. At deeper depths, the ocean initial fields are still from the BCC_GODAS. Before the initialization, the high-resolution OISST data was interpolated onto the ocean model resolution with an area weighting interpolation method. Compared to S2S_IEXP1, S2S_IEXP2 adopt a similar reforecast experiment strategy, and their main aim is to improve ocean initial conditions by application of high temporal–spatial-resolution surface observational data.


In addition, given that the prerequisite for making reliable dynamical forecast is the model’s reasonable capability in simulating general climate phenomena, a 30-year free run was conducted as the control experiment to reveal the representation of MJO in the model. In both the simulation and reforecast experiments, greenhouse-gas external forcing is the same as that used in the CMIP5 historical simulation.

3 Validation data and methods

The observational data used to verify the MJO simulation and forecast include the NOAA daily outgoing longwave radiation (OLR; Liebmann and Smith 1996), daily wind field from NCEP/Department of Energy (DOE) Reanalysis 2 (Kanamitsu et al. 2002), daily and monthly OISST (Reynolds et al. 2002, 2007; Reynolds 2009), and monthly precipitation from the Global Precipitation Climatology Project (Adler et al. 2003).

Using observed daily OLR, 850-hPa zonal wind (U850) and 200-hPa zonal wind (U200) from 1994 to 2013, several steps are conducted to extract the MJO signals based on the technique of Wheeler and Hendon (2004): (1) daily climatology of OLR, U850 or U200 is defined as the annual mean plus the first three annual harmonics of the 20-year average; (2) for a certain date, raw daily anomaly is calculated as the departure of the total field from the daily climatology, and intraseasonal anomaly is further computed as raw anomaly minus interannual anomaly, which is presented as the average of raw anomaly over the previous 120 days; (3) multivariate Empirical Orthogonal Functions (EOFs) are calculated for the intraseasonal anomalies of combined OLR, U850 and U200 averaged over 15°S–15°N, normalized by the respective standard deviation of each field. The principal components (PCs) are then obtained by projecting the respective intraseasonal anomalies onto the multiple-variable EOFs and further undergoing a normalization relative to the respective standard deviation; (4) MJO structure, amplitude and phase angle are defined by the two leading EOF spatial modes, (PC12 + PC22)1/2, and tan−1(PC2/PC1), respectively. The above procedures are almost the same as Wheeler and Hendon (2004) except that we removed interannual variability only by deducting previous 120-day average and neglected the more complicated step of removing the part related to the ENSO. This practice was also employed in Lin et al. (2008), Gottschalck et al. (2010) and Rashid et al. (2011); it has been shown to be sufficient for the removal of interannual variability. The resulting PC1 and PC2 closely match the realtime multivariate MJO indices 1 and 2 defined in Wheeler and Hendon (2004), with correlation coefficients of about 0.94 and 0.97, respectively.

For S2S_HST, the experiments are conducted every day in the 20 years from 1994 to 2013, producing a continuous distribution of forecast on all dates during that period at all lead time from 1 to 60 days. Thus, at any fixed lead time, approximately the same procedures as that for observations are utilized to address the MJO in S2S_HST, except that the computation of forecasted PCs is based on projection of the predicted intraseasonal anomalies onto the observed EOFs. In this case, the forecasted climatology and variability are extracted in the same way as the observations, and the forecast object can be normalized with respect to the model state itself, leading to the removal of model systematic biases or keep them to the minimum. Same processes for observations and fixed lead-time S2S_HST are adopted in Sect. 5, and they are generally similar with those in Wang et al. (2014).

For S2S_IEXP1 and S2S_IEXP2, a different pre-process method is needed due to their different data lengths and experimental sampling from S2S_HST. Particularly in Sect. 6, to make a comparison among S2S_HST, S2S_IEXP1 and S2S_IEXP2, the following pretreatments are conducted for all the experiments. (1) The observed EOFs are used but their PCs are recomputed by projection of the intraseasonal anomalies with respect to climatology during 2000–2013. This procedure produces slightly different observed PCs based on a 14-year climatology, which is more comparable to the setting of hindcast experiments. (2) Forecasted climatology of OLR, U850 or U200 is defined as the 14-year average of 60-day forecast for a certain calendar date on which the reforecast experiments are started. Raw daily anomaly is calculated by the total forecast field minus its climatology, and then a subtraction of interannual variability from daily anomaly is made to acquire the forecasted intraseasonal anomaly. Here, the interannual anomaly is defined as the average of raw anomaly over the previous 120 days, in which the raw anomaly before the forecast starting date is appended by the corresponding observed anomaly. (3) Forecasted PCs are computed by projecting the predicted intraseasonal anomalies of combined fields onto the observed EOFs, in which the anomalies of variables and PCs are normalized by the respective standard deviation of observations. (4) Calculations of MJO structure, amplitude and phase angle are done in the same way as described in the second paragraph of Sect. 3.

In addition, for an equitable comparison between observations and model free-run outputs, data-processing steps in Sect. 4 are generally similar to those mentioned above for extraction of observed MJO, except for the following two aspects: (1) for a better filtration of MJO signals from daily time series, the intraseasonal anomalies are derived by applying a 20–100-day Lancoz filter to the raw anomalies; (2) the two leading spatial modes and PCs are computed for simulations and observations separately, based on EOF expansion of the respective combined variables.

Three main indices are utilized in this study to measure the MJO forecast skill, including bivariate anomaly correlation (AC) and bivariate root mean square error (RMSE) defined by Lin et al. (2008), and phase angle error used in Rashid et al. (2011).

4 MJO characteristics in free run of model

The model’s capability in simulating the key characteristics of MJO to some extent determines the performance of its MJO forecast. Thus, in this section we explore the basic features of MJO in the uninitialized free run of the model.

Figure 1 shows the observed and simulated climatological fields of OLR, U850, SST, and precipitation in the boreal winter. The observations exhibit robust precipitation from the tropical Indian Ocean to the western Pacific, coupled with obviously low OLR and high SST over that region. The mean state of U850 is featured by an eastward extension of westerly wind from the western Indian Ocean to the west of the dateline, corresponding to active convection area (Fig. 1a, b). The model captures the general distribution of convection, but the central location of the convection has a more eastward shift over the western Pacific. The tropical westerly wind belt in the simulation splits into a rather small center over the western India Ocean and a wide area from the eastern Indian Ocean to the dateline. Correspondingly, apparent dry bias over the eastern Indian Ocean as well as wet biases over the southwestern Pacific and the western equatorial Indian Ocean can be found, denoting weaker-than-observed Indian Ocean intertropical convergence zone (ITCZ) and stronger-than-observed South Pacific convergence zone (SPCZ). Also, a clearly cold bias of SST appears over the eastern Indian Ocean, leading to a positive SST gradient bias from west to east over the equatorial Indian Ocean, and thus indicating a positive phase bias of the Indian Ocean Dipole (IOD) in model climatology (Fig. 1c, d). Meanwhile, colder-than-observed SST is found over the central-eastern equatorial Pacific, implying the existence of a negative-phase bias of the ENSO. Further, the standard deviation of seasonal-to-interannual variability of SST is given in Fig. 2. It shows that the observed distribution of standard deviation is generally reproduced by the model, but with an overall smaller magnitude over most of the tropics except for the eastern Indian Ocean. In terms of the variability of IOD and ENSO indices, defined as the difference in the SST average between (10°N–10°S, 50°–70°E) and (0°–10°S, 90°–110°E) and as the SST average over the Niño 3.4 region (5°S–5°N, 170°–120°W), respectively, overestimated IOD standard deviation (σ = 0.51) and underestimated ENSO standard deviation (σ = 0.49) are found when compared to the observations for IOD (σ = 0.41) and ENSO (σ = 0.88), respectively.
Fig. 1

November–April mean fields of outgoing longwave radiation (shadings in a and c; units W/m2), 850-hPa zonal wind (contours in a and c; units m/s), sea surface temperature (shadings in b and d; units °C), and precipitation (contours in b and d; units mm/day). a, b Observations; c, d model simulations

Fig. 2

Standard deviation of seasonal-to-interannual variation of sea surface temperature (units °C) in a observation and b simulation

The distribution of variance of 20–100-day filtered intraseasonal variability of 850-hPa zonal wind is also explored (figure not shown). In the observation, strong intraseasonal variability prevails over the equatorial Indian Ocean and western-central Pacific in the boreal winter, with a maximum center over the Maritime Continent. The simulation agrees well with observation in terms of distribution of major centers, but the magnitude is obviously underestimated by the model. The ratio of intraseasonal variance to the total variance over main activity centers of intraseasonal variability in simulation is basically similar to that in the observation, except over the equatorial Indian Ocean.

Given that eastward propagation is a fundamental characteristic of MJO, Fig. 3 shows lag–longitude diagrams of intraseasonal OLR and U850 correlated against OLR over an Indian Ocean reference region (10°S–5°N, 75°–100°E) for the boreal winter. An obvious eastward propagation from Africa and the Indian Ocean to the Pacific and western hemisphere is shown in observation. In the model, however, the eastward propagating signal is featured by weaker strength and faster speed, and the intraseasonal OLR anomaly decays extremely quickly when approaching the dateline. Meanwhile, the propagation from the western hemisphere to the Indian Ocean almost disappears in the simulation. These results imply that the model may have a shorter-than-observed life period of MJO.
Fig. 3

November–April lag–longitude diagram of 10°N–10°S-averaged intraseasonal outgoing longwave radiation anomalies (shadings) and intraseasonal 850-hPa zonal wind anomalies (contours) correlated against intraseasonal outgoing longwave radiation at the Indian Ocean reference region (10°S–5°N, 75°–100°E). a Observation; b model simulation

For the MJO spatial structure over the tropics (15°S–15°N), the two leading EOFs of 20–100-day filtered anomalies of combined OLR, U850 and U200 are shown in Fig. 4. In the observation, the first EOF mode shows enhanced convection over the Maritime Continent and suppressed convection from the eastern Pacific to Africa, while the second mode is characterized by enhancement and suppression of convection over the western Pacific and Indian Ocean, respectively. Accordingly, baroclinic structures of high- and low-level zonal winds match the distribution of convection. With a total explained variance of about 42 %, these two modes represent an eastward propagating signal of MJO because maximum positive (negative) correlation between the corresponding PCs are found when PC1 leads (lags) PC2 by 10 days (figure not shown). The simulation, however, shows a reversed order and apparently smaller-than-observed total variance for the first two modes. The pattern correlation coefficient between simulated EOF1 and observed EOF2 is 0.83, and that between simulated EOF2 and observed EOF1 is 0.94, indicating that the MJO structure is overall reasonably captured by the model in spite of some deficiencies.
Fig. 4

Two leading EOFs of intraseasonal anomalies of combined outgoing longwave radiation, 850-hPa zonal wind and 200-hPa zonal wind. a, b Observations; c, d model simulation results. The variance explained by each mode is given at the top right of each panel

Figure 5 further presents the composite MJO lifecycle using OLR and U850 anomalies. The observed and simulated MJO phases, defined as tan−1(PC2/PC1), are computed separately, and the simulated PC1 and PC2 have been exchanged for a sequence consistent with observations. For each phase, the days are used for averaging only when their MJO phase angles are within the phase and their MJO amplitudes are larger than one. It is observed that the MJO convection, initiated from Africa and the western Indian Ocean, propagates eastward from the Indian Ocean across the Maritime Continent to the western Pacific, and finally disappears in the western hemisphere. The simulation also captures an eastward propagation of convection from the Indian Ocean to the Pacific with the evolution of MJO phase. However, compared to the observation, the MJO convection develops and decays quickly near the Maritime Continent and does not show a well-organized structure and step-by-step migration from the Indian Ocean to the western Pacific, which is generally consistent with the results shown in Fig. 3.
Fig. 5

November–April composite intraseasonal anomalies of outgoing longwave radiation (shadings; units W/m2) and 850-hPa zonal wind (contours; units m/s) as a function of MJO phase in a observation and b simulation. The composite is made using the days when the MJO amplitude is larger than 1. The number of days used to generate the composite for each phase is shown to the right of each panel

5 MJO forecast skill in S2S_HST

Using the comprehensive S2S_HST outputs from 1994 to 2013, the overall forecast skill of MJO is measured by the bivariate AC and RMSE according to the definitions in Lin et al. (2008). As shown in Fig. 6, taken AC = 0.5 as the threshold of useful skill, the MJO can be predicted out to 16-day lead time. The variation of RMSE also indicates that the MJO forecast becomes unskillful beyond the lead time of 16 days, when the RMSE goes above the value of 1.414. The correspondence in maximum lead time of useful skill between AC and RMSE is indicated in Fig. 6, similar to the results in Rashid et al. (2011) and Wang et al. (2014).
Fig. 6

a Bivariate anomaly correlation and b bivariate root mean square error between observations and forecasts as a function of lead time. The dashed lines in a and b have the values of 0.5 and 1.414, respectively

Given that the MJO forecast skill is seasonal-dependent, AC is computed for a 91-day-moving window centered at each 365-day calendar date and its temporal evolution is given in Fig. 7a. Taken AC = 0.5 as the upper limit of useful skill, relatively low skill during July–September and high skill during October–March are found, with a minimum of 13 days around August and a maximum of 19 days around November. In general, owing mainly to the seasonal difference of MJO amplitude, MJO is most predictable in the boreal winter and least predictable in the boreal summer as noted by previous studies (Rashid et al. 2011; Wang et al. 2014). Somewhat unusual, the forecast skill by the BCC_CSM attains a maximum at the beginning of November (i.e., the seasonal window from mid-September to mid-December), showing that MJO is more predictable in autumn than in winter in the model.
Fig. 7

Bivariate correlation between observations and forecasts as a function of lead time and calendar date. Skills are computed using forecast experiments within a seasonal window centered on each calendar day from 1 January to 31 December in a and on each day from 1 January 1994 to 31 December 2013 in b. Running window is defined as 91 days for daily-rolling hindcasts in S2S_HST (shadings; correlation of multi-levels) and 3 months for 5-day-interval hindcasts in S2S_IEXP2 (contours; correlation coefficient of 0.5)

The seasonal-to-interannual variation of forecast skill is shown in Fig. 7b, using the AC calculated for a 91-day-moving window centered at each day from 1 January 1994 to 31 December 2013. Remarkable interannual differences of forecast skill are found. Especially in the boreal springs or autumns of 1997 and 1998, some particular forecasts are skillful at lead time above 40 days. While in several summers such as in 1995, 2005 and 2007, the predictions are often unskillful beyond the 10-day lead time. In terms of the mean skill during lead time of 1–15 days, it is significantly correlated with the 91-day-running average of observed MJO amplitude (R = 0.66). This confirms again the close relationship between forecast skill and MJO strength.

Since the MJO prevails in the tropics and is naturally modulated by large-scale tropical atmospheric–oceanic events especially the ENSO and IOD, the connections of MJO forecast skill with seasonal-to-interannual variations of Niño 3.4 and IOD indices are also explored. The ACs, for the MJO forecasts within the moving windows that are centered on the mid date of each month, are extracted to compare with the variations of 3-month-running-averaged Niño 3.4 and IOD indices during January 1994 to December 2013 (a total of 240 months). It is found that the mean skill of MJO prediction during 1–15 lead days is insignificantly correlated with the Niño 3.4 index (R = 0.10; below the 90 % confidence level), but is somewhat connected with the IOD index (R = 0.26; above the 99 % confidence level). This suggests that the interannual difference of MJO forecast is likely to have more impact from the IOD than the ENSO in the model.

To further investigate the influences of ENSO and IOD on the seasonal dependence of MJO prediction in the model, MJO forecast skill and amplitude variation during abnormal ENSO and IOD phases are examined in Fig. 8. Based on observed 3-month-running-mean anomalies of Niño 3.4 and IOD indices during 1994 to 2013, the warm (cold) ENSO episodes are defined when the Niño 3.4 index is larger than 0.5 °C (smaller than −0.5 °C) for at least five consecutive over-lapping seasons, and the positive (negative) IOD events are recognized when the IOD index is above 0.4 °C (below −0.4 °C) (i.e., one standard deviation). It is shown that the difference of MJO forecast skill between positive and negative phases of IOD is initially tiny and becomes gradually remarkable as the lead time increases to 15 days, but it decreases again toward an extremely small value at longer lead times (Fig. 8a). Also, overall higher skill within 15-day lead time is found during El Niño than during La Niña, although the difference between them is relatively smaller than that between different IOD phases (Fig. 8a). The above result is associated with the difference of MJO amplitude. In the observation, the seasonal-averaged MJO amplitude is nearly uncorrelated with the IOD index (R = 0.001), but somewhat positively correlated with the Niño 3.4 index (R = 0.11), which is almost the opposite in the model free run that is featured by a more significant correlation of MJO amplitude with IOD index (R = 0.19) than with Niño 3.4 index (R = 0.01) (Fig. 8b). In forecasts, the relationship between MJO amplitude and Niño 3.4 index overall maintains a close-to-observed significance level before the 15-day lead time and then shows a quick drop at longer lead times. Differently, as lead time increases, the connection between MJO amplitude and IOD index quickly deviates from the initial realistic state and shows a gradual increase to the maximum at day 15, followed by a decrease at longer lead times (Fig. 8b). The fast enhancement of relationship between MJO amplitude and IOD at short lead time should be partially responsible for the apparent diversity of MJO forecast skill between different IOD phases, as well as for the seasonal dependence of MJO predictability, given that both MJO forecast skill by the model and the IOD attain maximum values in the boreal autumn. Besides, the transition of relationship beyond 15-day lead time is bound to exert an important influence on the skill differences between different phases of ENSO and IOD, and further contributes to the significant decline of overall forecast skill of MJO.
Fig. 8

a Bivariate anomaly correlation for MJO cases during positive (red solid) and negative (red dashed) ENSO phases, and positive (blue solid) and negative (blue dashed) IOD phases. b Variation of correlation between predicted seasonally-averaged MJO amplitude and observed 3-month-running-mean Niño 3.4/IOD index with lead time. Solid and hollow circles (only referred to the y-coordinate) show the correlation coefficient between MJO amplitude and Niño 3.4/IOD (red/blue) index in the observation and free model run, respectively. c Composite of MJO amplitude as a function of lead time for MJO cases (initial amplitude > 1.0) during positive (red) and negative (blue) ENSO phases in observations (solid) and predictions (dashed). d Same as (c), except for the composite MJO amplitudes during different IOD phases. The black lines in c and d show the composite of all MJO cases (initial amplitude > 1.0) during 1994–2013

For an exploration on the cause of the above-mentioned transition at about 15-day lead time, the composite evolutions of MJO amplitude during various phases of ENSO and IOD are given in Fig. 8c, d. Because of the maximum removal of systematic bias by consecutive forecast sampling and the sufficient composite cases during the 20 years, the composite of all MJO cases in the forecast shows a fairly realistic weakening process compared to that in the observation, even though at longer lead times the forecasted MJO amplitude is slightly stronger than the observed one (Fig. 8c, d). For the MJO variation during various phases of ENSO and IOD, the differences between forecasts and observations become remarkable. The observed MJO tends to be stronger in warm episode than in cold episode of ENSO, as well as in positive phase than in negative phase of IOD. The forecasts basically reproduce this during short lead times but show an opposite feature beyond the 20-day lead time due to the abnormally striking increase of MJO amplitude at longer lead times in the negative phase of ENSO or IOD. In addition, except during El Niño when the forecasted and observed MJO evolutions are relatively consistent with each other, MJO amplitude in the forecast is often distinctly larger than that in the observation at longer lead times during abnormal periods of ENSO and IOD (Fig. 8c, d). Given the coincidence of some particular ENSO and IOD events, the MJO evolutions during un-coincident ENSO and IOD phases are also separately examined, and similarly marked increases of MJO amplitude at longer lead times are found (figure not shown), suggesting the independence of MJO transition from weakening to strengthening during negative ENSO or IOD phase in the forecast.

Further, composite phase diagrams for MJO cases during various ENSO and IOD phases are presented in Fig. 9. MJO propagation depends on the large-scale ocean–atmosphere background state. In the observation, MJO is stronger than normal during El Niño and weaker than normal during La Niña when propagating from the eastern Indian Ocean and the Maritime Continent into the western Pacific (Fig. 9c, d). Also, MJO propagates more quickly and farther in La Niña than in El Niño condition when developing from initial phase over the western hemisphere (Fig. 9a, d). Besides, the MJO tends to be stronger in positive IOD phase than in negative IOD phase when it propagates from the western Pacific across the western hemisphere to Africa, whereas it can propagate especially farther in negative IOD phase than in positive IOD phase when starting from initial phase over the western hemisphere and Africa (Fig. 9e–h). Compared to the observations, the forecasts basically reproduce these features, although the errors in MJO phase and amplitude become gradually pronounced as the lead time increases.
Fig. 9

Composite phase diagrams for initially strong MJO cases (initial amplitude > 1.0) during ad ENSO and eh IOD phases. Red, blue and black lines show composites of MJO cases during positive ENSO/IOD phases, negative ENSO/IOD phases and all years from 1994 to 2013, respectively. Solid lines indicate observations and dashed lines are the results of S2S_HST. The dots denote every 5 days from the forecast starting date. Integers shown in the phase space are the numbers of cases used to composite the curves with corresponding colors

The composite phase diagrams do not necessarily provide a definite answer to the amplitude variation shown in Fig. 8, but they can help us understand possible consequences of phase differences between forecasts and observations. During La Niña, maximum difference appears in MJO evolution from initial phase 8, in which an obvious overestimation of amplitude forms quickly and maintains at most lead times in the forecast (Fig. 9d). Also, MJO trajectories evolving from initial phases 2 and 3 seem to suffer a propagation slowdown over the Maritime Continent in La Niña, which is especially notable in the forecast, leading to a hardly-moving phase angle and corresponding larger-than-observed amplitude at longer lead times (Fig. 9b, c). These features may partially contribute to the overestimated MJO amplitude during La Niña in the model as depicted in Fig. 8c.

The MJO variation features in different IOD phases are also examined. In the positive IOD phase, MJO evolutions starting from initial phases 5–8 in the forecast often suffer an overestimation of amplitude at longer lead times, showing that forecasted MJO convection tends to become larger than observed when it propagates across the eastern Pacific and Africa to the Indian Ocean (Fig. 9e–h). In the negative IOD phase, it seems that forecasted MJO trajectories starting from phases 5–7 encounter a barrier when approaching Africa and the Indian Ocean, and thus display pseudo retreats of MJO convection as well as redevelopments of MJO amplitude in the phase diagram (Fig. 9e–g). Also, compared to the observation, MJO signals starting from phase 8 suffer from owning apparently stronger amplitude at most lead times (Fig. 9h). Moreover, for the trajectories starting from phases 1–3 during the negative IOD phase, faster speed or larger amplitude is often found at longer lead times when the MJO propagates across the eastern Indian Ocean and the Maritime Continent (Fig. 9e–g). These features should all contribute to the extremely significant growth of MJO amplitude error shown in Fig. 8d to a certain extent.

Thus, in the model the amplitude growth when MJO propagates from the western hemisphere to the Indian Ocean, as well as the propagation slowdown when the MJO approaches the Maritime Continent during La Niña, reveals possible problem in describing MJO amplitude evolution over the tropical Indian Ocean. Further, the amplitude enhancement during positive IOD phase and propagation barrier during negative IOD phase when the MJO approaches Africa and the western Indian Ocean, as well as the accelerated pace or increased amplitude when the MJO propagates over the eastern Indian Ocean and the Maritime Continent in negative IOD phase, display possible influences of exaggerated IOD variability in the model. These results imply that the transition of relationships between MJO amplitude and ENSO/IOD at longer lead times is related to the distinct representation of MJO during various ENSO and IOD phases in the model, partially owing to an especially unrealistic description of the background states of the ocean as well as the atmosphere over the tropical Indian Ocean.

6 Improvement of MJO forecast

Given the limited forecast skill in S2S_HST, two other sets of hindcast experiment (i.e., S2S_IEXP1 and S2S_IEXP2) are conducted to improve MJO forecast by upgrading the initial conditions in the atmosphere or ocean. As described in Sect. 3, same preprocesses for S2S_HST, S2S_IEXP1 and S2S_IEXP2 are conducted to have a fair comparison among them. Verification is mainly focused on the overall skill, skill’s dependence on phase, as well as amplitude and phase angle errors of MJO.

Figure 10 shows the overall forecast skill and potential predictability of MJO from all experiments. Based on a perfect-model assumption, the potential predictability is estimated by computing bivariate AC between each ensemble member and the ensemble mean of the other three members and then averaging the ACs for all the ensemble subsamples. When using a small number of forecast cases (Fig. 10a), the MJO is predicted out to 16 days in S2S_HST (same as that shown in Sect. 5 for comprehensive reforecasts), 18 days in S2S_IEXP1 and 22 days in S2S_IEXP2, if taken AC = 0.5 as the threshold of useful skill. For a 5-day-interval experiment sampling, the MJO can be skillfully forecasted at lead times of 15, 16 and 21 days in S2S_HST, S2S_IEXP1 and S2S_IEXP2, respectively (Fig. 10b). These results show a 6-day increase of skill from S2S_HST to S2S_IEXP2, implying that the improved predictions may be capable in skillfully capturing MJO at the lead time of 21–22 days. As for the MJO predictability, potential useful skill is found within the lead time of about 30 days, indicating a skill gap of more than one week, which needs to be overcome by decreasing forecast error for the current prediction scheme. Nevertheless, as the lead time increases beyond 30 days, potential predictability often shows a quick drop, and thus tiny differences among S2S_HST, S2S_IEXP1 and S2S_IEXP2 are found, showing that at longer lead times the improved initialization does not lead to better potential predictability although the actual forecast skill increases considerably. It is unclear what governs the same upper limits of predictability at longer lead times for the three sets of experiments, albeit the uncertainty of predictability estimation due to limited quantity of ensemble member should not be excluded. In addition, it is revealed by the comparison between S2S_IEXP1 and S2S_IEXP2 that the impact of SST initial condition appears and gradually strengthens starting from the 5-day lead time, displaying the importance of reliable SST forecast for the MJO prediction.
Fig. 10

MJO forecast skill (bivariate anomaly correlation) as a function of lead time for a once-per-month and b quasi-5-day-interval forecast sampling in S2S_HST (black), S2S_IEXP1 (blue) and S2S_IEXP2 (red). The solid and dashed curves represent actual skill and potential skill based on perfect-model assumption, respectively. The horizontal dashed line represents the useful skill of 0.5

Besides an enhancement of overall skill, the forecasts of MJO in most seasons during 2000–2013 are also generally improved in S2S_IEXP2 as shown in Fig. 7. The maximum and minimum skills appear near the 27- and 16-day lead times, respectively, within the seasonal window centered in early November and July, respectively (Fig. 7a). From S2S_HST to S2S_IEXP2, the fact that the MJO is more predictable in the boreal autumn than in the other seasons has not changed, suggesting that the unique seasonal dependence of MJO forecast skill shown in this study is model-dependent although it is produced under the interaction between model errors and initial errors. Also, significant interannual differences of skill are shown. During the Dynamics of the MJO/Cooperative Indian Ocean Experiment on Intraseasonal Variability in Year 2011 (DYNAMO/CINDY) Period (1 September 2011 to 31 March 2012) and DYNAMO Intensive Observing Period (1 September 2011 to 15 January 2012), the overall forecast skills of MJO in S2S_IEXP2 are 27 and 25 days, respectively, which are basically comparable to the model results in Fu et al. (2013).

MJO forecast skill largely depends on initial/target phase (Kim et al. 2014b; Wang et al. 2014; Xiang et al. 2015); thus, in this section bivariate ACs are computed for each of the eight MJO phases using the cases for which initial/target phase angle of MJO is within that phase and the initial/target MJO amplitude is larger than one. Figure 11 shows the prediction skill of MJO as a function of lead time and initial/target phase. For S2S_HST, the differences of AC among various initial phases become appreciable beyond the lead time of about 10 days, featured by a peak for initial phase 5 and relatively low skills near initial phases 8 and 2 (Fig. 11a). This implies that the MJO is easy to be predicted when it is initially located over the Maritime Continent and western Pacific, but is difficult to be reproduced when it starts from the western hemisphere and western Indian Ocean. As for the variation of forecast skill with target phase (Fig. 11d), during the lead time of 10–20 days, especially high skill is found for the forecasts targeting phase 7, corresponding to the forecasts from initial phases 4 and 5. In S2S_IEXP1, the lead time when forecast becomes unskillful is generally enhanced for all initial phases except for phase 7, and apparent peaks for initial phases 2 and 5 and lows for initial phases 3 and 7 are found (Fig. 11b). These features are almost retained in S2S_IEXP2, although with a further increase of skill (Fig. 11c). Correspondingly, the forecast skills beyond 15-day lead time are minimal for target phase 2 and relatively small for target phase 5, but apparently high for target phases 3 and 7 (Fig. 11e, f). This displays that the MJO is difficult to be captured when it propagates from the western hemisphere to the western Indian Ocean and from the eastern Indian Ocean to the Maritime Continent in both S2S_IEXP1 and S2S_IEXP2. Thus, from S2S_HST to S2S_IEXP2, in contrast to small changes for predictions initialized from phase 7, significant improvements for predictions initialized from phase 2 are found, suggesting that because of optimization of atmosphere and ocean initial fields, prediction of MJO propagation over the Indian Ocean is obviously improved although difficulties still exist for forecasting MJO propagation across the eastern Indian Ocean to the Maritime Continent. It is necessary to stress that the skill features are somewhat different between S2S_HST and S2S_IEXP1/S2S_IEXP2, which implies that the forecasted skill–phase relationship is partially dependent on model initial conditions. In fact, due to distinct initial conditions and model physics, phase dependence of skill varies from model to model. For instance, MJO is especially difficult to predict from initial phase 2 in the NCEP CFS version 2 (Wang et al. 2014), initial phase 5 in the ECMWF forecast system (Kim et al. 2014b), initial phases 7 and 8 in the POAMA (Rashid et al. 2011), and so on.
Fig. 11

Bivariate anomaly correlation between observations and forecasts as a function of lead time and MJO phase for initial/target strong cases (amplitude > 1.0) in S2S_HST (a, d), S2S_IEXP1 (b, e) and S2S_IEXP2 (c, f). The forecast skill is stratified by initial phase (top panels) and target phase (bottom panels)

MJO predictability and its difference with prediction skill are further given in Fig. 12. Predictability of MJO is not sensitive to the initial phase until around the 20-day lead time, above which two peaks centered on phases 3 and 6 become prominent in most experiments. The differences between potential predictability and actual prediction skill are especially remarkable at longer lead times for initial phases 3 and 6–7 (Fig. 12a–c), indicating that more efforts in reducing model errors and improving initial conditions are needed for the prediction of MJO initialized from these phases. To a certain extent, this suggests the contribution of model deficiency to the difficulties in forecasting MJO propagation across the western hemisphere–western Indian Ocean and the eastern Indian Ocean–Maritime Continent areas. In contrast, particularly in S2S_IEXP1 and S2S_IEXP2, the prediction skill is close to the predictability beyond 20-day lead time for initial phases 8–2 and 4–5. Correspondingly, small differences between predictability and prediction skill are found for target phases 2–3 and 6–7 (Fig. 12e, f). From S2S_HST to S2S_IEXP1 and S2S_IEXP2, with improved initialization, the skill gap between potential and actual forecast is gradually narrowed, although it suffers a rapid drop of potential predictability at longer lead times similarly to that shown in Fig. 10.
Fig. 12

Potential predictability (bivariate anomaly correlation; contours) and differences between predictability and forecast skill (bivariate anomaly correlation; shadings) as a function of lead time and MJO phase for initial/target strong cases (amplitude > 1.0) in S2S_HST (a, b), S2S_IEXP1 (c, d) and S2S_IEXP2 (e, f). The forecast skill is stratified by initial phase (top panels) and target phase (bottom panels)

Given the significant skill enhancement for initial phase 2 and little improvement for initial phase 7, composite of OLR and U850 anomalies for these two initial phases are compared in Fig. 13. In the observation, MJO evolution from initial phase 2 is featured by propagation of negative OLR anomaly (i.e., enhanced convection) from the Indian Ocean to the western Pacific along the borderline between westerly and easterly winds; while in S2S_HST, a faster propagation is predicted and the negative OLR anomaly nearly stretches into the easterly wind zone beyond the lead time of 10 days. In S2S_IEXP1 and S2S_IEXP2, the eastward propagation along the zero contour of zonal wind is basically captured within the 15-day lead time, and closer-to-observed amplitude of anomaly is found at shorter lead times. Also, it should be noted that, compared to S2S_HST, the initial strong dry anomaly over the western Pacific is better captured by S2S_IEXP1 and S2S_IEXP2. Kim et al. (2014a) pointed out that the dry anomaly over the western Pacific plays a dynamically active role in MJO propagation over the Indian Ocean and Maritime Continent through the Rossby wave response. It is further shown by Kim et al. (2016) that the strongly suppressed convective anomaly over the western Pacific is the key for high-skill MJO events in the ECMWF hindcasts. Thus, a better description of initial dry anomaly over the western Pacific in this study may also be a crucial factor for the enhancement of forecast skill for MJO propagation from the Indian Ocean. In contrast to the feature for initial phase 2, the observed MJO evolution from initial phase 7 is characterized by propagation of significantly inhibited convection from the Indian Ocean to the western Pacific and weakly enhanced convection over the western hemisphere and Africa. In contrast to the observation, the convection enhancement over the western hemisphere is slightly overestimated in S2S_HST, and the convection inhibition over the Indian Ocean quickly weakens as the lead time increases. This feature is almost duplicated by both S2S_IEXP1 and S2S_IEXP2, revealing again the insensitivity of MJO prediction skill for initial phase 7 to the initial conditions in the model.
Fig. 13

Time-longitude composites of outgoing longwave radiation (shading; units W/m2) and 850-hPa zonal wind (contour; units m/s) anomalies for initial phase 2 (left column) and phase 7 (right column) in observations (a, b), S2S_HST (c, d), S2S_IEXP1 (e, f), and S2S_IEXP2 (g, h). Only MJO cases with larger-than-1 initial amplitude are used in the composite

In addition, in terms of MJO propagation in the phase space, the composite phase diagrams for various initial phases in the observation and predictions are shown in Fig. 14. Overall, the predictions for most initial phases show weaker-than-observed amplitude and quicker-than-observed decay of MJO. The differences in amplitude and phase angle between predictions and observations are relatively small during short lead time, especially below 10 days, and gradually become remarkable as the lead time increases. Beyond the target day 20, the predicted MJO often tends to disappearance while the observed MJO further propagates eastward. From S2S_HST to S2S_IEXP2, the forecasts of MJO variation in the phase space are overall improved, but with apparent differences among different initial phases. For example, more resemble-to-observed trajectories of MJO are found in S2S_IEXP2 than in S2S_HST for initial phase 1. Also, although similar trajectories from initial phase 2 are shown in both S2S_HST and S2S_IEXP2, the latter exhibits a more realistic moving speed compared to the former. In contrast, for initial phases 6 and 7, the MJO phase angle in S2S_IEXP2 is initially fairly close to the observation, but then gradually deviates from the observed routine and even shows larger biases than those in S2S_HST.
Fig. 14

Composite phase diagrams for MJO with initial phases 1, 3, 5, and 7 (a), and phases 2, 4, 6, and 8 (b). Black solid, colour dashed and colour solid lines show MJO evolutions in the observations, S2S_HST and S2S_IEXP2, respectively. The dots denote every 5 days from the forecast starting date. Only MJO cases with larger-than-1 initial amplitude are used in the composite

Aimed at the MJO amplitude and phase angle errors averaged over 20-day lead time, the forecast skill as a function of initial/target phase is computed and given in Fig. 15. Possibly suggesting the leading role of model deficiency, the MJO amplitude is always underestimated and its propagation speed is mostly overestimated in the predictions. The former is also shown in many previous studies (Vitart et al. 2007; Rashid et al. 2011; Kim et al. 2014b; Wang et al. 2014; Xiang et al. 2015), while the latter is somewhat in agreement with the result of Xiang et al. (2015). Compared to S2S_HST, the forecasts of MJO amplitude in S2S_IEXP1 and S2S_IEXP2 are overall better, showing especially apparent increase of skill for initial phases 1, 2 and 4 as well as small improvement for initial phases 3 and 6. As for the phase angle, considerable positive errors for initial phases 2–4 and 6 in S2S_HST are obviously decreased in S2S_IEXP2, but the tiny errors for other initial phases in S2S_HST are augmented in or out of phase in S2S_IEXP2. For the target phases, from S2S_HST to S2S_IEXP2 the underestimation of MJO amplitude is improved, and the overestimation of phase angle is reduced for most phases except for target phase 2, corresponding to the deterioration in forecasting phase angle for initial phases 7, 8 and 1.
Fig. 15

Left panel: MJO amplitude averaged over 20-day lead time in eight initial phases (a) and target phases (b) in observations (black), S2S_HST (green), S2S_IEXP1 (yellow), and S2S_IEXP2 (red). Right panel: MJO phase angle error averaged over 20-day lead time in eight initial phases (c) and target phases (d) in S2S_HST (green), S2S_IEXP1 (yellow), and S2S_IEXP2 (red). The value in parentheses shows the average over all eight phases. Only MJO cases with larger-than-1 initial/target amplitude are considered

7 Summary and discussion

This study explores MJO forecast skill and deficiency based on comprehensive hindcasts by the BCC_CSM that participates in the S2S Prediction Project. We further attempt to improve the MJO forecast by limited-sampling experiments with upgraded initial conditions.

A relatively low overall skill of about 16 days in S2S_HST is found, compared to the beyond-20-day MJO forecast skills in some state-of-the-art forecast models (e.g., Rashid et al. 2011; Zhang et al. 2013; Kim et al. 2014b; Vitart 2014; Wang et al. 2014; Xiang et al. 2015). Also, a distinctive seasonal dependence of skill, featured by an about-19-day maximum in the boreal autumn rather than winter, appears in the predictions. It indicates that the seasonal-to-interannual variation of MJO forecast skill is more significantly correlated with IOD index than with Niño 3.4 index, implying different degrees of impacts from the IOD and ENSO. Further, the comparison of MJO forecast skill and amplitude evolution during different IOD and ENSO phases show that, in contrast to the little changed connection between MJO amplitude and ENSO at shorter lead times, the MJO amplitude-IOD relationship quickly strengthens as lead time increases to about 15 days, which may partially account for MJO forecast skill’s excessive dependence on IOD and is thus related to the peak of prediction skill in the boreal autumn when the IOD often reaches its maturity. Nevertheless, the relationship of MJO amplitude with IOD index or Niño 3.4 index shows a different-from-before fast weakening as the lead time goes above 15–20 days, possibly because of the apparent overestimation of MJO amplitude at longer forecast lead times in the negative phases of IOD and ENSO, especially the IOD. For the above relationship transition, unrealistic MJO activity in the prediction may play an important role. Particularly, amplitude exaggeration during positive IOD phase and propagation barrier during negative IOD phase when the MJO approaches Africa and the Indian Ocean, as well as the overestimation of amplitude or speed when the MJO propagates over the eastern Indian Ocean and Maritime Continent during the negative IOD phase, altogether suggest the influence of overestimated IOD variability in the model and partially contribute to the variations of MJO’s forecast skill and amplitude with increasing lead time. These results indicate that the actual MJO predictability is obviously modulated by the interannual variability of large-scale external forcing background state.

Given the above-mentioned limited skill, with upgraded ocean and atmosphere initial conditions, a 6-day increase of overall forecast skill is obtained, and thus the MJO is predicted at the lead time of 21–22 days in S2S_IEXP2. This 3-week skill is comparable to the skill of most statistical models (e.g., Seo et al. 2009; Kang and Kim 2010; Cavanaugh et al. 2015) except for the nonlinear stochastic model with “past-noise forecasting” method (Kondrashov et al. 2013) and the spatial–temporal projection statistical model (Zhu et al. 2015). It is also comparable to the skill of dynamical models at several advanced operational centers (Rashid et al. 2011; Wang et al. 2014). Compared to the multi-model MJO forecasts, the skill shown in this study is in the upper-middle class of the skill spread by all the individual models in Intraseasonal Variability Hindcast Experiment (ISVHE), but is still behind the skill of the best multi-model ensemble mean by about one week (Zhang et al. 2013). In S2S_IEXP2, the skill maximum of about 27 days also appears in the boreal autumn, showing a similar seasonal dependence as that in S2S_HST. MJO prediction skill obviously depends on initial and target phases. Typically, in S2S_HST the MJO is less predictable when it propagates across the Indian Ocean and Maritime Continent than across the western Pacific, while in S2S_IEXP1 and S2S_IEXP2 the MJO propagation over the Indian Ocean is obviously improved although the forecast of MJO propagation across the eastern Indian Ocean and the Maritime Continent is still relatively difficult. Nevertheless, from S2S_HST to S2S_IEXP2, albeit with an overall increase of skill for most MJO initial phases, little change of forecast skill is found for MJO evolution starting from the western hemisphere. The differences between potential predictability and actual prediction skill further show that significant impacts of model imperfection and initial errors exist in MJO forecasts for initial phases 3 and 6–7, corresponding to MJO propagation across the eastern Indian Ocean–Maritime Continent and the western hemisphere–western Indian Ocean areas. For other initial phases, the forecast skill is relatively close to the predictability, and the upper limit of potential predictability is quickly met in S2S_IEXP2 as the lead time increases beyond the 20-day lead time. In addition, from S2S_HST to S2S_IEXP2, the MJO variation in phase space becomes more close to the observation, and underestimation of MJO amplitude and overestimation of propagation speed are somewhat reduced, but with apparent differences among various phases.

The above results reveal an evident improvement of MJO forecast skill because of the introduction of more accurate initial conditions. In particular, compared to the ocean initial fields produced within a 10-day assimilation window from BCC_GODAS, the contribution of high temporal-resolution OISST initial conditions is more vital as shown in Fig. 10. Fu et al. (2008, 2013) indicate that reliability of SST greatly determines the predictability of tropical intraseasonal oscillation and a two-tier forecast can perform as well as the one-tier forecast if the same SST boundary conditions are used. It also shows that high temporal SST frequency is helpful for obviously improving the simulation and prediction of the MJO (Kim et al. 2008). Thus, improving SST forecast is certainly a primary target for the new initialization scheme in this study. Figure 16 gives the prediction skill of SST averaged during 1–20 lead days for different forecast schemes. The persistent forecasts show significant correlation skills everywhere, but with particularly low centers over the central-eastern equatorial Indian Ocean and the western equatorial Pacific, where convection activity is considerably vigorous. S2S_HST show similar pattern of skill distribution, but the magnitudes are mostly smaller than those for the persistent forecasts. In contrast, improved SST forecasts are shown in S2S_IEXP2 and increased skills appear in most tropical and subtropical regions compared to the persistent forecasts. Therefore, from S2S_HST to S2S_IEXP2, a transition of SST skill from below- to above-the-persistent forecast is found, with especially significant improvement over the tropical oceans. This should largely contribute to the improvement of MJO forecast. Typically, as suggested by Fig. 10b, for the enhancement of overall MJO prediction skill, the optimization of ocean initial fields plays a more important role than that of atmosphere initial fields, and the impact rapidly becomes remarkable as the lead time increase beyond one week. Compared to the apparent improvement in S2S_IEXP2, no significant increase of SST skill is found in S2S_IEXP1 (Fig. 16g), although in a coupled model better SST forecast may be also partially attributed to the feedback of improved atmosphere initial conditions. However, the influences of atmosphere initial fields cannot be ignored. As seen in Figs. 11 and 15, for initially strong MJO events, improved atmosphere initial conditions lead to considerable increases of MJO prediction skill in phase and amplitude. From this example, the role of upgraded atmosphere initial conditions is sometimes particularly critical although it is inconspicuous on the whole.
Fig. 16

Left panel: anomaly correlations of 20-day-averaged SST between observations and forecasts. a Persistent forecast by adding observed anomaly of previous 20 days before the forecast starting date to the climatological mean of 20-day forecast, b S2S_HST, c S2S_IEXP1, and d S2S_IEXP2. Right panel: differences of SST forecast skill between S2S_HST and persistent forecast (e), between S2S_IEXP2 and persistent forecast (f), between S2S_IEXP1 and S2S_HST (g), and between S2S_IEXP2 and S2S_HST (h)

It is noted that, although MJO prediction is improved by better initialization schemes, there are still several limitations, including the unexpected skill peak in the boreal autumn, apparently faster-than-observed propagation speed of MJO, difficulty in predicting MJO propagation across the eastern Indian Ocean and the Maritime Continent, and so on. As we speculated in Sects. 46, these characteristics may be dominated mainly by the deficiencies of model physics, and thus they vary from model to model. Typically, the propagation of MJO across the Maritime Continent is difficult to predict as shown in many previous studies (Vitart et al. 2007; Seo et al. 2009; Fu et al. 2011; Weaver et al. 2011; Zhang and van den Dool 2012; Kim et al. 2014b; Wang et al. 2014). This Maritime Continent predictability barrier, however, is not found in some other dynamical models (Kang and Kim 2010; Rashid et al. 2011; Xiang et al. 2015), suggesting its highly model-dependent property. Also, in contrast to excessively slow MJO propagation speed in POAMA, CFS and ECMWF forecast system (Rashid et al. 2011; Kim et al. 2014b; Wang et al. 2014), overall faster-than-observed MJO propagation is predicted in Xiang et al. (2015) and also in this study. In addition, previous studies revealed a common deficiency that MJO amplitude is often underestimated in predictions (Vitart et al. 2007; Rashid et al. 2011; Kim et al. 2014b; Wang et al. 2014; Xiang et al. 2015), which is also clearly presented in our analysis (see Fig. 15). This problem may be partially related to the weak MJO variability in the model and to the ensemble averaging process for MJO signal extraction. Nevertheless, as we have shown in Fig. 8, the intensive experiment sampling and sufficient quantity of forecast cases can remove maximum amount of model systematic biases and further obtain close-to-observed MJO amplitude evolutions. In this context, sufficiently frequent reforecast experiments are necessary for an objective and fair verification of MJO prediction.

The above-mentioned deficiencies in BCC_CSM are closely interrelated and altogether contributed to the limited MJO prediction skill. As we have shown, the maximum skill in the boreal autumn is partially owing to the overestimated relationship of MJO with the IOD, which is featured by significant biases in climatology and variability of SST, precipitation and circulation over the western and eastern Indian Ocean in the model. Also, in the free run of model, composite distributions of MJO activity in various phases show that the MJO convection initiated over Africa and the western Indian Ocean propagates eastward, and culminates but then quickly decays when crossing the eastern Indian Ocean and Maritime Continent (figure not shown). This should be partially related to the Maritime Continent barrier and the relatively short lifecycle of MJO in the model. Thus, model’s inability in describing features over the eastern Indian Ocean and Maritime Continent is a serious problem. To briefly explore this, a moist heat budget diagnosis for MJO phases 3 and 4 over the region (10°S–10°N, 100°–120°E) is shown in Fig. 17. It is supposed that the planetary boundary layer (PBL) moistening to the east of MJO convection destabilizes the local atmosphere and further induces the eastward propagation of the MJO (Hsu and Li 2012). In phase 3, when the observed MJO convection center is locates over the eastern Indian Ocean, vertical advection of moisture dominates the low-level moistening and causes a positive tendency of PBL-integrated moisture over the western Maritime Continent, along with a small contribution by positive horizontal moisture advection but largely offset by the condensational latent heating (Fig. 17a). This helps MJO convection propagate eastward and reach the western Maritime Continent in phase 4. Afterward, a negative tendency of PBL-integrated moisture is shown (Fig. 17b) because the PBL moistening moves to the eastern Maritime Continent and the MJO signals over the eastern Indian Ocean and western Maritime Continent will gradually transit into dry phase. In the model simulation, however, the vertical moisture advection and latent heating are both overestimated, and the horizontal moisture advection is negative, altogether resulting in a negative tendency of PBL moisture over the western Maritime Continent in MJO phase 3 (Fig. 17c). In phase 4, vertical moisture advection and latent heating are both remarkably reduced, and are thus apparently weaker than the observed; meanwhile, the negative horizontal moisture advection further strengthens, contributing to the continuous decrease of PBL moisture over the western Maritime Continent (Fig. 17d). In general, the unrealistic negative horizontal moisture advection in phase 3 and its further intensification in phase 4, as well as the significant decline of horizontal moisture convergence from phase 3 to phase 4, imply untenable PBL moistening and rapid weakening of convection activity over the eastern Maritime Continent in the model. This accelerates the wet-to-dry phase transition, and thus partially accounts for the short life and propagation barrier of MJO convection over that area. Therefore, overcoming the tendency toward drying over the eastern Indian Ocean and Maritime Continent is one of the most key aspects to improve MJO simulation and prediction in future work.
Fig. 17

1000–700-hPa integrated intraseasonal moisture budget terms averaged during MJO phase 3 (top panels) and phase 4 (bottom panels) over the equatorial region (10°S–10°N, 100°–120°E). From left to right, the four terms are specific humidity tendency, horizontal moisture advection, vertical moisture advection, and apparent moisture source. Two terms inside the panel represent the relative contributions of horizontal moisture convergence and vertical flux to the vertical moisture advection. Shown are results from the ECMWF Interim Reanalysis during 2002–2011 (left panels) and the free run of BCC_CSM (right panels)



This study was jointly supported by the National Key Research and Development Program of China (Grant 2016YFA0602102), the National Basic Research Program of China (Grant 2015CB453203), and the National Natural Science Foundation of China (Grants 41675090 and 41375081).


  1. Abhilash S, Sahai AK, Borah N et al (2014) Prediction and monitoring of monsoon intraseasonal oscillations over Indian monsoon region in an ensemble prediction system using CFSv2. Clim Dyn 42:2801–2815CrossRefGoogle Scholar
  2. Adler RF, Huffman GJ, Chang A et al (2003) The version-2 global precipitation climatology project (GPCP) monthly precipitation analysis (1979–present). J Hydrometeorol 4(6):1147–1167CrossRefGoogle Scholar
  3. Brunet G, Shapiro M, Hoskins B et al (2010) Collaboration of the weather and climate communities to advance subseasonal-to-seasonal prediction. Bull Am Meteorol Soc 91:1397–1406CrossRefGoogle Scholar
  4. Cavanaugh NR, Allen T, Subramanian A, Mapes B, Seo H, Miller AJ (2015) The skill of atmospheric linear inverse models in hindcasting the Madden–Julian Oscillation. Clim Dyn 44:897–906CrossRefGoogle Scholar
  5. Charney JG, Shukla J (1981) Predictability of monsoons. In: Lighthill J, Pearce RP (eds) Monsoon dynamics. Cambridge University Press, Cambridge, pp 99–109CrossRefGoogle Scholar
  6. Fu X, Yang B, Bao Q, Wang B (2008) Sea surface temperature feedback extends the predictability of tropical intraseasonal oscillation. Mon Weather Rev 136:577–597CrossRefGoogle Scholar
  7. Fu X, Wang B, Lee JY, Wang W, Gao L (2011) Sensitivity of dynamical intraseasonal prediction skills to different initial conditions. Mon Weather Rev 139:2572–2592CrossRefGoogle Scholar
  8. Fu X, Lee JY, Hsu PC et al (2013) Multi-model MJO forecasting during DYNAMO/CINDY period. Clim Dyn 41:1067–1081CrossRefGoogle Scholar
  9. Gottschalck J, Wheeler M, Weickmann K et al (2010) A framework for assessing operational Madden–Julian oscillation forecasts: a CLIVAR MJO working group project. Bull Am Meteorol Soc 91:1247–1258CrossRefGoogle Scholar
  10. Griffies SM, Gnanadesikan A, Dixon KW et al (2005) Formulation of an ocean model for global climate simulations. Ocean Sci 1:45–79CrossRefGoogle Scholar
  11. Hsu PC, Li T (2012) Role of the boundary layer moisture asymmetry in causing the eastward propagation of the Madden–Julian oscillation. J Clim 25(14):4914–4931CrossRefGoogle Scholar
  12. Hsu PC, Li T, You L, Gao J, Ren H (2015) A spatial–temporal projection model for 10–30 day rainfall forecast in South China. Clim Dyn 44:1227–1244CrossRefGoogle Scholar
  13. Hung MP, Lin JL, Wang W et al (2013) MJO and convectively coupled equatorial waves simulated by CMIP5 climate models. J Clim 26:6185–6214CrossRefGoogle Scholar
  14. Hurrell J, Meehl G, Bader D et al (2009) A unified modelling approach to climate prediction. Bull Am Meteorol Soc 90:1819–1832CrossRefGoogle Scholar
  15. Jiang X, Yang S, Li Y et al (2013) Seasonal-to-interannual prediction of the Asian summer monsoon in the NCEP climate forecast system Version 2. J Clim 26:3708–3727CrossRefGoogle Scholar
  16. Jiang X, Waliser DE, Xavier PK et al (2015) Vertical structure and physical processes of the Madden–Julian oscillation: exploring key model physics in climate simulations. J Geophys Res Atmos 120:4718–4748CrossRefGoogle Scholar
  17. Jie W, Wu T, Wang J, Li W, Liu X (2014) Improvement of 6–15 day precipitation forecasts using a time-lagged ensemble method. Adv Atmos Sci 31:293–304CrossRefGoogle Scholar
  18. Kalnay E, Kanamitsu M, Kistler R et al (1996) The NCEP/NCAR 40-year reanalysis project. Bull Am Meteorol Soc 77:437–471CrossRefGoogle Scholar
  19. Kanamitsu M, Ebisuzaki W, Woollen J et al (2002) NCEP–DOE AMIP-II reanalysis (R-2). Bull Am Meteorol Soc 83:1631–1643CrossRefGoogle Scholar
  20. Kang IS, Kim HM (2010) Assessment of MJO predictability for boreal winter with various statistical and dynamical models. J Clim 23:2368–2378CrossRefGoogle Scholar
  21. Kim HM, Hoyos CD, Webster PJ, Kang IS (2008) Sensitivity of MJO simulation and predictability to sea surface temperature variability. J Clim 21:5304–5317CrossRefGoogle Scholar
  22. Kim D, Sperber K, Stern W et al (2009) Application of MJO simulation diagnostics to climate models. J Clim 22:6413–6436CrossRefGoogle Scholar
  23. Kim HM, Webster PJ, Curry JA, Toma VE (2012) Asian summer monsoon prediction in ECMWF system 4 and NCEP CFSv2 retrospective seasonal forecasts. Clim Dyn 39:2975–2991CrossRefGoogle Scholar
  24. Kim D, Kug JS, Sobel AH (2014a) Propagating versus nonpropagating Madden–Julian oscillation events. J Clim 27:111–125CrossRefGoogle Scholar
  25. Kim HM, Webster PJ, Toma VE, Kim D (2014b) Predictability and prediction skill of the MJO in two operational forecasting systems. J Clim 27:5364–5378CrossRefGoogle Scholar
  26. Kim HM, Kim D, Vitart F, Toma VE, Kug JS, Webster PJ (2016) MJO propagation across the maritime continent in the ECMWF ensemble prediction system. J Clim 29:3973–3988CrossRefGoogle Scholar
  27. Kirtman B, Min D, Infanti J et al (2014) The North American multimodel ensemble: phase-1 seasonal-to-interannual prediction; phase-2 toward developing intraseasonal prediction. Bull Am Meteorol Soc 95:585–601CrossRefGoogle Scholar
  28. Kondrashov D, Chekroun MD, Robertson AW, Ghil M (2013) Low-order stochastic model and “past-noise forecasting” of the Madden–Julian oscillation. Geophys Res Lett 40:5305–5310CrossRefGoogle Scholar
  29. Kug JS, Kang IS, Choi DH (2008) Seasonal climate predictability with tier-one and tier-two prediction systems. Clim Dyn 31:403–416CrossRefGoogle Scholar
  30. Lee JY, Wang B, Kang IS et al (2010) How are seasonal prediction skills related to models’ performance on mean state and annual cycle? Clim Dyn 35:267–283CrossRefGoogle Scholar
  31. Lee SS, Lee JY, Ha KJ, Wang B, Schemm JKE (2011) Deficiencies and possibilities for long-lead coupled climate prediction of the Western North Pacific-East Asian summer monsoon. Clim Dyn 36:1173–1188CrossRefGoogle Scholar
  32. Liebmann B, Smith CA (1996) Description of a complete (interpolated) outgoing longwave radiation dataset. Bull Am Meteorol Soc 77:1275–1277Google Scholar
  33. Lin H, Brunet G, Derome J (2008) Forecast skill of the Madden–Julian oscillation in two Canadian atmospheric models. Mon Weather Rev 136:4130–4149CrossRefGoogle Scholar
  34. Liu X, Yang S, Kumar A, Weaver S, Jiang X (2013) Diagnostics of subseasonal prediction biases of the Asian summer monsoon by the NCEP climate forecast system. Clim Dyn 41:1453–1474CrossRefGoogle Scholar
  35. Liu X, Wu T, Yang S et al (2014a) Relationships between interannual and intraseasonal variations of the Asian-western Pacific summer monsoon hindcasted by BCC_CSM1.1(m). Adv Atmos Sci 31:1051–1064CrossRefGoogle Scholar
  36. Liu X, Yang S, Li Q et al (2014b) Subseasonal forecast skills and biases of global summer monsoons in the NCEP Climate Forecast System version 2. Clim Dyn 42:1487–1508CrossRefGoogle Scholar
  37. Liu X, Wu T, Yang S et al (2015a) Performance of the seasonal forecasting of the Asian summer monsoon by BCC_CSM1.1(m). Adv Atmos Sci 32:1156–1172CrossRefGoogle Scholar
  38. Liu X, Yang S, Li J et al (2015b) Subseasonal predictions of regional summer monsoon rainfalls over tropical Asian oceans and land. J Clim 28:9583–9605CrossRefGoogle Scholar
  39. Madden RA, Julian PR (1971) Detection of a 40–50 day oscillation in the zonal wind in the tropical Pacific. J Atmos Sci 28:702–708CrossRefGoogle Scholar
  40. Neena JM, Lee JY, Waliser D, Wang B, Jiang X (2014) Predictability of the Madden–Julian oscillation in the intraseasonal variability hindcast experiment (ISVHE). J Clim 27:4531–4543CrossRefGoogle Scholar
  41. Nie S, Luo Y, Wu T, Shi X, Wang Z (2015) A merging scheme for constructing daily precipitation analyses based on objective bias-correction and error estimation techniques. J Geophys Res Atmos 120:8671–8692CrossRefGoogle Scholar
  42. Palmer TN, Doblas-Reyes FJ, Hagedorn R et al (2004) Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bull Am Meteorol Soc 85:853–872CrossRefGoogle Scholar
  43. Pegion K, Sardeshmukh PD (2011) Prospects for improving subseasonal predictions. Mon Weather Rev 139:3648–3666CrossRefGoogle Scholar
  44. Rashid HA, Hendon HH, Wheeler MC, Alves O (2011) Prediction of the Madden–Julian oscillation with the POAMA dynamical prediction system. Clim Dyn 36:649–661CrossRefGoogle Scholar
  45. Reynolds RW (2009) What’s new in version 2. OISST web page, NOAA/NCDC.
  46. Reynolds RW, Rayner NA, Smith TM, Stokes DC, Wang W (2002) An improved in situ and satellite SST analysis for climate. J Clim 15:1609–1625CrossRefGoogle Scholar
  47. Reynolds RW, Smith TM, Liu C et al (2007) Daily high-resolution blended analyses for sea surface temperature. J Clim 20:5473–5496CrossRefGoogle Scholar
  48. Seo KH, Wang W, Gottschalck J et al (2009) Evaluation of MJO forecast skill from several statistical and dynamical forecast models. J Clim 22:2372–2388CrossRefGoogle Scholar
  49. Vitart F (2014) Evolution of ECMWF sub-seasonal forecast skill scores. Q J R Meteorol Soc 140:1889–1899CrossRefGoogle Scholar
  50. Vitart F, Molteni F (2010) Simulation of the Madden–Julian oscillation and its teleconnections in the ECMWF forecast system. Q J R Meteorol Soc 136:842–855CrossRefGoogle Scholar
  51. Vitart F, Woolnough S, Balmaseda MA, Tompkins AM (2007) Monthly forecast of the Madden–Julian oscillation using a coupled GCM. Mon Weather Rev 135:2700–2715CrossRefGoogle Scholar
  52. Vitart F, Ardilouze C, Bonet A et al (2016) The sub-seasonal to seasonal prediction (S2S) project database. Bull Am Meteorol Soc. doi: 10.1175/BAMS-D-16-0017.1 Google Scholar
  53. Waliser DE, Lau KM, Stern W, Jones C (2003) Potential predictability of the Madden–Julian oscillation. Bull Am Meteorol Soc 84:33–50CrossRefGoogle Scholar
  54. Wang B, Ding Q, Fu X et al (2005) Fundamental challenge in simulation and prediction of summer monsoon rainfall. Geophys Res Lett 32:L15711. doi: 10.1029/2005GL022734 CrossRefGoogle Scholar
  55. Wang B, Lee JY, Kang IS et al (2008) How accurately do coupled climate models predict the leading modes of Asian–Australian monsoon interannual variability? Clim Dyn 30:605–619CrossRefGoogle Scholar
  56. Wang W, Hung MP, Weaver SJ, Kumar A, Fu X (2014) MJO prediction in the NCEP climate forecast system version 2. Clim Dyn 42:2509–2520CrossRefGoogle Scholar
  57. Weaver SJ, Wang W, Chen M, Kumar A (2011) Representation of MJO variability in the NCEP climate forecast system. J Clim 24:4676–4694CrossRefGoogle Scholar
  58. Weisheimer A, Doblas-Reyes FJ, Palmer TN et al (2009) ENSEMBLES: a new multi-model ensemble for seasonal-to-annual predictions-skill and progress beyond DEMETER in forecasting tropical Pacific SSTs. Geophys Res Lett 36:L21711CrossRefGoogle Scholar
  59. Wheeler MC, Hendon HH (2004) An all-season real-time multivariate MJO index: development of an index for monitoring and prediction. Mon Weather Rev 132:1917–1932CrossRefGoogle Scholar
  60. Winton M (2000) A reformulated three-layer sea ice model. J Atmos Ocean Technol 17:525–531CrossRefGoogle Scholar
  61. Wu T (2012) A mass-flux cumulus parameterization scheme for large-scale models: description and test with observations. Clim Dyn 38:725–744CrossRefGoogle Scholar
  62. Wu T, Yu R, Zhang F (2008) A modified dynamic framework for the atmospheric spectral model and its application. J Atmos Sci 65:2235–2253CrossRefGoogle Scholar
  63. Wu T, Yu R, Zhang F et al (2010) The Beijing Climate Center atmospheric general circulation model: description and its performance for the present-day climate. Clim Dyn 34:123–147CrossRefGoogle Scholar
  64. Wu T, Li W, Ji J et al (2013) Global carbon budgets simulated by the Beijing Climate Center climate system model for the last century. J Geophys Res Atmos 118:1–22Google Scholar
  65. Wu T, Song L, Li W et al (2014) An overview of BCC climate system model development and application for climate change studies. J Meteorol Res 28:34–56Google Scholar
  66. Xiang B, Zhao M, Jiang X et al (2015) The 3–4-week MJO prediction skill in a GFDL coupled model. J Clim 28:5351–5364CrossRefGoogle Scholar
  67. Yang S, Zhang Z, Kousky VE et al (2008) Simulations and seasonal prediction of the Asian summer monsoon in the NCEP climate forecast system. J Clim 21:3755–3775CrossRefGoogle Scholar
  68. Zhang Q, van den Dool H (2012) Relative merit of model improvement versus availability of retrospective forecasts: the case of climate forecast system MJO prediction. Weather Forecast 27:1045–1051CrossRefGoogle Scholar
  69. Zhang C, Gottschalck J, Maloney ED et al (2013) Cracking the MJO nut. Geophys Res Lett 40:1223–1230CrossRefGoogle Scholar
  70. Zhou W, Chen M, Zhuang W et al (2016) Evaluation of the tropical variability from the Beijing Climate Center’s real-time operational global ocean data assimilation system. Adv Atmos Sci 33:208–220CrossRefGoogle Scholar
  71. Zhu J, Shukla J (2013) The role of air-sea coupling in seasonal prediction of Asia-Pacific summer monsoon rainfall. J Clim 26:5689–5697CrossRefGoogle Scholar
  72. Zhu Z, Li T, Hsu PC, He J (2015) A spatial–temporal projection model for extended-range forecast in the tropics. Clim Dyn 45:1085–1098CrossRefGoogle Scholar

Copyright information

© The Author(s) 2016

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Xiangwen Liu
    • 1
  • Tongwen Wu
    • 1
  • Song Yang
    • 2
  • Tim Li
    • 3
  • Weihua Jie
    • 1
  • Li Zhang
    • 1
  • Zaizhi Wang
    • 1
  • Xiaoyun Liang
    • 1
  • Qiaoping Li
    • 1
  • Yanjie Cheng
    • 1
  • Hongli Ren
    • 1
  • Yongjie Fang
    • 1
  • Suping Nie
    • 1
  1. 1.Climate Model Division, National Climate CenterChina Meteorological AdministrationBeijingChina
  2. 2.Department of Atmospheric SciencesSun Yat-sen UniversityGuangzhouChina
  3. 3.Department of Meteorology, International Pacific Research CenterUniversity of Hawaii at ManoaHonoluluUSA

Personalised recommendations