1 Introduction

Current climate modeling practice places a high priority on the development and use of a few dozen state-of-the-art climate models, many of which are housed at national modeling centers. Recently, however, a number of scientists have proposed substantial changes to the practice of climate modeling (see, e.g., Held 2005; Hurrell et al. 2009; Palmer 2012; Curry 2013). In fact, some even say that a “revolution” in climate modeling is needed (Shukla et al. 2009).

The leading proposals for change can be described as the unified approach, the hierarchy approach and the pluralist approach. The unified approach would pool international resources to develop and deploy a small number of climate models that have spatial and temporal resolutions that are much higher than those of current models and that are constructed within a seamless prediction framework; these efforts are intended to meet “societal demand” for better information about future climate change (see, e.g., Shukla et al. 2009, 2010; Hurrell et al. 2009, and Palmer 2012). The hierarchy approach would have climate scientists devote more attention to the development and systematic study of hierarchies of models that relate to one another in known ways, in order to facilitate understanding of the climate system (Held 2005, 2014). The pluralist approach would have climate scientists increase the diversity of modeling efforts, by systematically increasing the structural variation of state-of-the-art, physics-based models and perhaps also by developing more data-driven and semi-empirical models.

Up to now, there has been no comparative discussion of these different proposals. We offer such a discussion. After briefly reviewing the current state of climate modeling and some of the limitations that motivate calls for changes in practice (Section 2), we outline the different proposals in turn, identifying some challenges and questions that remain for each. Section 3 focuses on the unified approach, calling attention to uncertainty about when – and even whether – it would deliver much more accurate climate change projections. Section 4 focuses on the hierarchy approach, highlighting the potential limitations of its reductive strategy as well as the challenge of identifying hierarchies of lasting value. Section 5 considers the pluralist approach, noting uncertainty about what systematic exploration of model structures might involve as well as worries about the predictive reliability of empirical models, but suggesting that differences between empirical models and other climate models should not be exaggerated. In discussing each proposal, we distinguish between aspects that are primarily practical, e.g., institutional and organizational aspects, and those that are primarily scientific.

Finally, in Section 6, we discuss the reasonably expected gains and costs of the three different approaches and offer some closing reflections. We argue that the policy-based argument for accelerating the development of very high resolution models is not entirely persuasive. We also suggest that piecemeal pursuit of the hierarchy approach and increased attention to empirical modeling approaches can be expected to benefit climate science without requiring much increase in resources. Finally, we note that substantial resources might be freed by bringing the number of complex climate models into line with the effective number of models of this kind.

2 Climate modeling today

Climate models are computer-implemented, numerical models used to simulate Earth’s climate system. They come in a range of types. Among the most familiar are energy balance models (EBMs), Earth system models of intermediate complexity (EMICs) and global climate models (GCMs).

The simplest EBMs represent the flux of energy in and out of the climate system as a whole but do not represent components of the climate system or Earth’s geography. EMICs do represent climate system components as well as Earth’s geography, but often in a relatively coarse and simplified way. GCMs are characterized by their higher resolution and by their explicit representation of a wide range of atmospheric and oceanic processes. The latest generation of complex climate models, Earth system models (ESMs), are akin to GCMs but also represent biogeochemical processes that are relevant to climate change. Another important kind of climate model is the regional climate model (RCM). RCMs have a higher resolution than the typical 100 to 300 km resolution of GCMs and ESMs, but the domains of RCMs cover only portions of the globe.

While all of these, and other, types of model continue to be important in climate research, in recent decades state-of-the-art GCMs and ESMs have come to occupy center stage. Rapid increases in computing power have made possible the use of GCMs and ESMs for a wide variety of purposes (Edwards 2011). They are used to project future changes in global climate and in studies aimed at quantifying uncertainty about those changes. In addition, they play important roles in detection and attribution studies, and more broadly in attempts to identify the causes of a variety of climatic phenomena (Taylor et al. 2012). The justification for relying on GCMs and ESMs in these investigations has been the extensive physical knowledge that these models implement as well as their ability to simulate a wide variety of aspects of observed climate (IPCC 2013).

On a more practical level, developing and using GCMs and ESMs is resource intensive, requiring the collaboration of large numbers of scientists, software engineers and support staff as well as significant supercomputing time. In the U.S., for example, the National Center for Atmospheric Research and the Geophysical Fluid Dynamics Laboratory each carry out multi-million dollar modeling exercises that can involve over 100 workers (NRC 2012, p. 28). It is because GCMs and ESMs are so resource-intensive that there are only a few dozen of them being developed and maintained at any given time. Results from these models are compared periodically in model intercomparison projects (e.g., CMIP3 2007 and CMIP5 2011).

The GCM/ESM-centric approach in its current state, however, has a number of limitations. We mention three scientific ones that are particularly important in motivating the calls for change in the practice of climate modeling that are discussed below. First, current GCMs/ESMs still have spatial resolution that is too coarse to simulate explicitly a variety of important climate system phenomena, including cloud-systems and ocean eddies. Parameterizations are used to represent these unresolved processes in a simplified way, but there is uncertainty about how these parameterizations should be constructed, which translates in turn into substantial uncertainty in projections of future climate change. Second, understanding the behavior of GCMs/ESMs can be quite difficult, both because these models are highly complex and because they differ from simpler climate models in numerous ways (e.g., they incorporate additional processes, represent some processes differently, etc.), making it harder to leverage understanding of simpler models to interpret the behavior of GCMs/ESMs. Third, available GCMs and ESMs constitute an ensemble of opportunity (Stainforth et al. 2007; Knutti et al. 2010; Hargreaves and Annan 2014), providing something closer to a set of best-guess projections than a range of independent projections that span our uncertainty. According to Pennell and Reichler’s analysis (2011), for instance, the 24 CMIP3 (2007) GCMs behave like a set of only 7.5 to 9 independent models; while the actual number of models is 24, the effective number is 7.5 to 9.

3 A unified modeling approach

Shukla et al. (2009, 2010) call for a “revolution” in climate modeling that would involve changes in how climate models are constructed as well as where they are constructed and by whom. They call for the establishment of a few multinational climate modeling centers that pool human and computational resources in order to develop and deploy much higher resolution climate models:

“Because current computational infrastructures are funded through national resources, no single modeling center in the world has been able to acquire the required supercomputing resources and the critical mass of scientists to build and run climate models with cloud-system-resolving atmosphere, eddy-resolving oceans, and landscape-resolving land surfaces to investigate what is really needed scientifically to provide confident global and regional predictions over the next century” (Shukla et al. 2010, 1409).

Given societal demand for trustworthy information about future climate change, Shukla et al. see the development of climate models that do resolve these important smaller-scale processes as a high priority, meriting dedicated supercomputing facilities and the focused expertise of multinational teams of scientists. Their target here is the development of 1 km resolution ESMs, where current higher-end coupled models have 50 km resolutions. They emphasize (ibid., 1411) that the proposed multinational facilities should supplement, not replace, existing national modeling centers; the latter, however, will be free to focus on providing the specific predictions and services that their countries most urgently require (e.g., for local adaptation decisions), informed by the scientific and modeling developments emerging from the multinational facilities. It is also possible that perceived competition with the multinational facilities will stimulate national centers to try to ‘catch up’, encouraging further progress in the field.

Shukla et al.’s proposal can be described as a “unified” approach not just at the institutional level, where it calls for multinational cooperation in funding and expertise, but also in its philosophy of model development. In particular, it advocates a unified or “seamless” approach to model construction and evaluation (Palmer et al. 2008; Hurrell et al. 2009). The seamless prediction approach recognizes that the Earth’s climate system incorporates processes acting and interacting across a wide range of scales, from so-called “fast physics” that shapes the evolution of short-term weather conditions to ocean and other processes operating on much longer time scales. It envisions constructing ESMs such that they can be used to make not only predictions of climate change over decades and longer, but also short-term and seasonal weather forecasts. These forecasts can be compared with observations, providing out-of-sample tests of the models’ ability to simulate the processes that control the evolution of weather conditions on relatively short time scales. (Poor performance in simulating these processes can lead to significant errors in longer-term climate predictions as well.) Performance on these forecasting tests in turn, it is hoped, can facilitate model improvement.

An attempt at this doubly unified approach is already underway. The EC-Earth consortium includes scientists from ten European countries who have pooled expertise and (existing, distributed) computational resources to develop and run the EC-Earth model, an ESM that is built under a seamless prediction philosophy (see Hazeleger and Bintanja 2012). While further development of the EC-Earth model is ongoing, the consortium already reports significant “cross-fertilization” of ideas between the weather and climate modeling communities involved (see ibid.) The U.S. National Academy of Sciences has also recommended (NRC 2012) pursuing the seamless prediction approach alongside elements of the hierarchy approach.

Nevertheless, a number of challenges and questions remain for the unified approach. To begin with, there are questions about the extent to which the unified approach will lead to a reduction in the worldwide pool of GCMs/ESMs and whether this would hinder attempts to estimate uncertainty about future climate change. At the moment, there is no good substitute for using GCM/ESM ensembles to probe and quantify uncertainty (Yokohata et al. 2013). Palmer (2012) suggests that developing future ESMs in a probabilistic framework that includes stochastic parameterization of unresolved processes and allows for multiple parameterization schemes to be employed within a single simulation will provide sufficient model diversity, but the evidence for this is still only suggestive (see e.g., Weisheimer et al. 2011; Weisheimer and Palmer 2014; Weisheimer et al. 2014). Another response on behalf of the unified approach, however, is that the lack of independence of available GCMs/ESMs implies that substantial, careful reduction in the total number of available state-of-the-art climate models should be possible without further compromise in our ability to quantify uncertainty.

A second scientific issue is that it is unclear to what extent more accurate predictions can be achieved. Expected improvements in predicting the statistics of weather events, due in part to improved simulation of processes that control the evolution of weather conditions on relatively short time scales, are in turn expected to lead to improved regional, seasonal predictions (Shukla et al. 2010). But resolving cloud-systems and ocean eddies may not suffice to produce decadal and longer-term predictions of desired accuracy, for various reasons. It may be that missing representations of mechanisms that drive internal variability (and that will not be captured by increasing resolution) are an important source of current model limitations (see, e.g., Lovejoy 2014b). Likewise, if important climate feedbacks, such as land-surface feedbacks (see e.g., Knight and Harrison 2013; Aalto et al. 2014), are simply not included in the models at all, then predictions may again be undermined. It has also been suggested that, if climate prediction is to a significant extent an initial value problem, this will make long-term predictions of desired accuracy, including sufficiently accurate predictions of statistics, out of reach even for models that capture the relevant physics of the climate system (for competing perspectives see Stainforth et al. 2007; Hurrell et al. 2010; Pielke 2010; Pielke et al. 2012; Meehl et al. 2014).

Turning to more practical issues, there are various institutional challenges that will need to be addressed. Realizing the unified approach will cost hundreds of millions of dollars (Palmer 2014b). It may also require new forms of cooperation between climate scientists, software engineers and hardware specialists (Wehner et al. 2011). And the ESMs themselves will have to be developed, along with new parameterizations, software and hardware.

Finally, although pursuit of the unified approach would yield ongoing scientific benefits, it would take decades before its main decision-support goals might be realized. Palmer (2014a) indicates that the requisite computing power for running very high resolution climate models will probably not be available in climate institutes until the 2030s at the earliest, unless alternative computational approaches, such as imprecise computing, are pursued. But imprecise computing has the potential to shorten only one step in the process that culminates in better policy decisions. The new climate models will have to be developed and assessed for reliability, and development may take longer if a novel, imprecise computing approach is pursued. Moreover, assessing reliability of predictions will take anywhere from several years to many decades, depending on the lead times of those predictions. If demonstrable breakthroughs in prediction are forthcoming, additional time will be required in order to formulate, approve and manage policy based on these predictions.

4 Understanding through hierarchies of models

Held (2005) argues for a better balance between two kinds of modeling: the currently-dominant approach, in which today’s climate models are used to help us interpret observations and make predictions just until they are replaced by the next, “better” generation of complex models, and a second approach that involves the development and detailed study of hierarchies of models of lasting value. The base of a model hierarchy should consist of one or more simple models, grounded in physical theory. More complex models should relate in traceable ways to these simpler models, should be physically coherent, and should be only as elaborate as needed to capture an additional source of complexity in climate system behavior (Held 2005, 1613; 2014). Models at each new level in a hierarchy should be studied systematically, teasing out relations among emergent behavior and the underlying causal processes represented. Eventually, we should arrive at physically-coherent models that simulate important climate system phenomena in fairly rich detail and that – hopefully – behave in ways that can be made sense of using the understanding gleaned via the study of lower levels of the hierarchy. Held contends that there are “no alternatives” to the hierarchy approach if we want to understand the climate system (2005, 1610).

As Held notes, the importance of model hierarchies has long been recognized (see e.g., Schneider and Dickinson 1974; Hoskins 1983). Yet today’s ‘hierarchy’ of climate models–usually taken to consist of the full collection of models, from EBMs to ESMs and beyond–differs in significant ways from the hierarchies that Held envisions. The models in today’s ‘hierarchy’ generally do not relate in traceable ways to one another, nor have they been comprehensively and systematically analyzed as Held recommends. On the contrary, the tendency has been continually to add processes and detail to models at the complex end of the spectrum, even as the behavior of existing models remains relatively poorly understood (see also Jakob 2014). Held contends that, in the absence of systematic analysis of existing models, model development often resembles an “informed random walk”:

“Model builders put forward various ideas based on their wisdom and experience, as well as their idiosyncratic interests and prejudices. Model improvements are often the result of serendipity rather than systematic analysis” (Held 2005, 1611).

The hierarchies approach is meant to improve upon this, making model development more directed and perhaps more efficient. A final difference between Held’s envisaged hierarchies and available hierarchies is their stability. Current models at most levels of complexity are liable to change in substantial ways as parameterizations improve. Held, by contrast, seems to envisage hierarchies that are more stable; they will have “lasting value”, presumably because they manage to capture key sources of complexity in a physically-coherent way and thus will have less need for revision.

Held focuses primarily on describing the scientific aspects of the hierarchy approach. The approach may, however, have noteworthy organizational implications as well. For example, since the development of stable hierarchies is to be pursued alongside the development of ‘replaceable’, high-end models, it may be that substantially different specializations within the modeling community will evolve and, moreover, will create a need for individuals who can facilitate effective communication and cooperation between these specializations.

As with the proposal for a unified approach, a number of challenges, both practical and scientific, arise in connection with the hierarchy approach. On the practical side, the proposal seems to require a cultural change in climate modeling; at present, improving high-end models seems to be more highly valued than systematic analysis of simple and intermediate-complexity models. On the scientific side, Held himself notes the challenge of reaching agreement on which climate models should be included in hierarchies if they are to have lasting value (Held 2005, 1609).

But there are other scientific questions and challenges as well. First, the extent to which physically-coherent models and hierarchies can be constructed is unclear since, for sufficiently rich phenomena, the need for parameterization becomes inevitable at some point in the hierarchy development process. Second, insofar as parameterizations are employed, we can expect that some will turn out to have limited applicability outside of the circumstances in which they were developed. This will limit the extent to which these models, and the hierarchies in which they are embedded, capture key sources of complexity in the climate system and thus have lasting value.

Third, it is unclear how far the reductive approach to understanding that Held’s proposal seems to endorse will ultimately take us. In making a case for the development of hierarchies, Held draws an analogy with biology. Just as in biology a number of relatively simple “model organisms” – organisms which, like e.coli, are studied in great detail in order to learn about other organisms – have helped biologists to understand a variety of biological phenomena in more complex organisms, so too simple, albeit artificial, climate models should help climate scientists to understand the climate system (Held 2005, 1610; 2014). However, as Love and Hüttemann (2011) argue, biology also illustrates that explaining complex phenomena in terms of simpler components and systems can fail. For example, attempts to understand the three-dimensional structure of proteins in terms of RNA translation will fall short, because it turns out that protein folding is also mediated by so-called ‘chaperone proteins’ in the environment, among other things (ibid.). There is no guarantee that what is learned about the behavior of simpler models will continue to be applicable in the case of more complex models as additional, nonlinear feedbacks come into play. Rather, it is an empirical question to what extent understanding gained at lower levels in a hierarchy can be leveraged to understanding more complex models and the climate system itself. And of course even when such understanding can be leveraged in this way, it may not be feasible to do so in the immediate term (see also Harrison and Stainforth 2009; Held 2014).

Finally, it is important to recognize the value of non-reductive approaches to advancing understanding of the climate system, which to some extent are already employed. To take a familiar example, conservation considerations allow us to understand some aspects of the climate system; rather than showing how these aspects of the climate system emerge from component parts or processes, we show that they are a necessary consequence of conservation constraints. Likewise, there are efforts underway to advance understanding of climate variability by appeal to a balance (or lack thereof) of slower and faster climate system processes, without enumerating what all of those processes are (Lovejoy 2014b). It is important that non-reductive approaches to advancing understanding not be overlooked as a resource to complement Held’s hierarchies approach.

5 Increased model diversity

A third approach aims to tackle the limitations of current modeling practice by increasing substantially the diversity of climate models that are employed in climate research.

The published literature provides no unified perspective on what this diversification should involve or how it should proceed, so we synthesize a proposal that includes optional elements and draws on a number of sources. This proposal calls for increased diversity in EMICs, ESMs and other physics-based models; for some, it also includes a call for the development of alternative types of model, especially data-driven and semi-empirical models.

Calls for increased diversity in climate modeling are often motivated by the goal of improved uncertainty quantification. Knutti et al. (2010), for example, argue that quantifying uncertainty about future climate requires an ensemble of climate models with a reasonable spread of model structures (see also McWilliams 2007). In the case of GCMs/ESMs, this suggests developing models that more comprehensively and more systematically sample uncertainty about climatic processes than available ensembles of opportunity do. For uncertainty exploration, simple models also can be attractive, since their low computational cost allows exploration of a wide range of hypotheses about the climate system (see e.g., Wigley and Raper 2001; Forest et al. 2002). By contrast, due to computational constraints, only a few runs of each available GCM/ESM may be possible for a given emission scenario (see IPCC 2013).

Many authors, though by no means all those who advocate a pluralist approach, also recommend further supplementing the existing hierarchy of physics-based climate models with data-driven and semi-empirical models (Kravtsov et al. 2009; Steinhaeuser et al. 2011; Tsonis 2012; Lovejoy and Schertzer 2013; Curry 2013). The hope is that insights from Earth-system dynamics, techniques drawn from computer science and formal learning theory, and the availability of increasing quantities of climatic data will allow data-driven and semi-empirical models to contribute in substantial ways to predicting and understanding climate. For example, although Lovejoy (2014a, b) argues that current GCMs/ESMs cannot simulate the weather-like behavior of climate over periods longer than 30 years because they lack representations of mechanisms relating to internal variability, he suggests that effective use of paleo-data and more recent empirical data may allow prediction of this behavior. With respect to understanding climate, empirical and quasi-empirical modeling may reveal clues about the drivers and sensitivities of emergent climatic phenomena, including regional climate phenomena and phenomena that physics-based models do not yet adequately simulate; for example, they may do so by revealing correlations between largely internally driven modes of climate variability and temperature patterns (see, e.g., Tsonis et al. 2007; Steinhaeuser et al. 2011; Ebert-Uphoff and Deng 2012; Wyatt and Curry 2014).

A more modest suggestion is to use the performance of data-driven and semi-empirical models as a baseline for quantifying the ‘value added’ by the detailed physical treatments of more complex models (Suckling and Smith 2013); if the value added is currently small for a predictive task of interest, perhaps the extra cost of running the complex models is not justified.

On a practical level, the pluralist approach, like the hierarchy approach, faces the challenge of institutional inertia. Substantially diversifying the pool of GCMs/ESMs may take some attention and resources away from existing modeling projects and, while developing data-driven approaches might not be terribly demanding financially or organizationally, such approaches currently are unpopular.

A scientific challenge for the pluralist approach concerns the sampling of structural uncertainty. Current knowledge gives no clear picture of the space of model structures that should be sampled, nor of what it would mean to adequately or systematically sample that space (Smith 2002; Murphy et al. 2007; Parker 2010). Pluralist calls for diversification of GCM/ESM structures need to be accompanied by some suggestion of how this diversification should proceed. For instance, is it more important to more thoroughly sample uncertainty associated with already-included processes or to expand the range of processes and feedbacks included? How should such decisions be approached? (This challenge does not apply, however, for pluralists who call for exploration of hypotheses about the climate system that are suggested by empirical data or by physical reasoning about incompletely understood climatic mechanisms.)

Advocates of data-driven and semi-empirical models face a different worry: compared to GCMs/ESMs, data-driven and semi-empirical models have a much more limited grounding (if any) in physical theory, calling into question their trustworthiness for inferring future climate and the causes of climate change. Yet one should not exaggerate the differences between standard climate models and those that are considered semi-empirical or even data-driven. GCMs and ESMs are themselves substantially empirical because they incorporate a number of parameters whose values are set in part by tuning to empirical data. Moreover, studies that include a strong empirical modeling component sometimes are preferred to those that just rely on GCMs/ESMs. The use of optimal fingerprinting to quantify the causes of recent climate change is a salient example. It relies on GCMs/ESMs to determine the spatiotemporal patterns of change that are expected in response to individual forcing factors, but it does not simply take at face value the magnitudes of the simulated changes; it estimates those magnitudes – and hence the contributions of the different forcing factors – by fitting combinations of the simulated patterns to recent observations (Katzav 2013).

In any case, the role of data-driven and semi-empirical climate models can be thought of as supplementary to the roles of other climate models. Data-driven and semi-empirical climate models might provide an independent check on results arrived at by other means, allowing increased confidence where there is agreement or, where there isn’t agreement, stimulus for further investigation. Semi-empirical models also are in some ways more flexible than GCMs/ESMs, allowing easier formulation and testing of some hypotheses about climatic phenomena. For instance, it may be relatively straightforward to modify a semi-empirical model so that it embodies a new hypothesis about processes contributing to a pattern of variability, whereas modifying the physics of a high-end model so that it does so may be quite challenging.

6 Putting it all together: gains, costs and other considerations

Building on the discussion above, Table 1 summarizes the gains that we suggest can be reasonably expected by pursuing each of the unified, hierarchy and pluralist approaches independently, when it comes to three important scientific goals: advancing understanding, increasing the reliability of predictions and improving the quality of uncertainty assessments. Costs are also estimated in a qualitative way. All are relative to a business-as-usual baseline, i.e., one in which climate modeling continues to follow the approach that it has followed in recent decades. (Admittedly, it is not easy to fill in this table; we welcome alternative analyses that prompt further discussion of the benefits and costs of the different strategies.)

Table 1 Plausible costs and scientific gains of the unified, hierarchy and pluralist approaches relative to a business-as-usual baseline, assuming each approach is pursued independently

Table 1 suggests that the different approaches are complementary. In an ideal world with unlimited funding and expertise, perhaps all three approaches could be pursued alongside current modeling practices. The unified approach’s emphasis on prediction would be complemented by the hierarchy approach’s emphasis on understanding; the pluralist approach would augment these with alternative uncertainty assessments and by exploring less reductive approaches to prediction and understanding. Synergies among the approaches could be expected: for example, increased understanding would likely facilitate more reliable predictions, at least for some quantities. And all of this would occur without sacrificing the relative security provided by current practice.

But the actual world is not an ideal one, leaving difficult questions about what the future of climate modeling should look like and how desired changes could be effected in practice. We cannot hope to answer these questions fully here, but we can offer a few remarks.

It is noteworthy that only one of the proposals – the unified approach – seems to require huge increases in funding. While this approach can be expected eventually to yield improvements in predictive accuracy for at least some lead times, we saw above that demonstrably-reliable predictions will not be available for some time: it will take time to develop very high-resolution models and the supercomputers on which they will run and, once these models are developed and implemented, it will take many years/multiple decades to collect meaningful statistics on their performance in seasonal-to-interannual/decadal (and longer) climate prediction. But for many climate-related decisions, we cannot afford to wait. This is not to deny that developing very high resolution climate models or pursuing a seamless prediction strategy has value; it is merely to cast doubt on the idea that accelerating efforts in this direction can make much difference to climate decision making in the near term. In fact, there is a worry that the pursuit of more accurate predictions that will guide ‘better’ decisions later may delay needed efforts to reduce vulnerability in the near term (Dessai et al. 2009; Lemos and Rood 2010).

We note also that some ways of improving the practice of climate modeling seem to be within easier reach. This includes the piecemeal pursuit of the hierarchy approach as well as increased attention to empirical modeling, especially empirical modeling undertaken with the aim of advancing understanding of climate phenomena. These activities are ones that can be undertaken locally, by individual researchers or modeling groups, and at relatively little additional cost (beyond business as usual). Moreover, they can be expected to yield gains even if other researchers maintain the business-as-usual modeling approach. The same is true of some increase in efforts to develop alternative GCM/ESM structures. Doing so is not yet the explicit target of major modeling efforts, despite its potential.

Another reasonable change to current practice would be to try to bring the number of state-of-the-art GCMs/ESMs into line with the effective number of models of this kind. This would free up resources that could be invested in the pursuit of any of the approaches identified above. Admittedly, this course of action would involve significant challenges, both scientific and institutional. On the scientific side, for instance, there are still questions about how model independence is best conceptualized and assessed. Institutionally, there would be difficult decisions about which GCMs/ESMs should be abandoned or replaced. Still, further attention to this course of actions seems warranted.

Finally, while the present discussion has focused on proposals for changing the practice of climate modeling, it points to a larger question: that of how resources can best be directed to advance climate science. It may be that alternatives to climate modeling – such as theorizing that is not model-driven, efforts to expand or update observing systems, more careful empirical investigation of poorly represented (or omitted) feedbacks, or development of much more detailed and careful process models – are at least as important.