Global warming: a review of this mostly settled issue
- First Online:
- Cite this article as:
- Keller, C.F. Stoch Environ Res Risk Assess (2009) 23: 643. doi:10.1007/s00477-008-0253-3
- 815 Views
Is the surface temperature record accurate or is it biased by heat from cities, etc.?
Is that record significantly different from past warmings such as the Medieval Warming Period?
Are human greenhouse gases changing the climate more than the sun?
Can we model climate and predict its future, or is it just too complex and chaotic?
Are there any other changes in climate other than warming, and can they be attributed to the warming?
KeywordsClimateClimate changeGlobal warmingClimate modelingAtmosphereOceanGreenhouse gasesCarbon dioxideSolar activityEnvironmentEcosystems
getting the best answer to the two questions above, and
communicating it to policy people in terms that they can understand and use.
Most of the ensuing controversy over the answers the IPCC has been publishing probably results from the urgency associated with carbon increase in the atmosphere. Because CO2 has a residence time in the atmosphere of over a century, it is important to get the earliest estimate of how much and how soon any warming might be if we are to do anything to slow projected warming before it occurs. Thus, climate scientists find themselves trying to see an initially small signal in a noisy, chaotic climate. The IPCC’s 1991 very guarded statement that we are beginning to identify this signal was an attempt to give the earliest possible warning commensurate with our understanding. Since then, in successive 5-year updates, the IPCC has made increasingly strong statements culminating in the 2006 statement that we are 95% certain that recent (last quarter century) warming is mostly due to human emissions from fossil fuel burning. The motivation for establishment of the IPCC was its ability to provide an authoritative scientific review and to reflect a scientific consensus without undue influence from extreme positions. This review finds that the IPCC has given us a fairly comprehensive assessment of the current understanding of these two questions. Summaries of the IPCC’s Fourth Assessment Report can be found at: http://www.ipcc.ch/.
The observational record of warming at the Earth’s surface is flawed and not reliable. There has been little warming since 1945.
Climate has always varied naturally, sometimes much more than during the 20th century, and so what warming we are experiencing is probably from natural causes like the sun.
Observed retreat of glaciers can be attributed to lack of snowfall or increasing temperatures as we come back to normal after the Little Ice Age (LIA).
The climate is far too complex and chaotic to be modeled even by present super computers, thus these huge General Circulation Models (GCMs) cannot be relied on either to attribute observed warming to AGHGs or to predict the timing or extent of future warming.
Because these are such common objections among those who do not know much about the subject, I will be careful to address them in this review especially in Sect. 5 on computer simulation of climate without which we can still see a human fingerprint but cannot quantify it adequately.
A word about the critics—many of them are indeed associated with businesses or political entities that have a vested interest in denying the hypothesis, but many more are good scientists who are just hypercritical of everything. Every scientific group has these people, who are usually of value since they call for a certain rigor in research. But, in excess, they eventually hurt the process as seems to be the case here. Looking at their stance on other scientific issues is a clue to their underlying philosophies (many of them deny that DDT ever caused near extinction of our national bird and other fish-eating birds or that cigarettes cause lung cancer). They claim these were hoaxes despite a huge body of evidence to the contrary.
It will be the underlying position of this review that, while each observational or theoretical finding has accompanying uncertainties, they all seem to point in the same general direction so that taken together (rather than separately as the critics require), they reduce our uncertainty considerably.
2 Taking the Earth’s temperature (Is the late twentieth century warming real?)
2.1 Non-instrumental indications of warming
What matters in the short term, however, is how long the Arctic takes each year to get 1–2 m of sea ice, so the atmosphere is insulated from the relatively warmer ocean. Once that happens the surface can radiate away as a black body through air with low absolute humidity and this allows formation of air at –40°C that then can spread south and create the winter weather we are used to in North America and Greenland. Basically, until there is 1–2 m of ice, cold Siberian air blowing across the Arctic to chill North America, etc. is warmed. So we might expect later onset of winter in North America—something to be looked for.
The situation in the Antarctic is more complicated. Sea ice is being lost on the west side of the continent and slightly increasing on the east. What’s happening? The answer has to do with topography. While in the Arctic there is no elevation change—only sea level where storms from more southerly latitudes can sweep across the entire area warming it, Antarctica (especially eastern Antarctica) is a continent higher in the center falling to sea level at the coast. Cold central air “drains” down to the sea as water would and is swept into a circular vortex by the Coriolis force (due to the Earth’s rotation). This vortex essentially isolates central Antarctica from warm, lower latitude storms allowing for dramatic cooling. This hyper-cold air is transported to the coast where it causes sea ice to form. Note that the Western side is more of a peninsula without much high ground. It thus acts more like the Arctic and indeed this is where the ice sheets are breaking up, rapid glacial flow is occurring (Vaughn 2007) and the tip of the peninsula is one of the most rapidly warming places on earth! Another consequent difference is that Arctic summer sees temperatures well above freezing, while this hardly ever occurs during Antarctic summer. Thus small increases in Arctic temperature increase melting, while similar increases in the Western Antarctic still cause little of no melting. Rates of surface elevation change, derived from ERS radar-altimeter measurements between 1992 and 2003 over the Antarctic Ice Sheet, are discussed by Davis et al. (2005), and locations of ice shelves estimated to be thickening or thinning by more than 30 cm year−1 by (Zwally et al. 2006). There is growing concern that, at least in the west, these structures are becoming unstable to melting.
2.2 National Snow and Ice Data Center
What about Greenland? Greenland is usually out of step with the rest of the world temperature-wise, lagging in warming at the end of the last ice age and lagging in recent warming until just recently largely due to persistent weather patterns that allow Arctic air to move south over it. However, the most recent satellite and surface measurements are starting to show accelerated melting and more rapid movement. It is thought that meltwater is penetrating to the base of the glaciers lubricating them and causing them to move faster (Schoof 2007).
Another way to look at past climate change is to study mountain glaciers from the highest peaks in the tropics. Lonnie Thompson and his team have been doing this for several decades. It is quite a story since they must not only climb the peaks carrying tons of ice coring equipment, but must remain there for weeks drilling the cores. (For a readable telling of this story and what they have found out see Bowen 2005). They found to their surprise that these glaciers held a climate record, not just a few hundred years old, but mostly over 10,000 years old. This work is particularly important since most ice core studies have been limited to polar regions which might not be representative of climate change over most of the earth. These were records from the tropics which many think is the place most of the Earth’s climate comes from. One of the most interesting discoveries was that these glaciers help solve what has been called the “Mercer Problem” named for its discoverer, namely that Antarctica ice cores show the last ice age gradually ending some 2–3,000 years before it did in Greenland. Thus it would seem that the end of the ice age began in the southern hemisphere. Thompson’s mountain glaciers generally agree with the Antarctic results (Thompson et al. 2006).
This caused another team (Schaefer et al. 2006) to look more closely at glacial activity in the northern subtropics such as the USA. Working with geologists who can date the retreat of North American ice sheets, they were able to find that these also began retreating about 17,500 years ago. The picture now emerging is one that seems to isolate Greenland as one of the only places where warming did not start until 15,000 years ago. As a side note, this and other findings are showing increasingly that Greenland is often not representative of climate elsewhere around the globe. Thompson’s team recently published a summary of their work on the Andes, Africa, and the Himalayas (Thompson et al. 2006). While temperature records from different locations often differ substantially from each other, their composite average is in remarkable agreement with other attempts to use proxies to determine temperatures in the past few thousand years (Sect. 3) and adds yet another datum to show that the Medieval Warming Period was not as warm as today.
In addition to and largely agreeing with instrumental measurements (Sect. 2.2) observations of a number of non-instrumental temperature indicators around the world have been made. In nearly all cases these show the effects of warming, lending credibility to instrumental records. Of particular near-term concern is the rapidly melting Arctic sea ice.
2.4 Surface and satellite temperature records
2.5 Satellite observations of middle troposphere
Satellite temperature observations are made with instruments that detect microwave radiation from oxygen in the atmospheric column (thus the name Microwave Sounding Units—MSUs). These are tuned to regions of the electromagnetic spectrum that change measurably with temperature and that sample specific vertical regions of the atmosphere.
For perhaps two decades a conundrum arose because reduction of satellite temperatures showed little warming trend in the past 25 years. That there was something wrong with both satellite and balloon data reductions has turned out to be true. One problem was that the satellite record had really only been studied by a single team, that of Christy et al. (1998, 2003) at the NASA University of Alabama site in Huntsville (referred to hereafter as UAH). Since that time several other groups began looking at this data and finding lots to be concerned about, from changes in the upper atmosphere with changing solar cycle to satellite orbit drifts. One group (Mears et al. 2003) at another NASA site in Santa Barbara, California—Remote Sensing Systems (hereafter referred to as RSS) took exception to the manner in which UAH team calibrated the data from ensuing satellites. Since the 25 year record came successively from nine such satellites, they all had to be put on the same absolute scale. Given the assumptions UAH made most of these linkages between overlapping satellite seemed acceptable, but one in the 1991–1992 time frame looked suspicious as it required an tenfold larger calibration factor than all the others.
Note here that this is just the time frame that I noted in OR saying something strange had happened here. I noted that the surface and satellite records agreed rather well from 1979 to 1991, and from 1992 to 2000 but that during 1991 the satellite record had suddenly stepwise dropped by about 0.2°C below the surface record. Since that was just when the atmosphere was reacting to the large injection of dust into the stratosphere by the eruption of Mt. Pinatubo, I wondered if this had not been the cause. Now there seemed evidence for an error in cross calibration of two satellites which was masked by the volcano-induced cooling. The RSS group revised this calibration and made other improvements. The result was that the satellite record showed warming similar to that at the surface! UAH countered that these corrections were a matter of judgment and that, since their results were corroborated by the radiosondes, which made in situ measurements of temperature in the middle troposphere, theirs was the more correct reduction and RSS had made some errors. However, evidence for a larger warming trend continued to build. Another group (Vinnikov and Grody 2003) reduced the satellite data by an independent method which corrected for both instrumental calibration errors and satellite orbital drift. This work got even more warming at altitude than that of the RSS group!
At this point there appeared an extremely important series of papers (Fu and Johnson 2004, 2005; Fu et al. 2004) which questioned another reduction procedure of the UAH group. Satellites measure temperature with several sensors or Multi-Sensing Units–MSU. Two (MSU2 and MSU4) are most commonly used to determine trends. Sensor #2 measures temperature mostly in the troposphere but unfortunately also includes a 17% contribution from the lower stratosphere which must be removed because the stratosphere is cooling. To do this both UAH and RSS use a method that had the satellite look sideways towards the horizon in successive steps and then with a theoretical model of the standard atmosphere, remove the stratospheric component. The resulting temperature record is referred to as T2LT (Temp. from sensor #2 in the, Lower Troposphere). Fu’s group was critical of that approach and proposed a more accurate method, taking advantage of sensor #4 that measures temperature in the stratosphere. Thus the two records overlap in the lower stratosphere and by their method Fu and Johanson determined another version of TLT (effectively slightly higher in the troposphere than the others). They then applied this method to both the UAH and RSS determinations. The result was that both now showed even greater warming [this method is referred to as the University of Washington adjustment (Fig. 2.9)].
It has been argued that this method is too inaccurate and unconstrained to be used, but in an elegant study (Gillett et al. 2004) used a climate model as a proxy for the actual atmosphere and applied the method to it. They found it gave extremely accurate results. To test this numerical method the model doesn’t have to be perfect. It just has to produce the general characteristics of a troposphere with a cool stratosphere above it. We take from this that the method of combining sensors #2 and #4 is robust in separating out the stratospheric component in #2 and should be the preferred one to use.
This all changed in the fall of 2005 when three papers (Mears and Wentz 2005; Santer et al. 2005; Sherwood et al. 2005) appeared in a single issue of Science magazine. These three articles plus a fourth described below, taken together, essentially ended the problem of middle troposphere temperature trends not agreeing with those measured at the Earth’s surface. In summary, the first shows the effect of correcting for satellite drifts which builds on earlier work and further renders MSU trends commensurate with surface ones. The second paper shows that radiosonde-determined trends and lapse rates, disagree with theory and models, which predict amplification of warming with altitude (they do agree for sort time periods, just not decadal and longer). The third gives a strong argument for why the radiosondes are inaccurate.
These three papers are discussed in detail in my update to OR (Keller 2007). Suffice it to say that following them the UAH determination was revised as in Fig. 2.8. Figure 2.9 compares the current determinations of global temperature and their warming trends (cf. Fig. 2.6, surface temperatures). A global image of the RSS mid-troposheric temperatures is shown in Fig. 2.10. Note that, as predicted by the models with GHG forcing, warming increases poleward in the northern Hemisphere. In the Southern Hemisphere oceans dominate and this trend is less obvious. Antarctica has not been sampled because of detector problems with snow and ice (similarly for the Himalayas, the Andes, and the Arctic ice cap).
In paper three (Sherwood et al. 2005) the authors note that “the temperature difference between adjacent 0000 and 1200 UTC weather balloon (radiosonde) reports shows a pervasive tendency toward cooler daytime trends compared with nighttime since the 1970s especially at tropical stations. They define a temperature difference which, averaged in longitude around the globe should be near zero. A plot of this quantity over the tropics shows a large difference; ~0.45 K in 1970 reducing gradually to zero after 1997. They point out that the decadal trend w.r.t. to altitude in this quantity is “almost two orders of magnitude larger than can be justified physically based on known forcings. Other similar inconsistencies are discussed. They then “drop the bomb” as it were by suggesting that the trend difference was caused by changes in detector insulation against direct sunlight. Indeed radiosondes had been reporting a disquieting warming trend, but only at night when the sun was not shining directly on the detector. Note here that this explanation also explains rather well the problem noted by Santer et al. (2005) where radiosondes agreed with RSS observations, theory, and models on daily and monthly time frames but not over decadal periods!
If this were not sufficient to show that UAH and radiosonde results are incorrect, a fourth paper by Randel and Wu (2006) followed showing similar problems with determining long-term temperature trends using radiosondes, and coming to essentially the same conclusions. The resulting corrections are compared with the surface record in Fig. 2.11. Note that the radiosonde warming trend starting in 1960 is actually higher than at the surface as theory and models predict, but trends are essentially equal from 1980 onwards. These papers added so much to the discussion that the (National Climate Program) published a rather large and comprehensive report (Karl et al. 2006) on what was known so far about the earth’s temperature especially in the middle troposphere. It concluded that we now have general agreement between all data sets. (A lingering problem is that theory predicts more warming aloft than at the surface, which is not seen in all data sets.
This has been a long and sometimes detailed treatment of how we know from instrumental data that the earth is warming and by how much. Since this continues to be one of the aspects of global warming attacked by critics and skeptics alike, the treatment was warranted. Suffice it to say that today there is no longer any real concern that surface and mid-tropospheric temperatures have large errors and that the warming trends, somewhat less than 0.20°C/decade are credible. Te fact that models predict more warming aloft than is observed is thought to be no real problem because some additional warming is indeed seen and the model results are within the data error bars.
3 Climate variations (Is the twentieth century warming special?)
If the twentieth Century warming is real, it may still be argued that it is not “special”, that is, there have been other warming periods as great or perhaps greater. To answer this question we consider what is known about the Earth’s temperature before the instrumental record began in the middle of the nineteenth century. To be sure climate has varied considerably over the age of the Earth, however, most of that time conditions were so different as to make those climates not good analogues for today’s situation. [A recent review of model simulations of paleoclimate in the distant past concludes that models are doing rather well (Cane et al. 2006)].
Over the past million years of so the climate has alternated between long glacial and short interglacial (warm) periods. These alternations are thought to result from three cyclic changes in the Earth’s orbit and orientation towards the sun called Milankovitch cycles after the man who discovered them. The record from the Antarctic Vostok ice core shows at least six ~135,000-year-long glacial/interglacial (discussed in detail in OR). The concentration of carbon dioxide in the atmosphere follows this temperature record closely but lags it in time by a few 100 years. This is to be expected since it takes an initial temperature change to change the CO2 which in turn changes the temperature further. Initial solar warmings are amplified by rising carbon dioxide and melting snow, etc., and vice versa during cooling. This is an example of a positive feedback to an initial forcing. At present the Earth is coming to the end of an interglacial phase. These usually last about 10,000 years after which the climate cools into the next glacial phase. The Milankovitch cycles operate such that interglacials are usually warmer at the start, cooling thereafter (discussed in detail in OR). For us to understand recent climate variations, it is instructive to study our interglacial, called the Holocene.
3.1 The last 10,000 years (the Holocene)
ice melt in core from Ellsmere Is.,
elevation of tree line (Sweden), and
oxygen isotope temperatures derived from stalagmites (Norway).
These agree on a secular, somewhat sinusoidal (due to precession), decline in atmospheric temperatures, which have leveled out in the last few thousand years, and it is against this backdrop that we look for additional, shorter-scale climate variation.
What we find in the last 1,500 years is an alternation between warming and cooling characterized by the two most significant events, the Medieval Warming Period (MWP) and the Little Ice Age (LIA).
3.2 The last 1,000 years
There is a plethora of individual determinations of northern hemisphere (and a few southern hemisphere ones) over the past 1–2,000 years (de Menocal et al. 2000; Dahl-Jensen et al. 1998; Rietti-Shati et al. 1998; Thompson et al. 1998; Huang et al. 2000; Keigwin 1996; D’Arrigo et al. 2001). The MWP occurred between 1100 and 1300 when Europe and the North Atlantic were relatively warm. The LIA followed between 1500 and 1900 when they were abnormally cold. Indeed, if we can believe historic records from Europe, which told of warm weather crops being grown far north of where they have not grown until recently. Indeed the MWP for Europe might have been as warm as the present. But was this warming and subsequent cooling global or even hemispheric at such large amplitude? Figure 3.2 shows a representative sample of proxy reproductions of Northern Hemisphere temperature (there are many others) as well as temperature inversions from boreholes (see OR). Although they differ in some details, they all agree that the MWP was not as warm as today.
While the LIA seems to have been global in extent, the MWP appears to have been less so. Also the MWP seems spread out over two or three centuries, occurring at different times in different places, but not generally in the SH (Hughes and Diaz 1994). One multi-proxy reconstruction (Crowley and Lowery 2000) shows that the peak warming seems to have occurred in the twelfth Century. Other records suggest it occurred earlier. The LIA also seems to have been a multi-episodic event generally of the late seventeenth, and mid nineteenth centuries.
One of the curves in Fig. 3.2 is perhaps the most well-known paleo-temperature determinations, that due to Mann et al. (Mann and Jones 2003) popularly known as the “hockey stick” graph. While critics have attacked this work widely, they usually fail to note all the other, similar determinations agree fairly well with it. However, the hockey stick does seem to have one significant deficiency, it misses the low frequency variations in particular the Little Ice Age (LIA). There are two of these: around mid 1600s and mid 1800s. Apparently the LIA was cooler than most proxy determinations, but not all. The argument about Mann’s hockey stick graph centers on two quantities, how cool it got during the LIA and how warm it got (compared with recent temperatures) during the MWP. The first of these has little direct relevance to present warmth excepting that it helps determine climate sensitivity to changes in solar activity (see Sect. 6 in modeling and climate sensitivity estimates). A deeper LIA means that the climate responds more sensitively to changes in solar forcing. The relevance of how warm the Earth got on average during MWP is; however, quite important to our consideration since, if it were as warm as the present day northern hemisphere, one might argue our current climate is within bounds of natural variability.
The heterogeneous nature of climate during the ‘Medieval Warm Period’ (Bradley et al. 2003) is illustrated by the wide spread of values exhibited by the individual records that have been used to reconstruct NH mmean temperature (Fig. 3.3). These consist of individual, or small regional averages of, proxy records collated from those used by Mann and Jones (2003), Esper et al. (2002) and Luckman and Wilson (2005).
“The 20th century warming, and 19th century cooling are the most extreme and coherent trend common to the northern sites we have studied. These recent, more coherent trends suggest the possibility of stronger common forcings, becoming more dominant over regional variations, in particular the increasing trace gases in the 20th century. Less coherent fluctuations among the records back in time may signify more regional effects, and also the need for additional coverage for a more accurate perspective of climatic change.” (D’Arrigo et al. 2001).
Two recent studies (von Storch et al. 2004, 2006; Moberg et al. 2005) have been cited by critics as showing that the hockey stick is in error and so question the source of current warmth. In reality, if read carefully, both these two efforts actually give strong support the AGHG hypothesis. The first is a modeling study. In those papers von Storch et al. use a model from the Max Plank Institute, ECHO-G, to reconstruct temperatures over the past 1,000 years. The model result is subjected to differing amounts of numerical noise and a synthetic proxy temperature reconstruction is made for varying amounts of noise. The results show that, if subjected to the type and amount of noise as involved in the hockey stick proxies, the depth of the LIA is greatly reduced and agrees well with the hockey stick result when the actual answer is that the LIA was significantly cooler than Mann et al. said it was. The modeled coolness of the LIA agrees more closely with that of boreholes (see discussion in below).
In the second study Moberg addresses another problem besetting hockey stick-type reconstructions. These seem to get the high frequency (annual to decadal) variability of climate rather well, but may miss the low frequency (multi-decadal) swings. This paper adds low frequency proxy data from oceans as well as land records. To combine low and high frequency data the authors use wavelet theory, a potentially powerful way to treat such data. This result shows a much cooler LIA than the hockey stick, more in agreement with borehole results. It also compares very well with von Storch’s ECHO-G modeling result. Here we note that in order to get models to agree with the hockey stick and similar low amplitude results, the climate sensitivity introduced must be lower than that determined from modeling 20th century temperatures. This was unsettling. Using a higher sensitivity (or a first principles model such as ECHO-G) makes the climate respond more strongly to variations in solar forcing and therefore gives a much cooler LIA. The good news is that the ECHO-G result agrees roughly with the Moberg proxy reconstruction, yet has a climate sensitivity similar to that for reproducing 20th century temperatures. It actually has a cooler-than—observed LIA which may mean the proxy-determined solar forcing is too large. Indeed the latest proxy calibration of solar forcing is smaller than previous ones (see Sect. 4).
So far, so good, but what do these two efforts say about the MWP? Moberg’s reconstruction essentially agrees with the many other higher frequency studies (including the Mann’s hockey stick) in showing a NH average temperature about as warm as in the 1940s and significantly below current temperatures (Fig. 3.2) With higher climate sensitivity one might expect von Storch’s ECHO-G model to respond to higher solar forcing resulting in a warmer MWP. In fact it does not. It continues to agree with all these proxy reconstructions. (In a private communication von Storch cautions that, since the ECHO-G runs were started just before the MWP, the model might not have “settled down”, however, there is no evidence that this is in fact a problem.)
Another paper (Osborn and Briffa 2006) looking all this and other data concluded: “Positive (temperature) anomalies during 890–1170 and negative anomalies during 1580–1850 are consistent with the concepts of a Medieval Warm Period and a Little Ice Age, but comparison with instrumental temperatures shows the spatial extent of recent warmth to be of greater significance than that during the medieval period.”
Thus, despite considerable attacks on reconstruction of past climates, it appears that the most recent work, while showing the LIA to be cooler, as borehole data indicate, says the MWP was not as warm as at present, corroborating current estimates of the response of climate to high solar activity as being unable to explain current warmth.
While all this sounds fairly clear, the NRC’s recent report to Congressional queries, (North 2006), took a more cautious stand saying that we cannot yet be highly certain just how warm the MWP was, indeed, that it might have been as warm as the last decade given the error bars on the reconstructions. The report however, did say there is no evidence for a warmer MWP, just not enough proxies to give high confidence in any result. See also Richard Kerr’s piece on this report (Kerr 2006).
However, I think there is an additional observation that increases our certainty. This is Thompson’s experience with coring high mountain glaciers in the tropics (Thompson et al. 2006). Many of his cores go back tens of thousands of years or more. In some cases, when he attempted to re-core several of these some 20 years after he made the first cores, he had a hard time because the mountain glaciers had melted so extensively that it was hard to find a place to drill. Given the rapid pace of this melting, it is likely that many if not all of these mountain glaciers may be totally melted in the next decades. If this is so, then one might argue that the MWP could not have been as warm as at present else those glaciers would have melted to bedrock then, and Lonnie’s cores would have found only ice that formed after the MWP. Thus the existence of mountain glacier cores going back tens of thousands of years would seem to imply that they did not melt much during the MWP and thus it was not as warm as today.
Curiously, the global borehole record indicates a larger total temperature variation amplitude of about 1.1°C between 1600 and 1945. There is some evidence that changes in land use could alter the temperature record complicating the borehole technique. But there are two borehole thermometry records from the Greenland icecap (Dahl-Jensen et al. 1998) which show a long and large amplitude MWP and a relatively deep LIA (which correlates rather well in timing and amplitude with a Sargasso Sea sediment ∂O18 record (Keigwin 1996). Also, on the same ice sheet, proxy and thermometry methods agree (Cuffey et al. 1994). This points to increasing realization that the North Atlantic was more sensitive to clmiate change than the NH on average.
Paleo-climate prior to the last 1,000 years or so has been influenced by well-known factors not relevant since. Attempts to determine recent paleo-temperatures are complicated by sparce data and large uncertainties. With this caution in mind we can still say that temperatures in the past 1,000 years or so are becoming determined well enough to be compared with the recent warming. The picture is that of a slow secular cooling throughout the Holocene related to precession. Superposed on this are solar activity and large volcanic eruptions, which produce significant variation in the signal. More recently there was a slow warming, MWP, (mainly in the northern hemisphere) beginning about a 1,500 years ago and continuing sporadically, varying from place to place and time to time till about 1200. The so-called LIA was really two or more episodes (sixteenth and nineteenth centuries) with a warmings in between. Hemispheric and global warming the twentieth century appears to have been larger, more rapid and more uniform than any time in nearly 2,000 years (although certain years or even several year periods were quite warm and in some regions that warming probably equaled that of the twentieth century. However, the lack of melting of mountain glaciers then, which are melting now, adds credibility to the proxy reconstructions and to the modeling which, using solar and volcanic forcings appear to simulate the observations well.
4 Solar forcing of climate (Are indirect solar effects affecting climate?)
This review will not attempt an exhaustive recounting of earlier attempts to determine the sun’s effect on climate. These are dealt with in my earlier reviews (OR and Keller 2007). Here I will give the briefest background information necessary to understand where we are today.
Perhaps the most common objection to the AGHG hypothesis is that the sun has been dominating climate change and is doing so now, nothing more nor less—no need to invoke CO2 to explain recent warming. It is little wonder that layman and some scientists think this, for indeed the sun has had a major role in climate change. However, in the earliest times some 4 billion years ago, the sun was some 30% fainter than it is now (as sun-like stars grow older they brighten), and thus for over half of the earth’s history it had to rely on a combination of water vapor and CO2 to make it habitable. It is perhaps fortuitous that CO2 participates in a type of thermostat that releases it from minerals such as calcium carbonate when it was cold and ties it up in the same minerals when it’s warm (processes taking millions of years). And so, while the sun is a dominant force in warming the Earth, it is aided by GHGs and always has been (Schrag and Alley 2004). The question today is how much of recent warming in the past 50 years can be attributed to increases in solar activity?
Satellite Calibration—interpretation of satellite-observed Total Solar Insolation (TSI).
Paleo-calibration—magnitude of solar forcing necessary to reproduce past climate change, especially changes in the past 1,000 years or so.
The two are not totally independent since those working on #2 use the calibrations from #1. But method #2 has the advantage (among many disadvantages) of trying out the calibrations from #1 on past climate, thereby giving some insight into both climate sensitivity to forcings and whether or not direct TSI is the dominant solar effect or if some indirect effects are important (if past temperature variations cannot be simulated with direct TSI changes only, then one might infer that indirect solar forcings are at work also).
4.1 Satellite calibration
Satellites have been observing TSI since 1979. This 28 year record spans three solar maxima and nearly three minima. The max’s were roughly at 1980, 1990, and 2001. The minima were about 6–7 years after these, and so we are approaching the third minimum, which should occur in early 2008. These observations allow us to determine the sun’s role in warming over this time period, and, by inference, over earlier times at least since the 1940s. Before that, these observations provide a basis for calibrating proxies for solar activity (Frohlich and Lean 2004).
These proxies are essentially three: number of sunspots, which indicate the amount of solar activity (Fig. 4.1 and OR), and levels of two isotopes: 10Be (Bard et al. 1997, 2000; Beer et al. 2006 and OR), and atmospheric 14C, which change with solar activity/solar wind strength (Stuiver and Braziunas 1987, 1993; Wigley and Kelly 1990) and indicate how active the sun was. Sunspots take us back to the time of Galileo, 1600s, while the isotopes can take us back a considerably longer time—tens of thousands of years.
In addition variations in the activity of sun-like stars have been used to calibrate sunspot activity, and this led to the idea that solar activity varied more than what has been observed from the satellites over three solar cycles. Based on this Paleo-calibration, computer models have been able to reproduce (postdict) climate for the past thousand years (As an example see Moberg et al. 2005; von Storch et al. 2004, 2006). Using these calibrations for climate results in estimates that changes in solar activity in the past century have caused 1/4–1/2 of the observed warming up to 1980, but since solar maxima of 1990 and 2001 were no higher than that in 1980, the sun cannot account for the rapid warming in the past 28 years or so.
However, recent studies (Wang et al. 2005) have called this method of paleo-calibration into question, and we now think that TSI has not varied as much over the past 1,000 years as was thought. (Before Wang et al. the solar forcing consensus was 2–4 w/m2. Lean’s (1995) was about 2. Hoyt and Schatten (1993) was 3.7. Wang et al. is 0.5!) This makes it difficult to reproduce past climate change (although not in the past 30 years or so because of satellite data). But how to simulate the past 1,000 years with this reduced solar variability? There are several possibilities: (1) climate has not changed quite as much as some of the studies suggest, at least globally, (2) changes in solar activity cause indirect forcings that also vary.
Regardless, the answer to our original question is that, yes, some of the warming in the twentieth century was caused by increases in solar activity-related irradiance, but not all, apparently not even half.
This conclusion concerns the critics considerably. They think there is substantial evidence that the sun is controlling climate by indirect forcings in addition to TSI variations. A correlation was found that was much better. Instead of using activity amplitude, Friss-Christensen and Lassen (1991) and Lassen and Friis-Christensen (1995) correlated NH temperature change with the length of the solar cycles. Surprisingly this appeared to give a very good correlation particularly because it seemed more closely synchronized with temperatures in the early twentieth century. Extension of this proxy back to 1,550 seemed to give equally good results, but other attempts to repeat this work arrived at less good correlations (Laut 2003; Damon and Laut 2004) and there is no adequate physical mechanism for this correlation. In addition, these types of correlations do not hold for the past 30 years or so. Regardless, critics have proposed additional solar forcing due to indirect effects of its activity swings.
4.3 Indirect solar forcing
Possible indirect forcings are being found in variations in tropical ocean cycles such as El Niños, (Mann et al. 2005; Shindell et al. 2006) stratosphere coupling with the upper troposphere (Haigh and Blackburn 2007; Van Loon and Labitzke 1999), and, more popularly, in changes in low cloudiness caused by solar wind modulated galactic cosmic rays which provide cloud condensation nuclei allowing more clouds to form (Marsh and Svensmark 2003). Several observational studies have attempted to find the admittedly small temperature variation over the solar cycle by subtracting from recent surface and satellite temperatures the effects of other forcings such as El Niño variations and large volcanic eruptions (which have a very large affect on decadal temperatures), and AGHG warming which though small is steadily increasing.
Largely because observed direct solar forcing (TSI) is unable to explain the late twentieth century NH temperature surface record, and, because many think temperature amplitude in the last 1,000 years may have been fairly high, it has been suggested that there is additional forcing from solar activity variations due to some indirect, solar-related amplification mechanism. Several have been proposed (Soon et al. 2000; Tinsley 2000), but most center around one mechanism—changes in the strength of the solar wind affecting cloudiness.
“I find it quite troubling that there is no correlation between the cloudiness and cosmic rays on shorter time scales. Since the hypothesized cosmic ray influence on cloud nucleation through ionization would have a very short time scale, why don’t we see a correlation between monthly anomalies in cloudiness and cosmic rays?”
Perhaps the main point against the Svensmark idea is cloudiness isn’t climate change. There must be a concomitant global temperature change. Figure 4.2 compares cosmic ray flux with surface temperature showing little if any influence of the former on the latter. While there might be some such effect, it certainly is not large compared with forcing by GHGs.
The magnitude of solar forcing varies with the amount of solar activity. In addition to the 10–12 year variations, solar maxima vary over multi-decadal periods. Of particular interest were the Maunder Minimum, when solar activity was minimal for several decades, and the sharp multi-decadal increase in solar maxima during the first half of the twentieth century. Satellite observations of solar irradiance nearly three decades have allowed calibration of variations in solar activity. A number of attempts at reconstructing past climate variations via solar forcing have shown that, prior to 1940, direct forcing by changes in solar irradiance can explain most of these changes, but these have been unable to account for warming since 1975. Indirect solar forcing effects seem to be small, but merit continued study.
5 Computer simulations (How good are these anyway? Good enough to make predictions?)
One of the largest concerns about understanding climate processes is whether large computer simulations can represent them. In particular there is widespread opinion that these models have little predictive capability because weather and resulting climate are so complex. Thus much of what the IPCC documents say is dismissed by critics and questioned by the public because it is based in part on model results. On the other hand in the research community models have become an everyday tool for parsing different physical aspects of climate dynamics, for understanding the meaning of data sets and for helping design future observing programs. This mismatch between the climate research community and the public may be due to what each considers predictive capability to mean. For some it means ability to predict all possible events. For the climate community predictability is limited to what I will call linear changes in climate based on known forcings. Thus, unforeseen events such as larger-than-expected melting or massive release of methane from thawing permafrost, will not be predicted, while the steady warming as AGHGs increase and other known forcings are included is well within the models’ capability. In OR and Keller 2004 there are several examples of dramatic predictions—of cooling and recovery from volcanic eruptions, of 40 year warming in the deep oceans, etc.
5.1 Simulation of present climate
In OR there is a detailed consideration of tests that GCMs must pass to be said to simulate present climate adequately. It also mentions formal programs set up for intercomparison of model results with data (PCMDI, etc.). It is shown that they largely pass these tests with noted exceptions such as stratiform clouds near Peru and in the Arctic. Since that time models have improved significantly. This section will discuss some of these improvements.
Modeled and observed global temperatures are compared in Fig. 5.1. The models have been run many times (see figure caption) to produce the yellow band of multiple realizations because the Earth only passed through one of these. It does not have to match exactly the average of the realizations (red line). It is sufficient for its observed temperature to fall inside the yellow. Note that while volcanic eruptions can be simulated by the models, El Niños are chaotic and cannot be synchronized with those the Earth produced. Thus we seen excellent agreement between observations and model (note that models used both natural and human caused forcings (see Sect. 6, Fig. 6.2).
Models are compared with many other observables such as precipitation, cloud cover, etc. These show relatively good agreement excepting in certain areas. For precipitation, regions of poor agreement are mostly near the equator and at high latitudes in the southern hemisphere. The problems of high latitude clouds remain only partially resolved, but, since the IPCC report, new work is showing that this problem might be solved. Models had the problem of simulating not only the normal Intertropical Convergence Zone (where moisture blown toward the equator (converging) upwells in high clouds), but of adding a similar spurious feature in the western Pacific Ocean south of the equator. With the new sub-model (Zhang and Wang 2006) this feature has gone away, and agreement with observations is good.
Models must simulate natural time variations of a number of climate cycles such as ENSO, Madden-Julian Oscillation, Pacific Decadal Oscillation, North Atlantic Oscillation, and many others. These are only partially cyclical since weather patterns that generate them are chaotic. The models cannot be expected to match the observations exactly. But models should simulate all these at approximately the same frequency. The general way to check on this is to compute the power spectra for both observations and models. The agreement is quite satisfactory. However, models at present don’t simulate a strong Madden-Julian Oscillation (MJO). But the above-mentioned convective cloud sub-model may have solved this problem also (Fig. 5.2, Zhang and Mu 2005).
Because water vapor is such a strong GHG and because it provides such a strong warming feedback to warming caused by carbon dioxide (see discussion of feedbacks in Sect. 6), it is important that models simulate its variation with temperature correctly. Figure 5.3 shows excellent agreement between modeled and measured water vapor. Agreement is important because water vapor is a powerful GHG and also a major feedback component (See Sect. 6 Feedbacks). Note large rises in water vapor 1988 and 1998 are due to warming from strong El Ninos. Comparison with Fig. 5.1 shows that, as predicted, water vapor generally follows observed temperature change.
5.2 Vertical temperature agreement between models and satellite measurements
Until very recently all models were thought to be flawed because they predicted warming aloft would be as much or slightly greater than at the surface when the satellite and radiosonde observations not seeing it. However, in an important example of predictive power of models, they have now been found to be correct and the reduction of satellite and balloon data were wrong (see Sect. 2.5).
5.3 Coupling between ocean and atmosphere
A particularly demanding test asks how well the coupling between ocean and atmosphere is modeled, thaat is, does the code accurately handle fluxes of temperature and water vapor between ocean and atmosphere? At least three teams have simulated the observed deep ocean temperatures, but with an important twist. Attempts by modelers to reproduce twentieth century temperature records have always been done with the answer known beforehand. Thus, critics could charge that the modelers simply “fudged” parameterization constants to get the right answer. This is true to some extent. For example, codes with low climate sensitivity need to add less aerosol cooling to model climate cooling 1945–1975) than do those with high sensitivity (see discussion of sensitivity Sect. 6). But this is not how at least one of the ocean heating studies was done. It found good agreement with ocean observations in a set of climate runs done before the ocean data were published (Barnett et al. 2001). Richard Kerr wrote in an editorial accompanying these two paper, “another test passed” (Kerr 2001). At least two other modeling groups have reproduced this data (Hansen 2001; Levitus et al. 2001).
But can models be relied upon to make predictions of future climate? The answer here is a guarded, yes, if climate change is not too extreme. Of course, the greater the departure from present climate, the more cautious the projection. A good example is seen in Fig. 5.4 (Rahmstorf et al. 2007). This shows observations and projections/simulations (made in 1990) of three quantities: CO2 concentration, global temperature, and sea level. CO2 projections essentially turned out to be right. Global temperature from two different observations (red and blue) seems to be rising slightly faster than the model predictions (dashed blue and gray error bars). Sea Level from tide gauges (red) and satellites (blue) is significantly higher than predictions (blue and gray). This is a remarkable result because in 1990 the models were not nearly as good as at present, yet they were able to predict global warming very well.
Current climate models (CGCMs) are actually a coupled combination of four models: atmospheric, ocean, land, and sea ice. Comparisons with observations show them to reproduce reasonably well large-scale elements of climate and both seasonal and latitudinal. They also reproduce natural variability and responses to anthropogenic forcings, thus simulating the twentieth Century temperature record. They have shown skill in predicting such features as climate response to large volcanic eruptions and deep sea response to atmospheric warming and in forecasting future climate warming. However, necessarily coarse spatial resolution renders them unable to reliably simulate regional climate, and large uncertainties in aerosols, clouds, and water vapor feedbacks translate to similar uncertainties in their ability to simulate departures from present climate.
Perhaps the best way to characterize modeling improvements in the last decade is to quote from a report by the US Climate Change Science Program (CCSP) report (http://www.climatescience.gov/ AND http://www.usgcrp.gov/usgcrp/Library/ocp2004-5/ocp2004-5-hi-clivar.htm):
“Significant improvements in modeling have been made in understanding the Earth’s climate system components, their interactions, their variability, and the mechanisms driving current changes. Improved understanding has led to state-of-the-art climate models that now reproduce many aspects of the climate of the past century, and simulations of the evolution of global surface temperature over the past millennium are consistent with paleo-climate reconstructions, thus improving confidence in future projections.
New simulations of climate change during the twenty and twenty-first centuries have been carried out using these (improved) models, and this output is the centerpiece of the fourth assessment of the IPCC. These simulations have increased the credibility of scientific conclusion on the causes of global surface warming witnessed ever the past several decades. (recent models) show significant improvements in the simulation of the physical climate system compared to their predecessors a decade ago although there is still a need to reduce systematic biases that plague coupled models, such as the biases associated with double ITCZ (now apparently fixed—see discussion above), errors in the simulated intra-seasonal and inter-annual variability of the tropics, and various regional biases in simulated rainfall and surface temperature.
Despite recent model improvements, however, significant uncertainties associated with various aspects of climate models remain. One of these is the representation of clouds, which continues to be one of the weakest links in modeling the physical climate system (IPCC 2007). A climate process team (CPT) on cloud feedbacks has been formed to address this challenge by incorporating high-resolution satellite data, field observations, and small-scale cloud models. In addition, the Climate Change Prediction Program-Atmospheric Radiation Program Parameterization Testbed project, is addressing the cloud modeling problem by first analyzing the ability of a climate model to accurately simulate weather events, diagnosing the errors, and subsequently improving the model. Other improvements are being made in understanding and modeling different components of the Earth system, including atmospheric chemistry, ecosystems, and carbon cycling, although many challenges remain, including integrating these capabilities into increasingly comprehensive Earth system models.”
6 Attribution and what might we expect in future (You cannot predict the future?)
Most people skeptical of global warming say something like, “I now agree it’s warming, but I don’t think humans are doing it.” This may be the last remaining objection to be answered. This section will discuss just a small part of the large amount of work that has led the IPCC to state with such certainty that the human influence is indeed dominant in the last 25 years. The first step in attribution of causes in climate change is to determine and quantify the major contributors (forcings). Figures 6.1 and 6.2 show the magnitude and variety of forcings currently employed in climate models. Note the large negative forcing of aerosols and their indirect effect on cloudiness.
In addition to the forcings in Fig. 6.1 there is the complication that positive and negative feedbacks are operative. Doubling AGHGs produces a temperature increase of about 1.3°C. But feedback amplification increases this sensitivity to doubling CO2 to between 2.0° and 4.0°C depending on the model used. This is largely due to differences in how they simulate feedbacks. It is important to find ways to determine which sensitivity is correct. This involves validating the feedbacks in the model. Studies of paleo-climates both considerably warmer and cooler than present give a similar range in feedback uncertainty, lending credence to the models but not reducing the uncertainty (Covey et al. 1996). Hegerl et al. (2006) used paleo-temperature records to develop estimates of sensitivity that narrow the gap somewhat and agree with model and observational determinations from the twentieth century (Fig. 6.3). Note that their most likely sensitivity value matches well with that derived from models, giving credibility to climate attributions and projections. Bony et al. (2006) and Roe and Baker (2007) make some insightful comments on all attempts to determine climate sensitivity. The main positive feedbacks are water vapor (WV), albedo (A) from sea ice melt, decreased snow cover and changes in cloudiness (C). Negative feedbacks are also present, such as increases, if any, in low cloudiness, changes in lapse rate (LR) of water vapor etc. Major feedbacks as determined from different climate models are shown in Fig 6.4 as determined by various studies.
6.2 Water vapor feedback
Perhaps the most controversial of these is that due to water vapor changes in the upper troposphere. This is because such changes have a very strong effect on warming and thus climate sensitivity (changes in the lower troposphere have less effect since there is already a huge amount of water vapor there). Models have consistently predicted that warming would cause enough water vapor to increase in order to keep the relative humidity (RH) essentially constant. In the last six years considerable effort has gone into determining if this is in fact accurate (Santer et al. 2007). Both more detailed observations (especially with satellites), and with various tests of the models have shown that this is indeed the case (see Fig. 5.5) although there is some evidence that RH declines slightly with increasing temperature. Thus we have increased confidence that this large feedback with GHG warming is accurate.
6.3 Cloud feedbacks
Cloud feedback is closely related to water vapor which complicates uncertainties in formation, type, altitude, duration, extent and perhaps most of all interaction with radiation. Clouds mostly scatter visible short wave radiation (SWR) and reflect infrared, long wave radiation (LWR). However, recent observations suggest that they absorb about 20% more short wave radiation (SWR) than previously thought which may be due to soot particles in the droplets. Radiation in clouds depends on many factors: thickness, droplet size, total moisture content, aerosol content and effects. High clouds warm by reflecting LWR downwards, and low clouds cool by upward reflection of SWR in addition to radiating LWR. Satellite observations show that net radiative cloud forcing is negative (Ramanathan et al. 1989), causing cooling of the climate system. However, as seen in Fig. 6.4, feedbacks are slightly positive. In addition clouds produce precipitation. Not only is this a key quantity important to humans, the attendant phase changes during the precipitation and re-evaporation process are important to the total heat transfer within the model as well as the vertical distribution of moisture. Thus, studies of the microphysical processes involved are an important part of climate research, but they are necessarily sub-grid scale in size and so, even if known, must be parameterized.
6.4 Other examples of CO2 forcing
People still wonder if CO2 and other GHGs can actually push the climate around so effectively. They point to observations that show warming from glacial periods leads (in time) increases in CO2 and methane as proof that its effect is small. On the contrary, this is to be expected since to start increases in these gases there has to be an initial warming due to some other cause such as the Milankovitch cycles (see Sect. 3).
In fact CO2 has been at the center of most climate events throughout the Earth’s history. A little known fact is that, when the sun was young some 4 billion years ago, it was 30% fainter than it is now which was not nearly enough to warm the Earth above freezing. It is thought that only CO2 in large amounts could have brought the temperature up to that necessary for life to begin. Later, less than a billion years ago it appears the Earth froze over completely. Without CO2 from outgassing volcanoes it might never have been able to recover.
A dramatic example of the power of carbon dioxide to dominate climate comes from a curious event millions of years ago, called the Paleocene–Eocene thermal maximum. This event was a rapid increase of CO2 in the atmosphere. The processes at play here are complicated, but the resulting change in temperature was nearly 5°C. Modeling of this event (with the feedbacks in Fig. 6.4) is consistent with the observations, increasing our confidence that models are capable of simulating even such large departures from normal temperatures. Another such example comes from predictions that AGHG warming should cause the boundary between the troposphere and stratosphere, called the tropopause, to move to higher elevation. Both warming of the troposphere and cooling in the stratosphere contribute to this. Figure 6.5 shows attempts to model the observations with and without AGHGs. Again the power of CO2 to effect changes in the atmosphere is seen.
Another way to show the strength of GHGs is to separate the global temperature record into its major contributing components (Fig. 6.6): El Niño Southern Oscillation (ENSO) is seen to have a dramatic temperature variation but little effect on long-term rise. Similarly with volcanic aerosols which cause only temporary cooling. The 10–11 year solar activity cycle is clear with some slight warming in the first half of the last century but no warming since 1980. Greenhouse gases show a rise over the entire century with a short dip after 1940 due to cooling from air pollution. The contribution to temperature rise from these gases is essentially what the models predict—a further indication that models are getting the climate change about right.
6.5 More sophisticated attribution methods
Merely reproducing such gross observables as large scale temperature averages is not sufficient for selecting anthropogenic global warming over other possible causes of climate change. It is recognized that different sources of warming might each have unique effects on aspects of the climate. For example, GHG forcing is expected to cause warming preferentially at night and during the winter, and at higher latitudes, as well as causing cooling in the stratosphere at the same time as warming in the troposphere (increased solar activity is expected to cause warming when the sun is shining and simultaneously in both the troposphere and stratosphere.). Aerosol cooling might be expected to cool the NH more than the SH since most industrial and transportation pollution is concentrated in the north. There have been several attempts to find these so-called “fingerprints” of GHG forcing on climate (Barnett et al. 1999). However, because of a variety of factors, such fingerprints are less obvious than originally expected. But important fingerprints are being found. GHG forcing should have an increasing effect going from equator (where water vapor can swamp the small additional GHG forcing) to poles (where, in the relative absence of water vapor, anthropogenic GHGs should dominate). Thus, one might expect (and most models predict) that warming at high latitudes will be larger than at low latitudes. This is observed in both hemispheres over land up to quite high latitudes, but in the polar regions themselves things are more complicated. The Arctic is warming considerably but some of the extra heat seems to be going into the energy necessary to change solid ice to liquid water. The West Antarctic is warming considerably near it is north-trending peninsula, but not over the larger topographically dominated eastern half which is cooled partly by the famous ozone hole’s effect on weather. Attempts to quantify attribution have been published (Karoly and Braganza 2001a, b; Stott et al. 2001). These methods are fairly complicated and, due to space indeed limitations, will only be referenced here. Suffice it so say that they too show the AGHG theory to account for the observations (although other theories have not been subjected to such a test). Attribution has gone beyond just matching temperature change. In a pivotal paper, Santer et al. (2007) look for human-induced changes in atmospheric moisture content. In a formal detection and attribution analysis using the pooled results from 22 different climate models, the simulated “fingerprint” pattern of anthropogenically caused changes in water vapor is identifiable with high statistical confidence in the Satellite SSM/I data. Their conclusions bear quoting: “Models suggest that the large increase in water vapor is primarily due to human-caused increases in GHGs and not solar forcing… These findings, together with related work on continental-scale river runoff, zonal mean rainfall, and surface specific humidity, suggest that there is an emerging anthropogenic signal in both the moisture content of the earth’s atmosphere and in the cycling of moisture between atmosphere, land, and ocean. Detection and attribution studies have now moved beyond “temperature only” analyses and show physical consistency between observed and simulated temperature, moisture, and circulation changes.”
6.6 The global dimming and brightening issue
Aerosols alternately scatter, absorb, and reradiate radiation in the atmosphere (Penner 2000). Additionally they affect cloudiness and cloud properties. Figure 6.1 shows estimates of these forcings. While some of these are very uncertain, one can see that aerosols might be sufficiently cooling as to counter AGHG warming. Aerosols have resident times of only a few weeks, but human activities are constantly replenishing them. Major components are solar-reflecting sulfates, and solar-absorbing black carbon soot. It has long been understood that such air pollution has a net cooling effect on recent climate. Indeed, in the 1950s and 1960s it was large enough to obscure the AGHG warming. This is because in addition to their direct forcing, particulates affect cloud water content and indeed cloud amounts. They do this mainly by adding cloud condensation nuclei to the atmosphere causing clouds to have a preponderance of small droplets which increases the ability of clouds to scatter solar radiation rendering the clouds “brighter” and thus increasing local albedo (more reflection of sunlight back into space) increasing the net cooling effect. This indirect effect has been difficult to quantify and has been treated as somewhat of a limited, but a bounded free parameter in climate simulation codes. Satellite data and laboratory study are beginning to quantify this effect, and it is seen that in some cases the net cooling caused by heavy concentrations can still nearly cancel the warming effect of AGHGs (Breon 2006; Kaufman and Koren 2006). At the other end of the spectrum, black carbon soot absorbs incoming solar radiation and warming the atmosphere at the altitude of the soot layers. This would seem to make it a positive forcing, but Ramanathan et al (2001) show it simply changes where in the troposphere sunlight is absorbed. However, this can have dramatic effects on clouds and rainfall because it changes the vertical heating in the atmosphere cooling the surface and warming the higher regions. Nowhere has this effect been better observed than in the Indian (Ramanathan et al. 2005; Chung and Ramanathan 2006). In a large multi-year observational exercise called INDOEX they documented large “brown cloud” pollution moving south from the Indian subcontinent. This caused global dimming and the northern Indian Ocean cooled, thus reducing evaporation and consequent monsoon rains by some 10% since the 1950s. Meanwhile the southern Indian Ocean has continued to warm with the rest of the planet, resulting in increased rainfall in parts of the Sahel in Africa. Such absorbing aerosols occur elsewhere in the world and most likely are having similar affects on regional climate. The fact that aerosols can mask GHG warming brings up the question of what would warming look like if air pollution (aerosols) were largely cleaned up. Clearly, warming would be greater because the earth would return to receiving the sunlight now being scattered back into space. Indeed this reversal of aerosol cooling has been observed. Measurements of total incoming solar radiation, carried out over the past 40–50 years show what has been termed “global dimming” in that there was less sunlight reaching the earth’s surface after 1950 than in the past (Stanhill and Cohen 2001). However, more recent surface measurements (Wild et al. 2005) show the dimming going away since that time 1990. More recently Mishchenko et al. (2007) got the same result but globally from satellites commented on by Kerr (2007). Some have taken this to mean that a significant part of the warming attributed to GHGs was really due to this effect. A recent discussion of this (Schmidt 2007) quantified the effect and showed it to be small. In fact quoting from two other papers (Romanou et al. 2007; Zhang et al. 2004), they showed that while surface solar radiation was going down, global temperatures were rising. Models in fact take these changes in aerosol loading into account. It will be interesting to see if future aerosols will decline, adding to warming, or increase again as developing countries industrialize.
6.7 Model tuning
It is sometimes objected that there are so many free parameters in the CGCMs that the clever modeler can epicycle-like simply tune them to get most any desired behavior. This shows a lack of appreciation of how free one is to “tune” the models. While it is true that some parameterizations are tuned to match observations (and it would be best to move to models that do not require such tuning), the climate itself is so complex and chaotic that it would take an extremely large number of trial tuning runs to effect any desired outcome, and the resultant tuning would then be unlikely to be correct for the next set of studies. In addition I have cited several instances above where either outright predictions were made that were subsequently borne out, or where analysis of runs, already made for other reasons, showed agreement with observations. In these cases tuning was not a possibility.
6.8 Regional attribution
6.9 More precise simulations
A recent study shows how accurately models may be able to predict details of future climate change. A study was done (Overland and Wang 2007) that cautioned expectations that temperatures would rise continuously in the near future. Taking into account a number of natural cycles, solar activity, ocean cycles, etc. they predicted that temperatures would remain relatively constant 2004–2008, and again in the middle of the next decade. Modelers from the UK Met Office (Smith et al. 2007) initiated a simulation IN 1985 as precisely as possible stipulating the state of the atmosphere and oceans at that time and ran it forward to 2020), Both of these “stillstands” in temperature were predicted. This adds to our confidence that, with better starting conditions and more accurate forcings, even such details can be simulated.
6.10 Alternative ways to reproduce twentieth century temperature record
Are there any other forcings that could account equally well for the temperature records both past and present? Basically there are none, but people continue to believe that the sun must be responsible for a much larger fraction of the warming than currently estimated from direct forcing due to changes in TSI. (see Sect. 4). Indeed it is becoming clear that the sun does indeed influence climate by indirect means, but it also seems clear that what influences there are, are small compared with anthropogenic forcings. Solar indirect, especially cosmic ray-driven cloudiness, should vary with the solar cycle enough to show global temperature variations over that cycle which it does not seem to.
Attribution of observed global warming has received much attention since it is at the heart of the problem. There are two aspects to this. First is climate sensitivity to increasing CO2 in the atmosphere which strongly involves the positive feedbacks of water vapor, ice albedo, etc. While the models need continued improvement in these areas, comparison with observations of both present and paleo-climates suggests that a sensitivity to doubling CO2 is likely between 2 and 3°C. Second, showing that no other forcing is able to cause the observed warming. This is harder to do because as the adage says, “absence of evidence is not necessarily evidence of absence.” However, the role of the sun, so important in earlier warmings and coolings, is far less effective in the past 25 years during the largest warming but no increases in solar activity. Thus, as IPCC (2007) makes clear, we are now very certain that the observed warming especially in the last 25 years is due mostly to human emissions of GHGs.
7 Future projections
Sections 5 and 6 have been rather long and detailed, but this was very necessary because, unless we can be confident of the computer models’ ability to characterize climate and attribute recent warming to humans, we can have little confidence in their ability to adequately predict future climate change.
The following discussion is a distillation of the IPCC’s 4AR projections.
Plotting these projection on a 2D map of the world gives a global perspective to projected temperature rise depending of scenario of energy use/AGHG emissions. The expected warming with latitude is clear—higher latitudes will warm more. The expected warming over land compared with the oceans is also obvious. It is apparent that significant warming awaits in the future pointing to the need for societal action of some kind.
7.1 Sea level rise
The IPCC (2007) gives a very conservative projection of sea level rise—less than a meter in this century. It essentially assumes negligible contribution from melting ice. However, there is a community of concern over this, admittedly conservative estimate. Looking at past melting of ice and subsequent sea level rise, the rate per degree C can be estimated. Paleo-records suggest much larger rise. Consider the equilibrium sensitivity of sea level to global average temperature change (Last Glacial Maximum, −6°C—sea level down 120 m; Eemian (Last Interglacial which was brief), +1°C—sea level up 4–6 m; and distant past, temp up 4–5°C and there was no Antarctic ice sheet, so sea level up 70 m)—these give an equilibrium SL sensitivity of something like 5–20 m per degree, a dramatic difference from the IPCC estimate of less than a meter per degree! This is of course the equilibrium sensitivity and it might take hundreds of years to occur. Thus these two estimates might not be too far from agreement. It might be expected that this issue will produce considerable discussion in the literature in the near future since estimating this quantity is so extremely important.
7.2 Fauna changes
There is a growing literature documenting migrations of insects, northward extensions of birds and plants, etc. such that there is no longer any doubt of the response of living things to warming thus far. Continued warming will only increase these migrations. One problem here is with flora that cannot migrate fast enough. There are several areas around the world (South Africa is a clear example) where indigenous plants have nowhere to migrate to and may simply cease to exist.
Attribution of observed warming in the last 100 years is largely in the realm of complex climate simulating computer codes, however, there are many non-modeling indications that AGHGs are causing the trend such as cooling in the stratosphere when there is warming in the troposphere. While the models continue to need improvements, their ability to reproduce the large scale and average features of observed climate gives us confidence that they can be relied upon to simulate climate change over the last century. However, using natural forcings alone, they are unable to account for warming in the second half of the 20th. Addition of anthropogenic forcings, results in quite close agreement between models and observations. The possibility of other forcings such as solar-modulated cloudiness, merits continued research, but these are estimated not to be large enough to alter the IPCC’s basic conclusion that human emissions are a major cause of recent warming.
Finally, adequate postdiction of the last century’s temperature increases our confidence in projections of warming in the next century with a range in temperatures depending on the climate sensitivity of the codes used.
8 Policy (Can science assist planning?)
As mentioned earlier, with the issue of global warming, the science of climate change has taken on the unaccustomed dimension of forming the basis for nothing less than global policy and planning regarding use of fossil fuels and other activities involving AGHGs. This creates the need for scientists to communicate definitive information to society which is of its very nature uncertain. Given human reliance on fossil fuels to drive its economies and the acknowledged difficulty of alternative energy forms to supplant this source, society demands a level of certainty of risk that scientists are unaccustomed to providing. Thus, whether global change, associated with the putative warming to come, constitutes sufficient risk to warrant the necessary response, (i.e., drastically reducing reliance on fossil fuels) is the nexus of the debate. And the crucial element here is the level of certainty demanded before actions are taken. An added dimension, which perhaps makes this issue unique, is that, because of the long lifetime of carbon dioxide in the atmosphere, a lead time for action is necessary. In other words, by the time we are actually experiencing unwanted effects of AGHGs, their atmospheric concentration will be already too high for any mitigation to reverse the situation in a timely manner. Thus, policy makers are being asked to formulate rather drastic plans before we know for certain the magnitude of the problem.
The challenges for policy makers are great, but correct reading of the science is daunting. A short introduction to the issues can be found in Zillman (1997). For many, the risks, not only to humans, but also to the rest of the biosphere are simply too great to be ignored, and a prudent society will initiate steps to ameliorate that risk commensurate with its understanding of the problem in a dynamic way that allows for periodic course corrections as better information becomes available. There are some, of course, who think we know enough and should take immediate and rather drastic action to reduce GHG emissions. For others, the level of certainty at present is far from confirming that risks are large, and, given the benefits of fossil energy and the strain on growing economies that even small actions would cause, they suggest that no significant actions are called for at this time.
It is to be expected that those who have the largest economic stake in production of energy from fossil fuels (including transportation and land use) take the latter position while those concerned with the environment and rapidly increasing problems associated with population growth take the former. And, since advancing scientific knowledge is strongly adversarial—healthy skepticism drives all but the most soundly based hypotheses out—the resulting scientific debate is seized upon as evidence of high uncertainty. To further confuse policy makers, the debate has been amplified and polarized by the non-scientific societal antagonists, with each side presenting its scientists and scientific arguments. From a philosophical standpoint non-scientific policy makers are faced with such strong arguments both pro and con as to conclude that scientific knowledge is relative and in the end, values-based.
The idea was that by bringing together the world’s scientific and economic communities, a considered opinion could be arrived at which was not driven by extremes on either side of the issue. The IPCC’s First Assessment Report (FAR) came out in 1990 and was immediately updated to clarify major points in 1992. Working Group I stated explicitly that, given the information at hand, it could not attribute all or even part of the observed warming to human activities. This notwithstanding, Working Groups II and III presented a bleak picture of what could happen in a warmer climate. To discuss possible action, the UN held a world conference at Rio de Janeiro in which a treaty, the so-called Rio Accords, was considered and ratified by most. These accords called for voluntary measures to begin the process of reducing reliance on fossil fuels. Its basic points were that developed countries would reduce their AGHG emissions to 1990 levels by the year 2000, and provide significant financial assistance to developing countries to develop alternative energy sources. The participating developing countries agreed to take significant actions to reduce the rate of increase of their emissions. This treaty was generally accepted. For example, the United States’ President Bush signed the treaty and the Senate ratified it unanimously. (We note here that the nearly total violation of that treaty by the United States shows how little reliance can be put on voluntary agreements.)
However, in its Second Assessment Report (SAR) in 1996 enough new information had accrued that the IPCC made the historic statement: “Despite lingering uncertainties, the balance of evidence suggests a human influence on the climate.” While this was a remarkably weak statement, it caused a fire storm of reaction which went so far as to attack the IPCC as having been taken over by global warming proponents and having acquired a political agenda. In addition its Policy Makers Summary (which is all many people read) was criticized as not accurately reflecting the uncertainties dealt with in the main text. The follow-on to the Rio Conference, which attempted to move nations from voluntary to mandatory commitments to reduce AGHGs, was held in Kyoto, Japan. The resultant Kyoto treaty was far more controversial. It was no longer voluntary, but mandated that developed countries reduce emissions to slightly below 1990 levels by 2015 but it placed no restraints on developing countries. This was in part because developing countries correctly pointed out that developed countries (including the USA) were largely in non-compliance with the Rio Accords while they (developing countries) had made considerable strides to live up to their part. In essence they were saying, “when developed countries, who have caused the problem thus far (and profited economically from it), have shown that they can meet their obligations, only then will we obligate ourselves to emissions reductions.” At this point the United States, by far the largest contributor to AGHG emissions, became a holdout, for, although the administration signed the treaty, the Senate, many of whose members had voted for the Rio Accords, voted unanimously that it would reject the treaty if asked to consider it. Their stated reason was lack of participation of developing countries. The Bush administration went even further and simply refused to consider Kyoto. This nearly total loss of participation and leadership by the most polluting country has caused increased turmoil within the policy community, and it remains to be seen how the United States can long remain outside the mainstream of world activity on this issue in the face of the fourth IPCC’s report that says: “We are 90% confident that most of the warming in the past 50 years is due to humans.”
8.1 Critics response
Given the general level of agreement that global warming is real and will likely increase significantly in the coming years, it is strange to see how the critics of the anthropogenic global warming hypothesis could have succeeded in convincing legislators that the problem is nearly non-existent. Partly this has to do with the fact that reducing carbon emissions affects every individual’s way of life rather than a few companies as did the Montreal Accords which mandated reduction in chloro–fluoro compounds. The prospect of having to change personal habits makes the populace and its representatives only too willing to hear that, after all, the AGW problem is not all that bad and is only made to look so by value-driven extreme environmentalists.
In its Fourth Assessment Report (4AR), the IPCC states that “We are 90% confident that most of the warming in the past 50 years is due to humans.” For those who choose to call IPCC a political entity and thus reject it, they might wish to consider the fact that the National Academies of Science of some 13 nations as well as many national and international scientific societies (such as the United States’ American Geophysical Union) support the IPCC’s findings. There is simply too much global assent to the work of the IPCC for it to be rejected out of hand.
8.2 Societal response
Countries the world over are planning ways to respond to the science. Societal response, however, is an extremely complex process involving philosophy, economics, and political points of view. Pielke and Sarewitz (2003) discuss this rather clearly in a recent article where they point out “how decisions are really made”, “Decisions about people and ecosystems in the context of climate depend very little—if at all—on reduced uncertainty or plausible projections about climate change. In the face of fundamental uncertainties, decisions are made routinely on equally complex and far-reaching issues, such as economic policy, foreign policy, and national security. In such arenas, policymakers accept lack of certainty as a condition of life.” To make this point more clear the Pielke and Sarewitz recall when a “member of Congress asked more than a decade ago: ‘How much longer do you think it will take before [the USGCRP] is able to hone [its] conclusions down to some very simple recommendations, on tangible, specific action programs that are rational and sensible and cost effective for us to take…justified by what we already know?’ Some offer the following answer: Forever.” They believe that progress on developing cost-effective carbon-free energy sources will be more quickly stimulated through direct investments in energy research and technology justified for their own sake. Others point out that expecting industries to voluntarily change to less-carbon intensive technologies has proven not to work or at least not to work with the urgency the science seems to demand. On the other hand a recent report (Stern 2007; http://www.hmtreasury.gov.uk/independent_reviews/stern_review_economics_climate_change/stern_review_report.cfm or http://www.metoffice.gov.uk/corporate/pressoffice/stern/). This carefully done economic report concludes:
“There is still time to avoid the worst impacts of climate change, if we take strong action now.” The scientific evidence is now overwhelming: climate change is a serious global threat, and it demands an urgent global response. This Review has assessed a wide range of evidence on the impacts of climate change and on the economic costs, and has used a number of different techniques to assess costs and risks. From all of these perspectives, the evidence gathered by the Review leads to a simple conclusion: the benefits of strong and early action far outweigh the economic costs of not acting. Climate change will affect the basic elements of life for people around the world –access to water, food production, health, and the environment. Hundreds of millions of people could suffer hunger, water shortages and coastal flooding as the world warms. Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year. ”
Over the past 20 years an enormous amount of study has gone into understanding climate and in particular how it is responding the increasing concentrations of AGHG. While some uncertainties and conundra persist, it is felt that their resolution will not alter the basic conclusions put forth in this review. It is hoped that the information and references given here will be sufficient to allow the reader to make an informed decision on how well we know the elements of the anthropogenic greenhouse warming hypothesis. It is also hoped that this information makes credible the IPCC’s conclusion that we are now very certain that humans are causing a significant portion of the observed warming. Finally it is hoped that this review will allow the reader to put into perspective arguments to the contrary. And so the period of debate over human effect on climate is over. There now will ensure a perhaps larger, more controversial debate about how society should respond. Here, science can assist, but these decisions involve other a more human side which is our view of ourselves and that or the rest of living things on the planet. The great challenge of the next decade then, is to come to robust agreements on how to reduce the human effect on the planet.
The author is indebted to The University of California’s Institute of Geophysics and Planetary Physics—branches at Los Alamos National Laboratory and at Scripps Institute of Oceanography, UCSD for supporting the author as a Cecil Greene Scholar during which time much of this information was brought together. He is also indebted to the following for helpful discussions and references suggested or supplied: Richard Alley, Tim Barnett, Jim Hansen, Phil Jones, David Keeling, Judith Lean, Mike MacCracken, Joel Norris, Michael Mann, Roger Pielke, V. Ramanathan, Ben Santer, Drew Schindell, Gavin Schmidt, Jeff Severinghaus, Tom Shankland, Richard Somerville, Brian Tinsley, Kevin Trenberth, Warren White, Tom Wigley, Guang Zhang, and my long suffering but thoughtful critic, Yvonne Keller. In addition there have been a rather larger number of people who have both helped and encouraged me to take on this project. To them I am also thankful.