1 Introduction

The 21st Century global climate is expected to experience long-term changes in response to human-induced increases in atmospheric greenhouse-gas concentrations. Several decades of warming and a variety of hydrologic and landscape responses have already occurred, and these changes are expected to accelerate during the 21st Century unless greenhouse-gas emissions are brought under control and even reversed (Intergovernmental Panel on Climate Change (IPCC) 2007). How these global climate and landscape changes will play out in critical areas of the southwestern US—e.g., as a focal example often cited in this paper, over the Lake Tahoe basin—is highly uncertain, but current numerical models of the global climate provide plausible scenarios that can be evaluated to determine likely climate-change vulnerabilities in hydrologic, nutrient, and sediment systems (e.g., Coats et al. 2012; Costa-Cabral et al. 2012). Given widespread concerns about the approaching climate changes, such assessments are being performed in local to regional resource systems worldwide, and assessment strategies and scenarios have emerged that are being used for initial assessments. For example, the State of California has recently completed the second in a biannual round of State-scale climate-change assessments using scenarios of the sort analyzed here (Cayan et al. 2008; Cayan et al. 2009), a new US national assessment of potential climate-change impacts is in planning stages and will be largely scenario based (NCA 2011), and the next IPCC Assessment is expected to focus even more than in the past on regional scenarios (Doherty et al. 2009).

These various assessments typically begin by identifying some workable number of climate-change projections generated as simulations by a variety of global climate models (GCMs) forced by selected scenarios of future economic development and resulting greenhouse-gas emissions. Simulations from current global models typically are made on very coarse spatial grids, with model grid points separated geographically by anywhere from 1° to 3° latitude and longitude. At this scale, the climate of the entire State of California may be represented by as few as 10 grid cells, and the Tahoe basin covers much less than any one grid cell. As a consequence, the second step in most local to regional assessments is to “downscale” global-model results to finer grids or individual stations so as to preserve local climatic differences within the study area while representing the projected climate changes. The downscaled versions of the climate-change scenarios are then presented to various models or experts in order to identify local to regional vulnerabilities to climate change. Having identified key vulnerabilities to the climate scenarios, options for adaptation of existing management systems or structures can be identified and weighed, as can options for new management approaches.

Lake Tahoe spans the California-Nevada boundary at 39°N latitude. The lake covers 496 km2 at 1,897 m above sea level in the central Sierra Nevada, and contains dilute waters from its high-mountain catchment of 1,310 km2. Lake Tahoe is world renowned for its natural beauty, crystal clear waters, and cobalt-blue color. However, long-term monitoring shows that the clarity of the lake as measured by secchi disks has declined by 10 m since 1968 and the lake’s rate of primary productivity is increasing by about 5 % per year (Coats 2010; Sahoo et al. 2010).

Climate change and its attendant impacts may well exacerbate these problems. A program of studies has been undertaken to evaluate potential influences of climate change on the hydrology, lake mixing, pollutant loading and biological responses of Lake Tahoe. While additional data evaluation and technical analysis will be needed to tie climate change impacts directly to policy decisions, the goal of the program was to provide a preliminary, scientifically-justified assessment as to what extent climate change needs to be considered in efforts to protect Lake Tahoe. Scientific questions raised include:

  • How will magnitudes and frequencies of runoff, both from the entire Lake Tahoe watersheds and to small urbanized catchments, respond to climate change in the 21st Century?

  • How will discharges of sediments and nutrients to Lake Tahoe respond to climate change?

  • How will deep-water mixing in the lake respond to climate changes and how will changes in mixing affect deep-water dissolved oxygen contents and nutrient release from bottom sediments?

A first step in addressing such questions must to be to determine: What expected or plausible changes of climatic parameters might be faced during the next 100 years based on output from general circulation models (GCM) that have been developed to evaluate climate change? This paper presents climate-change scenarios across the southwestern US from two GCMs under two 21st Century emissions scenarios and describes a downscaling process used to provide inputs to climate-change assessments in the Lake Tahoe basin and elsewhere in the region.

2 Projections

In this study, most focus was placed on climate projections by NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL) coupled ocean–atmosphere GCM (CM2.1) in response to two greenhouse-gas emissions scenarios from the IPCC (Nakicenovic et al. 2000). Climatic responses to an emissions scenario labeled A2 were analyzed as a representative of middle-of-the-road climate-change projections (Fig. 1); the A2 scenario is a scenario that is based on assumptions of a very heterogeneous world economy with high population growth, moderate overall economic growth, and resulting emissions that accelerate throughout the 21st Century. Notably, until recently, the A2 scenario was viewed as a reasonable “worst case” scenario, but recent evaluations have shown that, in the past decade, emissions have actually exceeded the A2 trajectory (Raupauch et al. 2007). Consequently, A2 is now viewed as more of a middle-of-the-road scenario (compared, e.g., to A1Fi in Fig. 1). Climatic responses to a second emissions scenario labeled B1 were also evaluated here. The B1 scenario is based on assumptions of a more resource-efficient and ecologically friendly future with lower population growth and technological moves towards service and information economies, with emissions that level off by end of century (Fig. 1). The B1 scenario is considered to be an optimistic scenario that results in much less climatic change and challenge than does the A2 scenario.

Fig. 1
figure 1

Changes in global anthropogenic greenhouse-gas radiative forcings (Nakicenovic et al. 2000) of climate in the 20th Century and under scenarios of 21st Century emissions discussed here

Ideally, given the wide range of uncertainties regarding climate-change projections, a wide range of projections should be evaluated in assessments of potential impacts. Limited time and resources of the present assessment dictated, as with so many current assessments, that limited number of scenarios be addressed. In this study, we followed the lead of recent statewide assessment activities in California (Cayan et al. 2008) both for the sake of consistency and to bracket the possible futures. As noted, much of the focus in this paper is on the GFDL climate model, which warms more in response to each unit of greenhouse gas added to the atmosphere than do most of the two dozen climate models that were evaluated in the most recent IPCC Assessment (2007). Simulations of temperature and precipitation from another climate model, the National Center for Atmospheric Research’s Parallel Climate Model (PCM1), were also obtained and downscaled for the present study. The PCM1 model warms less in response to greenhouse-gas increases than do most IPCC climate models.

By considering climatic responses simulated under high-emissions A2 and low-emissions B1 scenarios by highly responsive GFDL and the less responsive PCM1 model, this study evaluates potential impacts spanning much of the broad range of possibilities presented in the most recent IPCC global assessment (2007). These scenarios were also key parts of the recent State of California climate-change assessments (Cayan et al. 2008, 2009).

As indicated, much of the focus here is placed on the GFDL scenarios. Focusing on a single GCM is far from optimal, but was mandated in this study by availability of detailed projections of future variations and changes in several climate variables of particular importance to Lake Tahoe. The parts of the archives containing GCM projections for the recent IPCC Assessment (2007) that contain outputs at daily time scales, contained more examples of temperature and precipitation outputs than outputs of other surface variables like surface winds, humidities, and surface radiative fluxes. For these other variables, the archives often are limited to simulated monthly summary statistics or discontinuous periods of daily outputs; we are optimistic that the archived projections for the next IPCC Assessment will be more complete to allow more extensive ensemble evaluations. These other variables, however, are of considerable concern in the Tahoe basin because they bear directly on overturning and mixing of deep waters and on biological activity in the lake (Abbott et al. 1984; Jassby et al. 2003), which in turn will largely determine the clarity and ecological responses of the lake to future climates. Because we were able to obtain more complete simulation outputs from GFDL, we were able to explore projected changes in these variables in ways that facilitated the Lake Tahoe assessment, but were limited for now to considering only the GFDL projections. The GFDL CM2.1 model that is the source of the radiative-flux projections downscaled here uses a simplified H2O-continuum parameterization of longwave radiation and vapor/liquid-cloud/ice-cloud parameterizations, with cloud-liquid, cloud-ice, and cloud-fraction microphysics and macrophysics parameterizations (Anderson et al. 2004).

3 Downscaling

Downscaling is the process of transforming simulated climate variables from coarse-grid GCMs into realistic estimates of climate sequences, variations and changes at higher resolutions. Many different approaches to downscaling have been developed and used in assessment studies. Two broad categories of downscaling methods are statistical methods (which use various statistical models or relations between historical observations of GCM variables or at GCM scales and their higher resolution counterparts as a basis for inferences about the high-resolution implications of climate-model outputs) and dynamical methods (which apply climate models that have much finer grid spacings but limited geographical extents to fill in detail over a desired area) (e.g., Wood et al. 2004). Dynamical methods will ultimately provide more physically consistent and flexible visions of the future than statistical methods, but at present suffer from very high computation costs. The computational burdens of dynamical downscaling methods are so large that it is still rare to see dynamically downscaled products that span more than 20–30 years. Furthermore dynamically downscaled products still maintain, or even worsen, biases suffered by the global models, so that it is generally necessary to statistical correct even the dynamically downscaled products before they are suitable for use. Statistical downscaling is much less computationally burdensome and typically includes some form of bias correction as an integral part. The statistical methods however make explicit or implicit assumptions that historical (statistical) relations between coarse-grained climatic variables and their high-resolution counterparts will not change in the future as the global climate changes.

A statistical method, called the constructed-analogue method (Hidalgo et al. 2008; Maurer et al. 2010), was used in the present assessment to downscale daily GCM-simulated precipitation, maximum and minimum temperatures from their original roughly 2° latitude-longitude grid spacings onto a 1/8° (roughly 12 km) grid. Figure 2 is a schematic of the method wherein, given a coarse-gridded depiction of some day’s climate (weather) from a GCM, the first step is to identify a set of days, in the same season of year, from the historically observed record with coarse-gridded climate patterns that are most similar to the model pattern. The linear combination of the weather maps from these coarse-grained historical analogs that best fits the model pattern is determined by simple linear regressions. The constructed-analogue method then applies the same regression coefficients to the high-resolution maps of the same variables on those historical analogue days to obtain a high-resolution version of the original model weather.

Fig. 2
figure 2

Schematic depiction of the two primary steps in downscaling climate model outputs by constructed-analogs method. More detailed description available at http://www.energy.ca.gov/2007publications/CEC-500-2007-123/CEC-500-2007-123.PDF

This downscaling method was applied to gridded historical temperature and precipitation observations (as a test of the method) and to historical and future climates simulated by the GFDL CM2.1 and NCAR PCM1 GCMs over the entire conterminous US. The gridded observations used here are those provided as inputs for macroscale hydrologic models, by Maurer et al. (2002), and spanned the period from 1950 to 1999. This gridded meteorology was used as both the source of potential analogues for the downscaling method and as a means of testing its efficacy (Hidalgo et al. 2008). Daily maximum and minimum temperatures were downscaled together, and precipitation separately. Because each day being downscaled came from observations (in tests) or from GCM outputs in which the two are meteorologically linked in realistic ways, necessary linkages between temperature and precipitation were obtained in the downscaled scenarios. Downscaled daily temperatures and precipitation over the Lake Tahoe basin were then extracted and used in other parts of the climate-change assessment described in this issue.

In order to test the method, daily historical climate datasets were aggregated onto a 2.5° GCM-grid spacing and then downscaled by the constructed-analogue method onto the original 1/8° high-resolution grid, with downscaling results compared to the original, unmodified high-resolution fields. Figure 3 shows the correlations between “anomalized” temperatures and precipitation totals from the unmodified datasets and from the coarsened-and-then-downscaled results, where “anomalized” means “with the long-term-mean seasonal cycle at each grid cell removed” so that no credit is given for knowing that winters are colder than summers or that high places are cooler than low, and so on. The correlations were calculated on the basis of results from odd-numbered years where the downscaling (and anomalization) only had access to data from even-numbered years, and vice versa for downscaling of even-numbered years, as a means of cross validation.

Fig. 3
figure 3

Anomaly correlations between gridded, observed daily maximum temperatures (a) and precipitation (b) and versions of same obtained by aggregating high-resolution observations to global-climate model gridding and then downscaling back to original, 1/8° gridding by constructed-analogs method of Hidalgo et al. (2008); inset map (c) shows anomaly correlations for monthly precipitation totals of the daily fields. Anomaly correlations are correlations between downscaled and observed variables that have both had long-term mean seasonal cycles removed at each grid cell

High-resolution temperature variations are very well recovered by the constructed-analogue method with anomaly correlations dipping no lower than 0.8 over most of the US, and remaining well above 0.9 over the Lake Tahoe basin (Fig. 3a). Precipitation is more difficult to downscale and, at best, daily scale anomaly correlations between the original and downscaled daily precipitation totals rise to about 0.7 within the Lake Tahoe basin (Fig. 3b). However, when the downscaled daily precipitation values are summed to form monthly totals and those monthly totals are compared to monthly totals of the original precipitation data, the anomaly correlations around Lake Tahoe are quite high (>0.95; Fig. 3c). A comparison of anomalized versions of the downscaled temperature and precipitation time series, 1950–1999, from the Tahoe basin with those from a point near the center of the Sacramento-San Joaquin Rivers Delta 160 km to the southeast (well within the distance between GCM grid cells) and 1,900 m lower (at about sea level) yields daily anomalous correlations of 0.7 for both temperatures and precipitation; the original gridded, observed temperatures and precipitation at these two points yield anomalous correlations of 0.6 for both, suggesting that the downscaling procedure is yielding geographic weather patterns that are modestly smoother than in the original observations. Still, the constructed analogs method recaptures very realistic and high-resolution historical daily temperature and monthly precipitation variations from a version of the historical record that has been regridded onto the coarse global-climate model grids with impressive skill. Several additional tests of the skill of this downscaling method are presented in Hidalgo et al. (2008) and Maurer et al. (2002).

Most climate-change assessments have focused entirely on projections of temperature and precipitation change. In this study, several additional climate variables were also downscaled and assessed in order to better address the central role of radiation inputs and winds in the heat balance of Lake Tahoe and to its overturning, its future health and clarity, and to the microclimates of the basin. The additional variables considered were surface-wind speeds, downward shortwave (solar) radiation fluxes, and downward longwave (infrared) radiation fluxes at the surface. The present study is unusual in that these additional variables have only rarely been (statistically) downscaled across whole regions in previous studies, with the recent exception of Abatzoglou and Brown (2011).

Because historical observations of these additional variables are much less common than those of temperature and precipitation, no entirely observationally based historical grids of these variables were available. Instead the downscaling strategy used here was to draw upon the high-resolution regional-climate reanalysis (called CARD10) of the meteorology over California, Nevada and surrounding areas during 1948–2005 on a 10-km grid (Kanamitsu and Kanamaru 2007; Kanamaru and Kanamitsu 2007) as a source for high-resolution gridded historical “observations”. The CARD10 reanalysis is actually a regional simulation without regional-scale data assimilation, and is based on making very short-term, high-resolution retrospective forecasts of high-resolution regional weather outcomes from large-scale atmospheric and surface conditions embodied in global-scale data-assimilated reanalyses. In particular, CARD10 was developed by embedding the Regional Spectral Model (RSM; Juang and Kanamitsu 1994) within global NCAR/NCEP Reanalysis (NNR; Kalnay et al. 1996) atmospheric fields every 6-hours and in a domain from 29.466° to 45.719°N, and 128.203° to 111.563°W. The 6-hourly “forecasts” from RSM, running in this mode, are the CARD10 outputs and, when aggregated to daily totals, are treated here as a best available approximation as to how climate variables like surface winds and radiative fluxes varied over the landscape at high geographic resolutions on a daily basis.

The accuracy of CARD10 fields have been tested by comparisons with observations of surface-air temperatures, winds, and precipitation, with the lower-resolution North American Reanalysis (NARR; Mesinger et al. 2006) fields, and with PRISM precipitation grids (Daly et al. 2002) by Kanamitsu and Kanamaru (2007). A comparison of surface winds at 12 airport stations and 12 offshore buoys indicated (largely as expected) that CARD10 winds reflect observations much more accurately over the ocean than over topographically complex inland areas; e.g., the average correlations between January 2000 (August 2000) winds from CARD10 and station winds were 0.82 (0.69) at buoys and 0.56 (0.46) at inland stations. Nonetheless, both on- and offshore, average wind observations were more accurately represented in the CARD10 fields than in the global NNR fields or NARR fields (Kanamaru and Kanamitsu 2007), indicating that the higher resolution CARD10 fields are providing downscaled “value added” relative to the GCM-resolution NNR. Kanamitsu and Kanamaru (2007) also validated CARD10 against land-based temperature and precipitation observations, at daily scales; average daily correlations between observed and CARD10 mean temperatures for January 2000 and August 2000 were 0.77 and 0.71, respectively, and mean correlations of monthly mean temperatures for January 1948–1996 and August 1948–1996 were 0.84 and 0.68, respectively. Correlations between monthly CARD10 precipitation and station observations were 0.69 for Januaries 1948–1996. Comparisons of CARD10 precipitation to gridded PRISM precipitation estimates indicated tendencies for CARD10 to perform worse (r = 0.4) and underestimate precipitation by several mm/day on the lee (eastern) side of the Sierra Nevada, and perform better (r = 0.9) but overestimate by the same amount on the windward (west) side. Overall then, CARD10 provides useful and realistic, but far from perfect, high-resolution depictions of weather relations and variations in the California and Nevada region. Unfortunately, observations necessary to reliably validate downward solar and longwave radiative fluxes are not available, but the CARD10 estimates of these fluxes have the advantage that they are internally and dynamically consistent with the validated CARD10 reanalysis variables.

In the constructed-analogs downscaling of surface winds and radiation onto the CARD10 10-km grid, CARD10 variables were treated the same as the observationally based historical temperature and precipitation fields discussed earlier to test the applicability of the constructed-analogue method to downscale these variables from global-model outputs and to downscaled future variations of these variables. Notably, available GFDL outputs included these additional variables, but available PCM1 outputs did not, so that only the GFDL trends in these variables can be considered here. Also, difficulties with the archived GFDL projections of humidity prevented us from being able to downscale humidity for the present assessment. Finally, southerly and westerly wind components were downscaled together and then combined to calculate the wind speeds.

Figure 4 shows anomaly correlations between daily and monthly means of historical CARD10 values of downward longwave radiation, downward shortwave (solar) radiation, and surface-wind speeds and coarsened-and-then-downscaled versions of the same (and then validated against the un-coarsened original CARD10 values in parallel with the temperature and precipitation validations illustrated in Fig. 3). Downward longwave fluxes are very well downscaled over the Tahoe basin when aggregated to monthly time scales (correlation >0.95), surface-wind speeds also are reasonably well recovered at that time scale (>0.9), and downward solar radiation somewhat less so (>0.75 or 0.8), giving some confidence in the downscaled projections for uses over such time scales. These differences in downscaling accuracy can broadly be understood by considering that downward longwave radiation is largely a function of the temperature and humidity of the atmosphere, both of which vary smoothly over the large scale (relative to processes like precipitation and cloudiness). Shortwave fluxes are very much a function of cloudiness, which although a function of temperatures and humidity, also reflects processes closely linked to precipitation. Thus the downscaling accuracy for longwave radiation approximates that for temperatures whereas the accuracy for shortwave radiation approximates that for precipitation. Downscaling accuracies for surface winds are presumably limited by the local influences of the complex topography of the southwest.

Fig. 4
figure 4

Anomaly correlations (as in Fig. 3) and anomalous standard deviations, except for downward longwave radiation, downward shortwave (solar) insolation, and surface-wind speeds. Lefthand frames are anomalous correlations of daily values of original CARD10 and coarsened-and-then-downscaled versions of same; middle frames are anomalous correlations of monthly totals of these dailies; righthand frames are anomalous standard deviations of daily values. All calculations span 1961–1999

Downscaling was applied and tested at the daily time scale (as with precipitation and temperatures) and then results were aggregated up to monthly scales for these comparisons. Downscaled longwave fluxes over the Tahoe basin were captured (in terms of anomaly correlations) at the daily scale roughly as well as at monthly scales (Fig. 4). Anomaly correlations for the daily downscaled shortwave fluxes were only about 0.7–0.75 over the Lake Tahoe basin, so that aggregation to monthly levels might be required for most confident use. However, although day-to-day mismatches between the original and downscaled shortwave fluxes are indicated by its lower daily-scale correlations, the daily variations remain broadly realistic as indicated by relatively close agreements between anomalous standard deviations (standard deviations once seasonal cycles are removed) of the original and downscaled daily fluxes. Overall, the downscaled shortwave fluxes are about 10–15 % more variable (as measured by anomalous standard deviations) than were the CARD10 fluxes (Fig. 4, middle-right panel). Thus the downscaled shortwave variations, which derive only from large-scale conditions, are somewhat more variable than are the sum of variations deriving from both large- and local-scale influences represented by the CARD10 data set. In practice, the close agreement suggests that, although the exact day-to-day sequences are not recovered by downscaling (indicated by relatively low daily-scale correlations in Fig. 4), the downscaled dailies nonetheless vary realistically. Anomalous correlations of downscaled daily wind speeds were only about 0.65–0.7 over the Tahoe basin so that, as with downward shortwave fluxes, aggregation to monthly averages would be needed for most confident use. Although not shown or used here, when the downscaled westerly wind components are tested against the original CARD10 values, anomaly correlations over the Tahoe basin are quite high (>0.9); anomaly correlations for downscaled daily southerly wind components over the Tahoe basin are only about 0.7, by contrast. Anomalous standard deviations of the daily longwave fluxes and wind speeds (Fig. 4, right-hand panels) indicate excellent agreement between the variability of the original and downscaled CARD10 fields. Notably, boundary effects (like those present in all regional climate models) make the first 20–25 grid cells from the lateral boundaries unreliable (Kanamitsu and Kanamaru 2007), yielding unnatural rectilinear features paralleling the western and southern edges of most panels of Fig. 4. However, these unreliable zones are mostly located offshore and thus away from the southwestern states that are the focus of the present study.

When applied to other (future, in this study) conditions, this statistical downscaling of dynamically simulated weather conditions amounts to developing a statistical stand-in for the computationally demanding (CARD10) regional-climate model. That is, to the extent that the statistical downscaling method continues to reproduce the regional-climate model variables as accurately as indicated in some frames of Fig. 4 when applied to other time periods, near-equivalents of the regional-climate model outputs are being obtained with much less computational burden. By this approach, the present study was able to develop what we hope to be near equivalents of the kinds of results that would have been obtained from applications of the CARD10 model to the entire 21st Century.

Considerable place-to-place differences in the anomaly correlations are apparent in Fig. 4. These differences are not random but rather are, in general, closely allied with topography. Thus, for example, some of the lowest anomaly correlations for monthly wind speeds over California are situated over the southern ramparts of the Central Valley (at foothill elevations) and over nearby parts of the southern Sierra Nevada mountains. These lowest correlations derive from errors in the southerly wind components (not shown), and it seems likely that the errors in this area reflect sheltering of those areas from direct onslots of large-scale southerly wind patterns by the mountainous terrains upwind (to the south). The patterns of downscaling skill in Fig. 4 thus are indications of where regional-scale atmospheric circulations and climatic conditions most strongly dictate the higher resolution variations versus where the higher resolution variations are more random or at least not so simply tied to the overall regional conditions. More research will be required to take full advantage of these suggestions about large-scale versus small-scale causes of climate variations over the western US region, but such research may help to formulate more realistic expectations as to where, when, and why regional models work well in one locale but poorly in another.

4 Results

US minimum-daily temperature trends in downscaled versions of the GFDL 21st Century climate-change projections under A2 and B1 emission scenarios are shown in Fig. 5. Projected warming ranges from about 3.5 to 7°C over the conterminous US under A2 emissions and from about 1 to 3°C under the B1 scenario. Over Lake Tahoe, minimum temperatures rise by 4.3°C by 2100 under the A2 emissions scenario and by about 1.9°C under B1 emissions. Maximum temperatures (not shown) rise by 4.7 and 2.0°C, respectively. Under both scenarios, the GFDL responds to greenhouse-gas emissions with drying trends of over 20 cm/yr/century over the Sierra Nevada and Lake Tahoe basin (Fig. 5), but farther east, the A2 scenario yields drying across much of the western, central and southeastern US, whereas under the B1 scenario, there is a broad pattern of enhanced precipitation across the conterminous US beyond the Sierra Nevada.

Fig. 5
figure 5

Downscaled temperature (left panels) and precipitation (right) trends under A2 (top panels) and B1 (middle panels) emission scenarios from the GFDL global climate model, with time series of downscaled changes in temperature and downscaled precipitation from a grid cell over southwestern corner of Lake Tahoe in bottom panels

Figure 6 shows projected temperatures and precipitation from the less-sensitive PCM1 model under the A2 emissions scenario. Under the A2 emissions, the GFDL model projects warming by over 4°C in the vicinity of the Lake Tahoe basin (Fig. 5) whereas the PCM1 model projects only about 2°C warming (Fig. 6). Both models (PCM1 not shown) warm considerably less under the B1 emissions, so that although there is considerable uncertainty about the actual magnitudes of warming to be expected from a given emissions scenario—as indicated by the model-to-model differences in Figs. 5 and 6—less emissions (e.g., B1) result in less change in whichever model turns out to be closer to the real future. The PCM1 projections of future A2 precipitation (Fig. 6) yield less precipitation change than does the GFDL model (Fig. 5), and indeed yields very little change at all over the Lake Tahoe basin.

Fig. 6
figure 6

Same as Fig. 5, except for projections by PCM1 climate model under A2 emissions scenario; same color bars as Fig. 5

Downward longwave radiation is projected to increase everywhere in the study domain under both the A2 and B1 scenarios (Fig. 7); at Lake Tahoe, the 21st Century downward-longwave radiation trend amounts to +34 W/m2/century under A2 emissions and +12 W/m2/century under B1 (both trends rising easily to greater than 95 % confidence levels). This is the essence of the greenhouse effect; more greenhouse gas in the atmosphere results in more trapping of heat in the atmosphere, especially more trapping of longwave heat fluxes, and thus more warming and reradiation downward of longwave heat towards the surface. Under the A2 scenario, more greenhouse gases are emitted, and so downward longwave fluxes increase more, than under the B1 emissions. In the downscaled fields, downward longwave radiation increases about three times as rapidly under A2 as under B1 emissions over most of the interior western US, including over the Tahoe basin.

Fig. 7
figure 7

Trends in downscaled projections of downward longwave radiation from the GFDL global climate model under A2 (left) and B1 (right) emissions scenarios, with downscaled time series from a grid cell over southwestern corner of Lake Tahoe in bottom panel

Downward solar insolation changes much less (in watts/m2) than do longwave radiative fluxes in the GFDL projections (Fig. 8). In the A2 downscaling experiment here, solar insolation declines beginning just east of the Central Valley at the latitude of Lake Tahoe, a decline that is well established and statistically significant (−6 W/m2/century) but still relatively small (−2 % of historical normal) at the Sierra Nevada ridgeline and Tahoe basin (due to increased cloudiness). Solar insolation increases modestly over California and most of Nevada, including the Tahoe basin (+4 W/m2/century or 1.6 % of historical normal, and only marginally statistically significant), under B1 emissions (Fig. 8). The broad patterns of insolation change probably reflect mostly differences in natural variations of insolation in the two projections.

Fig. 8
figure 8

Same as Fig. 7, except for downward shortwave (solar) insolation

Mean daily surface-wind speeds are projected to experience widespread but modest increases across most of the southwestern US with declines across much of the northwest under both scenarios, with intense changes under the A2 scenario (Fig. 9). More broadly, westerly wind components slacken offshore, over the northwestern areas and down the Sierra Nevada under both scenarios, with increased westerlies in the southwestern areas in the A2 scenario. The southerly wind components slacken even more (in relative terms) along much of the west coast—implying potential enhancements to coastal upwelling, much as projected by Snyder et al. (2003)—and across much of the interior northwest under the A2 scenario; the southerly winds also slacken in these areas under the B1 scenario but the projected changes are smaller. Under the A2 scenario, southerly winds increased in the southwestern quadrant. Locally, overall wind speeds are projected to change by only a few percent of historical normals overall on average over the Lake Tahoe basin, under the two emissions scenarios (−5 %/century and −4 %/century under A2 and B1, respectively).

Fig. 9
figure 9

Same as Fig. 7, except for surface-wind components and speeds; percentages mapped are of overall wind speeds

5 Conclusions

Climate-change projections from two GCMs were downscaled here to provide inputs for assessments of potential vulnerabilities and impacts in the southwestern US including, for the purposes of this issue, the Lake Tahoe basin. A statistical method, called the constructed-analogues method, was used to downscale daily temperatures and precipitation in the climate-change projections to about 12 km resolution, based on gridded historical observations. The same method was used to downscale projections of downward longwave and shortwave fluxes, and wind speeds, based on results of a well-constrained historical simulation by a regional-climate model. At monthly levels, downscaling of all the variables reproduced historical variations well, once aggregated to monthly averages. At daily scales, some of the downscaled variables (especially, temperatures and longwave fluxes) reproduced historical variations very faithfully, whereas downscaling of other variables (especially, precipitation and winds) did not follow day-to-day observed fluctuations but did capture realistic levels of variability. Large place-to-place differences in the ability of this statistical downscaling method to recapture observed, or regionally simulated, variations are present for some of the variables; these differences may provide useful indications of where local climatic variations are more narrowly dictated, or less dictated, by larger, regional scale circulation and climate patterns on daily and monthly time scales. Such indications might provide useful avenues for evaluating and improving the physically based models used for high-resolution dynamical weather and climate simulations and downscaling.

Specifically, for the Lake Tahoe area, projected changes in winds and solar insolation are relatively small, and close ties exist between temperatures and longwave heating of the surface, so that a focus mostly on temperature and precipitation changes is reasonable for many parts of the Lake Tahoe assessment. The implications of these projected climatic changes (as well as historically observed changes) for the hydrology, nutrient and sediment fluxes and reserves in the Lake Tahoe basin are discussed in other articles in this issue.