The Arctic surface energy budget as simulated with the IPCC AR4 AOGCMs
- First Online:
- Cite this article as:
- Sorteberg, A., Kattsov, V., Walsh, J.E. et al. Clim Dyn (2007) 29: 131. doi:10.1007/s00382-006-0222-9
Ensembles of simulations of the twentieth- and twentyfirst-century climate, performed with 20 coupled models for the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment, provide the basis for an evaluation of the Arctic (70°–90°N) surface energy budget. While the various observational sources used for validation contain differences among themselves, some model biases and across-model differences emerge. For all energy budget components in the twentieth-century simulations (the 20C3M simulation), the across-model variance and the differences from observational estimates are largest in the marginal ice zone (Barents, Kara, Chukchi Seas). Both downward and upward longwave radiation at the surface are underestimated in winter by many models, and the ensenmble mean annual net surface energy loss by longwave radiation is 35 W/m2, which is less than for the NCEP and ERA40 reanalyses but in line with some of the satellite estimates. Incoming solar radiation is overestimated by the models in spring and underestimated in summer and autumn. The ensemble mean annual net surface energy gain by shortwave radiation is 39 W/m2, which is slightly less than for the observational based estimates, In the twentyfirst-century simulations driven by the SRES A2 scenario, increased concentrations of greenhouse gasses increase (average for 2080–2100 minus average for 1980–2000 averages) the annual average ensemble mean downward longwave radiation by 30.1 W/m2. This was partly counteracted by a 10.7 W/m2 reduction in downward shortwave radiation. Enhanced sea ice melt and increased surface temperatures increase the annual surface upward longwave radiation by 27.1 W/m2 and reduce the upward shortwave radiation by 13.2 W/m2, giving an annual net (shortwave plus longwave) surface radiation increase of 5.8 W/m2 , with the maximum changes in summer. The increase in net surface radiation is largely offset by an increased energy loss of 4.4 W/m2 by the turbulent fluxes.