While Klaus Hasselmann began, as many theoretical physicists do, expecting to find a solution to the “turbulence problem” (whatever that is), he noticed that this would be a rather big challenge, and that it may also be good to tackle easier problems. And that is what he did before returning to the old dream when he retired, although he was not particularly successful in attracting praise and recognition. His thinking was informed by the dominance of a low-dimensional subspace, within which the dynamic lives and acts, while it influences, and is statistically influenced by myriads of factors in a high-dimensional space. This approach is sometimes obvious and sometimes under the surface, but it is ubiquitous.

The different strands of Hasselmann’s interest and effort relate to:

  1. 1.

    ocean wave theory and prediction

  2. 2.

    remote sensing

  3. 3.

    stochastic climate model

  4. 4.

    reducing phase spaces

  5. 5.

    climate and society

  6. 6.

    building the modelling strategy of MPI

  7. 7.

    METRONs—particle theory.

For each of the above areas we shall first provide a brief overview of the topic, try to determine the significance of Hasselmann’s work for the field, and provide a facsimile of a key publication, often his first on the subject. These first papers were often rather complicated and sometimes difficult to comprehend. In many cases, they were later followed by other versions characterised by a remarkable clarity. The reason why, wherever possible, we present the first papers, is to provide the reader with a glimpse into Hasselmann’s thought processes. As a rule, if something is truly great then the original ideas become simple. And indeed, Klaus was unwilling to update his original stochastic climate model-paper, “because it is too simple”.

The numbered references below relate to the publication list in Sect. 5.

3.1 Ocean Wave Theory and Prediction: From Basic Physics to an Integrated Wind and Wave Data Assimilation System

By Gerbrand Komen after some discussions with Luigi Cavaleri.

The challenge

There are two main challenges involved in ocean wave research:

  1. 1.

    One would like to better understand the basic physics, which is really quite complicated: even today many aspects are not fully understood.

  2. 2.

    There is a great need for practical applications: reliable forecasts and climatologies.

What was known in the 1950s?

Ocean wave research was booming in the 1950s,Footnote 1 with exciting progress being made along several lines. The semi-empirical forecasting methods of Sverdrup and Munk, based on wave height observations, came into wider use. Visual observations were complemented by instrumental observations, both in the laboratory and in the field. Bill Pierson introduced ocean wave spectra, applying results from studies on random noise, and he developed practical methods for ocean wave forecasting using wave spectra and statistics. Owen Philips and John Miles made significant contributions to the understanding of basic processes.

Basic equations

Then, in 1960, Klaus Hasselmann published Grundgleichungen der Seegangsvorhersage (Basic equations for sea state predictions) in German in the journal Schiffstechnik (Maritime Engineering) ([3], see facsimile f1 below). The paper opens by noting that knowledge of the forces acting upon developing ocean waves (“wind sea”) is insufficient, but also—more optimistically—that recent advances are encouraging in terms of attempts to develop a reliable, general method of sea state prediction and that this should be based on an equation that represents the energy balance that shapes the ocean wave spectrum. This is then followed by what is now known as the energy balance equation, aka the radiative transfer equation, which expresses the rate of change in energy of a spectral component as a result of advection, wind input, dissipation, and the exchange of energy between different wave components due to nonlinear resonant interactions.

In the paper, Hasselmann expresses his surprise that this equation had not been included in previous approaches. However, this is not quite correct, because, in fact, Gelci and his colleagues had formulated and used a similar equation in 1957 in a paper entitled Prévision de la houle. La méthode des densités spectroangulaires, which was published in the Bulletin d'information du Comité central d'océanographie et d'études des cotes. Obviously, this was not known to Klaus at that time. Anyway, his treatment contained an important new element, namely the inclusion of the wave-wave interaction term.

Some readers of Grundgleichungen (for example, Richard Dorrestein, director of Oceanography and Maritime Meteorology at the Royal Netherlands Meteorological Institute) were surprised that the paper did neither derive nor justify the correctness of the energy balance equation. In fact, a decent derivation was not provided until 1975, when Jürgen Willebrand published his ‘Energy transport in a nonlinear and inhomogeneous random gravity wave field’.

Grundgleichungen not only includes the basic equations, but also discusses several applications in special situations, namely for fully developed wind sea and for the “development phase, in which the non-linear effects are still negligible”. Later it would become clear that this second application was rather academic as nonlinear interactions were found to be strong for young wind sea. Finally, the paper includes a section on finite depth effects, with an application of generation in the Neusiedler See, a lake in Austria, south of Vienna.

Grundgleichungen has a modest citation record. Nevertheless, its impact has been enormous, as it not only provided a basis for further work, but also set out an agenda for ocean wave research by stating that:

  • (...) more precise observations would be required (a theoretical calculation might fail for the time being due to the turbulence problem) to determine those terms used in the energy equation that are still uncertain with greater precision.

  • the method could be expanded with the aid of a suitable computer programme for an electronic digital system, to calculate fast and accurate sea state and swell forecasts for any wind fields identified on the weather chart.

Klaus himself would actively pursue these objectives over the next few decades, with help of the global wave research community which he successfully mobilized. This is now history, with several well-written and well-documented accounts.Footnote 2 Here a short overview will be given.

Nonlinear interactions

Grundgleichungen contained an explicit expression for the exchange of energy between different wave components due to nonlinear resonant interaction, the so-called Boltzmann integral, a five-dimensional integral containing the products of wave spectra and a number of exchange functions. The exchange functions were not included in the 1960, but appeared in follow-up papers [6, 8, 9] in 1962 and 1963, and in a comprehensive and more general account which appeared in 1968 as “Weak-interaction theory of ocean waves” (21).

The Boltzmann integral is actually a 6-dimensional integral in wavenumber space, constrained by the resonance condition, namely that the frequency of the ‘forced’ component is equal to the sum of the frequencies of the ‘forcing’ components. Its numerical integration is challenging because wave spectra are typically sharply peaked. To obtain reliable results these peaks have to be represented with a high degree of accuracy in high resolution. Initial results were already available in 1961, indicating that energy from waves near the peak of the spectrum was transferred to still longer waves, but integrating the Boltzmann integral with sufficient accuracy and affordable computing costs remained a challenge for the next 25 years or so [198, 77, 78, 114]. An initial successful application emerged in 1972 when it was found that nonlinear resonant interactions were essential for understanding the spectral evolution observed during JONSWAP. Later applications from 1980 onwards were used in numerical wave prediction models.

JONSWAP

Klaus Hasselmann was involved in several large-scale field experiments. The first was the Pacific swell propagation programme [18] with Walter Munk and others. Another major campaign was MARSEN in 1979 in the North Sea. Perhaps best known is JONSWAP, which Hasselmann coordinated, which took place in the German Bight in 1969 following a pilot experiment in 1968. There were several objectives, such as measuring wave growth, wind stress, atmospheric turbulence, and swell attenuation. The development of sea states was studied by continuously measuring wave spectra along a line extending 160 kms into the North Sea westward from Sylt under (fairly) stationary offshore wind conditions.

One important result was the parametrisation of the observed spectra. The starting point was an earlier parametrisation by Pierson and Moskowitz for fully developed seas. The most remarkable difference was the strong enhancement of the energy level at the spectral peak during growth. Mitsuyasu, who had performed similar measurements at about the same time in Hakata Bay, proposed a somewhat different parametrisation, however the JONSWAP spectrum would be used more widely in later studies and applications.

A second important result of JONSWAP was the determination of the fetch dependence of the spectral parameters, where fetch is defined as the distance to shore. Ideally, one would like to perform these studies for a constant wind blowing perpendicular from a straight coastline. In reality this never occurs, which results in a lot of scatter in plots of measured wave parameters against fetch. This is usually somewhat hidden in log–log plots. Another problem relates to the choice of scaling variable. Quantities such as wave height and wavelength are usually presented in nondimensional form with the aid of either the wind speed at a given height or the frictional velocity. The choice is important when one extrapolates the JONSWAP results—which were obtained for fairly moderate wind speeds—to higher wind speeds, as the windspeed/friction velocity ratio is itself a function of wind speed.

Perhaps the most rewarding outcome was a better understanding of the mechanism of wave evolution. Using computations of the Boltzmann integral and simple parametrisations for wind input and dissipation it could be shown that wind input mainly occurs at medium and high frequencies and that the generation of low frequency waves—and the associated mean wavelength increase with fetch (and wave age)—is due to nonlinear interactions.

Models

The JONSWAP-results formed essential ingredients for the realisation of the second objective set out in Grundgleichungen: “to calculate fast and accurate wind sea and swell forecasts”.

Numerical wave models represent the wave spectrum on grid points and simulate their evolution in small time steps. As numerical integration of the full Boltzmann integral was prohibitively expensive, several ocean wave models were developed in Hamburg and elsewhere in which the effect of the nonlinear transfer was modelled by prescribing the spectral shape and imposing the observed dependence of the spectral parameters on the wave age. These models had skill and were used for many applications, but then an international model intercomparison (SWAMP) found that different models produced very different results in particular situations. An important step forward was made by the development of EXACT-NL [76], a model that used an approximation developed by Klaus and Susanne Hasselmann. Results were presented in Miami in 1981 but were not published until 1985.

Hasselmann launched a new initiative in 1984 known as the WAM (Wave Modelling) group in which an international team of researchers would collaborate on the further development of a model based on the Grundgleichungen. This involved the further improvement of the source terms, a new more rapid approximation to the Boltzmann integral, and implementation in many different centres. At ECMWF much work was done by Susanne Hasselmann and others at ECMWF, in particular by Liana Zambresky, Peter Janssen and Heinz Günther, each of whom spent several years in Reading installing the model on the CRAY-1, coupling it to wind fields, performing test and validation runs, introducing the model into the operational forecast cycle, and setting up routine validation against observations. Visitors from the WAM group (such as Anne Guillaume, Vince Cardone and Luigi Cavaleri and their colleagues) also made significant contributions. The model became known as the WAM model. Results were published in 1988, and later, in 1994, in the monograph “Dynamics and Modelling of Ocean Waves” [244]. This was all done under the continuous guidance of and was inspired by Klaus Hasselmann.

There was a certain amount of consensus that models constructed on the basis of fundamental physics, such as that described in Grundgleichungen, and the WAM-model in particular, would be superior to more empirical models. However, reality is complex: the WAM-model had some shortcomings, in particular in the numerics, whilst some models that did not integrate the Boltzmann equation were very well tuned and performed quite well. In practice, the quality of wind forcing was often a limiting factor. In fact, WAM was so reliable that it could detect errors in the atmospheric model used to generate surface winds.

Towards an integrated wind and wave data assimilation system

In 1985, when work on the WAM model was under way, and remote sensing from earth observing satellites became feasible, Hasselmann came up with a new and ambitious vision [74, 79, 95], namely, to run a coupled atmosphere/surface wave/ocean model, which could provide first-guess information for the retrieval of useful information from satellites, and which would assimilate all available observations in real time. This would then provide the best possible forecasts as well as an archive for climate and other research. This seemed like a pipedream in 1985, and some people were critical because of its inductive structure, as it would use model results to interpret measurements which were then used to validate the model. Nevertheless, it became a reality in the nineties, and was highly successful, helping to improve forecasting expertise and providing huge and useful datasets (ERA) for earth system research.

Heritage

After 1994, Hasselmann put his energy in other endeavours, while ocean wave research continued, building upon what he had already started. “Dynamics and Modelling of Ocean Waves” [244] became a standard reference book and remained so throughout the years, when many groups attempted to improve the representation of the various source terms. The WAM-model is still in use for both forecasting and wave climate studies.

Klaus’ dream of an integrated wind and wave data assimilation system became reality in 1998 when ECMWF started running a coupled forecasting system, where the atmospheric component of the Integrated Forecasting System (IFS) communicated with the wave model through the exchange of the Charnock parameter, which determines the roughness of the sea surface.

New ocean wave models, such as SWAN and WAVEWATCH, were developed. They are still essentially based on the Grundgleichungen as described by Hasselmann in his 1960 paper.

figure c
figure d
figure e
figure f
figure g

3.2 Remote Sensing

Prepared by Ola M. Johannessen, Guy Duchossois and Evert Attema.

From the beginning of his career, Hasselmann has been working on ocean waves. His Ph.D. thesis from 1957 dealt with the “Propagation of the von Schmidt head waves”. He later published 3 papers in the Journal of Fluid Mechanics “On the nonlinear energy transfer in gravity-wave spectrum, parts 1–3” in 1962–1963 [6, 8, 9]. When he was working with Walter Munk at IGPP, Scripps, he co-authored a paper with Munk and others about the “Propagation of ocean swell across the Pacific”, which was published in 1966 [18]. By 1970, Hasselmann and M. Schieler had already published a “remote sensing paper” in which they discussed “Radar backscatter from the sea surface” [26]. Hasselmann and his colleagues [45–49] published several important papers in 1978, which concerned radar measurements of wind and waves, which were followed by several papers about the same topic over the following years. This culminated in the international Marine Remote Sensing Experiment, MARSEN, in the North Sea between the 16th of July to the 15th of October 1979 whose objectives were: “(1) to investigate the use of remote sensing techniques for oceanographic applications and (2) to utilise remote sensing techniques in combination with in-situ oceanographic measurements to investigate oceanic processes in finite-depth water in the near-shore zone”. MARSEN was a well-integrated experiment in which six remote sensing aircraft took part including the NASA CV-990 with the JPL SAR with Omar Shemdin. 60 scientists from 6 countries took part in this very important experiment which was headed up by Hasselmann [67]. The results of the experiment were set out in several papers, 14 of which were published in a Special Issue of the JGR in 1983. One very important paper from the MARSEN Experiment entitled “Theory of SAR ocean wave imaging: A MARSEN view” appeared in the JGR in 1985 and was published by Hasselmann’s international team. The paper included a proposal for a new SAR imaging model, which would be fundamental for SAR imaging of the ocean surface from satellites SARs in the future [75].

Hasselmann had already become a key member of the ESA High Level Advisory Committee (EOAS) by 1980 (see below). Working with ESA, Klaus Hasselmann and his wife Susanne published a fundamental paper “On the nonlinear mapping of an ocean wave spectrum onto a SAR image spectrum and its inversion” [102]. A facsimile of this paper is presented below as an example of one of the major contributions the Hasselmann family made to the future of the retrieval of the ocean wave spectrum from the ERS-1 C band SAR on the global scale. Klaus and Susanne continued to contribute to the field of global SAR ocean wave spectrum research, but they drifted more and more into climate research. It appeared that their final contribution to ERS SAR Wave Mode was an extensive review entitled “The ERS SAR wave mission mode: A breakthrough in global ocean wave observations”, which was published in “ERS missions –20 years of Observing Earth, ESA SP-1326.2013” by Klaus Hasselmann as lead author and 15 co-authors [176].

In the latter half of 1970 Hasselmann was invited to join ESA’s thematic Scatterometer Expert Group-SEG and later, in 1981, he became a member of the High-level Earth Observation Advisory Committee (EOAC), which was founded by the ESA Director General. Of course, he was invited because he had seen the opportunity of using SAR from aeroplanes or satellites for ocean, wind and wave observations and had headed up the MARSEN Experiment in 1979. This expert group and committee provided outstanding scientific support for and made recommendations to ESA in various areas such as the definition of priority mission objectives, payload composition priorities, instrument performance specifications, in-orbit calibration requirements, the development of data processing algorithms, and geophysical product validation approaches. As a key contributor to this ESA Expert groups, Hasselmann played a major role in terms of the development of the mission objectives and the choice of the ERS-1 payload.

EOAC was given the following mandate by the ESA DG:

  • To review and, if necessary, revise, the mission objectives of the European remote-sensing satellite programme as defined in the 1970s.

  • To put forward an optimal configuration for the payload of the first ERS-1 mission.

The initial mission objectives, defined in the 1970s, had focused on the commercial and operational exploitation of remote-sensing applications. However, at the beginning of the 1980s, these objectives began to evolve within the Earth observation community, with the advent of worldwide programmes to study the oceans and the climate, such as the World Climate Research Programme (WCRP), the World Ocean Circulation Experiment (WOCE), and the Tropical Ocean Global Atmosphere (TOGA), which sought to answer the increasing concerns of the world scientific community, political decision-makers, and the general public over the issues of climate change and possible interactions with human activities. The new situation also required a deeper scientific understanding of the climate system, and hence also the main components of this system, namely the oceans, the polar regions, the continental land masses, and the atmosphere including the interactions between them—a field in which Hasselmann was also an expert.

The EOAC recommended the following payload for ERS-1.

  • The Active Microwave Instrumentation (AMI), combining a SAR mode and a wind scatterometer mode in the C band. The SAR wave mode was to determine the wave spectrum from 5 × 5 km mini-images collected globally every 200 km along the ground track of the orbit which was the result of Hasselmann’s involvement. It would also collect high-resolution SAR images (25 m resolution) above continental and coastal regions, and polar ice caps.

  • A radar altimeter operating in the Ku band.

  • A laser retro-reflector system for precise restitution of the satellite orbit.

Hasselmann also contributed to the selection of an “unusual” orbit scenario combining several successive orbit cycle periods (3 days, 35 days and twice 168 days repeat cycles), which would satisfy the various research communities (ice, ocean circulation, SAR land imagery, geodesy).

Following the launch of ERS-1, the exploitation of the resulting data via complex processing algorithms, some from the Hasselmann team dealing with the global spectrum of waves, led to the organisation of many ERS-1 symposia by ESA with ever increasing participant numbers (400 participants in Cannes in 1992, 500 in Hamburg in 1993, 700 in Florence in 1997…) and with specialised workshops on downstream application demonstrations (200 participants in Toledo in 1994, London in 1995 and Zurich in 1996…). These ERS-1 symposia provided Hasselmann with opportunities to present the results of his team’s work on wave mechanisms and global ocean wave spectra retrieved via the 5 × 5 km SAR images [e.g., 108, 115, 123, 124].

Hasselmann’s expertise and ability to analyse and propose solutions to issues raised for ERS-1 was exceptional. He was the object of a general admiration by the entire ERS-1 team and was invited by ESA to attend the successful launch of ERS-1 in July 1991 in Kourou, Guyana. This was an opportunity for ESA to thank him for his dedication and valuable contributions to the success of this mission.

ERS-1 and its successor ERS-2, which was launched in 1995, paved the way to the successful Envisat mission, which was launched in 2002. Together, these three missions provided some 20 years of continuous data as recalled during ERS’ 20th anniversary celebration at ESRIN Frascati in 2011 which was attended by some of the pioneers including Hasselmann [176]. These early missions were the precursors of the current joint ESA-EU Copernicus programme and the Sentinel mission series, making Europe a world leader in Earth observation and environment monitoring.

As previously mentioned, Hasselmann was an extremely important contributor during the early days of the ERS development. He was a fast talker with strong opinions. Not everybody could follow all of his complicated theories. Within his own scientific “bubble” he may not have been accustomed to much opposition in a debate, but he would always be open to accepting the opinions of his opponents if supported by correct theories and/or empirical evidence.

In addition to the political support for the ERS mission, Hasselmann was also very important within the scientific community. This was badly needed because, in the early days, reactions within the ERS scientific user community were very negative and even hostile. Today, following decades of successful application development, all opposition has clearly vanished.

Hasselmann’s dedication is demonstrated by the following anecdote: in a SEG meeting he complained about the slow speed and high cost of industry studies, something “he could do in a couple of days with some of his students”. ESA said: “great, let’s do it. You have a week after which we’ll come to your Institute on Friday to review the results”. We found Hasselmann in his office submerged in paperwork, computer printouts, and graphics—but not quite with a conclusive answer despite his own efforts as well as those of his wife and some students who had reportedly spent several days and nights carrying out the research.

The SEG was a special group which included experts from ESA and scientific institutions as well as from industry. To avoid the complications involved in defining formal responsibilities, industry was no longer represented in the C/D phase. All members, especially of the SEG, were pioneers who had never been involved in a similar project before. The SEG host would normally present issues to the team asking for answers and/or recommendations. The SEG, including Hasselmann, actively participated in the discussions about such things as the required image size and the tracking distance between them needed to calculate the global ocean wave spectra.

figure a

Scatterometer Expert Group (SEG) meeting with industry representatives in 1980: from left to right David Offiler (UK Met Office), Tim Tucker (UK National Oceanography Centre), Werner Alpers (University of Hamburg), Evert Attema (Technical University Delft—later with ESA), Gert Dieterle (ESA), Alf Long (ESA), Gerbrand Komen (KNMI), Klaus Hasselmann (Max Planck Institute for Meteorology), Laurence Gray (CCRS), Juan Guijarro (ESA), Dave Lancashire (formerly Marconi Space Ltd., currently Airbus Defence and Space)

figure h
figure i
figure j
figure k
figure l
figure m
figure n
figure o
figure p
figure q
figure r
figure s
figure t
figure u
figure v
figure w
figure x

3.3 Stochastic Climate Model

Prepared by Peter Lemke.

The difficulty in modelling the climate system is not only due to the variety of physical processes involved, but also to a large extent to the fact that the interacting components are characterised by rather different internal timescales: the atmosphere—several days, sea ice and the oceanic surface layer—several months, the deep ocean—several centuries, and the continental ice masses—many millennia. Even if all processes influencing climate variations were completely understood, the fact that the different sub-systems respond over different timescales would still cause considerable problems in numerical modelling.

All models of the individual sub-systems, such as atmosphere, ocean, or ice, with realistic geographical resolutions have been designed for a single timescale range so as to prognostically describe the typical fluctuations of these components. The influence of more rapid processes than the prognostic regime is parameterised by the prognostic variables using the temporal average over the rapid processes. Any components that vary on longer timescales than the prognostic regime are treated as constant boundary values or external parameters.

In many of the climate models used at in the 1960/70 s, the atmosphere was not explicitly included and was therefore placed in the model’s statistical-diagnostic regime and was represented only through temporally averaged terms. However, in his seminal paper on Stochastic Climate Models published in Tellus in 1976 ([18]; see facsimile below) Klaus Hasselmann pointed out that the atmosphere's influence is not limited to these temporally averaged terms and that its variability must also be considered. This results in differential equations for the slow components of the climate system, which include stochastic forcing terms. These short-term atmospheric variations cause (analogous to the Brownian motion) long-term fluctuations in the slow subsystems, which explains the observed red spectrum of the slow climate variables. The theory of Brownian motion has been discussed in many applications since Einstein’s paper in 1905 but had not yet been applied to geophysical systems, such as the climate system.

In his stochastic climate modelling approach, Hasselmann made use of a time-scale separation: a slowly varying dynamic climate variable under the influence of short-term atmospheric variations represented as white noise. Applications of stochastic climate models typically use linearized dynamic equations that describe small fluctuations around an equilibrium state. This approach represents a First Order Markov Process, which is characterised by a memory-term and white noise forcing, and it results in a red spectrum for the slow climate variables.

In two follow-up papers co-published in 1977 with Claude Frankignoul [39] and Peter Lemke,Footnote 3 the applicability of the concept was demonstrated through an analysis of sea surface temperatures and thermocline variability and with a global energy balance model. A large variety of different applications of this stochastic approach followed over the subsequent years.

One may of course ask if all this was really new. To some extent it was not, as certain ideas typically float around within the scientific community. In his interview, Hasselmann himself refers to J.M. Mitchell and his 1966-paperFootnote 4: “the same concept on the generation of different frequency domains of climate variability by the successive forcing of longer timescales by shorter timescales”. Another physicist thinking about such concepts was Chuck Leith, but in hindsight these approaches have not received much attention and did not cause the great epistemological step forward that Hasselmann’s physical approach and construction did.

figure y
figure z
figure aa
figure ab
figure ac
figure ad
figure ae
figure af
figure ag
figure ah
figure ai
figure aj
figure ak

3.4 Reducing the Phase Space: Signal-to-Noise Analysis and Detection and Attribution

Prepared by Hans von Storch incorporating comments by Peter Lemke.

Some would argue that the most significant part of Hasselmann’s legacy would be the introduction of the stochastic dimension in the dynamical and analytical concept of the climate system. The first paper that received a great deal of international attention was the one in which he introduced the “stochastic climate model” [38] in 1976 (see Sect. 3.3). Indeed, this first paperFootnote 5 (see facsimile in Sect. 3.3) made use of scale separation—a long-term dynamic, given by a climate variable—under the influence of short-term variations summarised as white (or red) noise. One can see this as a separation of two parts of the phase space, one defined by long-term fluctuations, and the remainder as short-term fluctuations. After setting out a few assumptions and discretisation, the prototype of the concept was encapsulated in an autoregressive first order process, with the conclusion that even in the absence of any force acting upon the slow dynamics, the system would show variations on all time scales because of the presence of the white noise of the short-term variability.

In the 1980s, Hasselmann formulated a more general conceptFootnote 6 of “Principal Oscillation Patterns” (POPs) in an—again barely comprehensible, but never published—manuscript and asked Hans von Storch to “bring it to life”. He did so, but only after simplifying or “vulgarising” the concept, such that a workable version finally emerged, even if the basic idea was less clear [94]. Hasselmann saved the original concept by introducing a new term: “Principal Interaction Patterns” (PIPs), which, however, never became popular—at least so far. His 1988 paper on “Principal Interaction patterns” [86] (see facsimile below) spelled out the idea that it would be possible to divide the phase space into two sets, one with a finite number of dimensions, within which the core of the dynamics would play out. The basis spanning this space, were the PIPs. The rest, spanned by very many if not an infinite number of dimensions would contribute to the core dynamics, but in a kind of slave mode – either independent noise, or noise conditioned by the state of the PIPs (colloquially referred to as “parametrization”). This concept, of a low-dimensional dynamically active part of the phase space, and the high-dimensional part essentially operating as (conditional) noise is at the core of his conceptualization of the stochastic climate system.

Hasselmann had thus introduced a new paradigm,Footnote 7 certainly worthy of a Director of a Max-Planck Institute, which enabled an understanding of climate variability based on a split between signals (related to specific causes) and background “noise”. But the concept led to another practice in the analysis of climate variability and responses, i.e., the challenge of separating the two components; to find the relevant signal within the sea of noise. To achieve this, Hasselmann introduced the “detection and attribution” concept [54], which involved a 2-step process. First, in the detection-step, the relevant change is examined to see if it falls within the range of natural variability. This is done via a conventional statistical hypothesis test. If the result of the first step is the successful rejection of the null hypothesis “consistent with unprovoked variability”, then the change is compared to one or several theories derived from numerical experimentation, theoretical arguments, or independent statistical analysis in a second step. If a good fit is found, then the conclusion is drawn that the relevant change can be attributed to the relevant factor(s). This attribution takes the form of a non-rejection of a null-hypothesis and, as such, represents a weaker argument than the successful detection. Hasselmann added another level of complexity by suggesting that one should optimize the potential signal, allowing for an a priori expectation of a favourable signal-to-noise ratio, but this elegant component was hardly ever used.

Again, the original paper posed a challenge for the reader, but Hasselmann wrote an updated version about 15 years later, which was extremely clear and easy to comprehend [110]. Later still he followed this with a version in which he employed Bayesian concepts [138].

The concept was used successfully for investigating whether an external signal, as suggested by climate model simulations, would be detectable in the observational record of global temperatures, and to attribute the change to human emissions [125,135] since the beginning of industrialization. The technique was adopted by the IDAGFootnote 8 group at various corners of the academic world and eventually became a corner stone in IPCC reports: the fingerprint of human activity in changing the global climate.

The insight that there is internal variability, i.e., variability unrelated to an external force, or sometimes simply called “noise” had already been floating around in scientific circles: in the early 1970s scientists in the USA had already noticed the omnipresence of internal variability, but not the constructive role played by this noise in the formation of gradual variations and signals. The need to discriminate between signal and noise when evaluating the outcome of numerical experiments with global atmospheric models was introduced in the early 1970s.Footnote 9 The challenge of detecting a human signal in the observational record had gradually been addressed by others in the 1980s, notably by Jerry North,Footnote 10 Ben Santer, and Tom Wigley.Footnote 11

figure al
figure am
figure an
figure ao
figure ap
figure aq
figure ar

3.5 Climate and Society

Prepared by Dmitry V. Kovalevsky, with some additions by Hans von Storch.

The insight that climate change is not merely a natural science issue, and its link to policy making and coordinated climate action shaped Klaus Hasselmann’s interest in socioeconomic modelling and modelling human decision-making, both in the context of climate action. These modelling activities resulted in several publications, and also in his active contributions to recent major research projects.Footnote 12

Hasselmann was quite skeptical about certain dominant approaches and paradigms of mainstream economics, arguing that their basic assumptions do not adequately reflect the ways in which economic stakeholders interact and shape decision making, and seeing these conceptual shortcomings as a reason for their limited ability to describe and predict real-world economic processes—especially when things go worse than usual. For instance, during the 2008 global economic crisis Hasselmann was highlighting the need for rethinking certain paradigms of economic modelling in his talks and papers, as no mainstream models had been able to foresee and predict the coming crisis.

In particular, Hasselmann was skeptical about computable general equilibrium (CGE) models. His criticism of this was primarily based on two objections: on one hand, at the conceptual level, he argued that real-world economic processes were fundamentally out of equilibrium; on the other, he referred to the technical difficulties involved in calibrating and validating the model, and questioned the extent to which economic data could effectively support overly complex multi-regional, multi-sector applied CGE.

Hasselmann also adopted a critical position towards another cornerstone of mainstream economic modelling, intertemporal optimisation, which is common to the majority of economic growth models and, accordingly, broadly used in relation to the economics of climate change and in integrated assessment models (IAM). From a conceptual perspective, in his opinion, neither individual nor collective decision making follows the mechanism that has been translated to the mathematics of intertemporal optimisation schemes. In addition, following the publication of the influential Stern Review in 2006, a lively discussion emerged concerning the economics of climate change particularly in relation to the high sensitivity of IAM to the value of discount rate, a fundamental parameter of intertemporal optimisation models, and therefore about the ‘correct’ value of this parameter. Hasselmann also participated in this debate, and it made him even more concerned about the extent to which the intertemporal optimisation approach could serve as a solid basis for informing climate action.

Concerns about excessive complexity and the related difficulties of calibration and validation mentioned above with respect to CGE, were also the reasons for Hasselmann’s mixed feelings towards certain innovative modelling approaches, such as agent-based modelling (ABM), that have been very popular since the 1990s. Whilst acknowledging that ABM is conceptually much more satisfactory than, say, CGE, in the attempt to describe the decision-making processes of various stakeholders, he questioned whether a high level of disaggregation of ABMs with their extremely large populations of individual agents is really justified, and whether such strongly disaggregated models can be reliably supported by the available data.

Given his creativity and independence of mind, Hasselmann followed his own, original way when it came to modelling the dynamics of coupled climate-socioeconomic systems for the reasons outlined above. He used the term actor-based system dynamics modelling to describe his approach to the construction of socioeconomic models. In essence, his approach involves describing a socioeconomic system via a dynamic model that includes a few interacting aggregate actors, pursuing their own, often conflicting, goals. Mathematically, the system is described by ordinary differential equations, and stakeholder decision making is also parameterised within this mathematical scheme using actor control strategies. Unlike in CGE, the socioeconomic dynamics are described as fundamentally lacking equilibrium. Unlike intertemporal optimisation models, this one is mathematically a dynamic system, and the maximisation of any goal functions is avoided. Unlike in ABM, there are only a few aggregate stakeholders included in the model, rather than a very large population of individual actors, which in turn reduces the dimensionality of the model.

Some of actor-based system dynamics models developed by Hasselmann are of moderate complexity, with relatively few variables and parameters. Some of the simplest models of this kind could, in principle, be designed to partially allow analytical treatment. Despite the fact that analytical work carried out using paper and pencil is still very much respected, for example, in the realm of mainstream economics, Hasselmann sees actor-based system dynamics modelling as a substantially numerical approach. He developed these models for simulations and numerical experiments, not for the elegance of abstract thinking. For Klaus Hasselmann, the narrative told by a model and the results of ‘what-if’ simulations are ultimately important, as opposed to rigorous propositions and their proofs so popular in the realm of mathematical economics.

Another important element of the actor-based system dynamics approach, to which Hasselmann continuously draws attention in his papers, is a strategy for developing a hierarchy of model families. A hierarchy should start with designing the simplest possible root model and this model should be thoroughly explored via simulations to determine its strengths and limitations. Based on this experience, the complexity of the model can then be increased by adding new actors and processes. This yields one or several models at the next level of the hierarchy, after which the model building process is reiterated. However, as Hasselmann stresses, there is no need for such a model tree to grow infinitely high: the complexity of models should not go beyond the level at which they are no longer supported by the available data.

Hasselmann’s early thoughts on the topic of coupled climate-socioeconomic modelling are reflected in his 1991 conference paper “How Well Can We Predict the Climate Crisis?” [99], a facsimile of which is reproduced below. The term “climate crisis”, which appears in the title, was already in use when the paper was published but was not at all as widespread as it currently is: its permanent inclusion in the climate change related lexicon came at a much later date. Whilst most of the paper is devoted to the natural science-related aspects of climate change, the first and the last sections are particularly relevant to the current topic.

The introductory section calls for the development of a comprehensive Global Environment and Man (GEM) model and sketches out its conceptual design. He later renamed it “Global Environment and Society” (GES). It includes a review of the building blocks from which a GES model could be assembled which are, paradoxically, already available and the yet missing. In the subsequent parts of the paper, Hasselmann also discusses the required improvements in some of these building blocks, which are yet to be made. Another remarkable point in the Introduction is the stress he places on the interplay between climate and environmental change problems.

figure b

Klaus Hasselmann’s GES model

In the concluding section, Hasselmann makes several important points relating to the development of GEM-type models that have later been explored in more detail in his own socioeconomic modelling studies, and that one can now see were (and still are) important points for many other researchers in this field. Hasselmann highlights the dynamical and multi-time-scale nature of the GEM system, both in its natural science and human parts. He argues that the inherent uncertainty of all GEM model components calls for the development of GEM-like models as statistical optimisation models. Hasselmann reminds us that the development of GEM should not be seen as “curiosity-driven science”, but rather that its ultimate objective is to inform climate-related policy making. Finally, he warns that the limits of our knowledge and the uncertainties inherent in the model that are alluded to above are not an excuse for postponing co-ordinated climate action or waiting for a “perfect” model instead.

The GES approach has met with some critical reactions, one of which involves the question as to whether it would even be possible to define a “global welfare function”. Another was that the system reduced society, and the variety of cultures to the choice of a global welfare function, whilst the determination of policy and measures, conditional upon the welfare function, would be a matter only for experts. It was argued that the objective determination of the adaptation and of the abatement costs would not be possible, but that these costs would go through a filter of—possibly interest-led—experts, modified by a variety of different social constructs, so that society would not respond to the state of the environment but to the perception of the state of the environment.Footnote 13

These comments did not target the concept or mathematical implementation of actor-based system dynamics approach proposed by Hasselmann; rather, they broadly apply to the overall architecture and design of such models of the economics of climate change and integrated assessment models, and therefore, so far remain unanswered by the majority of mainstream models used in this area.

figure as
figure at
figure au
figure av
figure aw
figure ax
figure ay
figure az
figure ba
figure bb
figure bc
figure bd
figure be
figure bf
figure bg
figure bh
figure bi
figure bj
figure bk

3.6 Strategy in Climate Modelling at MPI

By Lennart Bengtsson.

As computers became more powerful, comprehensive modelling of the Earth’s climate system developed rapidly and increasingly came to be considered as a key tool for gaining a better understanding of the climate system. Progress had been made in the use of global weather prediction models in particular at ECMWF, which was established by a number of European member states in 1979. The parameterisation of many physical and fine scale dynamical processes requires systematic experimentation in operational daily predictions to define a well-functioning forecasting model. To use such a well-tested global model for climate simulation and forecasting and in much longer integration was found to be a most useful and a practical strategy, as many aspects of the weather forecast model could naturally be adjusted to climate research.

Following discussions between Klaus Hasselmann and ECMWF it was agreed that this was the approach to be taken. The ECHAM model was than combined with different ocean models developed at MPI and the University of Hamburg in such a way as the exchange of energy and momentum fluxes between atmosphere and oceans could be handled consistently. Different subsystems, such as that for atmospheric transport and the full carbon cycle were subsequently added and integrated into the coupled atmospheric-ocean model.

Thus, from the 1990s onward, the MPI had a comprehensive set of climate models that constituted a numerical laboratory for all kinds of climate studies including climate change simulation studies made available for all IPCC assessments. The system was set up in a systematic and flexible way, which made all sorts of climate studies possible. It was not only used by scientists at the MPI and the University of Hamburg but also by a large number of research groups in Germany as well by associated European groups and by visiting scientists from all over the world. Important studies to understand and predict the ENSO phenomenon as well as tropical and extra-tropical cyclone were carried out successfully. We were happy to learn that the ECHAM model was found to be one of the most realistic ones in several evaluation studies, as its results came closest to the observed climate.

Of particular importance were the diagnostic systems developed by Klaus Hasselmann and his group, the aim of which was to identify anthropogenic climate change through a multi-dimensional search for a climate change “fingerprint” in the modelled data sets. This turned out to be a powerful tool for enabling the detection of climate change as early as the late 1980s. It played a very important role for the IPCC in its bid to convince the world that anthropogenic climate change is really happening. Today, 30-years later, the signal of climate change is obvious for everybody.

One crucial factor in the successful modelling work was the positive, unbureaucratic, and open atmosphere that was due in large part to Klaus Hasselmann’s clear mind and stimulating personality.

figure bl
figure bm
figure bn
figure bo
figure bp
figure bq
figure br

3.7 Metrons—Particle Physics

Prepared by Susanne Hasselmann.

Inspired by his work on interactions between ocean surface waves and other wave phenomena, which he approached using perturbation theory with the aid of Feynman diagrams in the 1960s, Klaus Hasselmann went on to develop a unified deterministic theory of fields and particles thereby realising Einstein’s dream of a deterministic description of all elementary particles and their interactions [121, 122, 131, 132]. Quantum theory is regarded only as a first approximation.

The theory is based on solutions to an extension of Einstein’s vacuum equation with an additional attenuation tensor using solitons or solitary waves, which he set out in papers published in 1996 and 1997. These solutions were then verified in computational models.

In his twelve-dimensional theory (four space–time and eight extra dimensions, representing interacting non gravitational wave lengths as well as electromagnetic, strong, and weak forces.) he takes a classical view of real particles and their guiding force free waves (de Broglie waves). He sees solutions in which solitons are trapped by waveguides, in accordance with the theory proposed by Theodor Kaluza and Oskar Klein, as being exemplary.

His theory was rejected by particle physicists. Reactions ranged from polite smiles to pronounced aggression, as already stated above in the interview between Hasselmann, Olbers, and von Storch (Sect. 2.1). So, he started writing a book: The Metron Model: A Classical Unified Theory of Fields and Particles. We present the first chapter below, which contains the basic theory and a short description of the following chapters in an overview section.

Chapters 5–8 have not yet been completed. The manuscript can be downloaded in its present state. The model programmes, which are written in Fortran V, are also available from Susanne Hasselmann.

Maybe someday, when the careers of young physicists no longer depend on the sheer volume of their publications but rather on their originality, some young physicist may feel inspired to flesh out and complete the metron theory: this is the hope of Klaus and Susanne.

figure bs
figure bt
figure bu
figure bv
figure bw
figure bx
figure by
figure bz
figure ca
figure cb
figure cc
figure cd
figure ce
figure cf
figure cg
figure ch
figure ci
figure cj
figure ck
figure cl
figure cm
figure cn
figure co