Towards a Calculus of Redundancy

Open Access
Part of the Qualitative and Quantitative Analysis of Scientific and Scholarly Communication book series (QQASSC)


In this chapter, I extend Shannon’s linear model of communication into a model in which communication is differentiated both vertically and horizontally (Simon, 1973). Following Weaver (1949), three layers are distinguished operating in relation to one another: (i) at level A, the events are sequenced historically along the arrow of time, generating Shannon-type information (that is, uncertainty); (ii) the incursion of meanings at level B is referential to (iii) horizons of meaning spanned by codes in the communication at level C. In other words, relations at level A are first distinguished from correlations among patterns of relations and non-relations at level B. The correlations span a vector space on top of the network of relations. Relations are positioned in this vector space and can then be provided with meaning. Different positions provide other perspectives and horizons of meaning. Perspectives can overlap, for example, in Triple-Helix relations. Overlapping perspectives can generate redundancies—that is, new options—as a result of synergies.

In this chapter, I extend Shannon’s linear model of communication into a model in which communication is differentiated both vertically and horizontally (Simon, 1973). Following Weaver (1949), three layers are distinguished operating in relation to one another: (i) at level A, the events are sequenced historically along the arrow of time, generating Shannon-type information (that is, uncertainty); (ii) the incursion of meanings at level B is referential to (iii) horizons of meaning spanned by codes in the communication at level C. In other words, relations at level A are first distinguished from correlations among patterns of relations and non-relations at level B. The correlations span a vector space on top of the network of relations. Relations are positioned in this vector space and can then be provided with meaning. Different positions provide other perspectives and horizons of meaning. Perspectives can overlap, for example, in Triple-Helix relations. Overlapping perspectives can generate redundancies—that is, new options—as a result of synergies.

In the opening statements of A Mathematical Theory of Communications, Shannon (1948, at p. 3) emphasized that the semantic aspects of communication are irrelevant to the engineering problem.” Information can be defined as “uncertainty” and is not “informative” in the sense of reducing uncertainty. Although Shannon’s coauthor Weaver called this definition “bizarre,” he considered the change of perspective as “potentially so penetratingly clearing the air that one is now, perhaps for the first time, ready for a real theory of meaning” (at p. 27). Weaver (1949, p. 8) emphasized that “information must not be confused with meaning.” Varela (1979, at p. 266), however, argued for defining “information” in accordance with the semantic root of the word “in-formare.” Bateson’s (1973) aphorism of information as “a difference which makes a difference” defines information as “meaningful information” and has been widely accepted among cyberneticians (e.g., Scott, 2004).

In my opinion, meanings can be attributed to information from the perspective of hindsight and with reference to other possible meanings. Meaning is thus not added to the information, but events can be considered from different perspectives. Whereas Shannon-type information is generated in relations (between a sender and a receiver), meaning is provided from a position in a network of relations. Positions are based on correlations among patterns of relations and non-relations. The correlations span a vector space with dimensions (“eigenvectors”) on top of the network of relations. The vector space and the network graph can be considered as different evaluations of the events. First, information is generated operationally by links between senders and receivers. Second, providing meaning to information assumes a position in the network as an aggregate of nodes and links; and third, positions provide perspectives.

4.1 The Network Graph and the Vector Space

As a first step in the specification of a theory of meaning within the framework provided by information theory, Weaver (1949, at p. 26) proposed two “minor additions” to Shannon’s linear diagram of a communication channel (Fig. 4.1). 
Fig. 4.1

Weaver’s (1949) “minor” additions penciled into Shannon’s (1948) diagram of a communication channel. Source: Leydesdorff (2016), p. 282

Weaver explained these extensions—the box labeled “semantic noise” and the one labeled “semantic receiver”—as follows:

One can imagine, as an addition to the diagram, another box labeled “Semantic Receiver” interposed between the engineering receiver (which changes signals into messages) and the destination. This semantic receiver subjects the message to a second decoding, the demand on this one being that it must match the statistical semantic characteristics of the message with the statistical semantic capacities of the totality of receivers, or of that subset of receivers which constitute the audience one wishes to affect.

Similarly, one can imagine another box in the diagram which, inserted between the information source and the transmitter, would be labeled “semantic noise,” the box previously labeled as simply “noise” now being labeled “engineering noise.” From this source is imposed into the signal the perturbations or distortions of meaning which are not intended by the source but which inescapably affect the destination. And the problem of semantic decoding must take this semantic noise into account.

A “semantic receiver” recodes the information in the messages received from the engineering “receiver,” while the latter can only change signals into messages. The semantic receiver is able to distinguish the signals from the noise. However, “the semantic aspects” were defined by Shannon as external to the model. Therefore, the relation between the two newly added boxes cannot be considered as communication of Shannon-type information.

Can this semantic dimension of the communication be considered another (non-Shannon) transfer mechanism? Meanings cannot be communicated, but they can be shared and organized depending on positions and perspectives, even without requiring a direct communication relation. Semantics are based not on relations, but on patterns of relations or, in other words, correlations. For example, two firms (at the nodes of a network) may have similar patterns of relations with their clients without necessarily relating directly to one another (Burt, 1982). Two synonyms, analogously, can occupy a similar position in a vector space of word co-occurrences without any empirical co-occurrences in the domain under study.

In the case of a single relation, the relational distance is not different from the correlational one; but in the case of three (or more) interacting nodes (Fig. 4.2), distances in the vector space can be very different from distances in the network (e.g., geodesics).
Fig. 4.2

Relational distance between and structurally equivalent positions of A and B. Source: Leydesdorff et al. (2018, p. 1185)

The graph in the left-hand panel of Fig. 4.2, for example, represents a configuration of empirically observable nodes and links. The edges correspond to the ones in the right-hand panel. However, the zeros in the right-hand panel are equally included when defining the vector space. The shortest distance between A and B in the left-hand panel is two. The positional distance between A and B is zero, since the Pearson correlation rAB = 1.0: A and B are at precisely the same position in this network.

As against Shannon-type information which flows linearly from the sender to the receiver, one can expect meanings to loop, and thereby to develop next-order dimensionalities (Krippendorff, 2009a, 2009b). Horizons of meaning are spanned by codes evolving in the communication. Overlapping codes may generate redundancies by describing the same events from different perspectives.

Redundancy can be measured if the maximum entropy can be defined or, in other words, the system of reference be specified. Whereas information (Shannon’s H) measures the number of options that have already been realized, redundancy measures the number of options that could alternatively have been realized. In other words, the zeros—such as the ones in the right-hand pane of Fig. 4.2—do not add to the information, but they add to the redundancy.

4.2 Dimensions and Dynamics of Information

A communication matrix is shaped when a vertical distinction—such as the levels distinguished by Weaver—is added to the horizontal channel (vector) of communications in the Shannon model (Fig. 4.1). A matrix can be considered as a two-dimensional aggregate of one-dimensional vectors. Whereas each vector models relations, a matrix can represent both relations and positions (see Fig. 4.2). The vectors are positioned in the matrix, for example, by a sequence number. However, a matrix contains also a structure different from and orthogonal to the sum of the vectors of relations. Structures can operate as selection environments; for example, providing meanings to the variation.

In Fig. 4.3a, each slice represents a communication matrix at a specific time; the repetition over time adds the third dimension. The development of information in a three-dimensional array can be visualized as a historical trajectory; the uncertainty is then organized over time (depicted as a cylinder in the cube of Fig. 4.3a). A four-dimensional array or hyper-cube of information is more difficult to imagine or represent graphically. However, a four-dimensional array can, among other things, contain a regime as a next-order feedback on historical developments along trajectories (Fig. 4.3b; cf. Dosi, 1982).
Fig. 4.3

a left and b right: A three-dimensional array of information can contain a trajectory; a four-dimensional hypercube contains one more degree of freedom and thus a variety of possible trajectories. Adapted from Leydesdorff (1997, p. 29)

In other words: a regime has one degree of freedom more than a trajectory and can thus “select” among the possible trajectories as representations in three dimensions of the system’s history (Fig. 4.3b). The additional degree of freedom provides room for another selection within an emerging system: a selection of one sub-dynamic or another. When mutual selections are repeated, a trajectory can be shaped in a co-evolution or “mutual shaping.” Whereas a trajectory is organized in history, a next-order regime provides meta-historical selection pressure in terms of expectations. In this fourth (or higher) dimension, one trajectory can be “weighted” differently from another. Each selection can refine the self-organization of a system of selections. Refinements can be expected to add to the performativity of a system.

For the intuitive understanding, it may be helpful to consider ourselves as psychologies with the reflexive capacity to reconstruct possible representations of our personal histories from the perspective of hindsight. For example, one might tell a story at work differently from what one could say at home. I suggest reading Luhmann’s model as a proposal to consider the social system of communications as a system without psychological consciousness, but with a similar complexity. Communications can be expected to entertain different representations of the history and organization of communications. Communications and consciousness are substantively different.

Whereas a psychological system operates in terms of individual consciousness and tends towards integration (Haken & Portugali, 2014), a communication system can be expected to remain distributed as a “dividuum” (Luhmann, 1984: 625; cf. Nietzsche, [1878] 1967: 76); this additional degree of freedom allows for the processing of more complexity at the supra-individual level than would be possible as the sum of individual processes. As a next-order system, the communications can thus provide a regime to the communicating individuals developing along historical trajectories at a one-lower level. Since communication systems are not biologically alive, they do not need to be integrated and constrained in terms of life-cycles.

In summary: whereas variation can be modeled as a one-dimensional vector, a two-dimensional matrix can represent selection and coordination mechanisms leading potentially to trajectories as stabilizations of the uncertainty over time. Codes in the communication add one more selection mechanism and make globalization at the regime level possible. Selections can be meta-selected for stabilization along trajectories, and some stabilizations can be selected for globalization at a regime level. Stabilizations are historical and can be at variance. They can thus be considered as providing a second-order variation; globalization functions analogously as a next-order selection. Because the second-order selections (regimes) select on the second-order variation (stabilizations along trajectories) in parallel to first-order variations and selection, the operations loop into themselves and one another with the resulting complexity and the possibility of self-organization, leading to unintended consequences. (What can be considered first- and second-order may change over time.) The loops are not hierarchically organized, but can interact and thus disturb one another.

Since the communication of information and the sharing of meanings operate in terms of recursive and incursive selections, the historical origin of the variation may no longer be visible in the present after a series of selective rewrites. Both the historical trajectories and the evolutionary regimes can be expected to change, but at different speeds or, in other words, without a priori synchronization. The two momenta of historical development (at the trajectory level) and evolutionary change (at the regime level) relate in dynamic trade-offs. The regime is instantiated as a meta-historical selection environment pending on the historical trajectories.

For example, airplane series such as the DC3 to the DC9 are developed along trajectories, but the introduction of the jet-engine as a replacement of the propeller motor was a systems innovation (Frenken & Leydesdorff, 2000). While helicopters are developed in another regime, the discontinuity between propeller airplanes and jet aircraft can be a change at the trajectory and/or regime level. One would need empirical research for answering this question. Dosi (1982, p. 152), for example, provided operational definitions for regimes (or paradigms) and historical trajectories, as follows:

In broad analogy with the Kuhnian definition of a “scientific paradigm,” we shall define a “technological paradigm” as “model” and a “pattern” of solution of selected technological problems, based on selected principles derived from natural sciences and on selected material technologies.[…].

As “normal science” is the “actualization of a promise” contained in a scientific paradigm, so is “technical progress” defined by a certain “technological paradigm”. We will define a technological trajectory as the pattern of “normal” problem solving activity (i.e. of “progress”) on the ground of a technological paradigm.

Note that Dosi (1982) articulated a model with three selection environments operating upon one another. This predates the neo-evolutionary version of the Triple-Helix model by two decades. However, Dosi did not elaborate specifically the evolutionary model (Andersen, 1994).

The metaphor of hill-climbing is also used in this context: hills are climbed along trajectories. However, climbing is different at night or during the day, and the difference between day and night is meta-historical for the hill-climbing agents. In terms of Dosi’s above definitions, the technological problems may be differently selected in daylight than during the night. In his article about “objectivity” in the social and cultural sciences, Max Weber used this same metaphor when he expressed change in the dynamics at the supra-individual level of a regime, as follows:

[…] at one moment or another, the color will change: the meaning of the perspective which was used without reflection, will become insecure; the road seems now to lead into zones of twilight. The light of the important problems of the culture has advanced. At such moments, the sciences have to provide themselves with the means of changing position and of changing their methodological apparatus, in order reflexively to grasp the higher grounds of reasoning from which to look down on the stream of history. Science follows the constellations which make it a meaningful enterprise. (Weber [1904] 31968, p. 214.)

Changes at the regime level happen beyond control; changes at the trajectory level can be organized by agency (e.g., entrepreneurs).

4.3 Levels B and C in the Shannon Diagram

In addition to proposing the two new boxes in Shannon’s diagram (Fig. 4.1), Weaver (1949, p. 24) suggested adding to this diagram the levels B and C: meaning is conveyed at level B, and the received meaning can affect behavior at level C (because codes are genotypical and binding). Elaborating Figs. 4.1 and 4.4 shows a scheme for distinguishing among these three levels.
Fig. 4.4

Levels B and C added to the Shannon diagram (in red-brown and dark-blue, respectively). Source: Leydesdorff (2016), p. 283

As noted above, the relations among a semantic receiver and semantic noise at level B are based on correlations among sets of relations at level A. In the vector space thus constructed at level B, meanings can be shared, while information continues to be communicated in the links at level A. The use of language facilitates and potentially reinforces the options for sharing (and distinguishing!) meanings at level B. Natural languages provide opportunities to develop semantics; symbolic meanings, however, require codes to operate in the communications.

Codes of communication are invoked from level C for regulating the use of language. The codes enable us, among other things,1 to short-cut the communication; for example, by paying the price for something instead of negotiating using language. The codes enable us to make the communications far more efficient than is possible in natural languages. The communication can both vertically and horizontally be differentiated: horizontally in terms of different codes operating in parallel and vertically between historical organization and evolutionary self-organization. In the following sections, these two differentiations are related.

4.4 Scholarly Discourse and Codification

The tension between historical organization and evolutionary self-organization is articulated in the sociology of science as the difference between “group” and “field”-level dynamics. Following up on his (1976) historical analysis of “Le champ scientifique,” for example, Bourdieu (2004, at p. 83) added a further reflection on the study of the sciences in his book, entitled Science of Science and Reflexivity. He formulated as follows:

Each field (discipline) is the site of a specific legality (a nomos), a product of history, which is embodied in the objective regularities of the functioning of the field and, more precisely, in the mechanisms governing the circulation of information, in the logic of the allocation of rewards, etc., and in the scientific habitus produced by the field, which are the condition of the functioning of the field […].

What are called epistemic criteria are the formalization of the “rules of the game” that have to be observed in the field, that is, of the sociological rules of interactions within the field, in particular, rules of argumentation or norms of communication. Argumentation is a collective process performed before an audience and subject to rules.

From a very different perspective, Popper (1972) denoted the domain of supra-individual codifications as World 3. Bourdieu (2004; at p. 78) called this transition from “objectivity” to “intersubjectivity” a “Kantian” or transcendental turn. However, the philosopher to be associated with this transition, is Husserl, who criticized the empiristic self-understanding of the modern (European) sciences (Husserl, [1935/36] 1962). According to Husserl ([1929] 1960, at p. 155), the possibility to communicate expectations intersubjectively grounds the empirical sciences “in a concrete theory of science.” In Chap.  2, I called this the communicative turn in the philosophy of science.

Neither Popper nor Husserl specified the evolutionary dynamics of expectations in terms of or in relation to communications. I shall argue that the dynamics of res cogitans can be further specified information-theoretically. The symbolically generalized codes in the communication enable us to multiply meanings at the intersubjective level—that is, within the communication—as new options. The proliferation of expectations can take place in a techno-cultural evolution at a speed much faster than in biological evolution.

The intersubjective layer of expectations codes and structures the communications. The different codes can be recombined and reconstructed in translations. At level B, meanings are instantiated in specific combinations of codes, while at level C the codes themselves evolve in response to the integrations in the instantiations as historical events. The superstructure of codes continues to be driven into differentiations by the need to cope with the increasing complexity of the communication at the bottom. At this level A, the probabilistic entropy (H) increases because of the coupling of information to entropy and the second law of thermodynamics.

4.5 Redundancy and Evolution

Shannon (1948) defined information (H) as probabilistic entropy \(\left[ {H = -\Sigma _{{\text{i}}} p_{{\text{i}}} *\log \left( {p_{{\text{i}}} } \right)} \right]\) in accordance with Gibbs’s formula for thermodynamic entropy: \(S = k_{B} *H\). In this equation, kB is the Boltzmann constant that provides the dimensionality Joule/Kelvin to the thermodynamic entropy S; H provides a dimensionless statistic. H can be measured as uncertainty in a probability distribution: \(H = -\Sigma _{{\text{i}}} p_{{\text{i}}} *\log \left( {p_{{\text{i}}} } \right)\). (When two is taken as the basis for the logarithm, the measurement is in bits of information.)

The second law of thermodynamics states that entropy increases with each operation. Because of the linear relation between S and H, historical developments unfold with the arrow of time; that is, from an origin to the future. However, models enable us to anticipate future states from our position in the present, that is, to use future states (xt+n) represented in the present (xt) against the arrow of time for the reconstruction. In other words, the dynamics of expectations are very different from the historical dynamics “following the actors.” In the remainder of this chapter, the focus will be on the interactions among differently coded expectations and how they can generate redundancy (against the second law).

Redundancy R is defined in information theory as the fraction of the capacity of a communication channel (Hmax) that is not used. In formulic format:
$$R = \frac{{H_{\max } - H_{observed} }}{{H_{\max} }}$$
H is equal to the uncertainty in a relative frequency distribution (Σi pi = Σi [fi/N]) as follows:
$$H = -\Sigma _{{\text{i}}} p_{{\text{i}}} *\log_{2} \left( {p_{{\text{i}}} } \right)$$
When all N probabilities are equally probable and thus equal to 1/N, one can formulate the maximum information content Hmax as follows:
$$\begin{aligned} H_{\max } & = - \mathop \sum \limits_{i = 1}^{N} \left( \frac{1}{N} \right)\log (\frac{1}{N}) \\ & = - \frac{N}{N} {\log}\left( \frac{1}{N} \right) \\ \end{aligned}$$
$$= \log (N)$$
In the case of an evolving system—e.g., an eco-system in which new species can be generated—not only the observed information (Hobserved) of the system increases with time, but also Hmax, representing the number of possible states (N). The difference between Hmax and the observed information Hobserved is (by definition) equal to the redundancy R; that is, the options that are available but have not yet been realized. From the engineering perspective of information theory, these options are redundant. Redundancy can be used, among other things, for error-correction (Shannon, 1945).
Fig. 4.5

a The development of entropy (Hobs), maximum entropy (Hmax), and redundancy (HmaxHobs). b Hitherto impossible options are made possible because of cultural and technological evolution. Adapted from: Brooks & Wiley (1986, at p. 43)

Figure 4.5a shows Brooks & Wiley’s (1986, at p. 43) illustration of the dynamics of a biological system. I have added green to the redundancy as part of the evolving capacity of this system. As noted, redundancy provides a measure of the options that were not realized, but could have been realized. The exclusion of these options is “historical.” Kauffman (2000), for example, called these in principle possible realizations “adjacent.” Above this (green) area, however, Brooks & Wiley (1986) added the label of categorically “impossible” as a legend of Fig. 4.5a.

In Fig. 4.5b, I have replaced the label “impossible” with “technologically made feasible” in order to introduce a model which includes the levels B and C. Unlike a biological system, the techno-cultural evolution of expectations can be expected to generate redundancy. An intentional system is able to add new options without necessarily realizing them; one can keep options in mind. The cycling of information on top of the linear flow generates redundancy (Maturana, 2000). Redundancy is generated when two (or more) perspectives on the same information are operating at an interface.

For example, in the case of introducing a new technology into a market, the markets operate with a (supra-individual) logic different from technological criteria. When both the economic and the technological logic can operate, innovations can be enhanced because of the options made visible by the cross-tabling. (In Fig. 4.2b, for example, five zeros were added to the representation in Fig. 4.2a.) The redundancy added to the green surfaces of Fig. 4.5b is generated by the recombination of different expectations organized in terms of the variety of perspectives that can be entertained in the communication. Let me first specify this process in information-theoretical terms and then return to the interpretation. The reader who is less interested in the following derivations may wish to skip to Sect. 4.8.

4.6 The Generation of Mutual Redundancy

The total number of options available in a system is (by definition) equal to the sum of the realized options and the not-yet-realized but possible ones. This sum of realized and possible options determines the capacity of a system.

In information theory, one counts by using relative frequencies multiplied by their respective logarithms.2 This transformation is monotonous. For example, the two sets in Fig. 4.6 can be summed as follows:
Fig. 4.6

Set-theoretical representation of two sets of overlapping options

$$H_{12} = H_{1} + H_{2} - T_{12}$$
H1 and H2 can be used as labels for the information contents of the two sets with an overlap in T12. T12 is called “mutual information” or “transmission” between H1 and H2. If T12 were not subtracted from (H1 + H2), the overlap would be counted twice. However, the second time would be redundant. This redundancy R12 is equal to –T12 or, in other words, negative since the mutual information (T12) itself is  Shanon-type information and therefore necessarily positive (Theil, 1972, p. 59f.).

Weaver (1949) already noted that redundancy might be a prime candidate for the development of a theory of meaning. Using a different definition of information (as “a difference which makes a difference”; see Mackay, 1969), Bateson (1972, p. 420) argued that “the concept ‘redundancy’ is at least a partial synonym of ‘meaning’: […] if the receiver can guess at missing parts of the message, then those parts must, in fact, carry a meaning which refers to the missing part and is information about these parts.” Unlike information, redundancy is not observable; the maximum information has to be specified on theoretical grounds. This specification has the status of a hypothesis (which one may wish to update after the research process).

The same information can be appreciated differently by other agents or at different moments and other levels. Whenever information is appreciated, a system-specific meaning is generated. Whereas information can be communicated, meanings can be shared. Sharing can generate an intersubjective layer with a dynamic different from that of information processing. The redundancy in the overlaps can be measured as reduction of uncertainty at the systems level; that is, as negative bits of information. The relative uncertainty is reduced when the redundancy is increased. Whereas the events are historical and generate entropy along trajectories following the arrow of time, appreciations are analytical and can add redundancy or negative entropy from the perspective of hindsight—that is, against the arrow of time. One can also consider this redundancy as feedback or error correction against the arrow of time (Kline & Rosenberg, 1986; Krippendorff, 2009b).

In Fig. 4.7, Fig. 4.6 is extended to three sets. The two possible configurations in Fig. 4.7 indicate that T123 (the set in the centre) can be positive, negative, or zero. Redundancy is a measure of these absent options which can be defined (Bateson, 1972; Deacon, 2012). Unlike the empty space outside the three circles, the gap among the three circles in the centre can be quantified.
Fig. 4.7

Overlapping uncertainties in three variables x1, x2, and x3: two configurations with opposite signs of T123

The formula for the entropy of the combined set H123 follows the corrected numbers of elements using summations and subtractions as in overlaps among sets, as follows:
$$H_{123} = H_{1} + H_{2} + H_{3} - T_{12} - T_{13} - T_{23} + T_{123}$$

In Eq. 4.6, the central overlap T123 is included three times in (H1 + H2 + H3) and then three times subtracted by (– T12T13T23). It follows that T123 has to be added once more after the subtractions. Since T123 is added, while T12 was subtracted (in Eq. 4.4), the sign of the last term, representing the mutual redundancy in three dimensions, is opposite to that representing a model with an even number of dimensions: R12 = –T12 and R123 =  + T123, etc.

By replacing T12 in Eq. 4.6 with (\(H_{1} + H_{2} - H_{12} )\) as in Eq. 4.5, one can formulate as follows:
$$\begin{aligned} H_{123} &= H_{1} + H_{2} + H_{3} - \left( {H_{1} + H_{2} - H_{12} } \right) - \left( {H_{1} + H_{3} - H_{13} } \right)\\ &\quad- \left( {H_{2} + H_{3} - H_{23} } \right) + T_{123} \end{aligned}$$
Or after reorganization of the order of the terms:
$$\begin{aligned} T_{123} &= H_{123} - [H_{1} + H_{2} + H_{3} ] + \left( {H_{1} + H_{2} - H_{12} } \right) + \left( {H_{1} + H_{3} - H_{13} } \right) \\&\quad+ \left( {H_{2} + H_{3} - H_{23} } \right) T_{123} \\&= [H_{1} + H_{2} + H_{3} ] - \left[ {H_{12} + H_{13} + H_{23} } \right] + H_{123} \\ \end{aligned}$$

Using sets of relative frequency distributions—variables—the measurement of T123 is straightforward: all H values can be aggregated from writing the data as relative frequencies. The values of T123 follow from adding and subtracting H-values using Eq. 4.8.

4.7 Generalization

The sign change of the mutual information with the number dimensions was until recently an unsolved problem in information theory.3 However, Alexander Petersen has proven that this sign, indeed, changes with the addition of each next dimension.4 In other words, it can be shown that mutual redundancy is a consistent measure of negative entropy (Leydesdorff, Petersen, & Ivanova, 2017, p. 17).

Equation 4.8 can be rewritten as follows:
$$\begin{aligned} T_{123} & = H_{1} + H_{2} + H_{3} - H_{12} - H_{13} - H_{23} + H_{123} \\ T_{123} & = \left[ {\left( {H_{1} + H_{2} - H_{12} } \right) + \left( {H_{1} + H_{3} - H_{13} } \right) + \left( {H_{2} + H_{3} - H_{23} } \right)} \right]\\ &\quad + \left[ {H_{123} - H_{1} - H_{2} - H_{3} } \right] \\ T_{123} & = \left[ {T_{12} + T_{13} + T_{23} } \right] + \left[ {H_{123} - H_{1} - H_{2} - H_{3} } \right] \\ \end{aligned}$$

The terms in the first set of brackets in Eq. 4.10— [T12 + T13 + T23]—are Shannon-type information values and therefore strictly positive. The second bracketed term—[H123H1H2H3]—makes a negative contribution, because of the subadditivity of the entropy: \(H\left( {x_{1} , \ldots ,x_{n} } \right){ } \le \mathop \sum \nolimits_{1}^{n} H(x_{i} )\), which holds for any dimension n ≥  2. For example, H123 ≤ (H1 + H2 + H3). The sign of the resulting value of T123 depends on the empirical configurations of nodes (H-values) and links (T-values). Figure 4.7 shows the two opposites with positive and negative overlaps. This empirical trade-off can change over time and can also be considered as “the triple-helix dynamics” (Etzkowitz & Leydesdorff, 2000; see Chap.  5).

It follows inductively that for any given dimension n, one can formulate combinations of mutual information corresponding to \(\mathop \sum \nolimits_{1}^{n} H(x_{i} ) - H\left( {x_{1} , \ldots ,x_{n} } \right)\) that are by definition positive (or zero in the null case of complete independence). For example (up to four dimensions) as follows:
$$\begin{aligned} 0 & \le \mathop \sum \limits_{i = 1}^{n = 2} H(x_{i} ) - H\left( {x_{1} , x_{2} } \right) = T_{12} \\ 0 & \le \mathop \sum \limits_{i = 1}^{n = 3} H(x_{i} ) - H\left( {x_{1} , x_{2} , x_{3} } \right) = \mathop \sum \limits_{ij}^{3} T_{ij} - T_{123} \\ 0 & \le \mathop \sum \limits_{i = 1}^{n = 4} H(x_{i} ) - H\left( {x_{1} , x_{2} , x_{3} , x_{4} } \right) = \mathop \sum \limits_{ij}^{6} T_{ij} - \mathop \sum \limits_{ijk}^{4} T_{ijk} + T_{1234} \\ \end{aligned}$$
where the sums on the right-hand side are over the \(\left( {\begin{array}{*{20}c} n \\ k \\ \end{array} } \right)\) permutations of the indices.
Equation 4.11 can be extended for general n as follows:
$$\begin{aligned} 0 & \le \mathop \sum \limits_{i = 1}^{n} H(x_{i} ) - H\left( {x_{1} , \ldots ,x_{n} } \right) \\ = & \mathop \sum \limits_{ij}^ {{{\left( {\begin{subarray}{*{20}c} n \\ 2 \\ \end{subarray} } \right)}}} T_{ij} - \mathop \sum \limits_{ijk}^{{\left( {\begin{subarray}{*{20}c} n \\ 3 \\ \end{subarray} } \right)}} T_{ijk} + \mathop \sum \limits_{ijkl}^{{\left( {\begin{subarray}{*{20}c} n \\ 4 \\ \end{subarray} } \right)}} T_{ijkl} - \cdots + ( - 1)^{1 + n } \mathop \sum \limits_{{ijkl \ldots \left( {n - 1} \right)}}^{{\left( {\begin{subarray}{*{20}c} n \\ {n - 1} \\ \end{subarray} } \right)}} T_{{ijkl \ldots \left( {n - 1} \right)}} \\ & \quad + ( - 1)^{n } \mathop \sum \limits_{ijkl \ldots \left( n \right)}^{{\left( {\begin{subarray}{*{20}c} n \\ n \\ \end{subarray} } \right)}} T_{ijkl...\left( n \right)} \\ \end{aligned}$$
where the last term on the right-hand side is equal to \(( - 1)^{n } T_{1234...n}\).
Returning to the relation between \(R_{12}\) and \(T_{12}\), it follows (using first two dimensions instructively) that:
$$\begin{aligned} R_{12} & = - T_{12} \\ & = H\left( {x_{1} , x_{2} } \right) - \mathop \sum \limits_{1}^{2} H(x_{i} ) \le 0 \\ {\text{and}}\,\,\,\, & T_{12} \ge 0 \\ \end{aligned}$$
In other words, mutual information between two information sources is either positive or zero (Theil, 1972, p. 59f.). The relations for \(R_{123}\) and \(R_{1234}\) follow analogously from Eq. (4.12). In the general case of more than two dimensions (n > 2):
$$R_{n} = ( - 1)^{1 + n } T_{1234 \ldots n}$$
$$\begin{aligned} R_{n} & = \left[ {H\left( {x_{1} , \ldots ,x_{n} } \right) - \mathop \sum \limits_{1}^{n} H(x_{i}) } \right] \\ & \quad + \left[ {\mathop \sum \limits_{ij}^{{\left( {\begin{subarray}{*{20}c} n \\ 2 \\ \end{subarray} } \right)}} T_{ij} - \mathop \sum \limits_{ijk}^{{\left( {\begin{subarray}{*{20}c} n \\ 3 \\ \end{subarray} } \right)}} T_{ijk} + \mathop \sum \limits_{ijkl}^{{\left( {\begin{subarray}{*{20}c} n \\ 4 \\ \end{subarray} } \right)}} T_{ijkl} - \cdots + ( - 1)^{1 + n } \mathop \sum \limits_{{ijkl \ldots \left( {n - 1} \right)}}^{{\left( {\begin{subarray}{*{20}c} n \\ {n - 1} \\ \end{subarray} } \right)}} T_{{ijkl...\left( {n - 1} \right)}} } \right] \\ \end{aligned}$$

The left-bracketed term of Eq. 4.14\(\left[ {H\left( {x_{1} , \ldots ,x_{n} } \right) - \mathop \sum \nolimits_{1}^{n} H(x_{i} } \right)]\)—is necessarily negative (because of the subadditivity of the entropy; see above), while the configuration of mutual information relations contributes a second term on the right which can be positive. This latter term represents the entropy generated by the realization of the network in terms of links. The links are historical and thus add information.

In summary, Eq. 4.14 models the generation of redundancy (with a negative sign) on the one side versus the historical process of uncertainty generation in the relations (with a positive sign) on the other. A system with more than two codes (e.g., three alphabets; cf. Abramson, 1963, p. 127 ff.) can operate as an empirical (im)balance. When the resulting Rn is negative, self-organization prevails over organization in the configuration under study, whereas a positive Rn indicates conversely a predominance of historical organization over evolutionary self-organization.

4.8 Clockwise and Anti-clockwise Rotations

When the relation between two subdynamics is extended with a third, the third may feed back or feed forward on the communication relation between the other two, and thus a system is shaped (Sun & Negishi, 2010). This principle is known in social network analysis as “triadic closure.” Triadic closure can be considered as the basic mechanism of systems formation (Bianconi et al., 2014; de Nooy & Leydesdorff, 2015). The cycling may take control as in a self-organizing vortex (Fig. 4.8). A cycle with the reverse order of the operations (counter-clockwise) is equally possible stabilizing the dynamic in organizational formats.
Fig. 4.8

Schematic of a hypothetical three-component autocatalytic cycle. Source Ulanowicz (2009, at p. 1888, Fig. 3)

The two cycles can be modeled as two vectors PABC and QABC with three (or more) dimensions (A, B, and C), and this system can then be simulated in terms of the rotations of the two vectors (Ivanova & Leydesdorff, 2014b). One rotation can be understood as corresponding to the tendency of historical realization, and the evolving self-organization of horizons of meaning. Using simulations, Ivanova & Leydesdorff (2014a) showed that the operation of these two (three-dimensional) vectors upon each other can be expected to generate an R. The value of R is determined by the network configuration as were the values of T123…n in (Eq. 4.14). A negative sign of R can be associated with clockwise and the positive sign with counter-clockwise rotations of the vectors in the simulation, while the values of the two terms in Eq. 4.14 measure the relative weights of the two rotations in empirical data. The theorizing, simulation, and measurement can thus be brought into the single and comprehensive framework of a calculus of redundancy as a complement to Shannon’s calculus of information (Bar-Hillel, 1955). The resulting value of R can be positive or negative reflecting the possibility of an inversion along the time axis.

4.9 Summary and Conclusions

I first extended Shannon’s model of communication (at level A) with Weaver’s levels B and C. This changes Shannon’s linear model into a non-linear and potentially evolutionary one, since feedback and feed-forward loops among the levels become possible. The three levels distinguished in Fig. 4.3 correspond with Luhmann’s distinction among (i) interactions, (ii) organization on the basis of decisions, and (iii) self-organization among the fluxes of communications. At level A, information is communicated in interactions among senders and receivers; at level B, meanings can be shared to variable extents and thus meaningful information is organized into a vector-space. However, this vector space is constructed and therefore remains subject to reflexive reconstructions. The reconstructions, in terms of different weights of the codes of communication, open self-organizing horizons of meaning at level C.

The question central to the next chapters can now be formulated as follows: under what conditions can the different codes be expected to interact and co-evolve, and thus lead to new options? In this chapter, I have first focused on the coherence and tensions among the communication-, evolution-, and systems-theoretical perspectives with reference to Luhmann’s formulation of the program of theory construction (cited in Chap.  1). I have argued that redundancies can be generated at interfaces among sets of relations which are structured by codes.

In Luhmann’s theory, however, interactions among codes were a priori held to be impossible; the (sub)systems are defined as “operationally closed” (Luhmann, 1986a and b; cf. Maturana, 1978). In my opinion, this assumption leads to a meta-biology, since the analyst remains external to the closed systems under study which can only be “observed.” Whereas biological systems can gain in complexity by closing themselves operationally—for example, by shaping a membrane—expectations can disturb and penetrate one another “infra-reflexively” (Latour, 1988, at p. 169 ff.) and across domains in the second contingency. Neither the communication “systems” nor the codes “exist” as hardware (res extensa).

The reflexive layers (res cogitans)—at the individual and the above-individual levels—can be expected to operate with specific selection criteria upon one another and over time. Because of these reflexive couplings in terms of expectations, cultural evolution can be much faster than biological evolution, which operates in terms of realizations (over generations). Writing and rewriting in the hardware requires more energy and time than the exploration and codification of new combinations of expectations.

In other words, I draw a sharper line than Luhmann did between biology and sociology. Different from Luhmann, I do not make the assumption that systems “exist.” On the contrary, I assume that “systems” are analytical constructs. These constructs can eventually be tested as sets of expectations. Cognitive constructs are thus different from living systems. The philosophy in the background is fundamentally opposed to the holistic and biologically oriented ones nowadays prevailing in artificial intelligence (e.g., Damasio, 1994; Sherman, 2017). Theories of information and redundancy span different domains (Deacon, 2012). In the reflexive domain of the social sciences, we study our methods of studying, since these methods are the constraints of our respective perspectives.

Furthermore, Luhmann (e.g., 2013, p. 98) stated “avoidance of redundancy” as an objective. But it remained unclear why. From my perspective, this a priori makes it impossible to contribute to his original objective to specify “a form of selection that prevents the world from shrinking down to just one particular content of consciousness with each act of determining experience” ([1971] 1990, p. 27). The new options are redundant. The generation of redundancy proceeds in a domain of expectations about options that do not (yet) exist, but that one can imagine reflexively, refine, and (re)construct.

By turning away from an objectivistic self-understanding of the sciences as “observers” and “observations” room thus can be found for a theory of meaning and knowledge-generation as an extension of Shannon’s information theory (Fig. 4.3). Whereas Shannon felt the need to distance himself explicitly from this potential extension, his co-author Weaver understood this possible consequence as the proper intension of information theory (Bar-Hillel, 1955).


  1. 1.

    Spelling rules, syntax, and pragmatics can also be considered as codes in the use of language, but we focus on the semantics.

  2. 2.

    The counting rules in information theory (Shannon, 1948; cf. Leydesdorff, 1991; Theil, 1972; Yeung, 2008) are based on relative frequencies. Observed frequencies are divided by the grand total in order to obtain relative frequencies or, in other words, probabilities:

    \(p_{ijk \ldots } = f_{ijk \ldots } /\sum\limits_{ijk \ldots } {f_{ijk \ldots } } = f_{ijk \ldots } /N\)

    The probabilistic entropy of the distribution of relative frequencies is:

    \(\begin{aligned} H_{{{\text{observed}}}} & = - \sum\limits_{ijk \ldots } {p_{ijk \ldots } } \log_{2} p_{ijk \ldots } \\ & = - \frac{{\sum\nolimits_{{ijk_{w} }} {f_{{ijk_{w} }} } }}{N}*\log_{2} \frac{{f_{{ijk_{m} }} }}{N} \\ & = \log_{2} N - \sum\limits_{ijk \ldots } {f_{ijk \ldots } } \log_{2} f_{ijk \ldots } \\ \end{aligned}\)

    It follows that the maximum entropy \(H_{\max} = \log_{2} N\). The relative uncertainty or information is \(H_{observed}\)/\(H_{\max}\). The redundancy is defined by Shannon (1948) as the relative value of the not-realized options:

    \(Redundancy = \left[ {H_{\max } - H_{observed \, } } \right]/H_{\max \, }\)

  3. 3.
    Krippendorff (2009b, at p. 670; cf. Leydesdorff, 2010, at p. 68) provided a general notation for this alteration with changing dimensionality—but with the opposite sign as follows:
    $$Q(\Gamma ) = \sum\limits_{X \subseteq \Gamma } {( - 1)^{1 + |\Gamma | - |X|} } H(X)$$

    In this equation, Γ is the set of variables of which X is a subset, and H(X) is the uncertainty of the distribution; |Γ| is the cardinality of Γ, and |X| the cardinality of X.

  4. 4.

    The sign change finds its origin in the non-additivity of the entropy: H12 ≤ H1 + H2.


  1. Abramson, N. (1963). Information theory and coding. New York, etc.: McGraw-Hill.Google Scholar
  2. Andersen, E. S. (1994). Evolutionary economics: Post-Schumpeterian contributions. London: Pinter.Google Scholar
  3. Bar-Hillel, Y. (1955). An examination of information theory. Philosophy of Science, 22(2), 86–105.CrossRefGoogle Scholar
  4. Bateson, G. (1972 [1934]). Steps to an ecology of mind. New York: Ballantine.Google Scholar
  5. Bianconi, G., Darst, R. K., Iacovacci, J., & Fortunato, S. (2014). Triadic closure as a basic generating mechanism of communities in complex networks. Physical Review E, 90(4), 042806.CrossRefGoogle Scholar
  6. Bourdieu, P. (1976). Le champ scientifique. Actes de la recherche en sciences sociales, 2(2), 88–104.Google Scholar
  7. Bourdieu, P. (2004). Science of science and reflexivity. Chicago: University of Chicago Press.Google Scholar
  8. Brooks, D. R., & Wiley, E. O. (1986). Evolution as entropy. Chicago/London: University of Chicago Press.Google Scholar
  9. Burt, R. S. (1982). Toward a structural theory of action. New York, etc.: Academic Press.Google Scholar
  10. Damasio, A. R. (1994). Descartes’ error: Emotion, reason, and the human brain. New York: Grosset/Putnam.Google Scholar
  11. de Nooy, W., & Leydesdorff, L. (2015). The dynamics of triads in aggregated journal–journal citation relations: Specialty developments at the above-journal level. Journal of Informetrics, 9(3), 542–554. Scholar
  12. Deacon, T. W. (2012). Incomplete nature: How the mind emerged from matter. New York, London: Norton & Company.Google Scholar
  13. Dosi, G. (1982). Technological paradigms and technological trajectories: A suggested interpretation of the determinants and directions of technical change. Research Policy, 11(3), 147–162.CrossRefGoogle Scholar
  14. Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation: From national systems and ‘mode 2’ to a Triple Helix of university-industry-government relations. Research Policy, 29(2), 109–123.CrossRefGoogle Scholar
  15. Frenken, K., & Leydesdorff, L. (2000). Scaling trajectories in civil aircraft (1913–1970). Research Policy, 29(3), 331–348.CrossRefGoogle Scholar
  16. Haken, H., & Portugali, J. (2014). Information adaptation: The interplay between shannon information and semantic information in cognition. Heidelberg, etc.: Springer.Google Scholar
  17. Husserl, E. (1929). Cartesianische Meditationen und Pariser Vorträge [Cartesian Meditations and the Paris Lectures, translated by Dorion Cairns] (p. 1960). The Hague: Martinus Nijhoff.Google Scholar
  18. Husserl, E. ([1935/36] 1962). Die Krisis der Europäischen Wissenschaften und die Transzendentale Phänomenologie. Den Haag: Martinus Nijhoff.Google Scholar
  19. Ivanova, I. A., & Leydesdorff, L. (2014a). Rotational symmetry and the transformation of innovation systems in a triple helix of University-Industry-Government relations. Technological Forecasting and Social Change, 86, 143–156.Google Scholar
  20. Ivanova, I. A., & Leydesdorff, L. (2014b). A simulation model of the Triple Helix of university-industry-government relations and the decomposition of the redundancy. Scientometrics, 99(3), 927–948.
  21. Kline, S., & Rosenberg, N. (1986). An overview of innovation. In R. Landau & N. Rosenberg (Eds.), The positive sum strategy: Harnessing technology for economic growth (pp. 275–306). Washington, DC: National Academy Press.Google Scholar
  22. Kauffman, S. A. (2000). Investigations. Oxford, etc.: Oxford University Press.Google Scholar
  23. Krippendorff, K. (2009a). W. Ross Ashby’s information theory: A bit of history, some solutions to problems, and what we face today. International Journal of General Systems, 38(2), 189–212.Google Scholar
  24. Krippendorff, K. (2009b). Information of Interactions in Complex Systems. International Journal of General Systems, 38(6), 669–680.CrossRefGoogle Scholar
  25. Latour, B. (1988). The politics of explanation: An alternative. In S. Woolgar & M. Ashmore (Eds.), Knowledge and reflexivity: New frontiers in the sociology of knowledge (pp. 155–177). London: Sage.Google Scholar
  26. Leydesdorff, L. (1991). The static and dynamic analysis of network data using information theory. Social Networks, 13(4), 301–345.CrossRefGoogle Scholar
  27. Leydesdorff, L. (1996). Is a general theory of communications emerging? Paper presented at the Emergence—Complexité Hiérarchique—Organisation: Modèles de la boucle évolutive. Amiens: Actes du Symposium ECHO.Google Scholar
  28. Leydesdorff, L. (1997). The non-linear dynamics of sociological reflections. International Sociology, 12(1), 25–45.
  29. Leydesdorff, L. (2010). The knowledge-based economy and the Triple Helix Model. Annual Review of Information Science and Technology, 44, 367–417.CrossRefGoogle Scholar
  30. Leydesdorff, L. (2016). Information, meaning, and intellectual organization in networks of inter-human communication, pp. 280–303 in: Cassidy R. Sugimoto (Ed.), Theories of informetrics and scholarly communication: A festschrift in honor of Blaise Cronin, . In C. R. Sugimoto (Ed.), Theories of informetrics and scholarly communication: A festschrift in honor of Blaise Cronin. Berlin/Boston MA: De Gruyter.Google Scholar
  31. Leydesdorff, L., Johnson, M., & Ivanova, I. (2018). Toward a calculus of redundancy: signification, codification, and anticipation in cultural evolution. Journal of the Association for Information Science and Technology, 69(10), 1181–1192. Scholar
  32. Leydesdorff, L., Petersen, A., & Ivanova, I. (2017). The self-organization of meaning and the reflexive communication of information. Social Science Information, 56(1), 4–27.CrossRefGoogle Scholar
  33. Luhmann, N. (1984). Soziale Systeme. Grundriß einer allgemeinen Theorie. Frankfurt a.M.: Suhrkamp.Google Scholar
  34. Luhmann, N. (1986a). Love as passion: The codification of intimacy. Stanford, CA: Stanford University Press.Google Scholar
  35. Luhmann, N. (1986b). The autopoiesis of social systems. In F. Geyer & J. V. D. Zouwen (Eds.), Sociocybernetic Paradoxes (pp. 172–192). London: Sage.Google Scholar
  36. Luhmann, N. (2013). Theory of society, Vol. 2. Stanford, CA: Stanford University Press.Google Scholar
  37. Luhmann, N. (1990). Meaning as sociology's basic concept. In N. Luhmann (Ed.), Essays on self-reference (pp. 21–79). New York/Oxford: Columbia University Press.Google Scholar
  38. Maturana, H. R. (1978). Biology of language: The epistemology of reality. In G. A. Miller & E. Lenneberg (Eds.), Psychology and biology of language and thought. Essays in honor of eric lenneberg (pp. 27–63). New York: Academic Press.Google Scholar
  39. Maturana, H. R. (2000). The nature of the laws of nature. Systems Research and Behavioral Science, 17(5), 459–468.CrossRefGoogle Scholar
  40. MacKay, D. M. (1969). Information, mechanism and meaning. Cambridge and London: MIT Press.CrossRefGoogle Scholar
  41. Nietzsche, F. ([1878] 1967). Menschliches, allzumenschliches: ein Buch für freie Geister. Berlin: Walter de Gruyter & Co.Google Scholar
  42. Popper, K. R. (1972). Objective knowledge: An evolutionary approach. Oxford: Oxford University Press.Google Scholar
  43. Scott, B. (2004). Second-order cybernetics: An historical introduction. Kybernetes, 93(9–10), 1365–1378.CrossRefGoogle Scholar
  44. Sherman, J. (2017). Neither ghost nor machine: The emergence and nature of selves. New York: Columbia University Press.Google Scholar
  45. Shannon, C. E. (1945). A mathematical theory of cryptography Memorandum (Vol. MM-45-110-02, pp. 114 pages + 125 figs.). Bell Laboratories.Google Scholar
  46. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27 (July and October), 379–423 and 623–656Google Scholar
  47. Simon, H. A. (1973). The organization of complex systems. In H. H. Pattee (Ed.), Hierarchy theory: The challenge of complex systems (pp. 1–27). New York: George Braziller Inc.Google Scholar
  48. Sun, Y., & Negishi, M. (2010). Measuring the relationships among university, industry and other sectors in Japan’s national innovation system: A comparison of new approaches with mutual information indicators. Scientometrics, 82(3), 677–685.CrossRefGoogle Scholar
  49. Theil, H. (1972). Statistical decomposition analysis. Amsterdam/London: North-Holland.Google Scholar
  50. Ulanowicz, R. E. (2009). The dual nature of ecosystem dynamics. Ecological Modelling, 220(16), 1886–1892.CrossRefGoogle Scholar
  51. Varela, F. J. (1979). Principles of biological autonomy. Amsterdam: North Holland.Google Scholar
  52. Weaver, W. (1949). Some recent contributions to the mathematical theory of communication. In C. E. Shannon & W. Weaver (Eds.), The Mathematical Theory of Communication (pp. 93–117). Urbana: University of Illinois Press.Google Scholar
  53. Weber, M. (1904). Die Objektivität sozialwissenschaftlicher und sozialpolitischer Erkenntnis. Gesammelte Aufsätze zur Wissenschaftslehre (pp. 146–214). Tübingen: Mohr, 3 1968.Google Scholar
  54. Yeung, R. W. (2008). Information theory and network coding. New York, NY: Springer.Google Scholar

Copyright information

© The Author(s) 2021

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Amsterdam School of Communication Research (ASCoR)University of AmsterdamAmsterdamThe Netherlands

Personalised recommendations