1 Introduction

The signum of a living system is that in contrast to the 2nd law of thermodynamics, which rules all material systems, it is able to exist as temporary decrease of entropy.Footnote 1Living systems are episodes rendering the 2nd law its probabilistic character, they are born and they die, which coincides with the usefulness of concepts like ‘consciousness’ and ‘time’ as they are experienced by living entities. As modern biology has discovered, the basic process that randomly emerged on earth as a living system seems to be a disequilibrium process being able to reproduce itself in its local environment some 4 billion years ago.Footnote 2 Reproduction is the emergence of a copy, of circumstances that allow for a mirroring of the original disequilibrium process. The spreading of copies, all of them in nearby places and usually imperfect mutations, is the precondition for the setting in of evolutionary selection mechanisms. The step from this first mirror mechanism providing a living mechanism to the next step, the grand evolutionary jump forward to the human species, can be imagined as a second mirror mechanism, which projects – better: reflects - biological evolution into the brains of the individual members of the human species.Footnote 3 Again, the reflected copies are imperfect; a plethora of filters sorts out what has been evolutionary learned as essential from the chaos of perceived impressions. Such a second mirror, such a set of filters, can only emerge if it is able to survive substantially longer than the lifespan of individual members of the species. In other words, the human species has been bound to use a shared language to become an enduring social entity. It is the exchange of perception filters, in the communicative capacity of tribes of human individuals, which enables and constitutes individual consciousness.Footnote 4 The existence of this second mirror is built on the primacy of the group.Footnote 5 Note that like the first mirror, also this second mirror provides imperfect copies in individual brains. In this context, a variety of diverse copies is regulated by the syntax and semantics of the shared language. The first part of this paper develops these ideas in more detail. Then—having convinced the reader that all social science necessarily has to put the diversity of internal modelling of individuals and their communicative exchange in its centre, i.e. has to be ‘complex’—some proposals on how to proceed (e.g. by agent-based modelling of essentials) and which problems (e.g. algorithmic language evolution, use of complex numbersand octonions) will have to be overcome are provided.

2 The Transdisciplinary character of complexity issues

One of the rare topics in classical theoretical physics that explicitly addresses the irreversibility of time is the 2nd law of thermodynamics. Since it states the increase of entropy as a prevailing long-run tendency with countervailing short-run episodes of decreasing entropy, it evidently needs a theory of probability. Thus Boltzmann’s use of probability theory, see (Boltzmann, 1886), can be viewed as the last frontier, which Newtonian physics could reach; the point where reversible laws governing non-living physical systems started to point at their own contradiction: basic indeterminacy.

The scientist who as one of the first realized how deep the break of scientific development of theoretical physics that then occurred really was, has been Erwin Schrödinger. Being one of the central researchers in quantum theory, which brought to flourish Planck’s and Einstein’s breakthroughs, he felt that what was needed next was to broaden the general scientific audience, to display the transdisciplinary role that the quantum revolution plays. In a series of lectures he gave in the forties, he addressed the question ‘What is Life?’, (Schrödinger, 1944). This clearly crossed the border between physics and biology, and since the humanities after Darwin were also just the latest stage of biology, Schrödinger’s question could hardly have been more transdisciplinary.

What he wrote had a massive impact on biology,Footnote 6 but was almost completely ignored by the social sciences. In principle, his perspective was that a different type of scientific formalisms will be needed to describe episodes of decreasing entropy, i.e. living systems. He writes:

Life seems to be orderly and lawfulbehaviour of matter, not based exclusively on itstendency to go over from order to disorder, butbased partly on existing order that is kept up.

It appearsthat there are two different ‘mechanisms’ bywhich orderly events can be produced: the’statistical mechanism’ which producesorder from disorder and the new one, producingorder from order.’ (Schrödinger, 1944).

What he calls ‘statistical mechanism’ is the process by which an enormously large set of very small connected systems governed by quantum mechanics exhibits a macro-behaviour conveniently close to Newtonian mechanics.

To the unprejudiced mind thesecond principle appears to be much simpler,much more plausible. No a doubt it is. That iswhy physicists were so proud to have fallen inwith the other one, the ‘order-from-disorder’principle, which is actually followed in Natureand which alone conveys an understanding of thegreat line of natural events, in the first place oftheir irreversibility. But we cannot expect thatthe ‘laws of physics’ derived from it sufficestraightaway to explain the behaviour ofliving matter, whose most striking features arevisibly based to a large extent on the ‘order-from-order’ principle. You would not expect twoentirely different mechanisms to bring about thesame type of law -you would not expect yourlatch-key, to open your neighbour’s door as well. ...We must be preparedto find a new type of physical law prevailing init. Or are we to term it a non-physical, not to saya super-physical, law?’(Schrödinger, 1944).

And when Schrödinger considers the emergence of heredity—of memory—in the cell, a property not to be found in non-living matter, he states:

In the light of present knowledge, the mechanism of heredity is closely related to, nay, founded on, the very basis of quantum theory.’ [Schrödinger, 1944, Chapter 4].

Thus, the order-from-order principle can be identified:

‘… the new principle that is involved is a genuinely physical one: it is, in my opinion, nothing else than the principle of quantum theory over again.

Remember that this now—in a new context—is the principle necessary to describe living systems! Schrödinger made these remarks 75 years ago, and a lot of scientific knowledge has been produced since then. But first, take a look at a graphical summary of his perspective.

In Diagram 1 the difference between non-living and living systems is shown as a function of the progress of the entropy law (2nd law of thermodynamics). For non-living systems, entropyFootnote 7 increases steadily but, because what Schrödinger calls the ‘statistical principal’, a kind of order is established which allows for certain configurations of smallest entities to achieve relative stability, e.g. molecules. Two basic elements of this argument are particularly important: (i) The world is built by an ensemble of discrete smallest units, which can be described in two different forms: as a set of particles or as a field of waves.Footnote 8 (ii) The amount of material entities plays a decisive role; only with an enormous amount of interacting particles that Newton’s laws become valid. This is exactly what he names the ‘statistical principle’, which is at work to produce order. Note also that it was the introduction of a certain kind of probability theory (Boltzmann’s contribution), which allowed to establish this link from basic randomness to lawful behaviour.Footnote 9 For living systems, he assumes that a second type of order production starts to play an essential role—order produced by order. Of course, the first principle has not vanished, but it now is supplemented by the capacity of molecules to produce copies of themselves. For Schrödinger, genes are just this type of molecules. They thus are not just relatively stable configurations, they are programs; programs which can produce programs that are mostly copies of themselves—plus a few random mutations. The latter property is then the starting point for Darwinian selection processes (Diagrams 2, 3 and 4).

Diagram 1
figure 1

Emergence of order

Diagram 2
figure 2

Overlapping copies

Diagram 3
figure 3

The evolution of species

Diagram 4
figure 4

Double mirrors

Biological entities produce copies,Footnote 10 i.e. offspring, usually during their lifetime. The amount of offspring evidently depends on the conditions of the environment, the worse these conditions; the more offspring is needed to have surviving children, to keep the population alive. Since the conditions of the environment change over time, mutations are a safeguard against a too uniform set of properties of the members of a population. This is the background of the necessity of diversity within a population. In other words, stronger mutations will be favourable in faster changing environments—and vice versa.Footnote 11

The existence of different biological populations leads to the fact that for each single population, all the other ones are part of their environment. Again, the concept of relative stability, of a configuration of populations which is sustainable for a longer period of time, can be applied. Since such configurations empirically typically occurred only in certain geographical areas, at a certain topos, biologists called them biotopes. Darwin’s pivotal idea was to introduce progress, a sequence of species configurations in the history of observed species. The build-up of life on a planetary level clearly runs counter the still dominating long-run increase of overall entropy. Schrödinger added another twist to this general idea by realizing that relative stability of a configuration also needs an upper border, a less progressive threat that hinders it to slide immediately into the next progressive stage. The discreteness of stages—remember that the discreteness of smallest entities was a methodological revolution—that is emerging in this way therefore involves ‘revolutions’,Footnote 12 which overcome a short period of higher entropy by selecting a new, even more ordered configuration out of the rather finite set of possible new configurations.

Schrödinger answers his grand question ‘What is the characteristic feature of life?’ by writing:

It feeds upon negative entropy’, attracting, as it were, a stream of negative entropy upon itself, to compensate the entropy increase it produces by living and thus to maintain itself on a stationary and fairly low entropy level.

Our planet is open to negentropy import mainly from the sun and from its own motion, the rest is (in Marx terminology, (Marx, 1867, chapter 26)) exploitation of nature by man and of man by man. The environmental crisis in this view is just part of the more general problem to find a new configuration that allows for an overall lower, more progressive, entropy level. As Schrödinger suggested, the transition to this level needs a revolution—a phase transition of chaos towards higher entropy—during which (eventually) such a better configuration is singled out; or life dies and the long-run entropy law proceeds.

Our solar system is an open system, but the scale of its closure in time and space is so large that for the problems of the human species it can be safely considered as closed system. In that sense, environmental concerns and problems of global political economy indeed converge. While a decrease of entropy in an open system does not violate the second law of thermodynamics, i.e. life can exist locally in space-time, the compensation that the openness of a smaller open system within a larger (approximatively) closed system guarantees to force living systems to continuously invest into more ‘order from order’, to perpetuate progress.

Scientific efforts thus have to concentrate on finding and implementing such a new social configuration. And it is most important that these efforts must be transdisciplinary. Schrödinger was a supreme mathematician; nevertheless, his mathematical skills made him also a star in theoretical physics. In retrospect, biology considers him to be also the godfather of micro-biology. A similar judgement could be made about John von Neumann, starting with his study of chemistry, then via mathematics and theoretical physics to game theory, informatics, and economic growth theory modelling. The explosion of scientific knowledge during the interwar period certainly is related to the quantum jump in modelling skills and the reach out of exceptional researchers to neighbouring scientific fields. It is thus not only ‘multidisciplinarity’ which counts, it needs the transfer of highly developed scientific knowledge in one discipline into a second one; a process that can only be handled by outstanding individuals or smaller groups of scientists.

In the meantime, the boost in biology has proven Schrödinger’s suggestions to be essentially correct and has provided lots of detail. There exist now rather convincing theories on how life has emerged on earth.Footnote 13 In the course of all these discoveries, the borderlines between physics, chemistry, biology, and the algorithmic tools they all use became more and more blurred. It therefore seems to be promising to take a closer look at this overarching modelling. It will reveal why and how the concept of complexity is important.

Let me start with a handful of strong proposals.

A model is a copy of some essential features of a process that took place; it therefore is a special type of mutation of the original process. Models are made because social entities assume that the dynamics observed in the past and captured as being essential will prevail in the future and—once known—will enable the social entity to improve its welfare. But what exactly is a copy? This question leads back to the emergence of life.

A copy of an algorithm, e.g. a gene, is a reproduction of this algorithm in a different place. One could imagine this process as a two-step mirror process: In a first step, an appropriate environment gets imprinted by the original algorithm and thus receives a negative mirror image. In a second step, this mould lets matter flow into its shape and with this second mirror process creates a copy of the original in another place. Since in a discrete world any algorithm can be described by a finite stream of bits, it is instructive to display this process as follows.

As is well-known to every programmer, every algorithm can also be considered as a pattern of bits; therefore, this copying procedure might as well be interpreted as the usual copying of a pattern. And since some copying errors never can be avoided, the copy at the new place will always be a mutation. Plants often produce copies in nearby places, so mutations can be weak since environmental conditions stay almost the same. Animals often have a wider geographical range of activity; they can adjust their environment to their needs by seasonal movements. These movements therefore become part of their algorithm.

The human species in contrast to the animal kingdom is characterized by a second mirror process. It takes place in the human brain. In this second process, the large amount of human cells held together in the body of a human individual perceives itself as being different from its environment. This perception is done by special devices—somatosensation, audition, vision, olfaction, and gustation—rendering the raw material which then enters a filtering process that delivers the essential inputs to the second mirroring process. But for humans, the original to be copied in the brain is not just the inherited biological algorithm enriched with some essential sensory inputs. From their birth onwards, humans are trained to behave as members of a family, of a tribe, and of a larger social entity.Footnote 14 In a biological perspective, this primacy of the group is not a new feature of the human species; it is shared with higher animals. What is special is amplified consciousness, seeing oneself as part of the concerted action. Reproducing, i.e. feeding, sex, and finding shelter in winter..., all is not only experienced as physical biological system but also experienced in the brain, which collects the outside impressions and mixes them up with the algorithms already stored in its memory.Footnote 15

It is only straightforward to consider the emergence of a shared language of human individuals as just another possibility to prolong memory. Again, first hints in that direction can be found in populations of other higher mammals. But to speak as humans do, one needs the existence of a very conscious ‘I am’. Soon, oral tradition had been improved by written records; the transmission medium of air had been substituted by stone and other enduring material. The surprise is that the second mirror process, the one that in the first place constituted the human ego, had immediately produced a reproductive algorithm, which resided outside the human individuals! By behaving according to traditional algorithms, it was the group, the tribe, that constituted the self-consciousness of the individual member.Footnote 16

Traditions, or in Marx terminology production relations, are properties of a certain social entity, of a historically observed society. They break, they are revolutionized, when larger parts of society—usually organized as classes—realize that the traditional interpretation of what they perceive is in too great discrepancy with what this interpretation had announced. A pivotal role then is also played by a possibly competing, more promising vision, promising the fruits of a victorious class struggle. Thus, the way to a better world always predominantly first takes place on the battlefield of ideologies.

If these strong propositions hold, the question arises: Can these ideological class struggles be modelled? The answer is ‘yes’, but Schrödinger’s suggestion of a new type of modelling (partly echoed in von Neumann’s attempts in game theory (Neumann and Morgenstern, 1944)) has to be followed—because social relations in human societies are ‘complex’.Footnote 17

To begin at the end, today, the global production system is extremely interwoven and interdependent. There is massive, necessary ignorance of this fact on the side of the overwhelming majority of the human population. In other words, we are living in an age of alienation. Ignorance breeds believe. What modelling in political economy would urgently need is an algorithmic model of belief dynamics re-introducing (and re-framing) the concept of classes. Moreover, believe formation of the masses currently is subject to a technology shock: the smart phone allowing access to social media has changed the rule of the game. Mainstream economic theory in this respect is already completely irrelevant, since it stubbornly refuses to include a more sophisticated communication model of agents at different institutional levels. But transdisciplinary research has enabled a rather well-developed branch of voting theory, sometimes intermingled with institutional economics, which revives the older research on optimal governance forms.Footnote 18 Connecting these approaches with today’s technological possibilities, which in turn would have to be supported by broad studies conducted in the yet underdeveloped area of information science, this indeed would hopefully contribute to the vision of a new ‘configuration’ of society.

The underlying emerging model will have to take the backbone of so-called global value chains serious. Global division of labour should be adjusted to the needs of global populations, sure, but it cannot and should not be reversed. Using extended input-out techniques can be a starting point. Linking national I-O tables already is a complicated task, but this is just a starter for the emerging complexity (compare Hanappi, 2018b). When it comes to the drivers of economic forces, the specification of social agents, the trouble really gets a lot more demanding. Agents at all levels communicate, i.e. they produce images of algorithms and they digest images of algorithms; they blow into the air what they want others to believe and they listen to the voices others are emitting. John von Neumann’s attempt to produce a new language for the social sciences, game theory, tried to tame this field of research with the help of analytical mathematics. Unfortunately, the generation of mathematicians that followed him tamed his game theory and developed it into rather unexciting special field of analytical mathematics. But Neumann’s original approach is still inspiring and can be revived by researchers coming from computer science, from simulation theories, e.g. Hanappi et al. (2001).

There is no doubt that future models will be complicated, but is complexity more than that? In my view, a sharp distinction between complicated and complex models is useful. Consider the following assumption: A model is complex if:

  1. 1.

    It includes the model-building process of at least two agents, each knowing that the other agent is a model-builder (i.e. the game theory approach).

  2. 2.

    Agents are connected by the actions they take, communication between them possibly being such an action.

This attempt to define complexity evidently is built on von Neumann’s suggestion of game theoretic modelling. Note that models can be complicated like some applied macroeconomic (accounting) models, which consist of more than 1000 equations, but still are not complex because they include no explicit internal model-building process of interdependent agents—the assumption of rational expectationFootnote 19 was just a helpless excuse for ignoring communication. On the other hand, a rather small and simple-looking game—not complicated—can be complex, e.g. a 2-person game in its algorithmic form.Footnote 20 Therefore, being complicated and being complex are well-distinguished, independent properties.

There also is an interesting connection between the use of equilibrium conditions and the explicit use of complex (in the above sense) relations. In mainstream economic theory, equilibrium conditions, e.g. to assume that supply equals demand, often are used to circumvent complexity issues, e.g. how sellers and buyers use their internal model building in a bargaining process. Such a short-cut by assumption excludes all kinds of disequilibrium dynamics that in reality usually are stored in the memories of agents.Footnote 21 Justification of an equilibrium assumption usually refers to the ‘infinite’ speed with which the invisible disequilibrium dynamics will lead to equilibrium. This infinitely high speed, of course, then has to be interpreted as relative speed compared to (slow) changes expressed by other dynamic equations of a model. A similarly obscure role often is played by the opposite extreme assumption concerning economic variables: zero speed of change, i.e. constancy, of certain (socio-psychological) variables, e.g. the ‘propensity to consume’ in Keynes work. In both cases, the most interesting part of the agents’ behaviour, namely what happens in disequilibrium, is getting extinct.Footnote 22 As a consequence, complex models usually are disequilibrium models formulated as programs, what in fashionable terms today is called an agent-based model (see Hanappi, 2017). But this is not necessarily the case. It might well be that an equilibrium in expectations of different agents is reached in communications in time, and is playing an important role for further dynamics. Nevertheless, the general thrust of the trajectories of variables in complex models looks very different to what is praised as the highest achievement in mainstream economics: a general equilibrium growth path.

Instead of being afraid of diverging variables and widening of disequilibria, complex algorithmic models shall consider such processes as adequate presentations of what actually happens in human societies. The acquainted properties of algorithmic systems have already been discovered early on in the history of computer science when Alan Turing discussed the halting problem: There is no general way to predict if a program will ever stop.Footnote 23 An analogue process also happens in a 2-person game when my modelling of the other’s modelling, including a model of my modelling (and so on …), gets into an infinite regress. In all such complex models, divergence, exploding disequilibrium, unbearable time consumption, or the like are problems that indeed are archetypical for social entities. And social entities react on these observations mostly by being creative and innovative, by trying out something completely different. This disruptive practice always is risky; it might lead to an elimination of the agent, its agenda being dispersed to other agents or simply dropped. In the light of the earlier arguments, the build-up of order of a single entity within the evolution of global life always has a shorter time-span and contributes to the discovery of new survival mechanisms by taking innovative risk. It is remarkable that all such processes—the sequences of relatively stable oscillations, then the avalanche of disequilibrium leading to deterministic chaos and perception confusion, then innovation and selection of risky new configurations—can neatly be mimicked with agent-based complex models.

Calculus, still the preferred formal tool of most economic theorists, has played a most important role for the understanding of non-living systems.Footnote 24 At the wake of computer science, it certainly still was a useful inspiration for economic theory to express its proposals in a compact and stringent formal way. But since then, transdisciplinary research and some early pioneers have shown ways how to formalize living systems, how to overcome the limits that the nineteenth century apparatus of marginalist tools unescapably constitute.Footnote 25 It seems to be wise to follow Erwin Schrödinger in his personal opinion that the quantum revolution that pushed theoretical physics on a new track—leaving Newtonian physics as a special case appropriate as an approximation for many macroscopic relationships—has a lot of innovation power for the sciences of living systems in store.

3 Two essential difficulties to be mastered

It has turned out that the existing formal apparatus of economics is only supporting a distorted caricature of social dynamics, while the perspectives of developing complex political economy are indeed perplexing. We only have started to discover its theoretical potential. What is already visible is that its formalization, a language which provides the advantages of concise statements and compact formulation, will rely overwhelmingly on computer simulation languages.Footnote 26 If this is correct, then two future problems immediately appear on the scientific horizon.

3.1 The search for essentials

The first type of problem has been a central point of critique of mathematicians being sceptic of the merits of computer simulations right from the start of informatics. It concerns the bendability of algorithmic formalizations. With a simulation approach, it is particularly easy to customize the formalized results to whatever prejudices the model-builder wants to support. You can have any result for any field of investigation, if you just select the right set of variables and play around a bit with your algorithms. And it is true that the flexibility of this new methodological toolbox is surprising. For a hard-core analytical mathematician, who believes that truth in the end is encapsulated in provable mathematical truth,Footnote 27 such a high flexibility, i.e. the feature always to rely on strong bonds to empirically observed phenomena, is the original sin. In defence of algorithmic approaches and their necessity of sound empirical rooting the high aspirations of ‘pure mathematicians’—making their abstinence of any empirical relationship a prime virtue—this alternative formalization tribe accused their opponents of moving around in tautological circles only. Analytical mathematics in the end is just a language and it cannot be expected that the grammar on which the users of a language have agreed will provide new knowledge.Footnote 28 But insisting on semantics still makes it particularly important to produce a scientifically sound semantic coupling between algorithms and reality—in particular when the links can so easily be modified. And exactly here comes the first of the two abovementioned problems into play. It can be called the search for essentials.

A model never can be an exact copy of its original. Even if it resembles it perfectly, it still has to exist on a different place. The way social entities use models would suggest that this resemblance is of a particular form. The model should preserve the essential features of the original. In pre-scientific societies, these essential features were usually pre-determined by a belief in the rules of life produced by superior beings, i.e. by religion. Following the scientific revolutions that started in the seventeenth century, the search for essentials lead to many surprises. One of the biggest surprises was quantum theory, a hundred years ago. On the other hand, the surge of algorithmic formalizations during the last 50 years, the ease of use of formalization that came with it, has somewhat distracted researchers from the need to search for essentials. Model whatever you need to model to get your research paper published, became a slogan of the new generations of social scientists. Being scientific was reduced to the ability of a correct application of the toolbox of methods. This necessary condition often became a sufficient condition, leading whole schools of thought into an impasse.

The search for the essence is additionally handicapped by the fact that essentials are dynamically changing. Two examples: in the long run, the essential motivating concept of honour held by medieval knights has vanished today, and in the short run, what is essential in the political elections of the last decade seems to vary with frightening speed. Thus, the need to anticipate what will be essential for social entities in the next 5 years—in the different parts of a globalized world—really is a very difficult task. Given the manipulative force of large media corporations clashing on one side with the forces set free by deteriorating living conditions in the global south, and on the other side with neo-fascist movements of ‘middle classes’ becoming impoverished in industrialized countries, and on a third side with profit rate maximization of an already globally centralized financial capital accumulation centre (‘Wall Street’), it is indeed a mammoth task to single out a workable set of essential variables and relationships.Footnote 29 Unfortunately, the design of smaller modules of such a scientific project to a large extent hinges on such a master design, or at least on a rough sketch of it.

3.2 Towards a new formal language

As if this first problem of the search for essentials would not be difficult enough, a second problem is waiting at the door. The way we formulate political economy dynamics, i.e. complexity modelling, has to acquire a quantum theoretic framework. Why? The reason is not that the success of quantum mechanics in non-living systems might be an example for a fashionable new application in social systems. The need for this scientific advance lies much deeper. It starts, as Schrödinger had anticipated, with the application of complex analysis necessary to understand the most basic intricacies of non-living matter.

Quantum theory possesses an impressive set of consequences, which seem to contradict what the consciousness, which we apply in everyday life would refuse to be true. Nevertheless, all these consequences spring from empirically performed experiments, which in the sequel have justified it to be correct to an extremely high degree. Though these experiments focussed on the dynamics at the smallest achievable scales of space and highest achievable scales of time, there nevertheless is a priori no reason why they should not play an important role at the scales important for life on earth.Footnote 30 Actually, as a calculation device for non-living matter, they already affect our lives today severely.

The language in which we speak and think frames what we can think and what we can express. And as Paul Valéry once noted ‘The universe is built on a plan the symmetry of which is somehow present in the inner structure of our intellect.’ Note that symmetry in the strict meaning it has in mathematics refers to a transformation that preserves the original shape of an object, i.e. to an exact copy of the original, distinguishable only by the passage of time that the copying procedure takes. Valery’s statement therefore speculates that the basic features of the universe have produced their copies in our brains. If this is correct, then the transformation is the evolution of life starting with the ‘inner structure’ of the smallest elements of the universe—as described by quantum mechanics—that with endless mutations (copies with small modifications) lead to the biosphere. The special characteristic of the human species then consists of its ability to develop a finite (non-zero persistent) structural order with the help of a shared, externalized language of individual members. The emergence of subjective existence, the notion of consciousness and time, the availability of a trained personal memory, all these basic characteristics coincided and did draw a rather sharp borderline to the animal kingdom. As the physicist Andrea Rovelli recently demonstrated (Rovelli, 2019), the notion of time, despite its overwhelming importance in our everyday lives, does not make much sense as a particularly important variable for non-living systems. Its special role firmly belongs to the domain of living systems. This has severe implications for the formalization of living systems. The discovery of causality built on time—an earlier event causes a later event—pervades all thoughts of social entities.Footnote 31 After, Leibniz and Newton time was encapsulated in a formalization called calculus. The essence of calculus is stupefying simple, taking infinitely decreasing small steps to a finite limit.Footnote 32 Its success rests on being able to express a contradiction that appears before our eyes in reality. An object appears at a certain local position, a point, at a certain moment and in this moment also has a property that is the opposite of being at a point, namely speed (or its geometric analogue of the slope of a position that is a function of time). Looking only at the formalism, at calculus, everything seems to be straightforward, but looking at a photograph that picks out only a moment in the movement of an object, it seems to be very strange that the object in this moment, in this definite location, also possesses its opposite speed. Studying this relationship at the level of atoms and their ingredients showed that this duality of opposing descriptions is not just something that occurs due to the ‘eye of the observer’. It is not just grafted on an unambiguous reality by our own way of looking at it. Contradictions are a basic property of matter, and as the preliminary final outcome of the evolution of matter on earth, our knowledge acquisition process is an isomorphism of contradictory procedures—philosophically interpreted this is Hegel’s heritage. With ever more sophisticated language developments, formalization attempts between intuition and stern grammatical deduction, human art and science, try to grasp what seems to be contradictory. In a sense, this is what makes a model complex.

Of course, static models can be large and they can be very complicated to handle. But difficulties of calculation never should be mistaken as a sign of complexity. When Heisenberg in 1926 considered the empirical evidence at the atomic level, he was already an outstanding mathematician. The force of calculation was with him, but what he needed was something different, namely a formal operation that was not commutative. This was how complex numbers connected by matrix multiplication entered his formalization of quantum mechanics (Heisenberg, 1926, p. 687).Footnote 33 Empirically observed reality had induced the use of more complicated language elements. What Heisenberg notes as important for quantum theory is the fact that he chooses a complex self-adjoint matrix to represent an observable quantity of a quantum system composed of one or more particles; and that the linking matrix operator is non-commutative. Without this property of the operation, quantum theory would again collapse into Newtonian mechanics, he suggests. Note that it needs both, a special type of entity (matrix) and a special type of operation (matrix multiplication), to arrive at a non-commutative description. Furthermore, the elements of Heisenberg’s matrices are complex numbers. Complex numbers had started to play an important role for the revolution in theoretical physics already in 1912. Gauss had explored them with a completely different focus in 1799.Footnote 34And in 1912, Einstein had discovered that Gauss geometric interpretation of complex numbers could serve him well in the formulation of his special relativity theory.Footnote 35Complex numbers, losing the property of being scaled and ordered in one dimension, became necessary to embed duality—better contradiction—of particle and wave properties of electrons in a mathematical formalism. Since this is one and the same formalism, several contemporary theoretical physicists would hold that the semantically richer talk of ‘particle versus wave’ today is not too useful anymore. Another major ingredient of the new theory was to enrich it with the help of Ludwig Boltzmann’s probability theory, a device, which had enabled him to deal with large amounts of events, shifting observed behaviour of particles to probabilities of average behaviour following certain assumed probability distributions. Together, the contradictions implicit in complex analysis and probability theory enabled Schrödinger’s famous wave equation.

Once this step was taken,Footnote 36 its consequences surprised the scientific community. In particular, everyday experience stores only what happens after an enormous amount of quantum mechanics usually has levelled out, and has left our senses with a world of Newtonian physics. For laymen, quantum mechanics remains counterintuitive. But also even leading scientists in theoretical physics in the last decades tried to improve on the somewhat agnostic Copenhagen interpretation provided by Niels Bohr almost a century ago.Footnote 37Biologist Stuart Kauffman takes an even more daring viewpoint (Kauffman, 2019). He aims at radically different consequences, which a closer look at the combinatorial possibilities at the tiniest scale of matter open up. A Boltzmann type of probability theory for him is still a straight jacket for a formalization of the evolution of the biosphere. He therefore introduces the notion of so-called possibles, which in each moment of time exist and consist of a set of paths into the future, which due to their enormous number cannot be prestated. For him, it is the return of the ‘free will’, of the interference of a subject which is ‘doing’ actions that transform the set of ‘possibles’ into the set of ‘actuals’. Only the latter follow Aristotle’s rule of the excluded middle, e.g. that with two incompatible possible outcomes of a measurement, only one really exists. In the area of the nevertheless really existing ‘possibles’, their excluding incompatibility does not hinder their parallel existence. Like several other authors, Kaufman is not satisfied with the Copenhagen interpretation and tries to link his alternative view with features of conscious living systems, which he—as a biologist—finds in the evolution of the biosphere. In that respect, his emphasis on ‘doing measurements’ is a special case of what is proposed here, namely to equip human systems with the capacity to copy the primary copying process of all living systems a second time in their internal mirrors by sharing language systems.

It thus is this second mirroring process—thinking, speaking, internal model building—which breeds complexity. The formalization tools needed to evolve this capacity follow our abilities to perceive. Diversification of perception capacities in human societies therefore imply that what appears to be ‘more’ complex for one social entity can be ‘less’ (or differently) complex for another one.Footnote 38 It evidently is important to take a closer look at the interaction between language and the non-language perceptions it deals with. When scientists became apt to investigate sub-atomic processes, they enlarged the usual mathematical apparatus based on real numbers and started an extensive use of complex numbers,Footnote 39 which until then were just considered as a playground for number theorists. Laboratory experiments thus activated deductively derived formal concepts. On the other hand, it needs a pre-existing scientific model to construct and to perform an experiment. This model determines what one looks for. Take a deep breath and consider what the scientific community of evolutionary political economy would need as formal toolbox for its next step of complexification.Footnote 40

Following our ideas in (Hanappi and Scholz-Wäckerle, 2017) the kind of formalization needed would have to allow oscillations between time periods of longer relative stability (called ‘crystal growth’ stages there) and short revolutionary re-configuration periods. During the first type of stage, many dynamics are either already converging, or quickly bump into a (exogenous, often institutionally secured) limit throwing them back into the neighbourhood of equilibrium growth. Nevertheless, during this stage, some hidden variables slowly build up, which are not taken care of by the protective belt of institutions maintained by ruling classes. Then, with a more or less sudden burst—John Casti’s ‘big scientific surprise’ (Casti, 2015)or LászloBarabási’s ‘bursts’ (Barabási, 2010) translated into evolutionary political economy—the ruling regime stumbles across the difficulties it had ignored for a long time. Dynamics suddenly have to include the previously missing variables on prominent positions, and most dynamics now are diverging. A rather radical re-configuration process sets in. As Kauffman explains in detail for molecular re-combinations in organic chemistry, the sheer number of his ‘possibles’ surmounts any attempt of full enumeration of all ‘possibles’. But the revolting social agents do have only a very finite set of blueprints for a future setup of society; and they have no time to loose. This set clearly is a mix of historically grown visions enriched by contemporary technological possibilities and realized ecological limits.Footnote 41 In this turbulent times, a species makes a larger evolutionary jump, either upwards or downwards, eventually towards extinction (see Hanappi, 2020).

It is tempting to introduce complex numbers, in particular the notion of the circle of convergence in the Wessel plane, to describe the change from a more converging scenario to a diverging scenario (compare Penrose, 2016, pp. 448–458). Necessary small oscillations (Brownian motion) during the relative stable period—think of equilibrium-destroying innovations—could also be elegantly introduced as complex wave dynamics.Footnote 42 An important side-effect of assigning complex numbers to the variables again is that a strict smaller-larger relation does not exist. This provides more room to adjust formal variables to what is happening outside language.Footnote 43

A somewhat more radical formal development concerns the use computer programs. In the meantime, these devices are so common that their revolutionary aspects for scientific research are often overlooked. Operations involving complex numbers might as well (and in practice mostly are) be carried out by programs. But even without computer power, Heisenberg had noted in his pivotal paper that a formal property, namely the missing commutativity of matrix multiplication, is important for his argument. The somewhat unhandy analytical manipulation of more complicated number systems that nevertheless are division algebras, like quaternions or octonions, have also interesting properties that might be analogues to perceptions in human societies. As the sequence of actions in real life most of the time plays a decisive role, i.e. they are not commutative, it is obvious that quaternions or even octonions are a good element for the formal toolbox starting right from non-living systems. Only due to the two reasons stated by Schrödinger the earlier special case of Newtonian physics was convincing enough to dominate science: (i) An enormous mass of copies of quantum mechanical programs allowed for the emergence of the simpler rules perceived by simpler social entities. (ii) Order perception based on similarly ordered internal model building allowed for the emergence producing ‘seeing’, producing order into perceptions; this is the wide area of ‘order produced by order’.Footnote 44 With the use of octonions, the associative law falls. If in a sequence of three coupled actions one first puts together the two first ones, and then performs the third one, it will give a different result as if one performs number two and three together.Footnote 45 Free association actually is a standard procedure in creative human brains as well as in more sophisticated strategic considerations of larger social entities. It can well be expected that octonions are elements in a formal toolbox that are tailored to our internal creative modelling habits (compare Wolchover, 2018) describing the work of CohlFurey (Furey, 2016). Programs, sequences of electronic actions performed in time to mimic sequences in real life, can be a main carrier of octonions’ dynamics. And again, the transdisciplinary character of this type of social research, of complex evolutionary political economy, is evident. And not to forget: certainly, the interaction between language and investigated phenomenon can also help in the first problem area, the search for essentials.

3.3 Afterthoughts

The development of the human species is deeply stunning. As a member of the general group of living systems, a member of the biosphere, it has grown into a new level of self-consciousness, a knowledge of itself as dominating species. The main innovative idea of this paper is that it is this new level of living systems that has been achieved by the human species, which justifies to call our awareness of this fact to be complex—and not only complicated.

I usually prevent to use the adjective ‘complex’ because everyday language as well as some semi-scientific jargon often misuses it, calling a relationship ‘complex’ as an excuse for one owns inability to understand it. This is why I usually replace it by the more innocent word ‘complicated’. Nevertheless, the grain of truth in this use of the word is that ‘complex’ indeed is directly related to a social entity, which uses it. And since social entities differ in their intellectual capacities, what looks complex to one might appear not complex to another one. The view proposed here emphasizes the relation of complexity to a subject, a social entity, too. But here, the subject is the entire species, which via its scientific specialists investigates its own emergence. We are out not only to understand the world; we are out to understand our understanding of the world.Footnote 46 This is what the image of the ‘second mirror’ mentioned above is all about. It is the baseline of complexity from the standpoint of evolutionary theory, the perspective provided here.

Another example of a proposed meaning of complexity comes from the departments of mathematics.Footnote 47 In a nutshell, it relates the property of an object, e.g. a series of numbers, to the possibility to compress it to a shorter statement, e.g. a simple rule to generate this series. The most complex objects then would be those that cannot be compressed. As mentioned earlier, this approach cleans the concept of complexity from all references to non-language elements and transforms it into a feature of the language’s grammar. It is interesting to see that as computer science became influential in mathematical logic, the question of compressing sets of numbers to generating rules was turning into the task of pattern recognition, which in turn found its transdisciplinary partner in biologists studying it in natural phenomena.Footnote 48 Again, the platonic question pops up: Is nature following the pre-existing deep symmetries of mathematics, or is what we find in the double images of our formalizations just the trajectory of the Levy Flight we are developing? Though this paper opts for the second idea, in its scientific quest, it nevertheless always needs its platonic counterpart for a creative dialogue.

Complexity in evolutionary political economy is species-based and stems from the foundational property of this species, i.e. the language it uses to communicate as well as to think in the brains of its individuals. This is not a definition, not a conclusion, but a preliminary afterthought. The intimidating amount of future transdisciplinary science at which it points might be an excuse for the somewhat ambiguous organization of this paper; the owl of Minerva has not yet found the place from where to start its evening flight. Some solace comes from an unexpected corner. Somewhere in his book ‘What is Life’, the mathematician Erwin Schrödinger writes down a really perplexing sentence:

‘If a man never contradictshimself, the reason must be that he virtuallynever says anything at all.’