Science works! This is attested by the spectacular display of human technological prowess. Indeed, technological advancements, made possible by the scientific understanding of the universe, are becoming ever more disruptive and frequent. How is it then justifiable to speak of the crisis of science and even allude to the end of science? This chapter will explore between the poles of perceived knowledge and inescapable ignorance—between the illusion of certainty and limits of reason.

1 The Philosophy of Science

Even the simplest of questions can have the power to open Pandora’s box of existential dilemmas. All attempts to answer the innocent question “What can I know?” have been inconclusive at best. This question has a long history that has accompanied mankind during its efforts to scale the mountain of knowledge and has continued to eluded the conceptual grasp of our minds.

The journey begins with one of history’s first scientists (Grant 2002, p. 33):

No one in the history of civilization has shaped our understanding of science and natural philosophy more than the great Greek philosopher and scientist Aristotle (384–322 B.C.), who exerted a profound and pervasive influence for more than two thousand years [...].

As one of the first thinkers he introduce logic as a means of reasoning. He had a clear vision of what knowledge constitutes and he founded it on intuition (Ross 1963, Book VI, Chapter 6):

Scientific knowledge is judgment about things that are universal and necessary, and the conclusions of demonstration, and all scientific knowledge, follow from first principles [but] it is intuitive reason that grasps the first principles.

Nearly two thousand years later, not much had changed. But then, in 1620, the philosopher Francis Bacon presented modifications to Aristotle’s ideas (Bacon 2000). In essence, a new logic, a reductionist approach, the focus on inductive reasoning, and the aspiration that scientific knowledge should foster technology, introduced what has become known as the modern scientific method. Bacon paved the way for a new, contemporary understanding of scientific inquiry. Approximately at the same time, Robert Boyle, seen by some as one of the founder of modern chemistry, was instrumental in establishing experiments as the cornerstone of physical sciences, working with an air pump (Boyle 1682; Shapin and Schaffer 2011).

The folowing six sections are adapted and expanded from Glattfelder (2013).

1.1 Logical Empiricism

By the early 20th Century, the notion that science is based on experience (i.e., empiricism) and logic, where knowledge is intersubjectively testable, has had a long history. The philosophical school of logical empiricism (or logical positivism) tried to formalize these ideas. Notably, utilizing the tools of mathematical logic, which had matured extensively under the contributions of Betrand Russell.Footnote 1 The Vienna Circle , a group of philosophers, scientist, and mathematics meeting regularly from 1924 to 1936 at the University of Vienna, was a major social hub of the movement. Some notable proponents were Rudolf Carnap, Kurt Gödel, Otto Neurath, Karl Popper, Hilary Putnam, Willard Van Orman Quine, Hans Reichenbach, and Ludwig Wittgenstein (Creath 2013). See also Sect. 2.2.1.

In this paradigm, science is viewed as a building comprised of logical building blocks based on an empirical foundation. A theory is understood as having the following structure:

Observation \(\rightarrow \) Empirical concepts \(\rightarrow \) Formal notions \(\rightarrow \) Abstract laws

In essence, a sequence of ever higher abstractions. This notion of unveiling laws of nature by starting with individual observations is called inductive reasoning. Conversely, deductive reasoning starts with the abstract laws and seeks knowledge by finding a tangible factual description.

What started off as a well-founded and legitimate inquiry into the workings of nature soon faced serious difficulties and the opposition of influential scholars, some even from within the movement. As an example, Popper later claimed to have “killed” logical empiricism. Problems appeared on many fronts. For instance:

  1. 1.

    How can one construct pure formal concepts that solely reflect empirical facts without already anticipating a theoretical framework?

  2. 2.

    How does one link theoretical concepts (like electrons, inflational cosmology, Higgs bosons, utility functions in economics, ...) to experiential notions?

  3. 3.

    How can one distinguish science from pseudo-science?

Somewhat technical, these challenges highlight that the logical empiricists where engaging with the notion of knowledge at a very subtle level, invoking the proverb “the devil is in the details.” However, some glaring problems surfaced as well. One central issue concerns the legitimacy of inductive logic: can inductive reasoning lead to new knowledge? Not really, as deriving a generalization from multiple observations or repeated experiences is unjustified:

  1. 1.

    black swan: no matter how often I observe white swans, I cannot exclude the existence of a non-white one;

  2. 2.

    the future resembles the past: to assume that a sequence of events in the future will occur like it always has in the past, requires the complete knowledge of how the future evolves from the present according to laws of natureFootnote 2;

  3. 3.

    from the particular to the general: for instance, declaring that all wooden bodies float, based on the observation that a single piece of wood floats, is untenable without infusing additional, auxiliary knowledge, like Archimedes’ principle.

The problem of inductive reasoning in logic then also challenges notions of causality and causal relations. See Brun and Kuenzle (2008).

So, in 1967, the philosopher John Passmore declared: “Logical positivism, then, is dead, or as dead as a philosophical movement ever becomes” (as quoted in Creath 2013).

1.2 Critical Rationalism

While empiricism historically was shaped by the insights of John Locke and David Hume, that all knowledge stems from experience, rationalism was crucially influenced by René Descartes and Gottfried Wilhelm Leibniz: knowledge can have aspects that do not stem from experience, i.e., there is an immanent reality to the mind.

The critical rationalists believed they could fix the problems the logical empiricists had faced. Popper was a key figure advancing this epistemological philosophy (Popper 1934). The central theme, referred to by the adjective “critical,” revolves around the idea of falsifiability or fallibility. The idea, that insights gained by pure thought can never be justified but only critically tested by experience and thus discarded if discrepancies are observed. In a nutshell, no number of experiments can ever be used to prove a theory, but a single experiment suffices to contradict an established theory. Moreover, any claims of ultimate justification only lead to the so-called MünchhausenFootnote 3 trilemma (Albert 1968) . That is, one of the following consequences will necessarily be encountered by any attempt to prove a truth:

  1. 1.

    an infinite regress of justifications;

  2. 2.

    circular reasoning;

  3. 3.

    axiomatic or dogmatic termination of reasoning.

To address the problem of inductive reasoning, the critical rationalist turned the tables. Now, from a currently (not falsified) theory or premise, a logically certain conclusion is sought, which can be observed in nature. This top-down logic moves from the abstract to the empirical. As a result, science is no more understood as a linear accumulation of knowledge, metaphorically assembling the edifice of science. In its new incarnation, science is a construct that is invented by people who relentlessly test and adapt its contents. The progression of science is hence seen as an evolutionary, organic process.

Ultimately however, also the school of critical rationalism faced insurmountable challenges. In a nutshell (Brun and Kuenzle 2008):

  1. 1.

    How can basic formal concepts be derived from experience without the help of induction and how can they be shown to be true?

  2. 2.

    What parts of a theory need to be discarded once it is falsified?

But most crucially, the principle of deduction is also plagued by epistemic problems. For how can these principles of deduction be justified in the first place? Moreover (Markie 2015):

Intuition and deduction can provide us with knowledge of necessary truths such as those found in mathematics and logic, but such knowledge is not substantive knowledge of the external world. It is only knowledge of the relations of our own ideas.

A colorful account, of how the conclusion of even a simple deductive argument cannot be logically enforce, can be found in one of the writings of Lewis Carroll, the mathematician who conceived the Alice in Wonderland stories (Carroll 1895).

This is indeed a surprising turn of events. Induction and deduction ultimately fail as rigorous logical tools to generate knowledge of the outer world. Furthermore, many technicalities prevent, or seriously challenge, a clear understanding and justification of what is actually going on in the scientists’ minds when they engage in science. But perhaps science never was what we humans have idealized it to be for millennia. Perhaps science is a messier and murkier enterprise after all.

1.3 The Kuhnian View

Thomas Kuhn’s enormously influential work on the history of science is called the The Structure of Scientific Revolutions (Kuhn 1962). He irrevocably overthrew the idealized notion that science is an incremental process accumulating more and more knowledge. Instead, he identified the following phases in the evolution of science:

  1. 1.

    prehistory: many schools of thought coexist and controversies are abundant;

  2. 2.

    history proper: one group of scientists establishes a new solution to an existing problem which opens the doors to further inquiry and a so called paradigm emerges;

  3. 3.

    paradigm based science: unity in the scientific community on what the fundamental questions and central methods should be (generally a problem or puzzle solving process within the boundaries of unchallenged rules, analogous to solving a Sudoku challenge);

  4. 4.

    crisis: more and more anomalies and boundaries start to appear, questioning and challenging the established rules;

  5. 5.

    revolution: a new theory and Weltbild takes over solving the anomalies and a new paradigm is born.

Kuhn cites the Copernican revolution as an example. Puzzled by the movements of planets and stars in the night sky, ancient humans offered many colorful myths as explanation. Then, around 100 C.E., Claudius Ptolemy had a breakthrough with his geocentric model, building on insights gained by Aristotle and others. Suddenly, fairly accurate predictions could be made about celestial mechanics. With time it became more and more apparent, that the model needed to be adapted and adjusted to account for new observational data. Employing ever more epicycles, that is circles within circles, it was hoped that the model’s predictions would increase in accuracy. However, the mixing of epicycles led to a nearly unworkable system by the time Nicolaus Copernicus entered the scene, in the mid 16th Century. He boldly laid out a new paradigm by placing the sun at the center of the solar system (Copernicus 1543). Initially, the heliocentric model was not preciser than the geocentric model. Only the verification of novel predictions brought the final success, establishing the new paradigm.

Another core concept found in The Structure of Scientific Revolutions is called incommensurability . The term was introduced in the early 1960s by Kuhn and, independently, by the radical philosopher of science Paul Feyerabend (Preston 1997). Basically, if a scientist is too deeply embedded and invested in a specific conceptual framework, worldview, or paradigm, he or she will be unable to understand the reasoning of an outsider scientist, constrained by their own paradigm. More technically, every rule is part of a paradigm and there exist no trans-paradigmatic rules. The consequences are startling: the languages within different paradigms do not overlap enough to enable scientists to compare their theories. While Kuhn understood incommensurability as locally confined and only applicable to some terms and concepts, Feyerabend saw this as a global feature affecting every theory. In effect, scientists are plagued by blind spots and other cognitive biases (Sects. 11.3.2 and 14.4.2). More on Feyerabend and his unapologetic belief, see Sect. 9.1.6 below.

Perhaps a nice illustration of such myopia among scientists can be found in the controversy surrounding the reality of the (electromagnetic 4-vector) potential \(A_\mu \). This is the quantum field underlying the electric and magnetic fields and was introduced in the context of gauge theory. For nearly thirty years people believed that it could not produce any observable effects and was hence a fictitious filed. Then, however, a simple but ingenious experiment established the reality of the potential field, verified by the Aharonov-Bohm effect. In the words of Nobel laureate Richard Feynman (Feynman et al. 1964, p. 15–12):

The implication was there all the time, but no one paid attention to it. Thus many people were rather shocked when the matter was brought up. [...]. It is interesting that something like this can be around for thirty years but, because of certain prejudices of what is and is not significant, continues to be ignored.

See Sect. 4.2 for details on the potential \(A_\mu \), defined in (4.12), its role in gauge theory, and the Aharonov-Bohm effect.

Another disturbing consequence of Kuhn’s inquiries is that scientific revolutions are not at all rational processes governed by insights and reason. To the contrary, as Max Planck outlines in his autobiography (Planck 1950, pp. 33f.):

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.

Quite a dire analysis indeed, far removed from the idealized scientist’s eureka moment. In the same vein, the words of Nobel laureate Steven Weinberg (Weinberg 2003, p. 191):

Kuhn made the shift from one paradigm to another seem more like a religious conversion than an exercise in reason. He argued that our theories change so much in a paradigm shift that it is nearly impossible for scientists after a scientific revolution to see things as they had been seen under the previous paradigm.

Kuhn goes on to give additional blows to a commonsensical understanding of science with the help of Norwood Hanson and Quine:

  1. 1.

    every human observation of reality contains an a priori theoretical framework: this implies that two scientists looking at the same aspect of reality do not necessarily see the same things;

  2. 2.

    underdetermination of belief by evidence: any theory can be made compatible with recalcitrant observations by adaptions made to the background assumptions and thus data do not determine theories;

  3. 3.

    every experiment is based on auxiliary hypotheses: initial conditions, the proper functioning of the apparatus, the chosen experimental setup, the selected modes for interpreting the experimental data (what exactly are the scientists at CERN seeing, when they observe subatomic particles as peaks in a diagram, the abstract information the LHC relays from the quantum world?).

See Hanson (2010), Quine (1951), and also Brun and Kuenzle (2008).

What are the consequences of these unexpected and profound failings of the most simplest premises one would wish science to be grounded on? If logic, empiricism, objectivity, rationality, cohesion, structure, method, and a foundation are not inherently found in the way real humans conduct science, what are we left with? Indeed, people slowly started to realize the very serious consequences illuminated by the relentless but diligent inquiry led by the philosophy of science.

1.4 Postmodernism

Modernism describes the development of the Western industrialized society since the beginning of the 19th Century. Central ideas were the understanding that there exist objective true beliefs and that progression is always linear, steadily improving the status quo.

Postmodernism replaces these notions with the conviction that many different opinions and forms can coexist and even find acceptance. Core ideas are diversity, differences and intermingling. In the 1970s postmodernism is seen to enter cultural thinking, impacting art, music, and architecture. It is a notoriously hard concept to define due to its multifaceted nature. One attempt at a succinct definition centers around the idea of the meta-narrative. This is a narrative about narratives, relating to meaning, experience, and knowledge, which offers legitimation to a society. Then, in the words of the philosopher, sociologist, and literary theorist Jean-François Lyotard (Lyotard 1984. p xxiv):

I define postmodern as incredulity toward meta-narratives.

Abroader and more vivid characterization is offered by Tarnas (1991, excerpts from the Chapter “The Postmodern Mind”, p. 396f.):

What is called postmodern varies considerably according to context, but in its most general and widespread form, the postmodern mind may be viewed as open-ended, indeterminate set of attitudes that has been shaped by a great diversity of intellectual and cultural currents.

There is an appreciation of the plasticity and constant change of reality and knowledge, a stress on the priority of concrete experience over fixed abstract principles, and a conviction that no single a priori thought system should govern belief or investigation. It is recognized that human knowledge is subjectively determined by a multitude of factors; that objective essences, or things-in-themselves, are neither accessible nor possible; and that the value of all truths and assumptions must be continually subjected to direct testing. The critical search for truth is constrained to be tolerant of ambiguity and pluralism, and its outcome will necessarily be knowledge that is relative and fallible rather than absolute or certain.

Reality is not a solid, self-contained given but a fluid, unfolding process, an “open universe,” continually affected and molded by one’s actions and beliefs. [...]. Reality is in some sense constructed by the mind, not simply perceived by it, and many such constructions are possible, none necessarily sovereign.

Hence all meaning is ultimately undecidable, and there is no “true” meaning. No underlying primal reality can be said to provide the foundation for human attempts to represent truth. [...]. The multiplicity of incommensurable human truths exposes and defeats the conventional assumption that the mind can progress ever forward to a nearer grasp of reality.

Indeed, postmodernism can be understood as a call to cultivate one’s own inner authority (Tarnas 1991, p. 404):

The postmodern collapse of meaning has thus been countered by an emerging awareness of the individual’s self-responsibility and capacity for creative innovation and self-transformation in his or her existential and spiritual response to life.

Such personal exposure to postmodernism in everyday life is perhaps nicely captured by Sarah Kay’s experience. She is known for her spoken-word poetry and while teaching a class, Kay came up with the assignment to write a list of “10 Things I Know to be True.” By comparing one’s own list with the lists of enough other people one can observe the followingFootnote 4:

  1. 1.

    affirmation: someone has the exact same, or very similar, things as you have on your list;

  2. 2.

    dissonance: someone has the complete and total opposite to something you know is true;

  3. 3.

    novel thoughts: someone has something you have never even heard of before;

  4. 4.

    limited scope: someone has something you thought you knew everything about, but they are introducing a new angle to look at it or are offering an extended scope.

As expected, many scientists were unsympathetic to such an outlook on life and recoiled at most of the ideas associated with postmodernism. Physicist David Deutsch sees postmodern as “bad philosophy” and criticizes (Deutsch 2011, p. 314):

It [postmodernism] is a narrative that resists rational criticism or improvement, precisely because it rejects all criticism as mere narrative. Creating a successful postmodernist theory is indeed purely a matter of meeting the criteria of the postmodernist community—which have evolved to be complex, exclusive and authority-based. Nothing like that is true of rational ways of thinking.

But not only postmodernism, with its radical epistemology and ontology, faced fierce opposition from scientists, indeed the whole idea of philosophy in general. Some scientists believe it is an irrelevant enterprise, as echoed in a quip usually attributed to Feynman: “The philosophy of science is about as useful to scientists as ornithology is to birds.” Others have openly expressed their contempt, like the eminent physicist Freeman Dyson. In an article he wrote, reviewing the book of a philosopher (Holt 2012), he described today’s philosophers as “a sorry bunch of dwarfs” compared to the giants of the past and portrayed contemporary philosophy as “a toothless relic of past glories” (Dyson 2012). Such scornful attitudes can perhaps be understood as the aftershocks of the Science Wars of the 1990s. Scientists then accused certain philosophers of having effectively rejected realism, objectivity, and rationality. They believed the scientific method, and even scientific knowledge, to be under siege.Footnote 5

In this environment, some scientists sought to defend their cherished enterprise from postmodern attacks they perceived as anti-intellectual. In one incidence, called the Sokal hoax, physicist Alan Sokal got a nonsensical paper published in a journal of postmodern cultural studies. Already the grandiloquent title does not disappoint: “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity” (Sokal 1996). By flattering the editor’s ideology with nonsense that sounds scientific and meaningful, Sokal got his 35 page long article, with profuse citations, accepted for publication. Indeed, the text is mainly a conflation of academic terms and buzzwords with sociopolitical and economic notions, as illustrated by the following except (Sokal 1996, p. 242):

Thus, a liberatory science cannot be complete without a profound revision of the canon of mathematics. As yet no such emancipatory mathematics exists, and we can only speculate upon its eventual content. We can see hints of it in the multidimensional and nonlinear logic of fuzzy systems theory; but this approach is still heavily marked by its origins in the crisis of late-capitalist production relations. Catastrophe theory, with its dialectical emphases on smoothness/discontinuity and metamorphosis/unfolding, will indubitably play a major role in the future mathematics; but much theoretical work remains to be done before this approach can become a concrete tool of progressive political praxis.

Interestingly, modern physics has also suffered a similar embarrassment in 2002. Indeed, editors of scientific journals can just as easily succumb to imagining meaning where there is perhaps only empty jargon. The Bogdanov affair centers around the French twins and TV personalities Igor and Grichka Bogdanov. They enjoyed celebrity status and hosted a French science fiction television program. Today, they attract a lot of curiosity due to their physical appearance. What appears to be the result of extreme plastic surgery, gives the twins an eerie extraterrestrial look: drastically pronounced chins, cheekbones, and lips. The Bogdanov affair was an academic dispute regarding the legitimacy of the work produced by the twins. This included a series of theoretical physics papers published in reputable, peer-reviewed scientific journals, and their Ph.D. thesis, awarded by the University of Bourgogne in 2000. It was alleged that the contents was a meaningless combinations of buzzwords and the affair was covered in the mainstream media. The matter has also been referred to as the “reverse Sokal” hoax. To this day the Bogdanov twins have insisted upon the validity of their work, however, the controversy has prompted reflections upon the peer-review system. Declan Butler, a senior reporter for Nature Magazine, had the following to offer (Butler 2002):

Take a deep breath, and give the following sentence a go. “We demonstrate that the lorentzian signature of the space-time metric (\(+ + + -\)) is not fixed at the Planck scale and shows ‘quantum fluctuation’ between the lorentzian and euclidean (\(+ + +\)) forms until the 0 scale where it becomes euclidean (\(+ + + +\)).” Confused? Don’t worry, you’re in good company. Physicists around the world have been unable to agree on whether the Ph.D. thesis this line comes from is good, bad or a hoax. [...]. So are the papers good science or not? Enquiries by Nature show that few theoretical physicists, including some who reviewed the brothers’ Ph.D. theses, are completely certain.

Has modern physics itself really transformed into a postmodern narrative that defies meaning, clarity, and understanding? An edifice lingering in a state of undecidability? See Chap. 10.

On a more humorous note, the website http://www.snarXiv.org/ is aimed at spoofing the theoretical, high-energy physics section of the popular scientific archive for electronic preprints http://www.arXiv.org, by automatically generatingFootnote 6 meaningless titles and abstracts, infused with a barrage of buzzwords. As an example, consider the following:

figure a

Which title and abstract belongs to a legitimate contribution? For which one would you expect that there actually exists an article, building on the previous work of others, as highlighted by the many technical expressions introduced in the abstract, uncovering a novel detail of specialist knowledge? The author of the spoof website even based an online game on this idea, where the player is offered two titles, an actual high-energy physics paper from the arXiv, and a completely fake title randomly generated by the snarXiv. The aim of the game is to spot as many fake titles as possible.

However, postmodernism is only the tip of the iceberg of epistemic threats to knowledge and certainty. For one, it invokes the ghost of extreme skepticism in the form of solipsism, the belief that only one’s own mind is certain to exist while any other knowledge is necessarily unsure. A feeling epitomized by Descartes’ infamous sentence “cogito ergo sum” (Descartes 1937) and George Berkeley’s notorious denial of a material external reality in favor of a reality exclusively comprised of minds and their ideas (Downing 2013). But more unsettling, postmodernism opens the doors to the Scylla and Charybdis of constructivism and relativism, discussed next.

See Sect. 6.2.2 for the related notion of poststructuralism. Indeed, postmodernism (Cilliers and Spurrett 1999) and poststructuralism (Sect. 6.2.2) are the preferred philosophies of complexity. Moreover, note the debate on the foundations of mathematics and the roots of postmodern thought (Tasić 2001).

1.5 Constructivism

Kuhn’s analysis challenges the objective and universal nature of scientific knowledge. In essence, this knowledge is demoted to an edifice contingent upon the idiosyncrasies of human beings and their social and cultural imprintings. This predicament is lamented by Weinberg (2003, p. 190f.):

What does bother me on reading [The Structure of Scientific Revolutions] and some of Kuhn’s later writings is his radically skeptical conclusions about what is accomplished in the work of science. And it is just these conclusions that have made Kuhn a hero to the philosophers, historians, sociologists, and cultural critics who question the objective character of scientific knowledge, and who prefer to describe scientific theories as social constructs, not so different in this respect from democracy or baseball.

Furthermore (Weinberg 2003, p. 192):

If the transition from one paradigm to another cannot be judged by any external standard, then perhaps it is culture rather than nature that dictates the content of scientific theories.

This impact of the social and cultural conditions on science is today studied under the label of the sociology of scientific knowledge. Proponents of the University of Edinburgh and the University of Bath poularized this field of inquiry (Bloor 1976; Shapin et al. 1985; Collins 1985; Shapin 1994).

In effect, what is being proposed here is the notion that knowledge is effectively constructed: always a product of the different factors conditioning scientists. An example of such culturally influenced constructions is related to gender. Feminist critiques of science have been thematically linked to the sociology of scientific knowledge, namely the marginalization of points of view based on gender, ethnicity, socio-economic status, and political status. Given science’s long tradition of excluding women as practitioners, such critique is not unwarranted. The view that women are unfit for science, or vice versa, has been haunting the minds of male scientists since the beginning. For more on the feminist perspective on science, see Crasnow et al. (2015).

But more in general, constructivism maintains that all humans construct personal knowledge and meaning from the interactions of their subjective experiences and their ideas. Constructivist epistemology is a branch of the philosophy of science that argues that science is simply a product of such mental constructs, devised to explain the sensory experiences of the natural world. Essentially, scientific knowledge is merely constructed by the current scientific community, seeking to understand and build models of the world. More on constructivism can be found in Watzlawick (1984), Jonassen (1991), Perkins (1999).

The philosopher Ernst von Glasersfeld further added an uncompromising twist to theses ideas by introducing the notion radical constructivism (Von Glasersfeld 1984, 1989, 2002). In this relentless version, constructivism fully abandons objectivity. Or in the words of the physicist Heinz von Foerster (Schülein and Reitze 2002, p. 174, translation mine):

Objectivity is the illusion that observations are made without an observer.

At its heart, radical constructivism questions the validity of any external sensory input. The subjective observer is placed at the center of the experience, but without the means to probe the external world conclusively. An analogy would be the submarine captain who has to rely on instruments to indirectly gain knowledge about the outside world. In detail, it is understood that perception never yields a faithful image of outer reality but is always an inner construction, derived from sensory input but dependent on the cognitive apparatus of each individual. Indeed, radical constructivists are motivated and validated by modern insights gained by neuroscience. Instead of reality being passively recorded by the brain, it is thought to be actively constructed by it. In effect, our brains sample just a small bit of the surrounding physical world from which normal perception is constructed. This “normal” mode of experiencing reality hardly differs from hallucinations or dreams which are not at all anchored by external input. The enigma of perception, and neuroscience in general, will be discussed in greater detail in Chap. 11.

1.6 Relativism

Constructivism opens the door to the next epistemic threat: relativism. If knowledge is constructed and hence contingent, then it can be rational for a group A to believe a fact \(\mathcal {P}\), while at the same time it is rational for group B to believe in negation of \(\mathcal {P}\). Again, in the words of Weinberg (2003, p. 192):

If scientific theories can be judged only within the context of a particular paradigm, then in this respect the scientific theories of any one paradigm are not privileged over other ways of looking at the world, such as shamanism and creationism.

Relativism is the antipodal idea of absolutism. Whereas absolutism gives comforting certainty and clarity, offering solace to both scientifically and religiously minded people by invoking the idea of objective or absolute truth, relativism is a direct existential assault undermining and tainting any notion relating to meaning, truth or belief. Relativism summons disconcerting feelings of doubt, uncertainty, and ambiguity. Recall the long history of thinkers troubled by self-doubt and ambivalent knowledge, detailed in Sect. 8.1.

It is an interesting, albeit expected, observation that theologians and scientists alike show the same deep-seated disdain towards the idea of relativism: it is a double-edged sword threatening any institutionalized orthodoxy or scientific consensus. In the words of Joseph Ratzinger, delivered in his homily at the beginning of the conclave in 2005 from which he would emerge as Pope Benedict XVI:

We are building a dictatorship of relativism that does not recognize anything as definitive and whose ultimate standard consists solely of one’s own ego and desires.

On June 6, 2005, as Pope, he was quick to reiterate this point in the “Address of His Holiness Benedict XVI to the Participants in the Ecclesial Diocesan Convention of Rome”:

Today, a particularly insidious obstacle to the task of educating is the massive presence in our society and culture of that relativism which, recognizing nothing as definitive, leaves as the ultimate criterion only the self with its desires. And under the semblance of freedom it becomes a prison for each one, for it separates people from one another, locking each person into his or her own “ego”.

Also Pope Benedict XVI’s reformist and liberal successor, Pope Francis, adheres to this belief, viewing relativism as a vice which “makes everyone his own criterion and endangers the coexistence of peoples” (as seen in his address to the Diplomatic Corps accredited to the Holy See, on March 22, 2013).

In contrast, relativism is more prevalent in Eastern thought systems, where ideas are widespread that carry more holistic or pantheistic rings. An example can be found in Jainism, an ancient, radically non-violent Indian religion, that shares in its cosmology many of the elements of pre-Socratic Greek philosophies (see Nakamura 1998 and Sect. 3.1). In essence (Nakamura 1998, p. 167):

The fundamental standpoint of Jainism [...] signifies that the universe can be looked at from many points of view, and that each viewpoint yields a different conclusion. Therefore, no conclusion is decisive.

At its heart, “Jainism shows extreme caution and anxiety to avoid all possible dogma in defining the nature of reality” (Nakamura 1998, p. 169).

In Western philosophy, relativism can be attributed to the Greek thinkers Heraclitus and Protagoras. So, even before Aristotle would set out to revolutionize the way we think about reality, a process ultimately leading to science, the seeds of relativity were sown, which would, about two and a half millennia later, prompt thinkers to view science as just one of many ways of knowing the world. In this context we encounter the influential, colorful, and controversial philosopher of science, Paul Feyerabend. He was truly the enfant terrible of relativism (Kidd 2011):

Feyerabend was famously dubbed “the worst enemy of science” by Science, and even today philosophers of science will tend to associate his name with anti-science polemics, defences of voodoo and astrology, and more besides.

This is perhaps not surprising, looking at the titles of the books he published:

 

1975:

Against Method: Outline of an Anarchistic Theory of Knowledge (Feyerabend 2008);

1987:

Farewell to Reason (Feyerabend 1999);

1993:

The Tyranny of Science (Feyerabend and Oberheim 2011).

 

Indeed, Feyerabend initially gained notoriety for his anarchistic rallying cry “anything goes,” a vision of scientific anarchy that would send shivers down the spins of many scientists. In his own words (Feyerabend 2008, p. 9):

The following essay is written in the conviction that anarchism, while perhaps not the most attractive political philosophy, is certainly excellent medicine for epistemology, and for the philosophy of science .

In essence, he observed (Feyerabend 2008, p. 1):

The events, procedures and results that constitute the sciences have no common structure.

In Feyerabend (2008) he outlined his program for the philosophy of science. It is a very sympathetic, open-minded, idiosyncratic, and personal exposé. In an analytical index, Feyerabend offers a sketch of the main argument, summarized as a few sentences per chapter (Feyerabend 2008, p. 5f.):

 

Intro.:

Science is an essentially anarchic enterprise: theoretical anarchism is more humanitarian and more likely to encourage progress than its law-and-order alternatives.

Chapter 1:

This is shown both by an examination of historical episodes and by an abstract analysis of the relation between idea and action. The only principle that does not inhibit progress is: anything goes .

Chapter 2:

For example, we may use hypotheses that contradict well-confirmed theories and/or well-established experimental results. We may advance science by proceeding counterinductively.

Chapter 3:

The consistency condition which demands that new hypotheses agree with accepted theories is unreasonable because it preserves the older theory, and not the better theory. Hypotheses contradicting well-confirmed theories give us evidence that cannot be obtained in any other way. Proliferation of theories is beneficial for science, while uniformity impairs its critical power. Uniformity also endangers the free development of the individual.

Chapter 4:

There is no idea, however ancient and absurd, that is not capable of improving knowledge. [...]

Chapter 5:

No theory ever agrees with all the facts in its domain, yet it is not always the theory that is to blame. Facts are constituted by older ideologies, and a clash between facts and theories may be proof of progress. [...]

Chapter 11:

[...] Copernicanism and other essential ingredients of modern science survived only because reason was frequently overruled in the past.

Chapter 17:

Neither science nor rationality are universal measures of excellence. They are particular traditions, unaware of their historical grounding.

Chapter 18:

Yet it is possible to evaluate standards of rationality and to improve them. The principles of improvement are neither above tradition nor beyond change and it is impossible to nail them down.

Chapter 19:

Science is neither a single tradition, nor the best tradition there is, except for people who have become accustomed to its presence, its benefits and its disadvantages. [...]

 

Of course, Feyerabend also espoused very radical and provocative ideas. For instance Feyerabend (2008, p. 238):

In a democracy it [science] should be separated from the state just as churches are now separated from the state.

As expected, such utterances helped draw the wrath of the scientific community. Feyerabend was heavily criticized and vilified. However, accusations were often countered by him pointing out where he had been misinterpreted and by reemphasizing his ruthless commitment to open-mindedness. Like in the following example (Feyerabend 2008, p. 122):

A few years ago Martin Gardner, the pitbull of scientism, published an article with the title “Anti-Science, the Strange Case of Paul Feyerabend” Critical Inquiry, Winter 1982/83. The valiant fighter seems to have overlooked these and other passages [in Against Method]. I am not against science. I praise its foremost practitioners and (next chapter) suggest that their procedures be adopted by philosophers. What I object to is narrow-minded philosophical interference and a narrow-minded extension of the latest scientific fashions to all areas of human endeavor—in short what I object to is a rationalistic interpretation and defense of science.

Perhaps what infuriated Feyerabend’s critics the most was the adaptability of his beliefs, as one might expect from a relativist (Feyerabend 2008, p. 268):

In a critical notice of my book Farewell to Reason Andrew Lugg suggests “that Feyerabend and likeminded social critics should treat relativism with the disdain that they normally reserve for rationalism” . This I have now done, in Three Dialogues of Knowledge where I say that relativism gives an excellent account of the relation between dogmatic world-views but is only a first step towards an understanding of live traditions, and in Beyond Reason: Essays on the Philosophy of Paul Feyerabend, where I write “relativism is as much of a chimera as absolutism (the idea that there exists an objective truth) , its cantankerous twin”. [...] In both cases I raise objections against relativism, indicating why I changed my mind and mention some of the remaining difficulties.

With refreshing candidness he confessed (quoted in Horgan 1997, p. 50):

I have opinions that I defend rather vigorously, and then I find out how silly they are, and I give them up.

Furthermore, as one might expect, Feyerabend adhered to no systematicity in his work, often emphasizing the ad hoc and random nature of his undertakings. For instance, as seen in the analytical index (Feyerabend 2008, p. 8f.):

 

Chapter 20:

The point of view underlying this book is not the result of a well-planned train of thought but of arguments prompted by accidental encounters.

 

Or more generally (Feyerabend 2008, p. 159):

“Anything goes” does not mean that I shall read every single paper that has been written—God forbid!—it means that I make my selection in a highly individual and idiosyncratic way, partly because I can’t be bothered to read, partly because I can’t be bothered to read what doesn’t interest me—and my interests change from week to week and day to day—partly because I am convinced that humanity and even Science will profit from everyone doing his own thing [...].

Perhaps Feyerabend was indeed profoundly misunderstood, a fact he himself would probably never worry about and try to amend. To illustrate, a more sympathetic reading of Feyerabend (Kidd 2011):

The Tyranny of Science should therefore be interpreted as Feyerabend’s attempts to dissolve conflicts and establish harmony between science, society, and philosophy, on the one hand, and between scientists, philosophers, and the public, on the other. The concerns and alarms that concerned Feyerabend are not the exclusive preserve of any of those domains—scientific, public, or philosophical—and to properly understand and address them each must cooperate with the other. Tyranny only arises when one of those would try to dominate the others, and Feyerabend’s book offers an engaging and entertaining case against such tyranny.

2 The Evolution of Science

During the meandering evolution of science many pressing issues have been raised, relating to truth, knowledge, and beliefs. Inconspicuous and commonsensical ideas of rationality, objectivity, and universality came under siege. However, more drastically, and despite the remarkable success of science in continually uncovering knowledge of the world, our idea of reality itself emerged as conceptually flawed. A paradox surfaced (Tarnas 1991, p. 333):

For at the same time that modern man was vastly extending his effective knowledge of the world, his critical epistemology inexorably revealed the disquieting limits beyond which his knowledge could not claim to penetrate.

The envisaged role of the human being, taken to be that of the detached onlooker observing and interpreting a world of mind-independent objects, began to pose a problem. Indeed, our common sense and intuition, longing for an objective reality which can be comprehended by the human mind through unambiguous, justified knowledge of true facts, started to appear misguided. Today we know that “fundamental physics has a long history of disregarding our common sense notions” (Gefter 2012).

Ironically, harmless-looking anomaliesFootnote 7 led to the absolutely unexpected discoveries of new realms of reality, fundamentally and irreversibly disrupting the prevailing classical worldview. One was the uncovering of the discrete nature of reality, as revealed by quantum phenomena (Sect. 4.3.4), the other was the finding of the malleability of space and time (special and general relativity, discussed in Sects. 3.2.1 and 4.1 and 10.1.2, respectively). The deeper the human mind probed reality, the more outlandish the stories became that it has to tell itself about these new planes of existence. Troublingly (Tarnas 1991, p. 358):

[...] the concepts derived from the new physics not only were difficult for the layperson to comprehend, they presented seemingly insuperable obstacles to the human intuition generally: a curved space, finite yet unbounded; a four-dimensional space-time continuum; mutually exclusive properties possessed by the same subatomic entity; objects that were not really things at all but processes or patterns of relationships; phenomena that took no decisive shape until observed; particles that seemed affect each other at a distance with no known causal link; the existence of fundamental fluctuations of energy in a total vacuum.

These issues are discussed in Chap. 10. The possible implications are truly unsettling to the Western mind and many of humanity’s century old concepts and beliefs appear to be in danger. The ramifications of such uprooting discoveries left scars in the psyche of scientists (Tarnas 1991, p. 356):

By the end of the third decade of the twentieth century, virtually every major postulate of the earlier scientific conception had been controverted: the atoms as solid, indestructible, and separate building-blocks of nature, space and time as independent absolutes, the strict mechanistic causality of all phenomena, the possibility of objective observation of nature. Such fundamental transformation in the scientific world picture was staggering, and for no one was this more true than the physicists themselves. Confronted with the contradictions observed in subatomic phenomena, Albert Einstein wrote: “All my attempts to adapt the theoretical foundation of physics to this knowledge failed completely. It was as if the ground had been pulled out from under one, with no firm foundation to be seen anywhere upon which one could have built.” Heisenberg similarly realized that “the foundations of physics have started moving... [and] this motion has caused the feeling that the ground would be cut from science.”

See Chap. 10 for an overview of the struggles of physicsc in general and Sect. 10.3.2 for specific introduction to the bizarre realm of the quantum.

What did other practitioners of science have to say to all of this? At first, the philosophical conundrums of the novel physical theories were acknowledged. Indeed, any understanding of the newly discovered subatomic reality appeared to require the reintegration of philosophy into science (Kaiser 2011, p. 2):

Most of its [quantum mechanics] creators—towering figures like Niels Bohr, Werner Heisenberg, and Erwin Schrödinger—famously argued that quantum mechanics was first and foremost a new way of thinking. Ideas that had guided scientists for centuries were to be cast aside Bohr constantly spoke of the “general epistemological lessons” of the new quantum era.

However, after World War II, the philosophically inspired attempts at understanding the quantum world quickly faded, nearly vanishing during the Cold War with the emerging new rallying cry: “Shut up and calculate!” (see Sect. 2.2.1). With the scope of physics steadily increasing, mathematical prowess became the most vital skill, leaving not much room for grander musings. The question of what the mathematical symbols being manipulated really mean were ignored as was their relationship to reality. The attitudes scientists adopted towards philosophy now ranged between indifference and hostility. As mentioned above, some eminent physicists are on record expressing their contempt for philosophy. Needles to say, philosophers insisted on analyzing these conceptual puzzles, pestering the scientists. Understandably, it is hard to accept that the sanctity of science can be soiled by the very human nature of scientists (recall Sect. 9.1.5). Moreover, it is not easy to admit that (Tarnas 1991, p. 358):

Physicists failed to come to any consensus as to how the existing evidence should be interpreted with respect to defining the ultimate nature of reality. Conceptual contradictions, disjunctions, and paradoxes were ubiquitous, and stubbornly evaded resolution. A certain irreducible irrationality, already recognized in in the human psyche, now emerged in the structure of the physical world itself.

Perhaps it is consoling to some scientists that philosophy itself also suffer in the modern era (Tarnas 1991, p. 354):

As philosophy became more technical, more concerned with methodology, and more academic, and as philosophers increasingly wrote not for the public but for each other, the discipline of philosophy lost much of its former relevance and importance for the intelligent layperson, and thus much of its former cultural power.

2.1 The Comprehensible Universe

Of all the magnificent capabilities of the modern human mind, one is especially curious: the ability to be unimpressed by existence. While as children we are dumbfounded by the unfathomable reality encompassing us, as adults we are so often caught up in our bland daily routines that we cease to wonder. But not everyone. The philosopher Alan Watts confessed (Watts 1971 p. 23):

As Aristotle put it, the beginning of philosophy is wonder. I am simply amazed to find myself living on a ball of rock that swings around an immense spherical fire. I am more amazed that I am a maze—a complex wiggliness, an arabesque of tubes, filaments, cells, fibers, and films that are various kinds of palpitation in this stream of liquid energy.

In a similar vein, Einstein’s musings (Einstein 2007, p. 5):

The fairest thing we can experience is the mysterious. It is the fundamental emotion which stands at the cradle of true art and science. He who knows it not and can no longer wonder, no longer feel amazement, is as good as dead, a snuffed-out candle.

Ranking second, after blissful wonder, is perhaps the realization that reality can be comprehended. Indeed, it is a striking hidden assumptions of science that the universe is understandable to the human mind. Why should the mysterious workings of the grand universe find a correspondence in our minds and hence a correlate in our brains? Why should the formal abstract thoughts systems the human mind can access—even while sitting in Plato’s cave—relate to anything in the outer world? Why is there an overlap between inner and outer structures? In other words, why does a Book of Nature exist at all and why is it written in a language the human mind can read? These issues are detailed in Part I.

This astounding fact has had an enchanting effect on some eminent scientists. For instance, the theoretical physicist, mathematician, and Nobel laureate Eugene Wigner. In 1960 he published an article with the striking title: The Unreasonable Effectiveness of Mathematics in the Natural SciencesFootnote 8 (Wigner 1960). There he observed:

[T]he enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and [...] there is no rational explanation for it.

[I]t is not at all natural that “laws of nature” exist, much less that man is able to discover them.

[T]he two miracles of the existence of laws of nature and of the human mind’s capacity to divine them.

[F]undamentally, we do not know why our theories work so well.

Also Einstein did not hide his bewilderment (Isaacson 2007, p. 462):

The eternal mystery of the world is its comprehensibility.

The fact that it is comprehensible is a miracle.

Even as one of the greatest minds in physics he did not resist the temptation to express his views on science in an unscientific way which would have been scorned by many scientists had they been uttered by a lesser colleague (Einstein 1918):

The supreme task of the physicist is to arrive at those universal elementary laws from which the cosmos can be built up by pure deduction. There is no logical path to these laws; only intuition, resting on sympathetic understanding of experience, can reach them.

Einstein continues (Einstein 1918):

The state of mind which enables a man to do work of this kind [science] is akin to that of the religious worshiper or the lover; the daily effort comes from no deliberate intention or program, but straight from the heart.

Also the great cosmologist Stephen Hawking was tempted to dive deep into the metaphysical underbelly (Hawking 2008, p. 142):

What is it that breathes fire into the equations and makes a universe for them to describe?

To escape such metaphysical and existential challenges, scientists have been know to invoke the concept of beauty or some kind of divinity. Recall the words of the theoretical physicist and Nobel laureate Steven Weinberg from Sect. 4.4:

We believe that, if we ask why the world is the way it is and then ask why that answer is the way it is, at the end of this chain of explanations we shall find a few simple principles of compelling beauty.

Again, Einstein (Isaacson 2007, p. 388f.):

I believe in Spinoza’s God, who reveals himself in the harmony of all that exists, not in a God who concerns himself with the fate and the doings of mankind.

The rationalist philosopher Baruch Spinoza offered a vision of God as the essence of the universe (Nadler 2016):

God is the infinite, necessarily existing (that is, uncaused), unique substance of the universe. There is only one substance in the universe; it is God; and everything else that is, is in God.

Some have described such an understanding of God as pantheistic while others have seen links to Hinduism (Van Bunge and Klever 1996). In Einstein’s own words (Frankenberry 2008, p. 147):

Every one who is seriously involved in the pursuit of science becomes convinced that a spirit is manifest in the laws of the Universe—a spirit vastly superior to that of man, and one in the face of which we with our modest powers must feel humble.

Einstein would later define a principle of cosmic religion, see Sect. 15.3.1.

Even though Einstein’s intuition was so acute that it allowed him to access and uncover new facets of reality that were unimaginable before him, he appeared to have hit a dead end while pondering quantum phenomena. Not only did he reject their reality, tragically, his attempts at an alternative formulation would preoccupy his mind in vain until his death (see Sects. 4.3.4, 10.3.2.1, and 4.3.5). Perhaps this next quote from him best summarizes the inner turmoil felt by the practitioners of science (quoted in Hoffmann and Dukas 1973, p. vii):

One thing I have learned in a long life: That all our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.

Moreover, Einstein held the following personal conviction (quoted in Dukas and Hoffmann 2013, p. 39):

What I see in Nature is a magnificent structure that we can comprehend only very imperfectly, and that must fill a thinking person with a feeling of humility.

Returning to philosophy, one may ask the following question: What if the observable and comprehensible universe is only a slice of the totality of reality? What if the fabric of reality is vastly richer than we can perceive and fathom? Such a concession would allow for the notions of teleology and entelechy to enter the picture as explanatory templates, without the need to invoke the divine. Such musings are entertained in Part III.

2.2 The End of Science?

In the history of science, there have been many occasions where it was believed that nearly all of the workings of the universe had been decoded. Again and again, tempted by the dream of being only a small step away from a complete description of nature, scientists have made exuberant claims. For instance Graham et al. (1983, p. 38):

Indeed, it seemed to some physicists in the closing year of the nineteenth century that taken together, Newton’s celestial mechanics and Maxwell’s equations indicated that the prospect of completing physics was in sight.

In the words of the eminent experimentalist Albert A. Michelson in 1894 (quoted in Graham et al. 1983, p. 38):

While it is never safe to affirm that the future of physical science has no marvels in store even more astonishing than those of the past, it seems probable that most of the grand underlying principles have been firmly established and that further advances are to be sought chiefly in the rigorous application of these principles to all the phenomena which come under our notice. It is here that the science of measurement shows its importance—where quantitative work is more to be desired than qualitative work. An eminent physicist remarked that the future truths of physical science are to be looked for in the sixth place of decimals.

Then, in 1920 (Hawking 1980, p. 1):

[...] Max Born told a group of scientists visiting Göettingen that “Physics, as we know it, will be over in six months.”

In 1980, Stephen Hawking gave his inaugural lecture as Lucasian professor of mathematics at the University of Cambridge, England, titled Is the End in Sight for Theoretical Physics? He opened with (Hawking 1980):

In this lecture I want to discuss the possibility that the goal of theoretical physics might be achieved in the not too far future, say, by the end of the century. By this I mean that we might have a complete, consistent and unified theory of the physical interactions which would describe all possible observations.

Ten years later Hawking updated his prediction: “Give it twenty or twenty-five years” (Ferguson 2011, p. 214; see also Sect. 4.3.2). However, in 1998 the optimism started to diminish (Smith 2016):

It doesn’t look as if we are going to quite make it.

Finally, in 2002 (Hawking 2002):

Some people will be very disappointed if there is not an ultimate theory, that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind.

The end of science, in the sense that everything there is to know became known to the human mind, never transpired. The science journalist John Horgan took a more sinister take on the end of science in his book of the same name (Horgan 1997). He argued that science is loosing its momentum to uncover knowledge, slowly grinding to a halt. For the book, he interviewed prominent scientists and philosophers. The likes of Popper, Kuhn, Feyerabend, Daniel Dennett, Hawking, Weinberg, Feynman, Dyson, Roger Penrose, Murray Gell-Mann, Sheldon Glashow, Edward Witten, John Wheeler, David Bohm, Philip Anderson, Ilya Prigogine, Mitchell Feigenbaum, Gregory Chaitin, John Casti, Francis Crick, Richard Dawkins, Stuart Kauffman, and Edward O. Wilson. Horgan identifies the demise of progress in

  • the end of philosophy;

  • the end of physics;

  • the end of cosmology;

  • the end of evolutionary biology;

  • the end of social science;

  • the end of neuroscience;

  • the end of chaoplexity (the portmanteau of chaos and complexity);

  • the end of machine science.

Naturally, many people were not amused. The biologist Lynn Margulis perhaps captured this best (quoted in Horgan 2015):

He’s a very nice guy and he wrote a very bad book.

Looking back, Horgan assesses (Horgan 2015):

The re-launch [Basic Book’s 2015 edition of The End of Science] has stirred up many memories—and forced me to evaluate my thesis. My book has now sustained almost two decades worth of attacks, some triggered by genuine scientific advances, from the completion of the Human Genome Project to the discovery of the Higgs boson. So do I take anything back?

Hell no.

In a nutshell, taken from the preface of the new edition (Horgan 2015):

Our descendants will learn much more about nature, and they will invent gadgets even cooler than smart phones. But their scientific version of reality will resemble ours, for two reasons: First, ours [...] is in many respects true; most new knowledge will merely extend and fill in our current maps of reality rather than forcing radical revisions. Second, some major remaining mysteries—Where did the universe come from? How did life begin? How, exactly, does a chunk of meat make a mind?—might be unsolvable.

Indeed, today, we are still waiting for a coherent and unified theory describing the physical world. Or, at least, a theory of quantum gravity (Sect. 10.2). Unfortunately, the outlook is as bleak as ever and the understanding of ourselves and the world we live in continues to run into dead ends. It is as if nature enjoys teasing the human mind, by pretending to reveal her workings, only to present us with the next enigmas and then turn away. Science has become like the Red Queen in Lewis Carroll’s writings about Alice’s adventures in wonderland: by running faster and faster she stays at the same place. Similarly, science discovers more and more knowledge without fundamentally progressing anymore. For an assessment of the status of modern theoretical physics see Baggott (2013), Unzicker and Jones (2013).

Furthermore, science has been beset by various crisis. For instance, the reproducability crisis (Baker 2016):

More than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature’s survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research.

The application of bad statistics (Nieuwenhuis et al. 2011) has also raised questions. Analyzing 513 papers published in five prestigious neuroscience journals over two years, 157 studies where identified where a potential statistical fallacy could have been committed. Indeed, out of these publications, 50% contained the error. Generally, the whole notion of statistical significance can be called into question (Ziliak and McCloskey 2008). Moreover, as science advances, it relies more and more heavily upon very complex machinery and highly sophisticated software. This can be another source of error. For instance, a bug found in the software used by researchers to interpret fMRI data was found to result in false positive rates up to 70%, calling 15 years of research into question (Reynolds 2016). Some researchers have even alleged that “most published research findings are false” (Ioannidis 2005). The observation, that the number of scientific retractions is increasing (Steen et al. 2013), could be a sign of scientific self-correction or simply the result of poor scientific practices (Smaldino and McElreath 2016).

However, perhaps the biggest threat to science are the scientists themselves. Like any other social human endeavor, academia can be plagued by blind obedience to authority,Footnote 9 groupthink, corruption, and fraud. Furthermore, the unrelenting pressure to “publish or perish” expects scientists to be inexhaustible creative content-providers—with very possible negative consequences (Smaldino and McElreath 2016). The following anecdotes highlights some of these problems.

A publication in the prestigious Proceedings of the National Academy of Sciences (PNAS) claimed to have detected a universal pattern in how complex systems organize (Preis et al. 2011). Specifically, the authors reported on scaling laws found in financial data.Footnote 10 The study became famous not only among quantitative analysts. However, when the physicist, econophysicist, and complexity scientist Didier Sornette reproduced the study utilizing purely random data, surprisingly, the same patterns emerged. A “selection of biased statistical subsets of realizations in otherwise featureless processes such as random walks” (Filimonov and Sornette 2012) was responsible for the signal. In other words, the publication was meaningless, as the researchers did not reject the null hypothesis. To his utter dismay, when Sornette submitted these findings to PNAS, his paper was rejected. In an open letter he vented his frustrationFootnote 11:

Dear Editor,

As a coauthor of the paper Spurious switching processes in financial markets that you just rejected, I cannot remain silent and have to express my concern with how science is handled in general in journals such as PNAS. You are not alone as Science and Nature have in general the same reactions. I know that you will not change your mind in this instance, but I do hope that, little by little, the whole editorial community may become a bit wiser over time.

In a nutshell, your policy stating “it would have to go beyond a simple refutation of the earlier work and significantly add to the field”, implies

  1. 1.

    a fundamental error can remain published as “truth” in PNAS without the normal debate that should be the domain of real science. In my opinion, this is especially harmful to Science, given that this specific spurious claim for discovery has been highly publicized in different journals, in the media and many conferences.

  2. 2.

    a paper that does the solid work to demonstrate that the spurious claim is unsupported will most likely be considered as “not adding significantly to the field”. In other words, we can add “shit” to the field but we cannot correct and remove “shit” from the field, and in so doing teach how to develop better statistical tests. [...]

I hope that any editor could realize the moral hazard and wrong incentives permeating more and more the sociology of science encouraged by editors such as you (no personal attack, I know that you are just following “orders” of a general stance dictated by editorial boards of journals), in a way analogous to a graft from the scandalous behaviors observed in the financial industry.

Excuse my strong colorful words, but I consider that they convey my shock and repulsion to what I consider a violation of good scientific endeavor.

Sincerely,

Prof. D. Sornette

In 1956, two researchers applied a recently developed technique for analyzing human cells and counted 46 chromosomes. This was puzzling, as everyone familiar with biology knew that the correct answer was, since 1912, 48. After consulting with colleagues, it emerged that, surprisingly, other researcher had encountered the same problem. Some even stopped their work prematurely, as they could only find 46 of the 48 chromosomes which had to be there. Not our two researchers, who boldly, and correctly, claimed that everyone else was wrong. See Arbesman (2012). In a similar vein, albeit more trivial, how many scientists know the following (Dicken 2018, back cover):

When Galileo dropped cannonballs from the top of the Leaning Tower of Pisa, he did more than overturn centuries of scientific orthodoxy. At a stroke, he established a new conception of the scientific method based upon careful experimentation an rigorous observation [...].

The problem is that Galileo never performed his most celebrated experiment in Pisa. In fact, he rarely conducted any experiments at all.

Also recall Sect. 5.3.1 describing the initially favorable relationship Galileo and the Church enjoyed.

Finally, it is worth noting that science has no intrinsic aim, other than blind advancement, and is also not goal-driven. Kuhn famously and influentially argued that sciences progresses by sudden, unforeseeable disruptions. Initially, he viewed these paradigm shifts in science (recall Sect. 9.1.3) as being based on faith, fashion, and peer pressure, where evidence and reason only play a minor role. Moreover, he believed science was largely a non-rational activity. Kuhn later moderated his tone and offered a less radical vision of his ideas. For instance, the notion that there exist no algorithms for theory choice in science—scientific progress is inherently opaque. See Okasha (2002). In any case, how free are scientists really to steer the direction of research? As an example (Harari 2015, p. 303):

During the past 500 years modern science has achieved wonders thanks largely to the willingness of governments, businesses, foundations and private donors to channel billions of dollars into scientific research.

[...]

Why did the billions start flowing from government and business coffers into labs and universities? In academic circles, many are naive enough to believe in pure science. They believe the government and businesses altruistically give them money to pursue whatever research projects strike their fancy.

Essentially (Harari 2015, p. 304):

Scientists themselves are not always aware of the political, economic and religious interests that control the flow of money; many scientists do, in fact, act out of pure intellectual curiosity. However, only rarely do scientists dictate the scientific agenda.

Today, many scientists feel a lot of pressure, as they see the amount of scientific funding declining globally. More and more time is spent drafting funding proposals, which can drain a lot of resources from research (Powell 2016). Aspects relating to marketing and bureaucracy become relevant. Will science simply come to an end because the general population fails to see its benefits anymore and many politicians will thus be happy to pull the plug? A very real concern, in our post-truth world, where a climate of rising populism sees experts as a threat.

2.3 The Fractal Nature of Knowledge

The epitome of scientific progress is recounted by the theoretical physicist Sidney Coleman (quoted in Moriyasu 1983, p. 119):

There is a popular model of a breakthrough in theoretical physics. A field of physics is afflicted with a serious contradiction. Many attempts are made to resolve the contradiction; finally, one succeeds. The solution involves deep insights and concepts previously thought to have little or nothing to do with the problem. It unifies old phenomena and predicts unexpected (but eventually observed) new ones. Finally, it generates new physics: the methods used are successfully extended beyond their original domain.

While such upheavals were common in the past, today they have become exceedingly rare events. The increments at which science progresses appear to be becoming infinitesimal. Knowledge seems to posses a fractal-like nature, akin to an abstract space into which the human mind can zoom in indefinitely and the richness of structure does not diminish.

This paradox has been observed by some thinkers. In the words of Deutsch (2011, p. 64):

The deeper an explanation is, the more new problems it creates. That must be so, if only because there can be no such thing as an ultimate explanation: just as “the gods did it” is always a bad explanation, so any other purported foundation of all explanations must be bad too.

Similarly, Popper’s eloquent prose, taken from Popper (1992, p. 8):

I think there is only one way to science—or to philosophy, for that matter: to meet a problem, to see its beauty and fall in love with it; to get married to it and to live with it happily, till death do ye part—unless you should meet another and even more fascinating problem or unless, indeed, you should obtain a solution. But even if you do obtain a solution you may then discover, to your delight, the existence of a whole family of enchanting, though perhaps difficult, problem children.

In his widely acclaimed bestselling novel Zen and the Art of Motorcycle Maintenance , which was rejected by over hundred publishers, Robert M. Pirsig addresses related issues. The book “should in no way be associated with that great body of factual information relating to orthodox Zen Buddhist practice,” neither is it “very factual on motorcycles” (Pirsig 1981, Author’s Note). It is, however, a miniature study of the art of rationality itself. Pirsig argues that although thought may find truth, it may not be valid for all experiences. Again, the closer one examines a phenomenon, the more perplexing it becomes as every explanation seems to open the door to countless new puzzles (Pirsig 1981, p. 101):

The more you look, the more you see. [A]s you try to move toward unchanging truth through the application of scientific method, you actually do not move toward it at all. You move away from it! It is your application of scientific method that is causing it to change!

Finally (Pirsig 1981, p. 101):

Through multiplication upon multiplication of facts, information, theories and hypotheses, it is science itself that is leading mankind from single absolute truths to multiple, relative ones.

These words have a distinct postmodern ring to them. Finally, Wheeler, offers a haunting paradox (quoted in Horgan 1997, p. 84):

At the heart of everything is a question, not an answer. When we peer down into the deepest recesses of matter or at the farthest edge of the universe, we see, finally, our own puzzled faces looking back at us.

3 The Practitioners of Science

Usually, scientists aren’t very vocal about their personal experiences of practicing science. Science is a craft not to be burdened with intangible and immaterial overhead—in stark contrast to philosophers, whose trade it is to unearth vexing issues relating to the nature of knowledge, reality, and the human mind. The problem with knowing what beliefs scientists hold dear is that, by definition, this information is non-scientific. Hence, one searches for such revelations in the peer-reviewed literature without avail.Footnote 12 Sometimes though, scientists will give a glimpse of their inner worlds in popular science books they author. Other times, they are explicitly asked to reveal their very personal ideas about the universe.

In 1988, the literary agent and author John Brockman founded the Edge Foundation.Footnote 13 It has become an intellectual platform for scientists and deep thinkers to directly convey their thoughts to the public in a readily accessible manner to non-specialist. The tag-line on their website reads:

To arrive at the edge of the world’s knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.

Since 1998, the Edge poses an annual questionFootnote 14 to a diverse group of physicists, mathematicians, biologists, computer scientists, philosophers, etc. In 2005, the question was: What do you believe is true even though you cannot prove it? The compilation of the answers was published as a book edited by Brockman, titled What We Believe but Cannot Prove: Today’s Leading Thinkers on Science in the Age of Certainty (Brockman 2006). Ever since, the annual question has resulted in a book:  

2006:

What is your dangerous idea? (Brockman 2007a)

2007:

What are you optimistic about? (Brockman 2007b)

2008:

What have you changed your mind about? (Brockman 2009)

2009:

What will change everything? (Brockman 2010)

2010:

How is the Internet changing the way you think? (Brockman 2011)

2011:

What scientific concept would improve everybody’s cognitive toolkit? (Brockman 2012)

2012:

What is your favorite deep, elegant, or beautiful explanation? (Brockman 2013)

2013:

What should we be worried about? (Brockman 2014)

2014:

What scientific idea is ready for retirement? (Brockman 2015a)

2015:

What do you think about machines that think? (Brockman 2015b)

2016:

What do you consider the most interesting recent [scientific] news? (Brockman 2016)

2017:

What scientific term or concept ought to be more widely known? (Brockman 2017)

2018:

What is the last question?

 

Browsing through these book will give the reader insights into the amazing diversity and creativity of ideas. Naturally, many of the revealed beliefs are polar opposites—divergence and contradictions abound. Nonetheless, in such moments of honesty and intimacy we can glimpse behind the scenes and gauge the minds of contemporary intellectuals.Footnote 15 What becomes apparent is that many thinkers can acknowledge limits in knowledge and accept uncertainty and ambiguity—and even ignorance. Looking back at the spectacular success of human knowledge generation (see Part I), we are, somewhat anxiously, anticipating a future, where from the borders of knowledge radical and transcending new visions of the true nature of reality and consciousness are expected to emerge (see Part III).

3.1 On Philosophy

Sam Harris (Brockman 2015a):

Search your mind, or pay attention to the conversations you have with other people, and you will discover that there are no real boundaries between science and philosophy.

We must abandon the idea that science is distinct from the rest of human rationality.

Rebecca Newberger Goldstein (Brockman 2015a):

You can’t argue for science making philosophy obsolete without indulging in philosophical arguments. You’re going to need to argue, for example, for a clear criterion for distinguishing between scientific and non-scientific theories of the world.

A triumphalist scientism needs philosophy to support itself.

Paul Bloom (Brockman 2012):

Scientists can reject common wisdom, they can be persuaded by data and argument to change their minds. It is through these procedures that we have discovered extraordinary facts about the world, such as the structure of matter and the evolutionary relationship between monkey and man.

The cultivation of reason isn’t unique to science; other disciplines such as mathematics and philosophy possess it as well. But it is absent in much of the rest of life.

Melanie Swan (Brockman 2013):

Therefore some of the best explanations may have the parameters of being intuitively beautiful and elegant, offering an explanation for the diverse and complicated phenomena found in the natural universe and human-created world, being universally applicable or at least portable to other contexts, and making sense of things at a higher order. Fields like cosmology, philosophy, and complexity theory have already delivered in this exercise: they encompass many other science fields in their scope and explain a variety of micro and macro scale phenomena.

3.2 On Objectivity, Truth, Knowledge, and Certainty

Gavin Schmidt (Brockman 2015a):

We continually read about the search for the one method that will allow us to cut through the confusion, the one piece of data that tell us the “truth” , or the final experiment that will “prove” the hypothesis. But almost all scientists will agree that these are fool’s errands—that science is [a] method for producing incrementally more useful approximations to reality, not a path to absolute truth.

Mihaly Csikszentmihalyi (Brockman 2015a):

What needs to be retired is the faith that what scientists say are objective truths, with a reality independent of scientific claims. Some are indeed true, but others depend on so many initial conditions that they straddle the boundary between reality and fiction.

Scott Sampson (Brockman 2015a):

One of the most prevalent ideas in science is that nature consists of objects.

Yet this pervasive, centuries-old trend toward reductionism and objectification tends to prevent us from seeing nature as subjects, though there’s no science to support such myopia.

Alan Kay (Brockman 2006):

When we guess in science we are guessing about approximations and mappings to languages, we are not guessing about “the truth” (and we are not in a good state of mind for doing science if we think we are guessing “the truth” or “finding the truth”) . This is not at all well understood outside of science, and there are unfortunately a few people with degrees in science who don’t seem to understand it either.

Timothy Taylor (Brockman 2006):

If science fetishized truth, it would be religion, which it is not.

Michael Shermer (Brockman 2006):

Our knowledge of nature remains provisional because we can never know if we have final Truth.

In science, knowledge is fluid and certainty fleeting.

Clifford Pickover (Brockman 2014):

Should we be so worried that we will not really be able to understand subatomic physics, quantum theory, cosmology, or the deep recesses of mathematics and philosophy? Perhaps we can let our worries slightly recede and just accept our models of the universe when they are useful.

Nicholas G. Carr (Brockman 2017):

But what if our faith in nature’s knowability is just an illusion, a trick of the overconfident human mind?

Lawrence M. Krauss (Brockman 2017):

Nothing feels better than being certain, but science has taught us over the years that certainty is largely an illusion. In science, we don’t “believe” in things, or claim to know the absolute truth.

3.3 On Laws of Nature, Reality, and Science

Lawrence M. Krauss (Brockman 2015a):

[T]he laws of nature we measure may be totally accidental, local to our environment (namely our Universe), not prescribed with robustness by any universal principle, and by no means generic or required.

[T]here may be nothing fundamental whatsoever about the “fundamental” laws we measure in our universe. They could simply be accidental. Physics becomes, in this sense, an environmental science.

Gregory Benford (Brockman 2009):

I once thought that the laws of our universe were unquestionable, in that there was no way for science to address the question. Now I’m not so sure. Can we hope to construct a model of how laws themselves arise?

Charles Seife (Brockman 2009):

Science is about freedom of thought, yet at the same time it imposes a tyranny of ideas.

Colin Tudge (Brockman 2009):

I have changed my mind about the omniscience and omnipotence of science. I now realize that science is strictly limited, and that it is extremely dangerous not to appreciate this.

Haim Harari (Brockman 2009):

The public thinks, incorrectly, that science is a very accurate discipline where everything is well defined.

Donald D. Hoffman (Brockman 2015a):

Observation is the empirical foundation of science. The predicates of this foundation, including space, time, physical objects and causality, are a species-specific adaptation, not an insight.

Ian Bogost (Brockman 2015a):

To think that science has a special relationship to observations about the material world isn’t just wrong, it’s insulting.

But ironically, in its quest to prove itself as the supreme form of secular knowledge, science has inadvertently elevated itself into a theology. Science is not a practice so much as it is an ideology.

Satyajit Das (Brockman 2015a):

While not strictly a scientific theorem, anthropocentrism, the assessment of reality through an exclusively human perspective, is deeply embedded in science and culture.

Like a train that can only run on tracks that determine direction and destination, human knowledge may ultimately be constrained by what evolution has made us.

Science, paradoxically, seems to also have inbuilt limits. Like an inexhaustible Russian doll, quantum physics is an endless succession of seemingly infinitely divisible particles. Werner Heisenberg’s uncertainty principle posits that human knowledge about the world is always incomplete, uncertain and highly contingent. Kurt Gödel’s incompleteness theorems of mathematical logic establish inherent limitations of all but the most trivial axiomatic systems of arithmetic.

Sarah Demers (Brockman 2015a):

Of course, including aesthetic considerations in the scientific toolbox has resulted in huge leaps forward.

At this stage, with 96% of the universe’s content in the dark, it is a mistake for us to put aesthetic concerns in the same realm as contradictions when it comes to theoretical motivation. With no explanation for dark energy, no confirmed detection of dark matter and no sufficient mechanism for matter/antimatter asymmetry, we have too many gaps to worry about elegance.

Max Tegmark (Brockman 2009):

After all, physical reality has turned out to be very different from how it seems, and I feel that most of our notions about it have turned out to be illusions.

From your subjectively perceived frog perspective, the world turns upside down when you stand on your head and disappears when you close your eyes, yet you subconsciously interpret your sensory inputs as though there is an external reality that is independent of your orientation, your location and your state of mind.

Jean Paul Schmetz (Brockman 2006):

[...] our body of scientific knowledge is surely full of statements we believe to be true but will eventually be proved to be false.

Donald D. Hoffman (Brockman 2016):

Nobel Laureate David Gross observed, “Everyone in string theory is convinced...that spacetime is doomed. But we don’t know what it’s replaced by.” Fields medalist Edward Witten also thought that space and time may be “doomed.” Nathan Seiberg of the Institute for Advanced Study at Princeton said, “I am almost certain that space and time are illusions. These are primitive notions that will be replaced by something more sophisticated.”

Tor Nørretranders (Brockman 2010):

The visual world, what we see, is an illusion, but then a very sophisticated one. There are no colours, no tones, no constancy in the “real” world, it is all something we make up. We do so for good reasons and with great survival value.

3.4 On Ignorance and Irrationality

Paul Saffo (Brockman 2015a):

The science establishment justifies its existence with the big idea that it offers answers and ultimately solutions. But privately, every scientist knows that what science really does is discover the profundity of our ignorance.

Robert Provine (Brockman 2015a):

We fancy ourselves intelligent, conscious and alert, and thinking our way through life. This is an illusion. We are deluded by our brain’s generation of a sketchy, rational narrative of subconscious, sometimes irrational or fictitious events that we accept as reality.

Tom Griffiths (Brockman 2015a):

And when psychology experiments show that people are systematically biased in the judgments they form and the decisions they make, we begin to question human rationality.

Alex Pentland (Brockman 2015a):

It is time that we dropped the fiction of individuals as the unit of rationality, and recognized that we are embedded in the surrounding social fabric.

Margaret Wertheim (Brockman 2006):

In truth our ignorance is vast—and personally I believe it will always be so.

Dylan Evans (Brockman 2017):

If we could represent the knowledge in any given brain as dry land, and ignorance as water, then even Einstein’s brain would contain just a few tiny islands scattered around in a vast ocean of ignorance. Yet most of us find it hard to admit how little we really know.

3.5 On the Mind

Susan Blackmore (Brockman 2015a):

Consciousness is not some weird and wonderful product of some brain processes but not others. Rather, it is an illusion constructed by a clever brain and body in a complex social world. We can speak, think, refer to ourselves as agents and so build up the false idea of a persisting self that has consciousness and free will.

Jerry A. Coyne (Brockman 2015a):

In short, the traditional notion of free will—defined by Anthony Cashmore as “a belief that there is a component to biological behavior that is something more than the unavoidable consequences of the genetic and environmental history of the individual and the possible stochastic laws of nature”—is dead on arrival.

Tania Lombrozo (Brockman 2015a):

In our enthusiasm to find a scientifically-acceptable alternative to dualism, some of us have gone too far the other way, adopting a stark reductionism. Understanding the mind is not just a matter of understanding the brain.

Bruce Hood (Brockman 2015a):

We know that the self is constructed because it can be so easily deconstructed through damage, disease and drugs. It must be an emergent property of a parallel system processing input, output and internal representations. It is an illusion because it feels so real, but that experience is not what it seems.

Daniel Goleman (Brockman 2009):

Science found that, compared to novices, highly adept meditators generated far more high-amplitude gamma wave activity—which reflects finely focused attention—in areas of the prefrontal cortex while meditating.

The seasoned meditators in this study—all Tibetan lamas—had undergone cumulative levels of mental training akin to the amount of lifetime sports practice put in by Olympic athletes: 10,000 to 50,000 hours. Novices tended to increase gamma activity by around 10 to 15 percent in the key brain area, while most experts had increases on the order of 100 percent from baseline. What caught my eye in this data was not this difference between novices and experts (which might be explained in any number of ways, including a self-selection bias), but rather a discrepancy in the data among the group of Olympic-level meditators.

Although the experts’ average boost in gamma was around 100 percent, two lamas were “outliers” : their gamma levels leapt 700 to 800 percent. This goes far beyond an orderly dose-response relationship—these jumps in high-amplitude gamma activity are the highest ever reported in the scientific literature apart from pathological conditions like seizures. Yet the lamas were voluntarily inducing this extraordinarily heightened brain activity for just a few minutes at a time—and by meditating on “pure compassion,” no less.

I have no explanation for this data, but plenty of questions. At the higher reaches of contemplative expertise, do principles apply (as the Dalai Lama has suggested in dialogues with neuroscientists) that we do not yet grasp? If so, what might these be? In truth, I have no idea. But these puzzling data points have pried open my mind a bit as I’ve had to question what had been a rock-solid assumption of my own.

Lutz et al. (2004) is the publication Goleman is referring to here. See also Sect. 7.4.2.1 for an account of Matthieu Ricard, a molecular geneticist turned Buddhist monk, displaying exceptional powers of self-awareness and control, in the context of compassion and meditation.

3.6 And More

W. Daniel Hillis (Brockman 2015a):

The cause-and-effect paradigm works particularly well when science is used for engineering, to arrange the world for our convenience. In this case, we can often set things up so that the illusion of cause-and-effect is almost a reality.

The notion of cause-and-effect breaks down when the parts that we would like to think of as outputs affect the parts that we would prefer to think of as inputs. The paradoxes of quantum mechanics are a perfect example of this, where our mere observation of a particle can “cause” a distant particle to be in a different state. Of course there is no real paradox here, there is just a problem with trying to apply our storytelling framework to a situation where it does not match.

Nigel Goldenfeld (Brockman 2015a):

If the stuff that makes the universe is strongly connected in space and not usefully thought of as the aggregate of its parts, then attributing a cause of an event to a specific component may not be meaningful either. Just as you can’t attribute the spin of a proton to any one of its constituents, you can’t attribute an event in time to a single earlier cause. Complex systems have neither a useful notion of individuality nor a proper notion of causality.

Marcelo Gleiser (Brockman 2015a):

The trouble starts when we take this idea too far and search for the Über-unification, the Theory of Everything, the arch-reductionist notion that all forces of nature are merely manifestations of a single force. This is the idea that needs to go. And I say this with a heavy heart; my early career aspirations and formative years were very much fueled by the impulse to unify it all.

Why do so many insist in finding the One in Nature while Nature keeps telling us it’s really about the Many? For one thing, the scientific impulse to unify is crypto-religious. The West has bathed in monotheism for thousands of years [...].

The belief is that nature’s ultimate code exists in the ethereal world of mathematical truths and we can decipher it. Recent experimental data has been devastating to such belief [...].

We may hold perfection in our mind’s eye as a sort of ethereal muse. Meanwhile nature is out there doing its thing. That we manage to catch a glimpse of its inner workings is nothing short of wonderful.

Marcelo Gleiser (Brockman 2009):

The model of unification, which is so aesthetically appealing, may be simply this, an aesthetically appealing description of Nature, which, unfortunately, doesn’t correspond to physical reality. Nature doesn’t share our myths.

4 The Limits of Mathematics

The metaphor of the Book of Nature relies on the assumption that mathematics is the sole source of all exact knowledge of the world. By translating aspects of the physical world into formal abstractions, the human mind can unlock novel understanding of the workings of the universe (Sect. 2.1). Indeed, mathematical beauty was understood as a guiding principle in physics and a seemingly simple principle of symmetry unearthed some of the deepest understanding of reality (Chaps. 3 and 4). Platonism is the notion that a realm of perfect abstractions exists where all mathematical entities reside. In other words, mathematics has its own reality. In this sense, mathematics is discovered and not invented by the human mind. Notwithstanding the philosophical issues which are implied (Sect. 2.2.1), many of the greatest mathematicians were and are self-proclaimed Platonists (Sect. 2.2).

4.1 Inherent Randomness

Similar to the decline of science and philosophy chronicled in this chapter, mathematics also experienced a demotion. Ironically, at a time when mathematicians were establishing the foundations of mathematics, based on a complete set of consistent axioms,Footnote 16 disaster struck. Out of nowhere, Kurt Gödel—also a defender of mathematical Platonism—destroyed any hopes of establishing a solid foundation of mathematics. His incompleteness theorems proved that every formal axiomatic system containing basic arithmetic is inconsistent and incomplete (Sect. 2.2). In other words, the basic expectations, that a statement is true because there is a proof of the statement and that if a statement is true there is a proof of the statement, are—to everyone’s dismay—untenable.

Building on Gödel’s work, Alan Turing expanded the scope of the conundrum to computation (Turing 1936). In essence, the uncertainty discovered by Gödel now spread and plagued the mathematical foundations of the newly emerging computer science. Turing’s so-called halting problem is about undecidability. It is impossible for a computer to decide in advance whether a given program will ever finish its task and halt. The only way to find out if a program will ever halt is to run it and wait—ten minutes, ten billion years, or forever. See also Sect. 13.1.2.

Decades later, the mathematician and computer scientist Gregory Chaitin continued where Turing left off, yet again extending Gödel’s haunting legacy. He translated Turing’s question, of whether a program halts, into a real number between 0 and 1. In essence, this uncomputable number—called Omega—reflects the probability that an arbitrary computer program will eventually halt (Chaitin 1975). “It’s the outstanding example of something which is unknowable in mathematics,” Chaitin says (quoted in Chown 2001).

Unfortunately, Omega is more than an academic curiosity. It is not some esoteric number appearing at the fringes of mathematics. Chaitin’s halting probability is intimately linked to simple mathematical operations, such as the addition and multiplication of whole numbers. Randomness lurks at the heart of mathematics. After decades of fundamental research, the verdict is out (Calude and Chaitin 1999):

[...] randomness is as fundamental and as pervasive in pure mathematics as it is in theoretical physics. In our opinion it also gives further support to “experimental mathematics”, and to the “quasi-empirical” view of mathematics which says that although mathematics and physics are different, it is more a matter of degree than black and white.

For millennia, people have regarded mathematics as an outstanding intellectual construction of humankind. Mathematics was viewed as the pinnacle of rational thinking and human reasoning. Alas, today we know, as explained in the words of Chaitin, that (quoted in Chown 2001):

Mathematicians are simply acting on intuition and experimenting with ideas, just like everyone else. Zoologists think there might be something new swinging from branch to branch in the unexplored forests of Madagascar, and mathematicians have hunches about which part of the mathematical landscape to explore. The subject is no more profound than that.

Most of mathematics is true for no particular reason. Maths is true by accident.

Chaitin’s mathematical curse grows worse. There exist even more disturbing numbers, called Super-Omegas (Becher et al. 2001). All these “incalculable numbers reveal that mathematics is not simply moth-eaten, it is mostly made of gaping holes. Anarchy, not order, is at the heart of the Universe” (Chown 2001). This is a truly unexpected and devastating blow to the supremacy of mathematics and any intellectual tradition building upon it. Indeed (Chaitin 2005, p.146):

Let me repeat: formal axiomatic systems are a failure!

There exists a real-world problem related to randomness. In 1930, the philosopher, mathematician, and economist Frank P. Ramsey proved an innocuous theorem in graph theory (Ramaey 1930). In detail, the proof concerned itself with the relationship between groups of points in a network. This turned out to have deep implications, as a “network” can be a collection of all manner of things, from computers in an network, people at a dinner party, or stars in the night sky. In essence, pattern-free randomness is impossible. Every random collection of things will contain patterns: mysterious order emerges from apparent randomness (indeed, recall Benford’s law from Sect. 6.4.2). Ramsey theory says that this order is not only likely, but becomes inevitable as the number of nodes in the network increases. Adding insult to injury, our minds suffer from apophenia, a cognitive bias describing the tendency to perceive connections and meaning between unrelated things (see Sect. 11.3.2 for more on cognitive biases) . We are truly exposed to a profound randomness/pattern dichotomy: from the fundamental randomness in the theories the human mind devises, to the pattern-formation emerging out of randomness, to the mind’s propensity to see patterns everywhere—the illusion of order in a random universe.

A final example of the failings of mathematics relates to politics. Specifically, clear and justified rules for apportionments in a political system produce results which are unexpected and appear to violate common sense. Such deficiencies are summarized in the apportionment paradoxes and question rational decision-making (Arrow 1950; Balinski and Young 1975, 2001). Indeed (Deutsch 2011, p. 333):

Sometimes politicians have been so perplexed by the sheer perverseness of apportionment paradoxes that they have been reduced to denouncing mathematics itself. Representative Roger Q. Mills of Texas complained in 1882, ’I thought ...that mathematics was a divine science. I thought that mathematics was the only science that spoke to inspiration and was infallible in its utterances [but] here is a new system of mathematics that demonstrates the truth to be false.’

4.2 Losing Meaning

In Sect. 9.1.4 above, the following question was asked:

Has modern physics itself really transformed into a postmodern narrative that defies meaning, clarity, and understanding?

The same question can be posed for mathematics. As the discipline becomes ever more technical, detailed, and abstract, fewer and fewer people can understand its subtlety and profundity.

Consider Fermat’s Last Theorem of 1637 (Wiles 1995):

Theorem 9.1

There are no non-zero solutions to the equation \(x^n+y^n=z^n\), where xyx,  and n are integers for \(n > 2\).

It took over 350 years before, in 1995, Andrew Wiles presented a proof. It ran to over hundred pages and employed novel, previously unrelated, mathematical methods (Wiles 1995). What could be more demanding than a 100-page proof? Perhaps a proof which a computer carried out. The four-color theorem (Sect. 5.4.1) was proved in such a way. This raises the question “about whether a ‘proof’ that no one understand is a proof” (Colyvan 2012).

Then there is the tale of Shinichi Mochizuki (Castelvecchi 2015):

Sometime on the morning of 30 August 2012, Shinichi Mochizuki quietly posted four papers on his website.

The papers were huge—more than 500 pages in all—packed densely with symbols, and the culmination of more than a decade of solitary work. They also had the potential to be an academic bombshell. In them, Mochizuki claimed to have solved the abc conjecture, a 27-year-old problem in number theory that no other mathematician had even come close to solving. If his proof was correct, it would be one of the most astounding achievements of mathematics this century and would completely revolutionize the study of equations with whole numbers.

Everyone—even those whose area of expertise was closest to Mochizuki’s—was just as flummoxed by the papers [...]. To complete the proof, Mochizuki had invented a new branch of his discipline, one that is astonishingly abstract even by the standards of pure maths. “Looking at it, you feel a bit like you might be reading a paper from the future, or from outer space,” number theorist Jordan Ellenberg, of the University of Wisconsin-Madison, wrote on his blog a few days after the paper appeared.

Then, in 2016 (Castelvecchi 2009):

Nearly four years after Shinichi Mochizuki unveiled an imposing set of papers that could revolutionize the theory of numbers, other mathematicians have yet to understand his work or agree on its validity—although they have made modest progress.

Finally, in 2017 (Revell 2017):

“A small number of those close to Mochizuki claim to understand the proof, but they have had little success in explaining their understanding to others,” wrote Peter Woit at Columbia University in a blog post.

It does not help that mathematicians can be strange creatures (Castelvecchi 2015):

Mochizuki, however, did not make a fuss about his proof. The respected mathematician [...] did not even announce his work to peers around the world. He simply posted the papers, and waited for the world to find out.

Adding to the enigma is Mochizuki himself. He has so far lectured about his work only in Japan, in Japanese, and despite being fluent in English, he has declined invitations to talk about it elsewhere. He does not speak to journalists; several requests for an interview for this story went unanswered.

This tendency work in isolation and avoid interactions with the world has similarities to another ingenious mathematician’s conduct which also alienated the community. Grigori Perelman shot to fame in 2003 after solving the century-old Poincaré conjecture (Singer 2004). For this achievement he was awarded the prestigious Fields Medal, considered to be the greatest accolade in mathematics. In addition, he was awarded the $1 million Millennium PrizeFootnote 17 in recognition of his proof. Perelman, unprecedentedly, declined both prizes and noted (quoted in BBC News 2010):

I’m not interested in money or fame.

I don’t want to be on display like an animal in a zoo. I’m not a hero of mathematics. I’m not even that successful; that is why I don’t want to have everybody looking at me.

Mathematics not only defies meaning when we don’t understand what is going on, but, more seriously, also when we do understand. For instance, the deeply counterintuitive Tarski-Banach theorem (Banach and Tarski 1924) states that (Colyvan 2012, p. 152):

[...] a solid sphere can be decomposed into a finite number of pieces, the pieces moved around via rigid rotations and translations, and recombined into two spheres, each equal in volume to the first.

As a further counterintuitive example, consider the following equation

Theorem 9.2

$$\begin{aligned} 1+2+3+4+5+6 + \cdots = -\frac{1}{12}. \end{aligned}$$
(9.1)

How can the sum of all positive integers be a negative fraction?

Proof

Let \(S_1 = 1 -1 +1-1+1-1+ \cdots \) Then

$$\begin{aligned} \begin{aligned} 1 - S_1&= 1 - ( 1 - 1 + 1 -1 +1 -\cdots )\\&= 1 - 1 + 1 -1 +1- 1+\cdots = S_1. \end{aligned} \end{aligned}$$
(9.2)

Hence

$$\begin{aligned} S_1 = \frac{1}{2}. \end{aligned}$$
(9.3)

Let \(S_2 = 1-2+3-4+5-6+ \cdots \) Then

$$\begin{aligned} \begin{aligned} 2 S_2&= 1-2+3-4+5-6+ \cdots \\&+1-2+3-4+5- \cdots \\&= 1-1+1-1+1-1+\cdots = S_1 = \frac{1}{2}. \end{aligned} \end{aligned}$$
(9.4)

Hence

$$\begin{aligned} S_2 = \frac{1}{4}. \end{aligned}$$
(9.5)

Now consider

$$\begin{aligned} \begin{aligned} S- S_2 =&1+2+3+4+5+6+ \cdots \\ -(&1-2+3-4+5- 6+ \cdots )\\ =&0 + 4 + 0 +8 + 0 +12 +\cdots \end{aligned} \end{aligned}$$
(9.6)

Or, equivalently

$$\begin{aligned} S- S_2 =4 ( 1+2+3+4+5+6+\cdots ) = 4 S. \end{aligned}$$
(9.7)

Substituting (9.5) yields

$$\begin{aligned} S - \frac{1}{4} = 4 S, \end{aligned}$$
(9.8)

or \(S = -\frac{1}{12}.\)

\(\square \)

This bizarre proof highlights the failure of human intuition when faced with infinite sums. A rigorous proof of Theorem 9.2, looking less like arithmetic sleight-of-hand, can be found using the Riemann zeta function (Stopple 2003)

$$\begin{aligned} \zeta (s) =\sum _{n=1}^\infty \frac{1}{n^s}, \end{aligned}$$
(9.9)

with

$$\begin{aligned} S=\zeta (-1) = -\frac{1}{12}. \end{aligned}$$
(9.10)

However, even more puzzlingly, Theorem 9.2 is relevant in modern theoretical physics, as it constrains bosonic string theory to 26-dimensional space-time (Polchinski 2005, p. 22).

A further example of a deep and eerie connection between mathematics and reality is the number \(\pi \). It is defined as the ratio of the circumference of a circle to its diameter. It is an irrational number (i.e., it cannot be expressed as a fraction) and it is also a transcendental number (i.e., it is not the solution of any polynomial with rational numbers as coefficients). \(\pi \) has an infinite number of digits in its decimal representation and no repeating pattern ever occurs. Magically, the formula for \(\pi \) appears in a basic calculation in the physics of the hydrogen atom (Friedmann and Hagen 2015). Then there is the claim that the distribution of the prime numbers follow the energy levels of a quantum system (Bender et al. 2017).

Initially, mathematics gave structure and order to human thinking. In the realm of the abstract, clear rules inexorably dictated its inner workings. The domain of relevance of mathematics exploded with the discovery of Volume I of the Book of Nature (Chaps. 2, 3, 4, and 5). This unprecedented and extraordinary success is overshadowed by the discovery of the irreparable incompleteness and randomness in the foundations of mathematics. Moreover, how legitimate is a discipline, which can only be comprehended by a hand-full of initiated people?