1 Introduction

Widespread dissatisfaction with Humean and Neo-Humean projects has led to a revival of interest in Aristotle-inspired theories of causal powers. This revival has great potential to illuminate issues in the philosophy of science and of nature. In particular, an Aristotelian perspective on the import of the quantum revolution would open up new avenues of thought. In this paper, I will sketch one such perspective.

In the first section, I describe the basic elements of a powers ontology, in contrast to its principal competitors, and I propose that there two distinct philosophies of nature correspond to two of these ontologies (Aristotelian and Humean). Then, in Section 2, I argue that the quantum revolution has taken science in the direction of an Aristotelian metaphysics and philosophy of nature, a fact that has been noted by some (including Planck and Heisenberg) but which has not yet been widely recognized in contemporary philosophy of science. This new direction includes three components: potentiality, processes, and (most importantly) the need for a fundamentally real domain (beyond the microphysical) that includes experimenters and their instruments.

I explain in Sections 34, and 5 why the Aristotelian philosophy of science offers an alternative to the reduction of special sciences to microphysics. An Aristotelian philosophy of nature rejects the modern prejudice in favor of the microscopic, a rejection that is crucial if we are to penetrate the mysteries of the quantum world.

The remainder of the paper is a defense of the Aristotelian model that draws on two areas of contemporary science: quantum chemistry and thermodynamics (Section 6) and the measurement problem (Section 7). I argue that the distinction between commuting (quantal) and non-commuting (classical) properties in quantum theory (a distinction that appears only when models are taken to the thermodynamic or continuum limit) provides the basis for a new version of the Copenhagen interpretation, an interpretation that is realist, holistic, and hylomorphic in character. This new version allows for the attribution of fundamental causal powers (both active and passive) to meso- and macro-scopic entities, including human observers and their instruments.

My project has encompasses three phases, three goals—of increasingly ambitious character.

  1. 1.

    Phase 1: sketch a hylomorphic, powerist interpretation of modern quantum theory, arguing that it represents a genuine and stable location in logical space.

  2. 2.

    Phase 2: argue that there is no empirical evidence against the hylomorphic interpretation—that it is at least as well supported by data and scientific practice as is the microphysicalist, modern alternative.

  3. 3.

    Phase 3: argue that the empirical evidence supports the hylomorphic interpretation over the other alternatives, including old Copenhagen, Bohm, objective-collapse, and Everett interpretations.

I will argue for Phase 1 in Sections 4 and 5, and for phase 2 in the Section 6, with special consideration of quantum theories of chemistry and thermodynamics. I’ll take up the case for Phase 3 in the concluding Section 7.

2 Four metaphysical options and two philosophies of nature

There is a natural class of phenomena that at least appears to involve a sort of physical or natural modality. This class includes three sub-classes: subjunctive and counterfactual conditionals, dispositions and causal powers, and causal laws of nature (see Koons and Pickavance 2017). It would be quite surprising if all three sub-classes included metaphysically fundamental facts, since it seems that some can be defined by or grounded in the others. Consequently, there are four ontological options:

  1. 1.

    Powerism. Causal powers and dispositions are fundamental.

  2. 2.

    Hypotheticalism. Facts expressed by means of subjunctive conditionals are fundamental.

  3. 3.

    Nomism. Causal laws of nature are fundamental.

  4. 4.

    Neo-Humeanism. None of these are fundamental, but all are grounded in the Humean mosaic of categorical qualities distributed across spacetime.

Hypotheticalism and Nomism have largely fallen out of favor. Hypotheticalism has waned because of the implausibility of the idea that anything fundamentally real corresponds to the world-selection function needed for the semantics of the subjunctive conditional. The relative closeness of two worlds seems too subjective and anthropocentric to be a metaphysical primitive. Nomism has faded because of the difficulty of bridging the gap between facts about laws and facts about particular patterns of fact. Bridging this gap means attributing an odd sort of causal power to the laws themselves. Thus, the two main competitors today are Powerism (or the powers ontology) and Neo-Humeanism.

Neo-Humeanism has gradually declined somewhat in popularity as it failed to provide adequate accounts of the directionality of time and causality, of dispositions and powers, of objective probability, and of scientific theory choice and induction (again, see Koons and Pickavance 2017). Hence, there has been increasing interest in a Powerist alternative. (Of course, I am not denying that the other three views have their contemporary defenders, nor am I claiming that the issue is a settled one.)

A viable powers ontology must include two additional elements: forms and processes. It is processes that manifest powers, and it is forms that ground them. Causal powers come in two kinds: active and passive. An active power initiates a process of change (kinesis) in some entity, and a passive power is the potentiality for undergoing such a process.

Powers appear in nature in natural clusters, and these power-clusters are the expression of the presence of Aristotelian forms (Inman 2018). Functionally equivalent or interchangeable forms constitute the basis of natural kinds of substances, whether essential or accidental. Without forms as the common ground of these repeatable clusters of powers, we would be left with a large number of massive brute coincidences. The substantial form of water explains why the active and passive powers associated with all instances of water are found so regularly in concert.

Active causal powers initiate ongoing processes of change. Without such processes, it would be impossible to explain how the past influences the future, unless we were to posit immediate action at a temporal distance. Processes of change in turn presuppose the existence of fundamentally enduring entities, the fundamental participants in these processes, and these participants must be subject to substantial forms that determine their persistence-conditions and their liabilities to accidental change or motion. Nature’s repertoire of forms determines what kinds of entities are metaphysically fundamental.

In contrast, the Neo-Humean ontology requires no fundamental processes or fundamentally enduring entities (with their substantial forms). Instead, what is fundamental is a framework of spacetime (or spatiotemporal relations), with regions occupied by one or more kinds of qualities or stuffs (the Humean mosaic). Time is metaphysically prior to change, since change is simply a matter of the appearance of different qualities at different times (Russell’s At-At theory). Laws of nature are grounded in brute-fact patterns of qualitative succession. On the Mill-Ramsey-Lewis model, a mathematical function counts as a law of nature just in case it is a theorem of the simplest axiomatization of the mosaic’s patterns.

The two ontologies of causation correspond closely to two philosophies of nature, philosophies that have been in competition since the later Middle Ages. We can call these the perennial (or scholastic) and the modern philosophies. On the perennial philosophy of nature, the task of science is to identify the substantial and accidental forms in nature, from which flow things’ active and passive capacities, which manifest themselves (in turn) in the form of activities and processes of change. Mathematics can be a useful tool in describing these capacities and processes, but science is primarily concerned with discovering the real definitions of natural kinds. In addition, the realm of potentiality is real and inescapable, even if in some sense dependent on the actual. The reality of potentiality (powers) corresponds to the reality of a kind of teleology: the natural intentionality (in George Molnar’s phrase) of the real but unmanifested potentialities of nature.

The perennial philosophy of nature is pluralistic, in that each kind of form could give rise to a distinct set of active and passive powers. This allowed for the possibility of fundamental entities studied in distinct theoretical domains, including chemistry and biology as well as physics. In fact, I will go even further and argue that the quantum revolution requires us to demote the status of microphysical entities, including particles and fields. We should reverse the usual understanding of emergence: it is microphysical phenomena that emerge from the more fundamental domain of chemistry, thermodynamics, and solid-state physics, not vice versa.

On the modern view, science is primarily about discovering fundamental mathematical relations explain and in some sense govern observable phenomena. The task is to find increasingly general and simple formulas, from which all such mathematical relations can be derived through calculation. The realm of potentiality is unreal or imaginary–merely a result of human thought experiments. Natural reality is exhausted by what actually happens. The modern philosophy of science aspires to be absolutely unitary, discovering a single set of laws that apply to all interactions at all scales. In practice, this translates into the priority of the microscopic realm, since large-scale structures and patterns are nothing more than the sum of their small-scale components.

3 The quantum revolution

Perhaps the most important and yet often overlooked aspect of the quantum revolution is its elevation of physical potentialities to a level of indispensability, as Heisenberg recognized (Heisenberg 1958, p. 41) In modern philosophy of nature, the realm of potentiality can be treated as something unreal, as a mere mental construction or thought experiment. In quantum mechanics, however, what is merely potentially so has a real impact on what actually happens. This comes out very clearly in Richard Feynman’s sum-over-history or path integral formulation of QM. In order to predict what will actually happen, one must compute the probability amplitude corresponding to every possible path of the system from initial to final states.

Since the time of Newton and Leibniz, physicists have had two sets of mathematical techniques for explaining and predicting the motion of bodies. One model, the Newtonian, treats force, mass, and instantaneous acceleration as the metaphysically fundamental properties, relying on vector addition (the quadrilateral of forces) to work out the rate and direction of acceleration for each body. This model takes states and events as the primary reality, with a Russellian at-at theory of motion, and binary forces of attraction and repulsion between simple bodies as the ultimate drivers of physical action. This fits nicely with the microscopic or microphysicalist commitments of modern philosophy of science.

The second, analytical or Hamiltonian model, gives primacy instead to energies and processes (trajectories) over instantaneous forces, relying on the conservation of energy and principles of least action, instead of Newton’s laws of motion (McDonough 2008, McDonough 2009). The alternative model begins with the Lagrangian formulation of mechanics, in which whole trajectories are explained via some form of ‘least-action’ or ‘extremal’ or ‘variational’ principle (Yourgrau and Mandelstam 1979, pp. 19-23, 164-7; Lindsay and Morgenaw 1957, pp. 1336; Lanczos1986, pp. xxvii, 345-6).

In classical mechanics, theorists had a free choice between a Newtonian and a Lagrangian/Hamiltonian model, which each being derivable from the other. With the quantum revolution, the second model becomes obligatory, since the fundamental entities can no longer be imagined to be moving in response to the composition of forces exerted at each moment from determinate distances. Teleology reigns supreme over mechanical forces, as Max Planck noted. (See Planck1949, pp. 131-5; Planck 1960; Dusek 2001; Thalos 2013, pp. 84-6) This provides a second line of support between quantum mechanics and the perennial philosophy.

Finally, quantum mechanics represents the microscopic domain as incomplete, in that it ascribes to microscopic entities only a probability of being observed or measured in various states, but it leaves the notions of observation or measurement without any microscopic definition. This is in sharp contrast to classical mechanics, in which there is no essential reference to anything beyond the locations and momenta of the individual particles. This creates a severe problem for the microphysicalist commitments of modern philosophy of nature, a problem that has come to be known as the measurement problem. As we shall see, there is no such problem for the scholastic philosophy of nature and its attendant powers ontology.

4 The fundamentality of composite things

The perennial or Aristotelian philosophy of nature has the resources to deny the primacy of mereologically simple entities, whether these are so-called “fundamental” particles or field values at spatiotemporal points. In contrast, the modern philosophy of nature consciously or unconsciously identifies mereological simplicity with metaphysical fundamentality.

I will use the term substance to refer to the mereologically composite and metaphysically fundamental entities that are posited by the perennial philosophy. These substances can exist at many different scales: microscopic, mesoscopic, macroscopic, or even cosmic. They are not, however, among the very smallest things in nature, since they have proper parts than which they are larger. Unlike quantum particles, Aristotelian substances always have definite location and trajectory. Crucially, the substances have definite locations even though their quantum parts do not! Substances also have a full complement of determinate, classical properties (corresponding to superselection sectors in algebraic QM).Footnote 1 These classical properties include chemical form, chirality, temperature, entropy, and chemical potential.

It is when we look at composite substances (including macroscopic ones) that we see the need for Aristotelian hylomorphism, and not merely the so-called powers ontology of such recent philosophers as C. B. Martin, George Molnar, or John Heil. For example, Heil holds that the only substances that exist are simple and microscopic, corresponding to the fundamental particles of contemporary physics (Heil 2012, pp. 18-22). Such an non-hylomorphic version of powers ontology is in real tension with the apparent holism of quantum mechanics. In addition, as I will argue in Section 7 below, it fails to provide any solution to the quantum measurement problem. I will defend a hylomorphic account of substances that is precisely the opposite of Heil’s: instead of saying that only particles are substances, I will claim that only non-particles are substances, i.e., that no “fundamental” particles are substances at all.

There are several reasons for denying quantum particles the status of metaphysically fundamental substances (see Koons 2019 Section 2.4). First of all, when particles are entangled, they lose their individual identities, in much the same way that dollars do when deposited in a bank account. This is reflected in the anti-haecceitistic bias of quantum statistics, in both the Bose-Eistein (for bosons) and Fermi (for fermions) forms (see the chapters in Part I of Castellani 1998). Second, in relativistic quantum field theory, even the number of fundamental particles is not an absolute fact but varies according to one’s frame of reference (see Fraser 2008). Thirdly, particles are wavelike in nature–they are merely excitations in fields, not entities in their own right. In standard (non-Bohmian) versions of quantum mechanics, particles typically lack spatial location and spatiotemporal trajectories. Any particle at any time has a finite probability of being detected anywhere in the universe (Clifton and Halvorson 2001). Finally, if particles were substances, then explaining the Einstein-Podolsky-Rosen correlations (which violated Bell’s inequality) would require super-luminal causation between widely separated particles–effectively, instantaneous action at great distances.

Aristotelian substances, being composite, come in two kinds: homogeneous and heterogenous. The most prominent examples of heterogeneous substances are living organisms. Organisms and other heterogeneous substances (if there are any) have clear spatial boundaries. In the case of homogenous substances, like water or hydrogen gas, the spatial individuation of individual substances would seem to be a matter of convention or speculation. It might be the case that for each natural kind of homogenous substances, there is at each point in time just a single scattered individual, one that exists as long as some of the substance exists somewhere. Local substantial change at the level of homogeneous substances is, however, an empirical matter. Wherever symmetries are broken spontaneously, there is a local substantial change from one substance to another (see Section 6.2).

On the Aristotelian model, parts of substances are metaphysically dependent on the whole. Applying this to quantum mechanics would result in the supposition that the states and locations of quantum particles are wholly grounded in the natures and states of the bodies to which they belong (and not vice versa). We could even go so far as to say that quantum particles have only a virtual existence until they come to be manifested in interactions between substances. This accords nicely with the fact that quantum particles lack any individual identity. Quantum statistics (in both the Fermi and Bose-Einstein versions) treats indistinguishable particles as lacking ontological distinctness, in contrast to classical statistics.

Quantum mechanics assigns to particles vectors in a state space, with projections of the vectors on various properties corresponding (via Born’s rule) with the probability of our observing the particle’s exhibiting the property in question. From the perennial perspective, the quantum representation is a representation of a certain active power of the whole substance—a power to manifest a particulate part with certain features in interactions with other substances (in this case, the experimenters and their instruments). The Kochen-Specker theorem of quantum mechanics entails that it is impossible to attribute a full range of determinate properties to these merely virtual entities at all times.

5 Against microphysical reduction

The perennial philosophy depends on denying that sciences like chemistry, thermodynamics, and biology are reducible to particle or field physics, since entities that are reduced to other entities cannot be metaphysically fundamental, and it is chemical and biological substances and not particles or fields that are fundamental.

Most philosophers of science assume that one theory can be reduced to another if the dynamical laws of the former can be derived from those of the latter under certain constraints or conditions (the so-called ‘classical’ or ‘Nagelian’ model of reduction). However, this common assumption overlooks the fact that every scientific explanation appeals to two factors: dynamical laws and a phase space (including a manifold of possible initial conditions). Consequently, every scientific theory comprises two elements: a set of dynamical laws and a space of possible initial conditions. The structure of this space implicitly encodes crucial nomological information.

In order to secure a metaphysical conclusion about dependency between the domains of two theories, it is not enough to derive the dynamical laws of one theory from the dynamical laws of the other, supposedly more fundamental theory. We must also prove that the structure of the phase space and of the manifold of possible initial conditions of the supposedly reducing theory is not itself grounded in the structure or laws of the reduced theory.

Suppose, for example, that we have two theories, T1 and T2. Theory T1 consists in a set of dynamical laws D1 together with a phase space S1, and T2 similarly consists of laws D2 and space S2. Let’s suppose that we have a Nagelian reduction of T1 to T2: a translation ∗ from the vocabulary of T1 into T2 such that D2 entails (D1) with respect to space S2, but (D1) does not entail D2 with respect to S2: that is, the set of trajectories (the flow) through S2 that are logically consistent with D2 is a proper subset of the set of trajectories through S2 that are consistent with (D1).

Would this narrow or Nagelian “reduction” give us grounds for taking the entities and properties of T1 to be wholly grounded in those of T2? Not necessarily: we have to take into account the role of the phase spaces S1 and S2. Suppose, for example, that the structure of S2 (the supposedly reducing theory) is metaphysically grounded in the structure of S1: it is facts about the natures of the supposedly reduced theory T1 that explains the structure of the space of possibilities used to construct explanations in terms of theory T2. It may be, for example, that the structure of S1 is “tighter” or more restrictive than the structure of S2 (under any metaphysically sound translation between the two), and this tighter structure might be inexplicable in terms of D2, theory T2’s dynamical laws. Space S1 could have additional structure, in the form of new, irreducible properties. In addition, there might be no natural restriction on space S2 that would close the modal gap between S1 and S2. On these hypotheses, the Nagelian reduction of the dynamical laws of T1 to T2 would carry no metaphysical implications.

It was easy to overlook this fact, so long as we took for granted the ungrounded and even universal nature of the microscopic or microphysical phase space. In classical mechanics, the space of possible boundary conditions consists in a space each of whose “points” consists in the assignment (with respect to some instant of time) of a specific location, orientation, and velocity to each of a class of micro-particles. As long as we could take for granted that this spatial locatedness and interrelatedness of particles is not metaphysically grounded in any further facts (including macroscopic facts), reduction of macroscopic laws to microscopic dynamical laws was sufficient for asserting the complete grounding of the macroscopic in the microscopic, and therefore for asserting the ungroundedness (fundamentality) of the microphysical domain. However, this ungroundedness of the spatial locations of microscopic particles is precisely what the quantum revolution has called into question. As I will argue in Sections 6 and 7 below, the phase space of macroscopic objects involves classical properties that cannot be derived from the non-commuting, quantal properties of pure quantum mechanics. The introduction of the thermodynamic or continuum limit introduces new mathematical structure to the phase space of thermodynamics, rendering the metaphysical reduction of thermodynamics to particle physics impossible, even though the dynamic law governing thermodynamics (the Schrödinger equation) is wholly derived from particle physics.

6 Thermochemical powers and potentialities

FromFootnote 2 the 1950’s onward, quantum theory moved from the pioneer period to that of generalized quantum mechanics. Generalized QM moved away from the Hilbert-space representation of pure quantum systems to an algebra, in which both quantum and classical observables could be combined in a single formal representation. The algebras of generalized QM can have non-trivial cores, consisting of the classical properties that commute with every other property, representing exceptions to the mutual complementarity of the quantum variables. In practice, this means representing the classical properties of complex systems (like molecules or experimental instruments) as ontologically fundamental, on par with the quantum properties of the smallest particles.

In addition, by moving to the “thermodynamic” or continuum limit, which involves treating a system with apparently finitely many parameters or degrees of freedom as though there were infinitely many such degrees, algebraic QM enabled theorists to introduce superselection rules, which could be used to distinguish the different phases of matter that can co-exist under the same conditions (such as gas, liquid, solid, ferromagnetized, superconducting). I will argue in the following sub-sections that the use of the continuum limit can best be interpreted as representing an ontological difference between two irreducibly macroscopic conditions, providing strong evidence against reduction.

6.1 The continuum limit: a mark of ontological fundamentality

In applied physics, it is common to take some parameter to infinity: that is, to replace the original model having some finite parameter with a new model in which that parameter takes the value of infinity. For example, in the so-called “thermodynamic” limit, a system containing n molecules and a fixed volume V is replaced by one in which both the number of molecules and the volume go to infinity, while keeping the density n/V constant. As Compagner explains (Compagner 1989), this thermodynamic limit is mathematically equivalent to the continuum limit: keeping the volume constant and letting the number of molecules go to infinity, while the size of each molecule shrinks to zero. In many applications, such as the understanding of capillary action or the formation of droplets, the continuum limit is the right way to conceptualize the problem, since infinite volumes have no external surfaces and cannot interact with their containers.

As Hans Primas has pointed out (Primas 1983), there are three reasons for taking infinite limits in physics: for mathematical convenience, in order to isolate some factors from others, and in order to introduce new structure into the representation. The continuum limit in generalized quantum mechanics is an example of the third reason. In 1931, John von Neumann and Marshall Stone proved that finite systems admit of only one irreducible Hilbert-space representation (von Neumann 1931). Infinite systems, in contrast, admit of infinitely many inequivalent Hilbert-space representations. This apparent embarrassment of riches in the infinite case turns out to be crucial for the representation of phase transitions, entropy, and thermodynamic phenomena. As Geoffrey Sewell explains:

For infinite systems, the algebraic picture is richer than that provided by any irreducible representation of observables… Furthermore, the wealth of inequivalent representations of the observables permits a natural classification of the states in both microscopic and macroscopic terms. To be specific, the vectors in a [single Hilbert] representation space correspond to states that are macroscopically equivalent but microscopically different, while those carried by different [inequivalent] representations are macroscopically distinct. Hence, the macrostate corresponds to a representation and the microstate to a vector in the representation space. (Sewell 2002, pp. 4-5)

Thus, at the thermodynamic limit, algebraic quantum mechanics gives us exactly what we need: a principled distinction between quantal and classical (non-quantal) properties. In addition, the non-quantal properties do not supervene on the quantal properties of a system, since the latter always consists of a finite number of facts, while the thermodynamic limit requires an infinite number of virtual sub-systems. The classical features are real and irreducible to the quantum particle basis. As I will argue in Section 7, this is exactly what is needed to resolve the quantum measurement problem.

Franco Strocchi (Strocchi 1985) has shown that the continuum limit is needed to explain any spontaneous symmetry breaking in quantum-mechanical terms. In classical mechanics, symmetry breaking could always be explained by small perturbations with non-linear consequences. These small perturbations or prior asymmetries can be ignored for the sake of convenient, approximate representations. In quantum mechanics, this simply does not work. Strocchi points out that in many cases “it is impossible to reduce symmetry breaking effects to asymmetric terms in the Hamiltonian.” (Strocchi 1985, p. 117) The dynamics have to be defined in terms of a symmetric Hamiltionian. Consequently, we need true emergence of asymmetry, not simply the apparent emergence that results from suppressing slight asymmetries in the prior situation (as in classical mechanics). This is possible only for infinite quantum mechanical systems. Any finite system retains any symmetry that it possesses.

In addition to symmetry breaking, infinite algebraic models are also crucial to the representation of irreversibility, which, in turn, is essential to thermodynamics (as noted by Woolley Woolley 1988, p. 56). This reflects work by Ilya Prigogine and his collaborators, who demonstrated that molecular motions any finite quantum system are always perfectly reversible. This is not the case for infinite systems, which can show irreversible behavior and thus can validate the Second Law of Thermodynamics as a fundamental law of nature.

6.2 Thermodynamics and phase transitions: infinite algebraic models

The infinite algebraic models of generalized QM provide, for the first time, the possibility of rigorous and non-arbitrary definitions of the basic thermodynamic properties of entropy, temperature, and chemical potential see (Sewell 2002). Contrary to what many philosophers believe, science does not suppose that temperature is the mean kinetic energy of molecules! (Vemulapalli and Byerly 1999, pp. 28-30) See also (Primas 1983, pp. 312-3).

If the system is not at equilibrium, temperature is not well-defined, though the mean kinetic energy is… . Temperature is a characteristic of equilibrium distribution and not of either individual molecules or their kinetic energy. When there is no equilibrium between different kinds of motion (translations, rotations, and vibrations), as in the case of molecular beams, temperature is an artificial construct. (Vemulapalli and Byerly 1999, pp. 31-2)

Since thermal equilibrium is not defined at the level of statistical mechanics, temperature is not a mechanical property but, rather, emerges as a novel property at the level of thermodynamics. (Bishop and Atmanspacher 2006, p. 1769)

If temperature could be defined as mean kinetic energy, then temperature would always be defined for any collection of molecules, since the kinetic energy of each molecule is always well-defined. In fact, many physical bodies have no well-defined temperature, as Vemulapalli and Byerly point out in the above quotation. Temperature emerges only once a thermodynamic equilibrium has been established between different modes of kinetic energy. Thus, without the thermodynamic limit as a faithful representation of real systems, we would have to dismiss all talk of ‘temperature’ as merely a useful fiction.

In addition, phase transitions, such as those between the solid, liquid, gas states, and between conditions before and after the onset of coherent ferromagnetism or superconductivity in metals, require the use of infinite models (models involving the continuum limit): see (Liu 1999), (Ruetsche 2006), and (Bangu 2009). Phase transitions are an important case of spontaneous symmetry breaking (Sewell 1986, p. 19).

6.3 Molecular structure

Generalized quantum mechanics attributes both classical and quantum properties to objects. The modern quantum theory of molecular structure is a classic example. The structure of a molecule, that which distinguishes one isomer from another, including right-handed chiral molecules from left-handed ones, depends entirely on the classical properties of precise location applied to atomic nuclei. As Hans Primas put it, “Every chemical and molecular-biological system is characterized by the fact that the very same object simultaneously involves both quantal and classical properties in an essential way. A paradigmatic example is a biomolecule with its molecular stability, its photochemical properties, its primary, secondary, and tertiary structure.” (Primas 1983, p. 16). The quantal properties of a system correspond to the wavefunctions associated with each of its constituent particles. These wavefunctions play a crucial role in explaining the behavior of bonding or valence electrons in molecules, as well as such phenomena as super-conductivity (Cooper pairs of electrons) and super-fluidity.

7 Powers and the measurement problem

Pioneer quantum mechanics is pure quantum mechanics, in the sense that all (or nearly all) observables are quantum observables—mutually complementary (in Bohr’s sense), satisfying the superposition principle. A classical observable is a property that commutes with all other properties, meaning that it can be conjoined, in a physically meaningful way, with any other observable. An entity can have a determinate value of a classical observable at all time, while it is impossible to have determinate values for two, mutually non-commuting quantum observables. As an expression of this pioneer viewpoint, John von Neumann laid down the irreducibility postulate (von Neumann 1931): no two observables are commutative.

Irreducibility gives rise inevitably to the so-called “measurement problem”: experiments invariably take place in a context defined in terms of classical observables, like location and temperature. If the theory includes no classical observables, then there is an unbridgeable conceptual gap between the world of theory and the world of the experimenter. The different responses to the measurement problem produced the different “interpretations” of the formalisms of Pioneer Quantum Mechanics. Here are the five most common and well-defended interpretations:

  1. 1.

    The Copenhagen interpretation or family of interpretations, comprising a variety of pragmatic, operationalist, perspectivalist, and anti-realist interpretations, including that of Niels Bohr. Quantum states are defined in terms of experimental results and have no independent existence.

  2. 2.

    Dualist interpretations: Eugene Wigner, John von Neumann. Human consciousness causes a “collapse of the wave packet”: a discrete transition from a superposed quantum state into a state in which the system possesses some definite value of the appropriate classical property (position, momentum, etc). This involves positing two distinct dynamics in the world—one occurring autonomously, the other existing in response to interactions with consciousness.

  3. 3.

    David Bohm’s interpretation (Bohm 1951), building on Louis de Broglie’s 1925 pilot wave account. The pure quantum world exists with a unified, uninterrupted dynamics. The universe consists of point particles with definite locations at all times, guided by the wave function, and forming a single, indivisible and non-localizable dynamical system.

  4. 4.

    Hugh Everett’s (Everett 1957) “relative state” or “many worlds” interpretation, developed by Bryce De Witt, R. Neill Graham, David Deutsch, (Deutsch 1996) and David Wallace (Wallace 2008). The classical world of experiments is merely an appearance, a product of the limited perspective of human and other organisms. When performing experiments involving interaction with systems in superposed quantum states, the observer splits into multiple versions, one corresponding to each possible state. Each split state involves no awareness or memory of states experienced in parallel branches.

  5. 5.

    Objective collapse theories, such as GRW (Ghirardi et al. 1985). These interpretations are like the dualist versions, except that the collapse of the wave packet is triggered by certain physical events and not by consciousness. At this point, these theories go beyond interpretation, postulating a new, so-far merely speculative collapse-triggering mechanism. At this point, there is no specific theory and no empirical confirmation. In addition, objective collapse theories require still further ontological interpretation, such as John Bell’s “flash ontology” (Bell 1987) or the matter density model.

Hylomorphism with its power ontology can be offered as a sixth interpretation, an interpretation inspired by some remarks of Heisenberg (Heisenberg 1958), and defended by Nancy Cartwright (Cartwright 1999) and Hans Primas. Interaction between the quantum powers of one substance and the substances making up the experimenters and their instruments precipitates an objective collapse of the quantum object’s wavefunction, as a result of the joint exercise of the relevant causal powers of the object and the classical instruments,Footnote 3 and not because of the involvement of human consciousness.

How is this a solution to the measurement problem? Why haven’t I merely re-stated the problem by referring to ‘observers’ and their ‘classical instruments’? My answer is this: according to hylomorphism, observers and their instruments are substances (or made of substances), and substances are not composed of quantum particles. The states of substances are not reducible to the quantum states of their particles. Thus, there is no inconsistency in supposing that substances have properties (‘classical’) that are exempt from superposition and that, therefore, always constitute definite outcomes. I will explain how this works in more detail in Section 7.2 below, following the work of Hans Primas.

Do we need perennial philosophy and not just some version of contemporary powers ontology? Yes, because if we try to solve the measurement problem with powers alone, we will have to attribute those powers to quantum particles and only to quantum particles. This would include both active and passive powers. Solving the measurement problem requires observers and their instruments to have non-quantal passive powers, through which they can register definite results and not merely enter into an extended superpositions. As I have argued above, Aristotelian substances have the capacity to bear irreducible chemical and thermodynamic properties (as represented in the non-trivial centers of infinite algebraic models). Quantum particles do not have that capacity: they are fully characterized by vectors in a single Hilbert space in a finite algebra with only a trivial center and no superselection sectors.

7.1 Epistemological constraints on a solution to the measurement problem

To solve the measurement problem, it is not enough for an interpretation of quantum mechanics to merely save the phenomena, in the sense of merely explaining how it is possible for us to experience the appearance of a macroscopic world (with objects instantiating mutually commuting, classical observables like actual position). We must distinguish between explaining and explaining away. A credible scientific theory must explain most of our apparent data, in the sense of both treating it as objectively known fact and providing a satisfactory causal account of its genesis. A scientific theory that explains the data by entailing that it is all a mere appearance, without objective reality, destroys its own empirical foundations.

More specifically, here are some epistemological constraints that must be satisfied (see Simpson 2020, Chapter 8; Simpson 2019):

E1. Perception. :

The theory must endorse the fact that our sensory perception of physical events and objects is mostly reliable.

E2. Memory. :

The theory must endorse the fact that our memory of past observations is mostly reliable.

E3. Induction. :

The theory must endorse the fact that the physical events and facts that we observe (currently and in memory) are an inductively reliable sample of the whole.

As we shall see, each of the new interpretations of QM fails one or more of these tests, in contrast to the power ontology of hylomorphism.

The non-locality of quantum mechanics, as exemplified by Bell’s theorem, threatens condition E1. If we embrace a Neo-Humean account of causation, the immediate consequence is that causation in the quantum domain is radically non-local. By radically non-local, I mean that the intensity of the influence of distant bodies does not decrease as distance increases. Very remote objects (if entangled with something in our neighborhood) can have effects every bit as significant as other objects in that same neighborhood. In principle, at least, this raises questions about the reliability of our sensory perception of our immediate environment, since our brains or our sense organs might be entangled with distant objects in a way that makes them unreliable as indicators of local conditions.

Hylomorphists can secure the justifiability of reliance on perception by positing receptive causal powers that, when not interfered with by abnormal conditions (whether internal or external), actualize themselves in the form of veridical impressions of one’s environment. Since Neo-Humeans lose such a robust Aristotelian theory of causal powers, with its distinction between normal and abnormal conditions, they are left with a situation in which the fallibility of the sensory process makes it unreasonable to treat any sensory impression as knowledge-conferring.

7.2 The Neo-Copenhagen (hylomorphic) programme

The old Copenhagen view of Niels Bohr suffered from being too narrowly dualistic, distinguishing the classical world from the quantum world. In contrast, the hylomorphic interpretation embraces a salutary kind of ontological pluralism, recognizing that the non-quantum or supra-quantum world is itself a “dappled” world (as Nancy Cartwright puts it), dividing naturally into multiple domains at multiple scales. This fits the actual practice of scientists well, who are in practice ontological pluralists, as Cartwright has documented.

The measurement problem arises from the formulation of quantum mechanics as a theory about the probabilities of certain measurement results. The quantum wavefunction evolves in a deterministic manner, by the unitary dynamics of Schrödinger’s equation. In order to test the theory, some observable results must be deduced from the theory. It is Born’s rule that enables us to move from some parameter value in the wavefunction (the wave amplitude) to something testable: namely, certain probabilities about the result of measuring one or other classical parameter (such as position or momentum). This early model (as developed by Bohr and Heisenberg) assumed that we could continue to use classical language in describing the experimental setup and the measurement devices. Critics have argued that this involves an implicit inconsistency, since physicists assume that these classical instruments are wholly composed of quantum systems and so should be, in principle, describable in purely quantum and not classical terms.

This charge of inconsistency falls flat when lodged against the hylomorphic version of the Copenhagen programme. Observers and their instruments are not reducible to their quantum constituents–instead, quantum particles have only virtual existence, corresponding to certain powers of thermochemical substances. Theoretically, this depends (as I showed in the last section) on the use of algebraic formulations of quantum mechanics with infinite models (at the continuum limit). The additional structure afforded by such models represents the irreducible fundamentality of these substances.

Bohr’s interpretation required that reality be divided into two disjoint realms, the classical and the quantum, with a measurement involving any setup in which a quantum system is made to act upon a classical observer or instrument. This foundered on the fact that some systems, like supercooled fluids or quantum computer chips, bridge the gap between the two realms. We cannot consistently describe all macroscopic objects in purely classical terms, as Bohr’s program seems to require, since it is interaction with the classically described realm of measurement devices that collapses the wavefunction in Bohr’s model. In contrast, on the Primas model, we could postulate that the wave packet associated with a quantal property has “collapsed” whenever it becomes correlated with a fundamental classical property of a disjoint system. Even though entities cannot be neatly divided into two disjoint domains, this is not true of physical properties. Substances have both classical properties and (by virtue of their virtual parts) quantal properties. Infinite algebraic models represent quantal properties as vectors in individual spaces and classical properties as disjoint spaces or superselection sectors.

Primas demonstrates (Primas 1990) that interaction with the classical properties of entities in the environment will drive quantal vectors to eigenstates with a high probability in a short period of time. The Primas solution is, consequently, one of continuous rather than discrete collapse (unlike, for example, most versions of the GRW model of objective collapse). The Primas model can be incorporated into a powers ontology, by attributing to substances the power to collapse the wavefunctions associated with quantum parts of other substances.

Bell characterized the measurement succinctly in this way: either the Schrod̈dinger equation isn’t right, or it itsn’t everything. Most solutions to the problem fall squarely into one side or the other: the Copenhagen interpretation and the many-worlds interpretation insist that the equation isn’t everything, while the GRW and other objective collapse theories suppose that it isn’t right. On which side does hylomorphism stand? I’ve described it as a neo-Copenhagen view, while Primas offers a model of objective collapse. Footnote 4

Of course, Bell’s alternatives are not exclusive. In fact, the Schrödinger equation is neither everything nor right. It is right insofar as it describes the evolution of the quantal aspects of a substance sans interaction with other substances. However, this is not everything, since thermal substances also possess determinate, non-quantal properties. And it is incorrect, even as a description of those quantal aspects, whenever the quantum potentialities are actualized through interaction with other substances. At that point, a form of objective collapse takes place, in a way described by Primas’s model.

7.3 The Everettian programme

ThereFootnote 5 are three defects to the Everett (relative-state or branching world) programme, each of which hylomorphism avoids. First, hylomorphists can give a straightforward, intuitive, and natural account of the probabilities associated with the quantum wavefunction: the square of the wave’s amplitude associated with some precise state represents the probability that the quantum particle will interact in a corresponding way with some classical measurement instrument. So, for example, if we use a photographic plate to register the location of a photon, then the quantum probability associated with a particular location will give us the probability that the photon will interact with the plate at that location. In contrast, the Everett interpretation requires that we radically modify our naïve conception of probability, assigning fractional probabilities to various states, even though it is certain that each of the states will in fact be realized (although on different “branches” of the world). See (Kent 2010; Price 2010). I have argued that the sophisticated, neo-pragmatist solution to this problem developed by David Wallace and other “Oxford Everettians’ fails, because it overlooks the possibility of a rational agent’s utility depending on inter-branch comparisons (Koons 2018a).

The second drawback to the Everett interpretation is that it, like the Bohm interpretation, renders our classical interactions with the quantum world illusory. There are, on the Everett interpretation, no inter-actions at all. The evolution of the world is simply the autonomous unfolding of a single object, the universe, according to a global Schrödinger equation. Entities like you and I and our experimental instruments are merely simulated by aspects of this function, as a kind of “virtual reality”. (See Albert 2015, Halliwell 2010, Maudlin 2010) The world has all the causal oomph there is, leaving nothing over for mere parts of the world to exercise. This means that the Everett interpretation must lose all of the epistemological advantages that a causal-powers account of scientific experimentation can provide.

In effect, the Everett interpretation (in its modern, Oxford-school form, as developed by David Wallace 2008 and his collaborators) almost perfectly duplicates Plato’s allegory of the cave from Republic Book VI: we are forced to watch the mere shadows (the classical observables) cast by the quantum wavefunction, which lies always outside our field of vision. In fact, we are in an even worse predicament than the prisoners in the cave: since we (the observers) are also mere shadows on the cave wall. The classical world consists of mere shadows shadow-observing other shadows, with no real entities to whom the appearances can appear. In contrast, the hylomorphic interpretation is fully compatible with attributing real and fundamental causal powers both to the classical and to purely quantum objects.

Is this really fair to the Oxford Everettians?Footnote 6 They could plausibly claim that, on their view, the manifest or classical world is real although not fundamentally so. It seems unfair to compare the manfiest world on their account with virtual reality or with the shadows on Plato’s cave. The manifest world is a real pattern (to use Daniel Dennett’s phrase, Dennett 1991), one that is functionally realized by the underlying quantum reality. As we shall see (when we turn to my third objection), there are many patterns to be found in the quantum wavefunction. Every logically consistent story with the right cardinality is functionally realized by the quantum world. Therefore, the classical world of experimenters and their instruments is no more real than any fiction.

Thirdly and finally, the Everett interpretation leads to global skepticism via both Putnam’s paradox (Putnam 1980, Lewis 1984) and Goodman’s grue/bleen paradox (Goodman 1954, Lewis 1983), as I have argued elsewhere (Koons 2018a). Putnam’s paradox is an argument that purports to show that our words and concepts cannot pick out determinate properties, since the finite class of actual attributions of those words and concepts radically under-determines their extension with respect to not-yet-encountered instances. The standard response to this paradox is to appeal to the relative naturalness of properties whose relevant sub-extension matches our actual use: our words or concepts pick out that most natural property (if there is one) whose extension and anti-extension best fits our actual use of the word or concept in particular affirmations and denials. However, the Everett interpretation is committed to the radical non-naturalness of all the properties that putatively apply to entities in our familiar spacetime world. Hence, our concepts and words can be matched to the truly natural properties (those instantiated by the quantum wavefunction) in an infinite number of equally natural ways. (This is a generalization of an argument by Bradley Monton against wavefunction realism: (Monton 2002) and (Monton 2006).)

Suppose that we have two Everettian models of the universe, M1 and M2, with the same cardinality, where each model assigns a Hilbert vector in the same space to each moment of time. (I’ll assume that the spacetimes of the two models are isomorphic.) Let’s suppose that M1 represents the underlying microphysical reality of our actual universe and M2 that of an alternative, fantastical universe (like Tolkien’s Middle-Earth). Let’s also suppose that the unitary time-operators and the Schrödinger equations for the two models are both linear and deterministic, although they may be otherwise quite different. Then there are models \(M_{1}^{*} \) and \(M_{2}^{*}\) and homomorphisms H1 and H2 from \(M_{1}^{*}\) to M1 and \(M_{2}^{*}\) to M2 (respectively), where \(M_{1}^{*}\) consists of the representation of an approximately classical, macroscopic, 3 + 1-dimensional world that corresponds to the common-sense history of our phenomenal world, and \(M_{2}^{*}\) a similar representation of the fantastical history (with terms in the Hamiltonian representing the effects of wizardry, for example).

There will be a bijective function J (given the linearity and determinism of the dynamics of quantum mechanics) between the vectors of M1 and M2, which preserves the underlying dynamics (in the sense that a dynamically possible trajectory in M1 will be mapped onto a dynamically possible trajectory in M2). Mapping J will then preserve the truth-values of the microscopic counterfactual conditionals of the two models, so long as the antecedents of the conditionals specify complete states of the universe. In addition, the composition of H2 and J will be a homomorphism from \(M_{2}^{*}\) into M1. Let’s assume, further, that the closeness of two world-states (from a macroscopic perspective), for the purposes of evaluating counterfactual conditionals relevant to \(M_{1}^{*}\) and \(M_{2}^{*}\), is indifferent to the underlying microscopic models. If so, we can adopt a measure of counterfactual closeness on the states of M1 that perfectly preserves, under H2 composed with J, all of the phenomenal and macroscopic counterfactuals true in \(M_{2}^{*}\) (see Lewis2001). Hence, our actual universe will contain implicitly a representation of the fantastical history \(M_{2}^{*}\), in exactly the same sense in which it contains a representation of our common-sense history \(M_{1}^{*}\).

If the only conditions on the extraction of a phenomenal or quasiclassical world from the wavefunction are mathematical (i.e., the existence of some isomorphism and some measure of closeness that jointly preserve dynamics and the truth-value of conditionals), then any imaginable world can be extracted from any wavefunction. The world of Greek mythology, The Matrix, The Lord of the Rings, or Alice and Wonderland would be every bit as real as the world represented in our science and history textbooks. There would be minds experiencing an infinite variety of phenomena, the vast majority of which would have no correspondence whatsoever to the classical physics of Newton and Maxwell. Inhabitants of these non-classical phenomenal worlds would have no hope of ever discovering the fundamental laws of physics.

The only way to block these conclusions is to claim that the homomorphism H1 preserves the naturalness of macro properties, the real causal connections between macroscopic things, or the real closeness between states of the world in a way that the composition of H2 with J does not. However, on the Everett view, there are no natural properties and no real connections in our phenomenal world, and the laws of quantum mechanics do not dictate which pairs of states are really closer than others for the purposes of evaluating macroscopic conditionals, and hence there is no basis for preferring one homomorphism over another.

Reflection on these facts would, in turn, provide us with an effective defeater of our own scientific beliefs, since the vast majority of minds would be radically deceived about the deep nature of the world they (and we) really inhabit, and we would have no non-circular grounds for believing that we inhabit one of the few epistemically “lucky” phenomenal worlds.

Everettians could respond by insisting that the only real branches (the only ones inhabited by really conscious beings) are those that approximate the dynamics of classical physics. In fact, many recent Everettians have implicitly made just such a stipulation: (Albert 1996, pp. 280-1; Gell-Mann and Hartle 1996; Lewis 2004, p. 726). However, this would be a purely ad hoc move, with no plausible rationale. It would outrageously parochial and anthropocentric, given our own entirely derivative status in the Everettian universe.Footnote 7

The problem of multiple domains also puts at risk the rationality of induction as a guide to the future. Even assuming that our own domain has been approximately classical up to this point in time, there are many, equally natural extensions of that domain into the future, most of which invalidate our inductive expectations. This involves the application of Nelson Goodman’s grue/bleen paradox to the problem of extracting domains from the wavefunction. In Goodman’s thought-experiment, we are to imagine a possible future in which emeralds continue to be grue, rather than green, after the year 2020 (where ‘grue’ is defined as ‘green if discovered before 2020, and blue otherwise’). Goodman argues that our inductive experience with emeralds before 2020 gives us equally good reason to believe the hypotheses that all emeralds are grue and that all are grue.

When transferred to the Everettian scenario, Alberto Rimini (Rimini 1979) has shown that we can find actual domains in which objects shift in their behavior with respect to a standard set of observables but remain the same with respect to some gerrymandered, “gruesome” observables. Each consistent branch in the Everett multiverse has multiple extensions into the future corresponding to different observable-operators. Some of these extensions are intuitively unnatural, in the sense of treating grue-like objects as qualitatively the “same,” before and after the crucial transition. These alternative future branches of our domain are equally natural from the perspective of the underlying quantum wavefunction. Hence, the Everettian has no grounds for privileging what we would deem the more natural branch, since true naturalness must be wholly grounded in what is metaphysically fundamental.

The link between naturalness and fundamentality

If instantiations of F and G are wholly grounded in instantiations of (respectively) fundamental properties \(F^{\prime }\) and \(G^{\prime }\), then if F is more natural then G, so too \(F^{\prime }\) must be more natural than \(G^{\prime }\).

Goodman’s grue/bleen paradox can be taken as a special case of the Putnam paradox: one in which it is indeterminate how to extend our empirically well-confirmed hypotheses into the future, across an arbitrarily chosen boundary.

These grue/bleen-like paradoxes pose a dilemma for the Everettians. If they suppose that there is no natural mapping from our concepts to features of the real wavefunction, then they have to embrace a radical indeterminacy of interpretation that deprives nearly all of our assertions and beliefs of determinate truth-value. If, alternatively, they suppose that there is some brute semantic matter of fact about the correspondences, then they have to embrace a scenario in which our inductive practices are radically unreliable, since each empirical generalization will be falsified in many such interpretations, and the Everettians have no grounds for supposing that the one “correct” interpretation is one that verifies the majority of our inductive inferences, bringing the Everett interpretation into conflict with E3.

But what about Dennett’s real patterns? (Dennett 1991) Couldn’t we insist that our classical world is a real pattern, and that all of these other fictions are merely unreal? What makes a pattern real, in Dennett’s account? Dennett says that a pattern is real when it is “readily discernible” or “recognizable” (Dennett 1991, p. 33). The reality of a pattern depends on “perceivers’ capacities to discern patterns” (Dennett 1991, p. 34). We create real patterns by bringing our pattern-making perspectives to the buzzing blooming confusion of data. (Dennett 1991, p. 36) Finding real patterns enables us to engage in efficient and reliable prediction. (Dennett 1991, p. 42) There is one central problem with all of this: we, with our pattern-recognizing and pattern-making capacities, are also part of the very manifest world that we are trying to distinguish from merely fictional patterns. Dennett’s account is either viciously circular or tacitly dualistic, assuming that we exist as real observers outside of the quantum reality whose patterns we recognize. Hylomorphism enables us to avoid such implausible mind/body dualism.

7.4 The Bohmian programme

Like the Bohm view, the hylomorphic interpretation assumes a broadly realist stance toward the classical world. Bohm takes classical objects to be composed of particles really located (for the most part) in the regions of space that they appear to occupy in our experience. A deterministic version of Bohm’s theory would seem to offer Neo-Humeans and microphysicalists their best chance at surviving the quantum revolution. Each particle in Bohm’s theory has a definite location at each time, and these locational states are indeed fully separable. Each particle has its own unique identity, blocking any quantum fusion.

However, there are real concerns about whether Bohm’s theory can underwrite the reliability of our perception of the positional states of our measuring devices. Our subjective impressions would seem to depend on the contemporaneous states of our brains, not the positions of particles in our measuring devices (or even our sense organs, like the retina). Bohm’s theory is certainly capable of generating false sense impressions and false memories about particle positions, since particles do not influence each other’s positions, but are always guided by the cosmic wavefunction.

Here’s the form of the argument:

  1. 1.

    To be empirically adequate, Bohm’s theory must give an account, not just of the “pointer settings” of measuring instruments, but also of our perceptions of those settings (as Bohm himself admitted, Bohm 1951, p. 583).

  2. 2.

    There is good reason to think that mental states aren’t determined by particle positions within the brain alone. We must include all of the functional features of the brain.

  3. 3.

    But this requires that the basis of mental states includes the state of the cosmic wavefunction, which leads to the radical non-locality of the relevant brain state.

  4. 4.

    In the absence of pervasive and stable decoherence linking brain states and sensible objects, functional states of those states in relation to the brain do not fix particle positions (in either the object or the brain): two pairs of brain-object relational states can be functionally indistinguishable, even though they involve radically different particle positions and trajectories. Therefore, in the absence of effective decoherence, one and the same system (e.g., the person’s brain plus his sense organs) cannot be reliable both at tracking functional states and at tracking particle positions.

  5. 5.

    Non-local quantum effects threaten to destroy any reliable correlation between the functional states of the environment and local particle positions and therefore to destroy any correlation between brain states and particle positions.

  6. 6.

    This could be avoided only if we had good grounds for assuming that environmental interaction secured (through decoherence) the effective classicality of the brain-environment interaction, but that is very much in dispute. In addition, Bohm’s theory raises special technical problems for the widespread application of decoherence (see Schlosshauer 2005, p. 1297-8 and Simpson 2019).

  7. 7.

    Evolution would explain our ability to track reliably the relevant functional aspects of our environment, not our ability to track particle positions. Evolution cares about whether we can survive and reproduce—it is completely indifferent to whether we can reliably track particle positions.

Brown and Wallace explain why the perceptual state must be fixed by the functional state of the brain, not just by its configuration of particles (premise 2):

Observables in the context of Bell’s remark are defined relative to sentient observers, and it is a tenet of the de Broglie-Bohm picture that such observers are aware of corpuscles in a way that fails to hold for wavefunctions. Of course, there is an obvious sense in which the corpuscles are also “hidden,” and Dürr et al. emphasized in 1992 (Dürr et al. 1993) that the only time we can have sure knowledge of the configuration of corpuscles is “when we ourselves are part of the system.” But how exactly is this supposed to work? Stone correctly pointed out in 1994 (Stone 1994) that this claim certainly fails if our knowledge is based on measurements which one part of our brain makes on another… (Brown and Wallace 2005, p. 534)

In support of premise 5 (the lack of a simple correlation between brain states and particle positions), Brown and Wallace point out:

Suppose we accept that it is the [particle positions] that determine the outcome of the measurement. Is it trivial that the observer will confirm this result when he or she “looks at the apparatus”? No, though one reason for the nontriviality of the issue has only become clear relatively recently. The striking discovery in 1992 of the possibility (in principle) of “fooling” a detector in de Broglie–Bohm theory (Englert et al. 1992, Dewdney et al. 1993, Hiley et al. 2000, Brown et al. 1995) should warn us that it cannot be a mere definitional matter within the theory that the perceived measurement result corresponds to the “outcome” selected by the hidden corpuscles (Brown and Wallace 2005, p. 523).

As premise 6 indicates, Bohmians might respond to this problem by appealing the theory of decoherence. Decoherence involves considering how the action of two systems (thought of as the measuring apparatus and the object under study) on the wider environment can enable them to become approximately classical in their relation to each other, in such a way that they can be assigned stable properties (such as location) that evolve in roughly the way prescribed by classical, pre-quantum physics.

However, it is not at all clear that decoherence will work in the intended way in a Bohmian setting. Sanz and Borondo (Sanz and Borondo 2003) studied the double-slit experiment in the framework of Bohmian mechanics and in the presence of decoherence and showed that even when coherence is fully lost, and thus interference is absent, nonlocal quantum correlations remain that influence the dynamics of the particles in the Bohm theory, demonstrating that in this example decoherence does not suffice to achieve the classical limit in Bohmian mechanics. See also (Schlosshauer 2005, 1298).

Is this problem of perceiving pointer settings any greater for the Bohmians than it was in classical, Newton-Maxwell physics? Yes, it is, precisely because of the radically non-local character of Bohmian dynamics. All distant bodies in Newtonian mechanics have a negligible influence on local phenomena, an influence that decreases proportionally to the square of the distance. This is not the case in Bohmian mechanics. There is, therefore, real grounds for doubting whether we can reliably detect the actual positions of Bohmian particles, contrary to principle E1.

7.5 The GRW/objective collapse programme

The hylomorphic interpretations of quantum mechanics have several advantages over GRW and other non-hylomorphic objective collapse theories. First, hylomorphism does not require speculation about some as-yet-unknown mechanism by which quantum waves collapse into precise states. Consequently, hylomorphists can give a much simpler account of the internal dynamics of the quantum world: the quantum world proceeds without exception according to the dynamics of the Schrödinger equation. Instead of postulating some unknown quantum trigger of wave collapse events, the hylomorphic pluralist simply relies on our actual practice of using instruments with classical features to precipitate precise measurement events. For hylomorphic pluralists, to learn more about how quantum waves collapse is simply to learn more about macroscopic and mesoscopic systems themselves—to learn more chemistry and thermodynamics and biology. This is in fact the direction taken by generalized quantum mechanics (as I described in Section 5).

In addition, the hylomorphist can take the objects of the ‘mesoscopic’ world (including molecules and cellular structures) as persisting in stable states through time, while the objective collapse view has to be combined with a further account of the ontology of the macroscopic world. For example, if the GRW theory combined with John Bell’s “flash ontology” (Bell 1987, Maudlin2011, pp. 23–57), in which the macroscopic world consists of a number of widely separated and intermittent “flashes” (like the blinking of a swarm of fireflies), with each flash representing a wavepacket collapse. However, the Bell flash ontology can only provide a relatively small number of “flashes” of determinacy, too small a number to ground the existence of stable molecules and organisms:

The alternative version of GRW theory is the matter density interpretation. On this view, objective collapses result in relatively dense concentrations of expected mass in spacetime regions that resemble the objects of our classical world. The matter density interpretation shares with Bohmian theory the problem of verifying the reliability of our sense perception, and for similar reasons (both theories involve a high degree of causal non-locality). As Schlosshauer has pointed out, decoherence is of relatively little help to objective collapse theories (Schlosshauer 2005, pp. 1293-6).

In addition, as Alexander Pruss has recently argued (Pruss 2015), non-hylomorphic objective collapse theories face a problem with respect to the epistemological constraint E2, the reliability of memory. GRW is not really a single theory but a family of theories. The family has a single free parameter, which we can call (following Pruss) f, the hitting frequency. The hitting frequency gives us the probability of the collapse of any system of entangled particles, as a function of the total mass of those particles. We can put an upper bound on the hitting frequency–if f were too high, then we would never observe the kind of entanglement that is characteristic of the quantum realm. However, this experimental data puts no lower bound on the f. The frequency could be so low that it is very unlikely that any system should ever collapse. The argument against such a low frequency has to be philosophical and phenomenological rather than scientific: if the frequency were that low, human observations would never have definite or delimited outcomes, contrary to our experience.

Pruss suggests that we take such low frequencies seriously:

But imagine f is so low that typically a collapse in something the size of my immediate environment occurs only every hour. On its face this is ruled out by my memory of the past five minutes. But suppose, as seems reasonable on GRW, that consciousness occurs only when there is no superposition of brain states that constitute consciousness. Nonetheless, even when consciousness does not occur, my brain states will be evolving in superposition, and when they collapse they will give rise to conscious false memories of having had conscious states over the past period of time. We thus have no way of empirically ruling out such low values of f.

In other words, the proponents of GRW can rule out such low hitting frequencies by assuming (without argument) that our memories are veridical. However, the GRW family of theories, if true, would give us good reason to doubt that veridicality. If GRW were true and the hitting frequency were low, my present experience would be exactly the same. I could know that I have just now experienced a collapse of the wave function, but I could not have any confidence that any of my apparent memories of precise observations in the past are veridical. It isn’t just that proponents of GRW are, like all of us, subject to Cartesian doubts. It’s rather that the GRW program provides positive support to the skeptic’s worries. If the hitting frequency is low enough, my memories are radically unreliable as manifestations of the actual past. Some degree of reliability is a condition of knowledge.

The defenders of GRW might object to this reduction to skepticism by arguing that it is legitimate for them to take into account the need to secure the reliability of our memory in fixing the value of the hitting frequency parameter. Why can’t they simply build a sufficiently high hitting frequency into their theory as a way of blocking the argument for skepticism?

I have two responses. First, since f is a free parameter of the theory, the only legitimate way to settle its value is empirically. However, its value cannot be settled empirically without presuming (at least implicitly) that our memories are indeed reliable. Hence, it would be viciously circular to set the frequency high enough to ensure the reliability of our memory. In contrast, the hylomorphist treats the reliability of our memory as a fundamental fact about the human form, with no free parameters whose value-determination requires empirical input.

Second, the GRW theorist is vulnerable to epistemic defeat, along the lines developed by Alvin Plantinga (1993, 2003, 2011). In the absence of any physical or metaphysical constraints on the value of f, we have to take seriously the possibility that the value of f might be extremely low. We know that our memory is very unreliable, on the assumption that f is low (most of our apparent memories are illusory). In that situation, we cannot appeal to our memory of the past to verify the reliability of our memory without obvious vicious circularity. Thus, we cannot justify continued rational belief in the reliability of our memory, given the real possibility of an undercutting defeater which cannot itself be defeated.

In contrast, there is no similar consideration forcing the hylomorphist to recognize any possibility of the unreliability of our powers of memory.

Finally, even if we were to grant that the hitting frequency is so low that such false memories would be extremely unlikely, this is not sufficient for our memory-based beliefs to constitute knowledge. A very high probability of truth is not sufficient for knowledge, as the famous lottery paradox illustrated. I can know that the probability of each ticket’s winning is extremely low—in a hypothetical lottery with an astronomical number of tickets, fantastically low. However, such a low probability of falsity is not sufficient to give us knowledge of truth, since if I could know that each ticket is a loser, I could also know that they all are, which in fact I know to be false. What’s needed for knowledge is the exercise of some cognitive power which, if exercised in normal circumstances and without external interference, guarantees absolutely the truth of the belief formed. Given GRW without hylomorphic powers, our memory-based beliefs can never meet that standard.

Therefore, GRW theories and other objective collapse theories fail epistemological constraint E2.

GRW theories also fail constraint E1, perception, for reasons noted by David Albert and Lev Vaidman (Albert and Vaidman 1989) and (Albert 1990). The human visual system is quite sensitive to small numbers of photons–as few as six or seven suffice. However, such a small collection of photons has a vanishingly small probability of inducing a wavefunction collapse under GRW models. Aicardi et al. (Aicardi et al. 1991) responded by arguing that the movements of ions in the human nervous systems that correspond to the apparent perception of photons is sufficient to guarantee a collapse with high probabiiity within the time frame of conscious perception. However, this is not sufficient to satisfy E1, since it means that almost all of our visual perceptions are factually inaccurate. They represent events occurring in our environment, events that are ontologically independent of the movement of ions in our optic nerves and brains. If GRW is correct, however, what we see when we see something is actually an event occurring within our own nervous systems. There was no corresponding external event consisting of the emission of a localized photon that we were able to detect. Once again, GRW can save the phenomena but only at the expense of undermining human knowledge.

8 Conclusion

Power ontology provides us with a metaphysical framework that is sufficiently flexible to accommodate fundamental modes of causation at the level of thermodynamics, chemistry, and solid-state physics. By doing so, we can circumvent the usual measurement problem, which presupposes that an exhaustive description of the world at a fundamental level can be given in terms of pioneer quantum mechanics, with no non-trivial center of classical properties.

Additional work needs to be done in exploring the relationship between a purely quantal description of particles (taken either individually or as definite pluralities of discrete entities) and the metaphysically more fundamental level of substances and their causal powers. In particular, should we assume that there is a quantum wavefunction that embraces all the particles of the world, simultaneously characterizing the quantum potentialities of all substances, or should we suppose instead that quantum wavefunctions are always local and contingent affairs, part of what Nancy Cartwright has described as a dappled world? (Cartwright 1999) The hylomorphic view can be developed in either direction. If we assume a global wavefunction, then we get the traveling forms interpretation of Alexander Pruss, in which substantial forms of interacting substances induce global collapses of the wavefunction. (Pruss 2018) The dappled world alternative has been developed by William Simpson in his dissertation (Simpson 2020), and it is that model that is tacitly presupposed by Primas’s model of collapse. It also underlies recent work by Barbara Drossel and George Ellis.(Drossel and Ellis 2018)

This issue corresponds to a further question about the extent of entanglement in nature. The global wavefunction picture would suggest that entanglement is pervasive in nature, arising with the Big Bang and never fully disappearing. On the dappled world picture, entanglement occurs only under special circumstances, when complex systems are prepared in a way that is isolated from the surrounding environment. Local collapses destroy these fragile entanglements.