Generalised Quantum Theory—Basic Idea and General Intuition: A Background Story and Overview
- First Online:
- Cite this article as:
- Walach, H. & von Stillfried, N. Axiomathes (2011) 21: 185. doi:10.1007/s10516-010-9145-5
- 1.4k Downloads
Science is always presupposing some basic concepts that are held to be useful. These absolute presuppositions (Collingwood) are rarely debated and form the framework for what has been termed “paradigm” by Kuhn. Our currently accepted scientific model is predicated on a set of presuppositions that have difficulty accommodating holistic structures and relationships and are not geared towards incorporating non-local correlations. Since the theoretical models we hold also determine what we perceive and take as scientifically viable, it is important to look for an alternative model that can deal with holistic relationships. One approach is to generalise algebraic quantum theory, which is an inherently holistic framework, into a generic model. Relaxing some restrictions and definitions from quantum theory proper yields an axiomatic framework that can be applied to any type of system. Most importantly, it keeps the core of the quantum theoretical formalism. It is capable of handling complementary observables, i.e. descriptors which are non-commuting, incompatible and yet collectively required to fully describe certain situations. It also predicts a generalised form of non-local correlations that in quantum theory are known as entanglement. This generalised version is not quantum entanglement but an analogue form of holistic, non-local connectedness of elements within systems, predicted to occur whenever elements within systems are described by observables which are complementary to the description of the whole system. While a considerable body of circumstantial evidence supports the plausibility of the model, we are not yet in a position to use it for clear cut predictions that could be experimentally falsified. The series of papers offered in this special issue are the beginning of what we hope will become a rich scientific debate.
KeywordsHolism Non-locality Entanglement Complementarity Generalised quantum theory
If there is any consensus at all about what science is, then it is a consensus about the social nature of science (Kuhn 1955; Laudan 1977; Suppe 1977; Canguilhem 1979; Fleck 1980; Gutting 1980; Toulmin 1985; Collins and Pinch 1993; Latour 1999). Neither is there any preset law on the nature of science nor is there only one way of doing science (Feyerabend 1976; Feyerabend 1980). What science is, how it progresses, what rules it has to adopt in order to be efficient and most importantly, what goals it is going to set itself, is determined through processes of social negotiation by those who do science, by the “scientific community”. Whoever fulfils the requirements for acceptance by this community—nowadays usually a PhD and some publications to start with—can participate in the discourse. Another important truth about the scientific mode of operation is that it is immensely historical (Oeser 1969, 1979a, b, 1987, 1988). What science believes to be true, the methods it has adopted, the topics it considers worthwhile, cannot be understood unless we understand, at least partially, the history of science. That science also has a natural core of true findings that describe nature in some respects truthfully and usefully goes without saying. Airplanes fly, solar panels produce electricity and pharmacological agents influence the body in quite predictable ways because science has discovered some true relationships within nature that can be used for engineering purposes. This was largely due to a canon of methods that has developed since the beginnings of our natural sciences in the sixteenth century. We often forget that these scientific findings—the content of scientific discoveries—are only possible because we abstract from the multitude of potential viewpoints, connections and perspectives and adopt one as preeminent over and above others. This perspective—the paradigm (Kuhn 1955), the absolute presuppositions (Collingwood 1998, orig. 1940), the style of thinking (Fleck 1980), all the assumptions behind the everyday operation of science (Ziman 2006)—is not normally discussed or reflected upon. Rather it emerges out of the zeitgeist of a period. It is this “scientific worldview” students are being educated into that determines what one finds to be “scientific”. The social negotiation processes around those paradigms, to adopt the Kuhnian term for ease of speaking, are highly complex and not well understood.
A standard model of doing science prescribes some tacit or implicit presupposition. In the case of quantum physics versus standard Newtonian physics it was the old adage “natura non facit saltus”—nature does not progress in jumps. Those guidelines are helpful in that they channel scientific methodology and inquiry. In the historical case of quantum theory it was exactly this sentence that produced problems for Planck. Thus, at some point the current model does not work any longer.
Observed phenomena or logical contradictions within the paradigm necessitate that such implicit dogmas are questioned. In Planck’s case it was necessary for him to trust his empirical findings in the face of the dogma.
Once this is done and a new meta-theory or model of doing science starts to emerge, completely new viewpoints, vistas and even empirical phenomena can be seen that were not accessible through the old model. Quantum mechanics has produced a wealth of insights and findings that cannot be gleaned from classical physics.
The relationship of the new to the old model is often that of inclusion. Classical physics is contained within quantum physics as a special case. Sometimes one can also observe that old regulative ideas are just dropped and a completely new model emerges. This was the case, for instance, with the concept of ether, which was never disproved, but simply dropped as a concept. Sometimes some old guiding principles are revised and revisited in a new format. This is the case with the old model of the mixture of humours that produce different typologies of characters. It was dropped, and revived in nineteenth century psychiatry as constitutional types by Kretschmer and others, dropped again, and is now being revived in a completely different shape as genetic individual metabolic types.
What is important to note here is that, while questions that arise within a paradigmatic framework can be discussed and debated using accepted methodology that can decide between competing alternatives, for instance through an experiment, this is not always possible when it comes to decide between different paradigmatic frameworks. Often the presuppositions needed for the alternative framework question the presuppositions of the old methodology. One prominent example within the social sciences is the debate between behaviourist theorising and psychoanalytic theorising. By default, if one accepts the premises of unconscious processes as a background motive for overt behaviour, it is insufficient to employ methods that can only access conscious states of consciousness. Hence testing psychoanalytic predictions on the battleground of behaviourist science is unfair. In the case of this particular debate, these days we are witnessing an interesting amalgamation. The solution is coming from a completely different strand of thinking, from cognitive neuroscience which by default needs and has documented implicit processing (Giampieri-Deutsch 2002). By virtue of this new vista the old psychoanalytic findings can now be understood differently and from a new perspective.
This also opens up a third insight: often new paradigmatic viewpoints teach us to see things differently, or rather, to see for the first time things that we were unable to see before. A theoretical framework is not only a way of helping us to solve puzzles and decide about phenomena in the world. It is also a certain way of looking at the world, and by this token it determines what we can “see”. Atomic and subatomic particles we could only “see” once we had the appropriate theory which also helped construct the appropriate instruments. Some galactic phenomena, such as quasars, black holes or certain kinds of stars were only seen once a certain theory was adopted and predictions deduced from it.
The lesson to be learned from such episodes is: We often only perceive what we have a theory for. This is actually related to the way our cognitive system is structured. We often project an expectation of what the world is like based on our past experience, and suppress the perception of deviations from this expectation only when the deviation becomes strong enough (Gray 1991, 1995). The effects of these strong top-down attentional processes can be seen as similar to those that theoretical models have for the scientific discourse.
In this series of essays we suggest the virtues of a new axiomatic way of looking at our world. We suppose that by adopting a new viewpoint within science we will be able to see more phenomena, see old phenomena differently and be able to deal with our world more appropriately. We are not suggesting a new type of theory among others, but rather a new meta-theoretical framework or paradigm. This framework is quite tentative at this point. It is not completely elaborated. But it already makes some very clear predictions and allows reconsideration of some phenomena that are evident but are not considered scientific facts for lack of an appropriate theory which explains their deviation from what is expected by the current paradigm. We suggest that a paradigm of holism that uses insights from quantum mechanics proper (QM—whenever we speak of the physical systems traditionally considered as quanta, such as ions, atoms, molecules etc., we will use this term) can actually be applicable to other systems as well. We call this approach Generalised Quantum Theory (GQT). In order to formalise it, the formal structure of quantum theory proper (QT) was generalised. Some definitions and restrictions were dropped, while keeping the basic structural core (see below and the paper by Filk and Römer, in this issue as well as e.g. Atmanspacher et al. 2002; Römer 2004; Atmanspacher et al. 2006). We find that GQT deals with situations that are similar to those that are called complementarity and entanglement (or non-local correlations) in QT.
The classical picture is too narrow. It cannot incorporate a host of phenomena that, by their very phenomenology, suggest they are non-local.
The classical picture is built on a set of preconditions that predicate analysis and a break-down of holistic structures into individual elements as the most promising avenue of discovery. We think that this picture needs complementing in order to gain a scientifically viable theory of holism.
The classical picture is derived from an analogue to classical physics. This serves us well for most purposes. But as classical physics had to be broadened to quantum physics, similarly there needs to be a broadening of the paradigmatic framework to incorporate the basic insights of quantum theory. This is what our model does.
Before we go into a gross description of the model of generalised quantum theory (for a formal treatment see e.g. the paper by Filk and Römer, in this issue as well as e.g. Atmanspacher et al. 2002; Römer 2004; Atmanspacher et al. 2006) it might be useful to explicate these limitations of the current paradigm. Then we will elaborate on the generalised quantum theoretical model and how it can actually integrate some phenomena. This will be followed and concluded by an outlook and some ideas about the future agenda, should this model prove viable.
2 The Current Paradigm and Its Limitations
The currently accepted paradigm of doing science shares, across disciplines, some basic presuppositions, which Collingwood termed implicit presuppositions (Collingwood 1998, orig. 1940). These presuppositions are necessary for the paradigm to work, yet they are presupposed in that they cannot be derived or proven by the methods of the paradigm itself. Any formal system, no matter how logically consistent internally, will need to be based on at least one, usually more assumptions, that cannot be proven using the formal system itself. In other words: every model, every paradigm has to fall back on basic axioms whose truth is assumed and whose usefulness is taken for granted for reasons other than for results derived from this paradigm itself. These basic assumptions or implicit presuppositions come out of the general philosophy of the time, the zeitgeist. They are normally not openly discussed, and they are generally taken for granted. Only rarely are they reflected upon by those doing science. They form a mutually self-supporting circular system with the results produced by the system. Often, findings that do not fit the model are excluded from discourse, not because they are wrong or scientifically implausible, but because taking them seriously would mean questioning basic assumptions (Fischer 2007).
Materialism: All basic elements of the universe are ultimately material in nature; whatever appears to be non-material can be reduced to or shown to be derived from material elements.
Atomism: All complex systems are built of small constituents that can be analysed (Whyte 1961).
Analysis: It is useful to break down complex systems into their constituent elements and first understand those basic systems.
External relations: All systems and elements in the universe are related to each other by a set of relations that can be described from the outside in terms of exchange of energy and matter (Blanshard 1989).
Reductionism: It is a sensible strategy to reduce unknown phenomena to phenomena that are known and can be explained already. In the same vein, it might be a promising heuristic to use theory domains of simpler phenomena to understand more complex phenomena (Agazzi 1991).
Locality: Relationships between systems have to obey the framework set by special relativity. They are related to each other by material signals limited by the speed of light. Whatever falls within a given light cone can be in relationship, whatever does not will be unrelated by design, except in QM.
Causality: It is one of the goals of science to analyse and demonstrate causes of events. These are ultimately always material causes that obey the classic prerequisites for being a cause: they precede their effect temporarily, they are contiguous, and they regularly produce the same effects. All causes can be reduced to efficient causes in Aristotelian terms: causes that mediate an effect through contiguous contact and exchange of energy (Lucas 1984; Epperson 2009).
Binary logic/logic of the excluded middle: If there are two mutually exclusive descriptions of a phenomenon it is commonly assumed that maximally one of them can be right, and the other must then be wrong, except in QM.
Whether these sentences are indeed absolute presuppositions or not can be tested by the counterfactual assumptions what would happen if we were to doubt these assumptions. Whenever we find ourselves saying: “Of course, how else could it be?” we have reached the dead-end lane of an absolute presupposition. Whenever doubting such assumptions leads to a kind of aversive affect whose final argument ends in “But this is unscientific”, we know that we are doubting an absolute presupposition.
We suggest that, while the current paradigm is very useful in many respects, it is also deficient. It fails to account for holism, i.e. for the understanding of whole structures and their interrelationships. It fails to account for non-local relations and complementarity outside QM proper. And by virtue of its innate reductionism and materialism the current model fails to account satisfactorily for a series of phenomena.
One obvious example of such a limitation is the phenomenon of subjective consciousness and the connection to its physiological correlates (e.g. Chalmers 1995, 1997). None of the attempts to find a satisfying place for subjective consciousness within the current paradigm has met widespread acceptance. This links to the whole arena of subjective meaning and values which, by their very definition, are difficult to understand within a materialist, reductionist framework, while at the same time we are becoming painfully aware of their significance for society. Another example where we think the current paradigm fails is the realm of non-local, i.e. non-causal, phenomena. The current paradigm does not really allow for non-locality outside of quantum physics. This is related to the assumption that any regular relatedness can only be mediated via material causal signals, and these have to obey the framework of special relativity and of Maxwell’s theory of electromagnetism. If we want to transmit a cause, e.g. a signal, from earth to the moon, we will have to take into account the distance of roughly 300,000 km. A light signal will take approximately one second to get there, and another second to get back. Also, we will need to take into account the fact that the signal strength decreases with increasing distance. Hence we will need to power our signal sufficiently to survive the distance. This situation is the reason why all instances of non-local relatedness, such as telepathy, clairvoyance, precognition or telekinesis, have been viewed with suspicion by mainstream science. There is some a priori implausibility for such phenomena from the outset, based on the presupposition of locality and causality.
Moreover, science demands that such phenomena be demonstrated using the same kinds of methods that are used for the analysis and vindication of causal signals. For instance, if a physicist wants to see whether at a certain location in the universe there is a certain type of radio-source she trains a telescope to this place, gauges the frequency, and then keeps receiving and isolating the signal from background noise. If there is a reliable signal then after a time it can be reliably isolated, not only by her but by all her colleagues in the world who are sufficiently trained and have the right equipment. In other words, standard science uses repeated experiments and measurements to isolate causal signals.
Also assuming that these phenomena are due to some kind of causal signals, parapsychologists have produced a wealth of experiments which have failed to isolate a signal. A significant number of authors have meanwhile noted this “reproducibility-problem” as a characteristic feature of parapsychological research (Bierman 1980; Braud 1985; Stevenson 1990; von Lucadou 1991; Batcheldor 1994; Beloff 1994; Houtkooper 1994; White 1994; Haraldsson and Houtkooper 1995; Blackmore 1999; Bierman 2001; Kennedy 2001; Atmanspacher and Jahn 2003; Dunne and Jahn 2003; Kennedy 2003; 2006; Walach et al. 2009). This finding provides indirect evidence for the hypothesis that if these phenomena exist they are likely not of a causal nature in the standard sense. At the same time there is such a wealth of anecdotal and phenomenological evidence that it would be unscientific to deny the fact that these phenomena are a transcultural and transhistorical part of human experience (Walach and Schmidt 2005). Hence it might be necessary to use a completely different pathway to approach these phenomena. We suggest here that adopting a new paradigm of generalised non-locality would actually do exactly that: it would allow understanding these phenomena scientifically, as instances of non-local relatedness, and it would also make plausible why standard experimental procedures have not succeeded at reproducibly demonstrating their occurrence. Such standard procedures assume stable causal signals that can be extracted from background noise. This is precisely what cannot be achieved, simply because there is only relatedness, but no signal (von Lucadou 2006; von Lucadou et al. 2007). Hence a new model is needed.
Once such a model is adopted, we may suddenly see that a lot of other instances which are difficult to explain in the existing paradigm might fall under the category of non-local relatedness (see the last section of this paper) and we might possibly even be able to develop new empirical procedures which are more appropriate for dealing with macroscopic generalised non-locality.
This model of generalised non-locality which we are advocating is not meant to do away with the standard model of doing science. It can be considered a complement. Just as in normal processes we can assume that classical local and non-classical non-local processes interact in order to produce an optimum of coordination, we can also assume that on the theory level, standard classical reasoning using local causal analysis and non-classical reasoning following the model of generalised non-locality can complement each other to find an optimal balance of parsimony and comprehensiveness in accounting for observations, “saving the phenomena” as the Platonist astronomers in antiquity used to say (Mittelstrass 1962; Walach and Schmidt 2005).
At this point it might be useful to move one step forward and sketch the model and what it does, in order to understand its potential merit.
3 The Model of Generalised Quantum Theory: Generalised Complementarity and Non-Locality
3.1 Why a Formalism?
GQT uses the formalism of algebraic quantum theory (QT). A formalism as an axiomatic system has the benefit of symbolic clarity. It can be used, if appropriately filled with definitions and restrictions, to actually calculate predicted results or model empirical results. That way, empirical phenomena can be related to a theoretical structure, and the theoretical structure can be recursively used to predict new phenomena which can then be empirically tested. This has been the case with quantum theory since its inception and its application to physics. A rich framework of concrete definitions in conjunction with axiomatic formulations and relationships of operations to natural constants, such as the Planck constant, makes it possible to calculate expectation values for certain operations and measurements.
3.2 QT and Its Applicability
QT has produced many such predictions and many experimental concretisations and verifications of these predictions, more than 300 to date, and none of the predictions have been falsified. This is why QT is considered to be one of the most empirically justified and precise theories which we have. While it is conceivable that an even more fundamental theory can be constructed, especially one that marries the realm of quantum physics with that of relativity, this has not happened so far. Thus, QT, we hold, is the most solid theoretical framework available to us. Hence the intuitive understanding that it is a good starting point from which to formulate a more general theory. However, QT proper is confined to special realms. Planck’s constant limits it to the very small or to the very cold or to the very hot. Whenever we enter realms around 10−9—meters, kg, Kelvin, etc.—we must be ready to expect quantum phenomena, as QT becomes applicable. But for the normal range of our everyday life such quantum processes proper do not play a role. This is the reason why for all normal human affairs standard classical physics will do, at least normally and under regular circumstances.
3.3 An Important Heuristic: The Principle of Isomorphy
Another important intuitive heuristic idea comes into play when pursuing a generalisation of quantum theory: Whenever we see some very basic principle we can assume that it plays a role not only in a particular domain, but at all systemic levels. This is the principle of isomorphy that was used by general systems theory and its founding father Ludwig von Bertalanffy to explain why similar principles should hold in systems of different make-up (Bertalanffy 1968). For instance, the fact that all systems have boundaries and form a unit of elements that by virtue of their being part of a system form different relationships and give a system a certain property can be observed everywhere (Maturana and Varela 1975; Varela 1979). Whether we use as the unit of analysis a stellar system with planets, a family system consisting of family members, a system of social insects consisting of millions of ants or bees, an organismic system consisting of billions of cells, substructures of cells consisting of billions of molecules, or molecules consisting of millions of atoms, we always find the same abstract structure: elements that outside of the system have certain defined properties gain some other properties when they are joined together in a system. The principle of isomorphy holds that a certain abstract relationship will be found over and over again in different settings provided the basic relationship is enacted.
3.4 Generalising Quantum Theory
We now have all the elements in place to understand what Generalised Quantum Theory (GQT)1 does. It is a formal framework based on the algebraic formalism of Quantum Theory proper (for details see Atmanspacher et al. 2002; Atmanspacher et al. 2006; Römer and Filk, in this issue). It answers the question: what is the minimum requirement for a formal model to be meaningful, describe the quantum nature of a system and yet be as widely applicable as possible? In other words, it results from posing the question: What if we assume that under certain circumstances all systems exhibit some behaviour which is isomorphic to quantum behaviour: What would such behaviour be?
In order to achieve this we have to do two things: we have to keep the structural core of the formal description of quantum systems intact, while at the same time relaxing and dropping all conditions that restrict the applicability to the quantum realm proper. We do this by reducing the QT formalism to its bare minimum. All definitions that are not strictly necessary are dropped. All restrictions that make the formalism work specifically for QM, but also restrict its usage, are dropped, such as the precise value of the commutator (Planck’s constant). The result of this is twofold: On the one hand, the formalism’s applicability is widened to all systems that can be meaningfully isolated from a background and called a system. On the other hand, the quantitative precision that is so typical for QT is lost.
But what is kept is a relationship that is typical for QT. It is the capacity of the formalism to handle non-commuting operations. Put formally: The type of multiplication used by the QT and GQT formalism is non-Abelian. This means: the sequence in which operations are carried out matters for the end result. In our normal algebra 2 times 3 equals 6 in the same way as 3 times 2 equals 6. It is completely irrelevant whether we first take 2 and multiply it by 3 or first take 3 and multiply it by 2. The result is the same: 6. In the non-Abelian algebra used by QT and preserved in GQT this is not the case. Here, abstractly speaking, p times q is unequal q times p. In QM proper this inequality is precisely defined by Planck’s constant divided by 2 pi. This is in fact the Heisenberg uncertainty relationship, which defines the limit of precision of two incompatible measurements. In GQT this precision is dropped. But the formal handling of non-commuting operations is kept. An everyday language expression for this fact would be: We stipulate that GQT is useful to describe situations in a system, where non-commuting operations might be necessary.
3.4.1 Non-Commuting Operations, Incompatibility, Complementarity
Non-commuting operations are operations where the sequence in which they are applied is not irrelevant. In QM, for example, the end result differs depending on whether we measure the location of a particle first and then its momentum or the other way round: Depending on the sequence of these measurements, we might have a precise knowledge of the particle’s position at the cost of precise knowledge of its momentum or vice versa. This is the experimental meaning of the Heisenberg uncertainty relationship. Generalising this relationship into GQT means that we assume that such types of relationships also occur in other systems. Some situations in which this might apply are quite trivial: the sequence in which we do some things is essential and cannot be reversed. For instance it makes a difference whether we first open the door and then walk through it or the other way round. But we are more concerned about non-trivial cases. These are associated with situations where we need incompatible descriptions to accurately and comprehensively describe one and the same system.
This is the prototype where in QM non-commuting operations are required, such as the description of the momentum and location of a particle. In addition to being non-commuting these are incompatible operations. They mutually exclude each other but are both necessary to fully describe one and the same particle. Niels Bohr adopted the term “complementarity” from early psychology (Rosenfeld, 1963; Bohr 1966; 1997) to describe the relationship between such mutually exclusive yet collectively required descriptors. He thereby meant situations in which two maximally incompatible descriptions are necessary to describe one and the same thing (Meyer-Abich 1965).
As an example of generalised cases of non-trivial complementarity we can consider e.g. the description of a system in terms of process and substance (Römer 2006), or, relatedly, in terms of function and structure. Along these lines, we can also reconstruct the relationship between our mental and physical nature as a complementary relationship between two mutually exclusive yet collectively required descriptions of one and the same individual (Walach and Römer 2000; von Stillfried 2010). (This would offer, in analogy to the situation in QM, an explanation for their operating in strict parallelism without the need for a “mechanism” of coordination or interaction that has been the stumbling block of dualistic theories since Descartes.)
The special characteristic of complementary or incompatible observables in QM is that they are both necessary to describe a situation completely, and they cannot be reduced to each other, or expressed in terms of each other nor in terms of a coherent unifying meta-description. If we have a simple pair of opposites, one can be expressed in terms of negating the other. If we take warm and cold as opposites, we can express cold as a negation of warm. We cannot do this with complementary or incompatible descriptions. Location is not a negation of momentum, Substance is not the negation of process. Another way of putting this is saying they are orthogonal in a topological-geometrical sense. One cannot be reduced to the other, but both are needed to describe a situation.
(Potential) Candidates for complementarity relationships
Description of a particle
Description of a wave
Descriptions of a quantum
Description of a quantum event
Description of an individual
Description of a system
Description of a system
Description of reality as a whole
Description of a human being and human systems
In the physical context it is clear that complementary observables are maximally incompatible. In the generalised context this is not always the case. There may be some intermediate types of relationships where some incompatibility will be present, but some domains are also compatible. This means that the generalised version is applicable to domains where we most likely will see a mixture of classical and non-classical features.
However, the core of quantum theory is preserved in the generalised formalism, and for a reason: We assume that there will be enough situations in contexts outside physics proper where similar relationships of incompatibility and thus non-commutativity hold as are typical for QM. If this is the case, the formalism of GQT is applicable. And if this is so, the GQT formalism also makes a very important prediction, namely that in any system defined by a global observable which is complementary to local observables defining subsystems of this system, there will be entanglement-like (i.e. non-local) correlations between these local observables.
3.4.2 Non-Locality and Entanglement
In QM proper, even by 1935, Schrödinger had found that the formalism predicted in certain situations that quantum systems cannot be completely separated into parts but have to be described as one singular system until measured. In these situations, the results of measurements carried out at two or more subsystems show correlations, even if no causal signal could link these subsystems or the measurement events (Nadeau and Kafatos 1999). Schrödinger coined the term “Verschränkung”—“entanglement” for this situation (Schrödinger 1935). For a long time it was not clear if this was just a theoretical artefact resulting from an incompleteness of QT or if it was a real property of quantum systems. Only when John Bell introduced a reasoning and its operationalisation according to which it could be distinguished whether the parallel measurements were actually locally connected or not (Bell 1987), an experimental test was in principle possible. The actual experiments conducted in the 1980’s (Aspect et al. 1982a, b), and a long series of replications with ever more refinements have been conducted that show without doubt: quantum entanglement is real (Salart et al. 2008). In other words: Bell’s inequalities allowed to differentiate between a classical correlation which is due to local causal influences and non-classical correlations due to non-local and in this sense non-causal entanglement (see also Filk in this issue). Entanglement in QT can be described as follows: An entangled quantum system resides in a state of superposition, meaning that the overall state of the system is clearly defined but it is uncertain how the subsystems or elements of the system contribute to it because these are not in definite states. However, as soon as a state of one of the elements is measured, it takes on a precise value. This value is, to some extent, random. But we know that the overall result must have a certain fixed value, and thus, as soon as one element has assumed a fixed value, the other one takes on a corresponding and correlated value, even if there was no possible classical causal communication between these two events. At this point, however, the superposition of the system as a whole no longer exists. There is hence a mutual uncertainty relation between the state of the system as a whole and the states of its parts.
The formalism of QT and GQT captures this requirement for entanglement in the following way: Complementarity between global and local observables. Whenever this ensues, the elements of the system whose local descriptions are complementary (non-commuting and incompatible) with the global description of the system will be non-locally correlated/entangled. Thus, entanglement is a special case of complementarity, namely the complementarity between local and global observables.
Wherever we have a system in which a global description of the system is complementary to local descriptions of subsystems within the system, we can expect non-local correlations or some type of entanglement between these elements. We call this type of non-local correlation generalised entanglement.
Let us consider the meaning of this. There is always a multitude of ways to partition any whole into subsystems. In fact, it has been shown that entanglement can also be seen as a consequence of how a system is partitioned, and depending on the coarse graining of the procedures with which we look at a system, complementary relationships can be expected (beim Graben and Atmanspacher 2006). Whenever we partition a system, we create a whole that is constituted of parts (von Stillfried and Walach 2006). If the description of these parts is complementary to the description of the whole, and if the GQT formalism is reasonable, then we can expect entanglement correlations that are non-local, i.e. not mediated by signals, between those constituent parts.
What we don’t know at this time is, what exactly constitutes complementarity between whole and parts in general systems. We know that it would have to entail a non-commutativity and incompatibility. That means there has to be a reciprocal uncertainty relationship in such a way that precise knowledge of the state of the parts entails uncertainty about the state of the whole and vice versa. This could for example be realized in situations where there is a fixed quantity that defines the system as a whole, e.g. a strongly determined overall systems-dynamic, and at the same time a high degree of freedom in the behaviour of the system’s components which collectively constitute this overall quantity. Systems theoretical analyses of suspected instances of generalised entanglement, e.g. in the parapsychological arena, have repeatedly been shown to display characteristics which can be framed in these terms (see e.g. Matschuk in this issue; von Lucadou and Zahradnik 2004; Kleinberens 2007; von Stillfried 2008, 2010).
The next question is whether this complementarity is only a necessary or already a sufficient condition. Are other conditions necessary for such entanglement to arise? What are they? (See the contribution of Gernert in this issue and Gernert 2005).
Another important issue here is the following: In QT proper, Planck’s constant limits non-commutativity and thus the strength and scope of entanglement. Is there something similar in GQT? If so, what would it be? How can we derive it? Theoretically the commutator could take on different values in different systems, leading thus to weaker and stronger degrees of non-local correlations. This might allow for an explanation of some very strong cases of macro-psychokinesis, as in poltergeist phenomena, for instance (von Lucadou and Zahradnik 2004).
Thus, the most important feature of GQT is that it generalises complementarity and entanglement to other systems that are not quantum systems in the strict sense. It is of prime importance to note here that this type of generalised entanglement is purely hypothetical at this point. It is not identical to macroscopic effects due to amplified quantum entanglement correlations. If our reasoning is correct then we would assume that isomorphic types of correlations can also be found in the macro world under certain conditions, in all sorts of systems.
Seen from the viewpoint of philosophy of nature, the model proposed here would assume that a principle as fundamental as entanglement can be expected to hold in all types of systems, not just in quantum systems proper, as long as the relevant systemic parameters are fulfilled. If this is a viable position, then it is reasonable to assume that entanglement was first detected in quantum systems because QT is such a precise, quantitative theory that allows for experimental testing. But by no means does the phenomenon have to be restricted to the quantum realm. It is likely to be more general in nature. However, similar to QM, it can only be seen under certain provisions. In QM the experimental set-up calls for very strict procedures of isolation and shielding because as soon as an entangled quantum system interacts with its environment, the entanglement correlations drown in noise (decoherence). Similarly, we cannot expect to see generalised entanglement in natural systems without special effort or specific preconditions. We will need to investigate special systems under certain conditions. Exactly what those conditions are in the generalised case we do not yet know for sure. One of the most decisive factors is likely again the issue of isolation, meaning that tightly controlled experimental conditions might not be suitable. Related to this is the matter of non-replicability due to the fact that we are not dealing with a signal, which will be discussed again in the following sections.
But we know that if the reasoning is sound and if GQT offers a valid perspective, then we can expect a generalised form of entanglement to be operative, probably on a much broader basis than previously thought. The following paragraph offers a few, loosely knitted examples.
4 Potential Instances of Generalised Entanglement and the Explanatory Power of the Model
In fact, the phenomenology of some situations show structural similarity to quantum non-locality and this was one of the motivations to look for an alternative to the standard model. Hence, it does not come as a surprise to find potential instantiations of generalised entanglement among such situations.
4.1 Psychic Phenomena
Telepathy, clairvoyance, psychokinesis and precognition are hotly debated areas. Proponents claim that as phenomena they have been documented throughout human history and all cultures, and that modern experimentation has yielded enough evidence to grant them the status of scientific fact (Radin 1997). Critics point out that more than 100 years of experimentation have not produced one instance of reliably replicable evidence, and that those tiny vestiges of effect sizes that can be distilled out of meta-analyses can relatively easily be explained by publication and small sample bias (Alcock 2003; Jeffers 2003; Bösch et al. 2006; Ertel 2006; Radin et al. 2006; Timm 2006; Boller 2007). Also, they point towards the human tendency of misattribution of causality and the fact that we tend to see patterns where only random fluctuations are in fact occurring (Brugger et al. 1994; French 2003). More importantly, there is no potential theory within a purely causal worldview that could explain how such influences—from one mind to another, from one mind to a material system, from one mind to its future states and events—might possibly occur. It is probably the lack of a consistent theoretical model that poses the greatest difficulties, together with the fact that there is indeed a problem with experimental replicability (Walach and Schmidt 2005; Walach et al. 2009). The model of generalised non-locality might be able to tackle both problems at once (Lucadou et al. 2007; von Stillfried 2010). It provides a theoretical model which explains anomalous cognition as instances of non-local correlations between two mental systems (telepathy, clairvoyance), between a mental and a physical system (psychokinesis), or a system with a future state of the world (precognition). The detailed analysis might be different in each case and a good methodology for it remains to be worked out (for some efforts in this direction see also the contribution by Siccardi in this issue). The generic thrust is, however, always the same: In these situations there is some overall global systems parameter which is relatively fixed and local parameters pertaining to the subsystems which have a high degree of freedom. The global and local parameters are in some reciprocal relation of uncertainty. Thus, as in QT, non-local non-causal entanglement correlations occur between the local parameters. In this way, a model of generalised non-locality can provide plausible explanations of how such things as telepathic connection across distances in space or time might happen, or how correlations between a psychological state and a physical state can occur, as e.g. observed in hauntings and poltergeist phenomena and possibly even in psychosomatic symptoms (von Lucadou and Zahradnik 2004; von Lucadou et al. 2007, von Lucadou in this issue). More importantly, the model also clarifies why a traditional experimental examination of those phenomena must fail to provide replicable evidence for their existence: Experiments are always attempts at extracting causal signals from a system. Where there are no such signals, this attempt must fail. Since the model of generalised entanglement assumes that the relationships are regular, but purely correlational, without the exchange of signals, an attempt to isolate such an alleged signal is futile. What is more, as we know from QM, entanglement correlations cannot be used to transmit signals (Eberhard 1978) because as soon as there is the possibility to affect one side of the correlation, the entanglement breaks down due to decoherence. Thus, if someone replicates an experiment, say in telepathy, repeatedly and under controlled conditions, one would assume that initial effect sizes might be sizeable, but would either turn negative or dwindle (von Lucadou 1990, 1991). This exactly matches the empirical situation (Walach and Jonas 2007; Walach et al. 2009; von Stillfried 2010). This is why we think that the experimental model in parapsychology has utterly failed or is doomed to failure and will be abandoned in the long run (Walach et al. 2009). This is also why we think that GQT and the model of generalised entanglement might offer a new purview here. This would probably turn more towards the phenomenological and qualitative study of real-world occurrences, thus indirectly documenting their occurrence without subjecting them to a signal transfer paradigm.
4.2 Healing and Alternative Medicine
In the same vein, some cases of anomalous healing might fall under this category of lawful yet correlational relatedness according to GQT (Walach 2005). Spiritual healing, even over large distances, is well documented in the ethnographic literature. Modern research in alternative and complementary medicine has also documented anomalous cases of healing well enough to make it plausible that at least sometimes such instances can happen (Walach 2006). Again, we assume that these cases can be reconstructed as instances of generalised non-local correlations. Exactly how would be a question for careful analysis. But broadly speaking it could be conceptualised along the lines of a system being generated which includes the healer and the healee, with a strong overall definition combined with a high degree of flexibility among the members of the system. In this way nonlocal correlations could occur. That is, the healer’s behaviour, the healee’s behaviour, and the symptoms are coordinated in such a way that the defined overall system dynamic (i.e. healing) is realised, without, however, local causal interactions alone being responsible for it. Since the formation of systemic contexts is of prime importance in this respect, we can intuit the effect of ritualistic elements such as those used by magical traditions. Homeopathy is a good example of a modern ritualistic tradition which generates and employs a generalised form of non-locality (Walach 2000). Its effects can be construed along the lines of generalised entanglement (Walach 2003).
There are some empirical signatures that point into that direction: In the phenomenology of healing we quite frequently find extraordinarily quick effects. It does not always work and, in fact, it may only work infrequently. But when it works, effects can be seen rapidly (Heine 1990). Another signature is the well-known fact that it is extremely difficult to pinpoint specific effects of homeopathy and other complementary therapies using experimental research procedures (Walach et al. 1997, 2001, 2008). Such studies are notorious for being unsuccessful in demonstrating superiority of true therapy to a sham control. However, when compared with standard therapy (Diener et al. 2006; Scharf et al. 2006; Haake et al. 2007), or in uncontrolled practice (Keil et al. 2008; Witt et al. 2009) these therapies can be highly effective. It is as if the effect disappears when probed for causal stability, but can be freely and quite reliably used in normal practice. This poses quite some methodological paradoxes and conundrums.
4.3 Transference, Constellation Work, and Family Systems
During intensive therapeutic encounters, in long-term psychotherapy for instance, we often see a unique type of counter-transference. This happens when the therapist experiences states of consciousness related to the patient (Heimann 1950; Zwiebel 1992; Leuzinger-Bohleber and Pfeifer 2002; Kleinberens 2007; Matschuk in this issue). A more detailed analysis can reconstruct such cases as instances of generalised non-locality within a highly ritualised system. Emotional content is shared not only through traditional causal signals of language and paralinguistic signs, but also non-locally along the lines of a generalised analogue of quantum teleportation (Walach 2007). And since the reversal is also possible, we could also conceptualise therapy and healing along those lines.
In a similar vein, it is not difficult to understand how the increasingly popular instances of family constellation work could be effective (Hellinger 1999; Payne 2005). In such a type of systemic therapy, a person seeks out of a group of strangers, who gather for the event, as representatives of his or her own family system, including dead members of the family, sometimes some generations back. These representatives have practically no information about the person they represent or about the person whose system they are embodying. Nevertheless, it is a frequent experience that such representatives capture, within a very short period of time and without any guidance, hints or cues, the essence of the system, as it were (Varga von Kibéd 1998). Again, in terms of a generalised form of entanglement it is not difficult to construe the events along the lines of non-local correlations between the representatives and the real family members. In the same vein, this reasoning might help understand various phenomena in families: It has been pointed out how trans-generational motives might be transferred across generational boundaries, even without those concerned being aware of (Ancelin Schützenberger 1998). Thus, grandchildren might pick up topics or unsolved problems that are later discovered to be their grandparents’. Along the same lines, it is known from family therapy that children express and enact unconscious problems of their parents or their relationship. This reasoning also explains why rituals can form new entities. A married couple is more than just two people agreeing to share their lives. They become a new entity by way of the ritualistic binding. It becomes difficult to distinguish the ownership of mental content—ideas, impulses, emotions. And one can also expect that what one partner does, even though the other might not overtly and consciously know, will have reverberations on the whole system.
This is also the reason why we would likely also have to have strong rituals of disengagement and separation. We have such rituals for death and mourning. But we have not developed analogous rituals for couples that separate while still alive. By virtue of our model we would expect that this is bound to lead to various problems, if the separation is not mutually welcomed.
4.4 Synchronistic Events and Spirituality
Carl Gustav Jung and Wolfgang Pauli, in an exchange of letters covering nearly 30 years, have worked towards a model, where psychological reality and physical reality are aligned (Jung 1952; Pauli 1952; Atmanspacher et al. 1995; Meier 2001; Atmanspacher and Primas 2006). In fact, Pauli thought that physics is incomplete until it can incorporate psychology. They jointly coined the phrase “synchronicity” for events where the material world behaves in a way that is meaningful with respect to some psychological event. We all have such experiences: we need a solution to a personal problem urgently, and suddenly someone calls and brings us something, quite by accident, that helps solve the problem. Or we need contact with a person for an important reason and have misplaced the contact details, and by chance and accident the person calls. In all these situations we see a correspondence between a psychological state—in Jung’s terminology the activation of an archetype—and a physical reality. It seems rather implausible to assume that the psychological state has influenced reality or that we have “made” the person call. It is a simple, but meaningful correlation.
GQT can describe such instances of synchronicity as non-local correlations between a mental and a physical system (Lucadou et al. 2007). In fact, we would assume that a mental system which is well in tune with its environment would in fact experience quite a few such micro- and macro-instances of non-local correlations or synchronicity. This would be phenomenologically experienced as felicitous events, some kind of “flow”, serendipity. The reconstruction along the lines of our model would entail that by coordinating ourselves with our environment or the whole, in fact, we enhance the non-local connectivity between our mind or mental states, and the way the physical world evolves around us.
This would be a natural reconstruction of what is normally called “spirituality”: an alignment of the individual with the whole through repeated acts of entrainment, through, for instance, meditation or other ritualistic acts of aligning individual consciousness with the larger whole (Walach et al. 2009).
4.5 Other Cognitive Processes
It has been shown that the model of GQT can be used to describe some processes in the cognitive system, in particular with regard to bistable perception (Atmanspacher et al. 2004, 2008, 2009; Atmanspacher and Filk 2010; Filk, in this issue). Many other working groups using different theoretical frameworks have shown quantum-like characteristics of various cognitive processes (Aerts et al. 2000, 2009), in particular probabilistic inference problems (Busemeyer and Trueblood 2009), thinking in polarities (e.g. Bitbol in this issue), lexical processes (Bruza et al. 2009a and Kitto in this issue), judgment and decision making processes (Franco 2009) and economic processes (Cockshott 2009; Haven 2009). For an overview see Bruza et al. (2008, 2009b).
4.6 Speculations About Relevance in Other Fields
We might want to extend the discussion to history, the performing and creative arts, political movements, macro-shifts in social and economic phenomena and similar areas. That is something to be done in the future and by people with specific knowledge of the respective fields.
One very little explored arena is also the field of biological processes. If synchronistic correlations between chance events occur in macroscopic systems, we might suspect that this is a feature that organisms have adapted to or employ. We could for example imagine that there is a second line of coordination and entrainment, in parallel to all causal systems that support the physical systems of our body. The large scale systems within our organism might be non-locally correlated: the activity of neuronal assemblies for instance with the widely distributed workload across the brain can be supposed to be non-locally correlated. The coordination of processes in movement and motor activity are likely to be not only causally, but also non-locally coordinated. The way the immune system acts and finds antigens, for instance, and launches an immune response might also be non-locally correlated. Two butterflies find each other because they have a good sense of smell for the few molecules of pheromones that hover in cubic-kilometers of air. But perhaps some non-local coordination mechanism also plays a role, either in the form of generalised entanglement or even through real quantum processes, as discussed in the paper by Summhammer in this issue. Similarly a predator will likely find its prey because it has good causal sensory systems—good knowledge of habitats and behaviour of the prey, good visual, auditory, olfactory and tactile senses to locate it, but it might also be the case that some non-local correlation plays a role; leads to the initial moves, the direction in which to go, the time in which to become active, etc. In general, we could expect all biological systems to be non-locally correlated with their respective environment to some extent, a question pursued in part by Fels in his contribution to this issue. By the same token, we can expect the whole of evolution and the optimising of ecological niches and collaboration to also have some aspects of generalised non-local correlations (Margulis and Lovelock 1974; Toussaint and Schneider 1998; Stamos 2001; Auffray and Nottale 2008; Goswami 2008; Nottale and Auffray 2008; Aerts et al. 2010). Naranjo (in this issue) exposes the idea that all organisms might inescapably display quantum-like properties because they are self-organisational autopoietic systems closed to efficient causation, according to the analysis by Robert Rosen.
5 Final Remarks
We have not touched upon a bunch of issues, and perhaps they become even more problematic once we think about everything more carefully. Our model is very generic at this point. Hence we have no way of predicting exactly when we are to expect generalised entanglement correlations and exactly how strong and between which elements they are to be expected, except in very general terms. Hence, the model is useful to explain phenomena after the fact but it is not precise enough at this point to lend itself to prediction. For that we would probably have to find some easy to study system and generate the respective parameters from empirical findings. But even in this case, as pointed out above, the nature of the phenomenon may be intrinsically unsuitable for detection under controlled conditions. On the one hand, this is clearly a major drawback of this theory. Would any scientist be interested in a theory which postulates that it is not provable (or rather falsifiable) by controlled experiments? On the other hand it is important to remind ourselves of two aspects:
Firstly, as explicated in the beginning, what this theory proposes is an expansion of the paradigm rather than an additional model within it. In other words, it is primarily suggesting a shift of perspective rather than a new object to look at. In this respect, what counts for a successful theory is not only its predictive power in controlled experiments but also its explanatory, descriptive and integrative power regarding phenomena which are difficult to accommodate in other models of reality. GQT offers a logically consistent way of understanding why e.g., parapsychological experiments fail even though all throughout history and in all human cultures the respective phenomena are widely reported. In the current paradigm this is only possible by accusing those who report the experiences of fraud or hallucination.
Secondly, as our understanding of the theory grows, we might develop new ways in which empirical plausibility can also be generated, for example by indirect detection of the correlations, the use of correlation matrix analysis as suggested by von Lucadou (von Lucadou 2006, in this issue), or new experimental paradigms which give a more prominent place to first-person qualitative judgement.
A related area where more work is needed is in finding out how to best translate the formal requirement of complementarity, non-commutativity and incompatibility into parameters relevant and applicable in general macroscopic systems. There is a precise definition in physics proper. But as yet we do not have a good-enough analysis of these notions in our everyday language and in human affairs. Perhaps not all of the instances we take to be complementary at this time actually are. Perhaps we need to explore how complete an incompatibility we need to have. Does it have to be maximal? Would partial incompatibility do? How do we know when two descriptions are incompatible?
Do we need to add additional requirements to make the model work?
Is this model of complementarity and non-local correlations in any way different from good old dialectics? If so, what is the difference?
This is only a selection of important questions, and many more might arise once we dive into those. But putting the questions aside for a moment, we might be able to see that what we have here with this model of generalised entanglement and generalised complementarity is a meta-theoretical, paradigmatic viewpoint that is applicable to all types of systems, independent of make-up, size, scope and distribution and which allows us to consider the possibility of generalised non-local correlations occurring in these systems. We feel that, although thus far we cannot prove the theory, it is sufficiently plausible and its potential implications sufficiently far reaching to warrant much more in depth study. The papers in this issue are part of a collaborative effort that just makes a start. We hope that it stimulates a wide debate as to the viability of the concept and invite others to participate in it.
In previous formulations of the formalism we used the terminology “Weak Quantum Theory” (Atmanspacher et al. 2002). This was to signal that the model makes use of Quantum Theory but is a weaker version in the sense that it is less restricted. However, since there seem to be misunderstandings related to this terminology, we have decided to adopt the terminology of “Generalised Quantum Theory”. Here it refers to the very same program that used to be known under the heading “Weak Quantum Theory”. At the same time there may also exist other more or less formalised approaches to generalising QT, which might in this sense also be called generalised quantum theories.
The authors gratefully acknowledge funding from the Fetzer-Franklin-Fund and helpful comments from one anonymous reviewer.
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.