Advertisement

Postdigital Science and Education

, Volume 1, Issue 1, pp 29–42 | Cite as

Critical Philosophy of the Postdigital

  • Michael A. PetersEmail author
  • Tina Besley
Original Articles

Abstract

This paper draws on authors’ recent works on cybernetics, complexity theory, quantum computing, Artificial Intelligence, deep learning, and algorithmic capitalism, and these ideas are brought together to develop a critical philosophy of the postdigital. Quantum computing is based on quantum mechanics and offers a radically different approach from classical comouting based on classical mechanics. Cybernetics, and complexity theory, provide insight into systems that are too complex to predict their future. Artificial Intelligence and deep learning are promising the final stage of automation which is not compatible with the welfare state based on full employment. We have thus arrived into the age of algorithmic capitalism, and its current phase, ‘biologization of digital reason’ is a distinct phenomenon that is at an early emergent form that springs from the application of digital reason to biology and the biologization of digital processes. Rejecting a fully mechanical universe, therefore, a critical pedagogy of the postdigital is closely related to Whitehead’s process philosophy, which is a form of speculative metaphysics that privileges the event and processes over and above substance. A critical philosophy of the postdigital is dialectically interrelated with the theories such as cybernetics and complexity theory, and also processes such as quantum computing, complexity science, and deep learning. These processes constitute the emerging techno-science global system, perpetuate (algorithmic) capitalism, and offer an opportunity for techno-social change.

Keywords

Postdigital Critical philosophy Process philosophy Cybernetics Complexity theory Quantum computing Deep learning Algorithmic capitalism 

Introduction

The postdigital does not describe a situation, condition or event after the digital. It is not a chronological term but rather a critical attitude (or philosophy) that inquires into the digital world, examining and critiquing its constitution, its theoretical orientation and its consequences. In particular, it addresses the conditions of digitality and the ideology of digitalism, the idea that everything can be understood without loss of meaning in digital terms (see Jandrić et al. 2018). We call this the ‘critique of digital reason’ that has application not only in terms of social theory and theory of hyper-control but also in music, art and esthetics where it is concerned to humanize digital technologies. The critique of digital reason has two elements: first, the mathmatico-technical control systems that are part of the emerging global digital infrastructure within which we now exist, and second, the political economy of these systems – their ownership, acquisition, structure and ownership. It also refers to the convergence and marriage of the two dominant world historical forces of digital and biological systems and the ways in which together they constitute the unsurpassable horizon for existence and becoming – the species evolution of homo sapiens and life in general, and the colonization of space.

Postdigital esthetics is a term that has a certain currency after the collection of the same title, Postdigital Esthetics Art, Computation and Design by David M. Berry and Michael Dieter (2015), on a new esthetic of resistance against the digital and the return to modernism and old media. Christian Ulrik Andersen, Geoff Cox, and Georgios Papadopoulos (2014b) in their joint editorial to a Special Issue on postdigital research in A Peer-reviewed Journal About – which is “an open-access research journal that addresses the ever-shifting thematic frameworks of digital culture” (APRJA 2018) – provide a common working definition of the postdigital:

Post-digital, once understood as a critical reflection of “digital” aesthetic immaterialism, now describes the messy and paradoxical condition of art and media after digital technology revolutions. “Post-digital” neither recognizes the distinction between “old” and “new” media, nor ideological affirmation of the one or the other. It merges “old” and “new”, often applying network cultural experimentation to analog technologies which it re-investigates and re-uses. It tends to focus on the experiential rather than the conceptual. It looks for DIY agency outside totalitarian innovation ideology, and for networking off big data capitalism. At the same time, it already has become commercialized. (Andersen et al. 2014a)

As Florian Crammer puts it in his article ‘What is ‘post-digital’?’ included in the same issue (and later in the book) (Cramer 2015), post-digital is ‘a term that sucks but is useful’ and goes on to provide a list of characteristics:
  1. 1.

    disenchantment with ‘digital’,

     
  2. 2.

    revival of ‘old’ media, followed by a number of headings (numbered here but not in the original) that are just as revealing:

     
  3. 3.

    post-digital = postcolonial; post-digital ≠ post-histoire,

     
  4. 4.

    ‘digital’ = sterile high-tech?, digital’ = low-quality trash?,

     
  5. 5.

    post-digital = against the universal machine,

     
  6. 6.

    post-digital = post-digitisation,

     
  7. 7.

    post-digital = anti-‘new media’,

     
  8. 8.

    post-digital = hybrids of ‘old’ and ‘new’ media,

     
  9. 9.

    post-digital = retro?,

     
  10. 10.

    DIY vs. corporate media, rather than ‘new’ vs. ‘old’ media. (Cramer 2015)

     

Cramer is outlining a new esthetics in terms of “semiotic shift to the indexical” (although technically, he notes, “there is no such thing as ‘digital media’” or “digital esthetics”) and, most importantly, “the desire for agency”. This certainly is retro and modernist, and represents a critical rejection of the anonymous digital systems driven by the logic of big data and AI that can easily eclipse the agency of the individual artist.

We espouse a postdigital philosophy built on the radical interaction of the ‘new biology’ and informationalism, that we refer to as bio-informational capitalism (Peters 2012a; see also Peters and Jandrić 2018) and incorporates three configurations of quantum computing, complexity theory, algorithmic capitalism, and deep learning. This paper is an amalgam of an evolving agenda and draws on work from recent papers on the postdigital (Jandrić et al. 2018), cybernetics (Peters 2014), deep learning (Peters 2018) and algorithmic capitalism (Peters 2017) and bringing the ideas together here with new material at the beginning and end of the essay. In reality these aspects are part of a broader and interconnected perspective.

Quantum Computing

Quantum computing uses the laws and processes of quantum mechanics to process information. While traditional computers operate through instructions that use a binary system represented by the numbers 0 and 1 (representing the “off” or “on” of a logic gate on an integrated circuit) quantum computers use quantum bits, or qubits, to encode information as 0 s, 1 s, or both at the same time representing the control of the flow of energy through the circuit. The superposition of states in quantum computing together with both entanglement and tunneling allows quantum computers to manipulate enormous combinations of states at any moment. Quantum theory is the attempt to describe the behaviour of matter and energy at this subatomic scale. Experiments in the early twentieth century suggested that very small particles like photons and electrons can behave either like a wave or like a particle under different circumstances, and there are precise limits with which quantities can be known (sometimes known as the Uncertainty Principle). Quantum theory has no entirely satisfactory explanation. The Copenhagen interpretation first proposed by Niels Bohr and Weiner Heisenberg in 1925–27 holds that the nature of quantum mechanics is probabilistic and will never by replaced by a deterministic theory thus threatening the classical idea of causality of physical systems and the notion of scientific realism.

As the U.S. House of Representatives Committee on Energy and Commence hearing entitled ‘Disrupter Series: Quantum Computing’ (2018) puts it:

The wave-particle duality described in the HUP lies at the heart of quantum mechanics. A consequence of the theory is that at a fundamental level matter and light can only be described probabilistically; it is impossible to know both the position and momentum of a quantum object because the object exists in all possible states simultaneously until it is measured (or observed) – a concept known as superposition.

The background report also briefly mentions the core concept of quantum entanglement that posits a change of state in one particle with necessarily involve a change in its twin or related particle, an understanding that has led to string theory.

Applying quantum theory quantum computing perform certain computational tasks exponentially faster than classical computing. The report cites Joseph Altepeter:

Quantum computers are fundamentally different from classical computers because the physics of quantum information is also the physics of possibility. Classical computer memories are constrained to exist at any given time as a simple list of zeros and ones. In contrast, in a single quantum memory many such combinations—even all possible lists of zeros and ones—can all exist simultaneously. During a quantum algorithm, this symphony of possibilities split and merge, eventually coalescing around a single solution. (Altepeter 2010)

The upshot is that both quantum mechanics and quantum computing differ fundamentally from classical mechanics and classical computer around value of indeterminancy/determinancy and anti-realism/realism that are highlighted by a probabilistic universe.

Globally, some $2.2 billion has been invested in quantum computing by IBM, QxBranch LLC, Ionq, Google, MagiQ Technologies, Rigetti Computing, and Station Q- Microsoft. The EU recently sponsored research funding of over $1 billion for quantum computing and China has made a commitment of $20 billion in a national laboratory for quantum sciences. The report continues:

The Chinese companies Baidu, Alibaba Group Holdings, and Tencent Group Holdings are also working on quantum computers, with Alibaba announcing in February 2018 the opening of a quantum computing cloud platform for researchers that operates on an 11-qubit processor. (U.S. House of Representatives Committee on Energy and Commence 2018)

The potential of quantum computing “to process multiple calculations simultaneously makes it particularly well suited to some of the most complex problems faced by programmers” including the “optimization” problem and machine learning from detecting patterns in large datasets.

We are at the edge of postdigitality in quantum computing which is very different from classical computing with multiple new uses based on fundamental differences and a fundamentally different perception of the world. This is how Michael Brett, Chief Executive Officer of QxBranch, Inc., expresses the point in his testimony to the Committee on Energy and Commence:

Quantum computers are not just a faster computer. They enable an entirely different approach to performing calculations – an approach that asks the question, what if we go beyond limit of “classical” computers and into the subatomic, or quantum realm, to perform computational work? It turns out that this is possible, and there are some incredible and surprising phenomena like superposition and entanglement that allow us to solve some interesting - and practically unsolvable - problems like simulating the interactions among molecules as the grow in size, since the exhibit exponential growth in complexity. (U.S. House of Representatives Committee on Energy and Commence 2018)

Brett indicates that “there are broadly three classes of application that become possible in the near-term”:
  1. 1.

    Optimization problems – like transport and logistics routing, production streamlining, and financial portfolio optimization;

     
  2. 2.

    Machine learning – accelerating the most computationally expensive part of training artificial intelligence systems to detect patterns in large and complex data; and.

     
  3. 3.

    Chemical simulation – using a quantum computer to simulate the behavior of molecules and materials, a quantum process that is extremely challenging to simulate using classical computers. (U.S. House of Representatives Committee on Energy and Commence 2018).

     

There is little doubt that quantum information science holds for next-generation computing and processing. Most effort has gone into harnessing development at the level of apps for business in the global competitive economy. Little if any thought has gone into the broader philosophy and the ways in which quantum information science will fundamentally alter the conditions of society.

Complexity Theory1

Cybernetics is also broadly related to systems philosophy and theory and as Charles François (1999: 203) notes, both function as “a metalanguage of concepts and models for transdisciplinarian use, still now evolving and far from being stabilized”. François (1999) provides a detailed history of systemics and cybernetics in terms of a series historical stages. First, Precursors (before 1948) – the “Prehistory of Systemic-Cybernetic Language” – going back to the Greeks and to Descartes in the modern world and ranging across the disciplines with important work in philosophy, mathematics, biology, psychology, linguistics, physiology, chemistry and so on (Hartmann, Leibnitz, Bernard, Ampère, Poincaré, Konig, Whitehead, Saussure, Christaller, Losch, Xenopol, Bertalanffy, Prigogine).

Second, “From Precursors to Pioneers (1948-1960)”, beginning with Weiner who aimed to address the problem of prediction and control and the importance of feedback for corrective steering and mentioning Shannon and Weaver’s (1949) Mathematical Theory of Communication, Von Bertalanffy’s 1950 paper ‘An outline of general system theory’, Kenneth Boulding’s (1953) ‘Spaceship Earth’, von Neumann’s theory of automata, Von Förster biological computer and his collaborators like Ashby (1956), Pask (1975) and Maturana who pursued questions in human learning, autopoiesis and cognition (Maturana and Varela 1980). François (1999) rightly devotes space to Prigogine (1955) on systemic and his escape from assumptions of thermodynamic models towards understanding dissipative structures in complex systems. Prigogine has an interest in time derived from the philosopher Bergson, and later from the physicists Boltzmann and Planck, where he developed a theorem on examples of systems which were highly organized and irreversible and applied it to the energetics of embryological evolution. His work in irreversible phenomena theory led him also to reconsider their insertion into classical and quantum dynamics and to the problem of the foundations of statistical mechanics (Prigogine 1977).

Third, ‘Innovators (After 1960)’ beginning with Simon’s (1962) discussion of complexity, Miller’s (1978) work on living systems, Maturana’s work on autopoiesis, i.e. self-production (Maturana and Varela 1980), Mandelbrot’s (1977) work on fractal forms, Zadeh (1965) work fuzzy sets and fuzzy logic, Thom’s work on the theory of catastrophes, and the development of chaos theory. As François (1999: 214) writes:

Chaos theory as the study of the irregular, unpredictable behaviour of deterministic non- linear systems is one of the most recent and important innovations in systemics. Complex systems are by nature non-linear, and accordingly they cannot be perfectly reduced to linear simplifications.

François also significantly details important work in ecology and economics mentioning Odum (1971), Daly (1973) on steady-state economy, Pimentel (1977) on the energy balance in agricultural production, among other working in the field. Fourth and finally, François (1999) examines ‘Some Significant Recent Contributions (After 1985)’ mentioning the Hungarian Csanyi’s (1989) work on the ‘replicative model of self-organization’, Langton (1989) on AL, Sabeili’s (1991) theory of processes, and McNeil (1993) on the possibility of a better synthesis between physical sciences and living systems. He ends by referencing Prat’s (1964) work on the ‘aura’ (traces that remain after the demise of the system), Grassé on ‘stigmergy’ (indirect communication taking place among individuals in social insect societies, see more on stigmergy and massive online collaboration in Susi and Ziemke (2001), Gregorio (2002) and Robles et al. (2005)), and Zeeuw (2000) on ‘invisibility’.

If modern cybernetics was a child of the 1950s, catastrophe theory developed as a branch of bifurcation theory in the study of dynamical systems originating with the work of the French mathematician Rene Thom in the 1960s and developed by Christopher Zeeman in the 1970s. Catastrophes are bifurcations between different equilibria, or fixed-point attractors and has been applied to capsizing boats at sea and bridge collapse. Chaos theory also describes certain aspects of dynamical systems i.e., systems whose state evolve over time such as the ‘butterfly effect’ that exhibit characteristics highly sensitive to initial conditions even though they are deterministic systems (e.g., the weather). Chaos theory goes back to Poincaré’s work and was taken up mainly by mathematicians who tried to characterize reiterations in natural systems in terms of simply mathematic formulae. Both Edward Lorenz and Benoît Mandelbrot studied recurring patterns in nature—Lorenz on weather simulation and Mandelbrot (1975, 1977) on fractals in nature (objects whose irregularity is constant over different scales). Chaos theory which deals with non-linear deterministic systems has been applied in many disciplines but has been very successful in ecology for explaining chaotic dynamics. Victor MacGill provides a nontechnical account of complexity theory: “Complexity Theory and Chaos Theory studies systems that are too complex to accurately predict their future, but nevertheless exhibit underlying patterns that can help us cope in an increasingly complex world” (in Peters, 2014). Complexity is concerned with theoretical foundations of computer science being concerned with the study of the intrinsic complexity of computational tasks and rests on understanding the central role of randomness.

AI and Deep Learning2

Goodfellow et al. (2016) identify three waves of development of deep learning: deep learning known as cybernetics in the 1940s–1960s that appeared with biological theories of learning; deep learning known as connectionism in the 1980s–1990s that used ‘back-propagation’ to train neural network with multiple layers, and the current resurgence under the name deep learning beginning in 2006 and only appearing in book form in 2016. They argue that the current deep learning approach to AI goes beyond the neuroscientific perspective applying “machine learning frameworks that are not necessarily neutrally inspired”. Deep learning, then, is “a type of machine learning, a technique that allows computer systems to improve with experience and data”. Morris et al. (2017) writing a guest editorial for IEEE Transactions On Automation Science and Engineering report on the remarkable “take-off” of artificial intelligence and with the resurgence also the return of the machinery question posed almost 200 years ago in the context of the Industrial Revolution. They note the upbeat analysis of mainstream press in 2016 and document the publication of several US and UK reports that suggest not only that “AI has arrived” but also offers “huge potential for more efficient and effective business and government”. The economists cite welcome AI for productivity gains. They ask “What triggered this remarkable resurgence of AI?” and they answer:

All evidence points to an interesting convergence of recent advances in machine learning (ML), big data, and graphics processing units (GPUs). A particular aspect of ML—called deep learning using artificial neural networks— received a hardware boost a few years ago from GPUs, which made the supervised learning from large amounts of visual data practical. (Morris et al., 2017: 407)

The popularity of ML, they note, has been enhanced by machines out-performing human in areas taken to be prime examples of human intelligence: “In 1997, IBM’s Deep Blue beat Garry Kasparov in chess, and in 2011, IBM’s Watson won against two of Jeopardy’s greatest champions. More recently, in March 2016, Google’s AlphaGo defeated Lee Sedol, one of the best players in the game of Go” (ibid: 407). Following this popular success, as Morris et al. (2017) note the private sector took up the challenge. They note, in particular, that IBM developed its cognitive computing in the form of their system called Watson, a DeepQA system capable of answering questions in a natural language. The Watson website makes the following claim “Watson can understand all forms of data, interact naturally with people, and learn and reason, at scale” (NDB 2018). And it also talks of “Transforming learning experience with Watson’ taking personalised learning to a new level”.

The autonomous learning systems of AI, increasingly referred to as deep learning theoretically, has the capacity to introduce autonomy into machine learning with the same dramatic impact that mechanization had first in agriculture with the creation of industrial labour force and massive rural–urban migration that built the mega-cities of today. Fordist automation that utilized technologies of numerical control (NC), continuous process production and the production processes using modern information technology (IT) introduced the system of mass production and later, the ‘flexible system of production’ based on the Japanese management principles. When Fordism came to a crisis in the 1960s with declining productivity levels where Taylorist organizational forms of labour reached its limits, the search for greater flexibility diversified into new forms of automation, especially as financialization took hold in the 2000s and high-frequency trading ensued on the basis of platforms of mathematical modeling and algorithmic engines.

A working hypothesis and a dark scenario is that in an age of deep learning—the final stage of automation–the welfare state based on full employment, might seem a figment of a quaint and romantic past when labour, together with the right to withdraw one’s labour, and labour politics, all naturally went together and had some force in the industrial age (Peters et al., 2018b). In retrospect and from the perspective of an ‘algorithmic capitalism’ in full swing, the welfare state and full employment may seem like a mere historical aberration. Without giving in to technological determinism, given current trends and evidence it seems that deep learning as a form of AI will continue apace the process of automation and that while it will create some new jobs, it will do so much more slowly than the jobs it disestablishes.

Algorithmic Capitalism3

Increasingly, cybernetics and its associated theories has become central in understanding the nature of networks and distributed systems in energy, politics and knowledge and also are significant in conceptualizing the knowledge-based economy. Economics itself as a discipline has become to recognize the importance of understanding systems rather than rational agents acting alone and pure rationality models of economic behaviour are being supplemented by economic theories that use complexity theory to predict and model transactions. More critical accounts of globalization emphasize a new form of global capitalism. The ‘financialization of capitalism’ is a process that seems to have accompanied neoliberalism and globalization, representing a shift from production to financial services, proliferation of monopolistic multinational corporations and the financialization of the capital accumulation process (Foster, 2007). Nassim Taleb (2018) and Benoit Mandelbrot (Mandelbrot and Hudson 2004) joined forces to criticize the state of financial markets and the global economy, highlighting some of the key fallacies that have prevented the financial industry from correctly appreciating risk and anticipating the current crisis including, large and unexpected changes in dynamical systems that are difficult to predict, the difficulty of predicting risk based on historical experience of defaults and losses, the idea that consolidation and mergers of banks into larger entities makes them safer but in reality imperils whole financial system (Taleb and Mandelbrot, 2008).

Cybernetic capitalism is a system that has been shaped by the forces of formalization, mathematization and aestheticization beginning in the early twentieth century and associated with developments in mathematical theory, logic, physics, biology and information theory. Its new forms now exhibit themselves in the forms of finance capitalism, informationalism, knowledge capitalism and the learning economy with incipient nodal developments associated with the creative and open knowledge (and science) economies (Peters et al., 2018a). The critical question in the wake of the collapse of the global finance system and the impending eco-crisis concerns whether capitalism can promote forms of social, ecological and economic sustainability.

“Cognitive capitalism” (CC) is a theoretical term that has become significant in the critical literature analyzing a new form of capitalism sometimes called the “third phase of capitalism,” after the earlier phases of mercantile and industrial capitalism (Boutang 2012). CC purportedly is a new set of productive forces and an ideology that focuses on an accumulation process centred on immaterial assets utilizing immaterial or digital labor processes and the co-creation and co-production of symbolic goods and experiences in order to capture the gains from knowledge and innovation which is considered central to the knowledge economy. It is a term that focuses on the fundamental economic and media shift ushered in with the Internet as platform and post-Web 2.0 technologies that have impacted the mode of production and the emergence of digital labor. The theory of cognitive capitalism has its origins in French and Italian thinkers, particularly Gilles Deleuze and Felix Guattari’s Capitalism and Schizophrenia (2009), Michel Foucault’s biopolitics (1997), Hardt and Negri’s trilogy Empire (2001), Multitude (2005) and Commonwealth (2009), as well as the Italian ‘Autonomist’ Marxist movement that has its origins in the Italian Operaismo (‘workerism’) in the 1960s.

More recently CC emanates from a group of scholars centred around the journal Multitudes (after Hardt and Negri) established by Boutang in 2000. Multitudes is a political concept at the limits of sovereign power dating from Machiavelli and Spinoza naming a population that has not entered into a social contract and retained it capacity for political self-determination and, after Hardt and Negri, resistance against global systems of power. The journal offers the following description: “The concept of “multitudes” refers to the immanence of subjectivities (rather than “identities”) acting in opposition to established power structures and mapping the way for new futures” (Multitudes, 2018).

As an epistemology related to systems and systems philosophy ‘cybetics’ functioned as an approach for investigating a wide range of phenomena in information and communication theory, computer science and computer-based design environments, artificial intelligence, management, education, child-based psychology, human systems and consciousness studies. It also was used to characterize cognitive engineering and knowledge-based systems, ‘sociocybernetics’, human development, emergence and self-regulation, ecosystems, sustainable development, database and expert systems, as well as hypermedia and hypertext, collaborative decision-support systems, and World Wide Web studies. It also has been used to talk neural nets, software engineering, vision systems, global community, and individual freedom and responsibility.

In a paper entitled “Algorithmic Capitalism and Educational Futures: Informationalism and the Googlization of Knowledge” Peters (2012b) commented upon the rise of a new kind of capitalism that Agger had been one of the first to name and to begin to scrutinize its social consequences:

Algorithmic capitalism and its dominance of the market increasingly across all asset classes has truly arrived. It is an aspect of informationalism (informational capitalism) or "cybernetic capitalism," a term that recognizes more precisely the cybernetic systems similarities among various sectors of the post-industrial capitalist economy in its third phase of development - from mercantilism, industrialism to cybernetics - linking the growth of the multinational info-utilities (e.g., Goggle, Microsoft, Amazon) and their spectacular growth in the last twenty years, with developments in biocapitalism and the informatization of biology, and fundamental changes taking place with algorithmic trading and the development of so-called financialization. (Peters 2012b)

Speed and velocity are the main aspects of a new finance capitalism that operates at the speed of light based on sophisticated “buy” and “sell” algorithms. Already researchers have demonstrated that data transfer using a single laser can send 26 terabits per second down an optical fiber and there are comparable reports that lasers will make financial “high-frequency” trading even faster.

Western modernity (and developing global systems) exhibit long-term tendencies of an increasing abstraction described in terms of formalization, mathematicization, aestheticization and biologization of life. These are characteristic of otherwise seemingly disparate pursuits in the arts and humanities as much as science and technology and driven in large measure through the development of logic and mathematics especially in digital systems. Much of this rapid transformation of the properties of systems can be captured in the notion of “bioinformational capitalism” that builds on the literatures on “biocapitalism” and “informationalism” (or “informational capitalism”) to develop the concept of “bio-informational capitalism” in order to articulate an emergent form of capitalism that is self-renewing in the sense that it can change and renew the material basis for life and capital as well as program itself. Bioinformational capitalism applies and develops aspects of the new biology to informatics to create new organic forms of computing and self-reproducing memory that in turn have become the basis of bioinformatics (Peters and Jandrić 2019).

The third phase of “cybernetic capitalism” itself has undergone further development from first to fifth generation. I described above the first four generations to the point of complexity theory. The fifth is what Peters calls “bioinformationalism” representative of bioinformational capitalism (Peters 2012a, b; see also Peters and Jandrić 2019) that articulates an emergent form of capitalism that is self-renewing in the sense that it can change and renew the material basis for life and capital as well as program itself. This represents a massive change to the notion of digital reason as also a biological notion—biologizing digital reason. Bio-informational capitalism applies and develops aspects of the “new biology” to informatics to create new organic forms of computing and self-reproducing memory that in turn has become the basis of bioinformatics.

Our speculation is that the ‘biologization of digital reason’ is a distinct phenomenon that is at an early emergent form that springs from the application of digital reason to biology and the biologization of digital processes. In this space, we might also talk of digital evolution, evolutionary computation, and genetic algorithms.

Notes Towards the Postdigital as Process Philosophy

A critical philosophy starts from the rejection of a purely mechanistic universe—the universe of classical mechanics that is echoed by a traditional deterministic computing. The move to something new – to a philosophy that emphasizes nondeterministic states, also might find solace in the concept of reality and philosophy as process or a web of interrelated processes as developed by Whitehead in his Harvard Lectures (1927–28) that was published as Process and Reality (1929). Whitehead’s event-based process ontology offers an ‘ecological’ and relational approach to a wide range of studies as well as serving as the common ground for Eastern and Western religious and cultural traditions. Indeed, theology was one of the disciplines that was first to develop Whitehead’s process thought with the Claremont School of Theology and the establishment of the Center for Process Studies founded in 1973 by John B. Cobb and David Ray Griffin.

Whitehead’s metaphysics was interpreted as a fundamental challenge to scientific materialism and a view of reality not fully determined by classical mechanics or causal determinism: creativity is the principle of existence and there is a degree of originality in the way in which entities responds to other entities. At least at first glance Whitehead’s philosophy seems consonant with the postdigital and quantum physics as we have described it. For Whitehead (1920/2009: 166) “nature is a structure of events and each event has its position in this structure and its own peculiar character or quality”. In Chapter 2 of The Concept of Nature (1920/2009: 173) he clearly rejects what he calls “the bifurcation of nature” (minds and matter) and criticizes “the concept of matter as the substance whose attributes we perceive” arguing that “The character of the spatio-temporal structure of events can be fully expressed in terms of relations between these more abstract event –particles”. He goes on to explain that “Each event-particle lies in one and only one moment of a given time-system’ and its characterised by its extrinsic character, its intrinsic character and its position” (Whitehead 1920/2009: 191). Thus, all entities for Whitehead are temporal – they are occasions of experience – and nothing exist in isolation by only its relations.

Nicholas Rescher, the great American pragmatist much influenced by Whitehead’s philosophy of the organism, writes:

What is characteristically definitive of process philosophizing as a distinctive sector of philosophical tradition is not simply the commonplace recognition of natural process as the active initiator of what exists in nature, but an insistence on seeing process as constituting an essential aspect of everything that exists — a commitment to the fundamentally processual nature of the real. For the process philosopher is, effectively by definition, one who holds that what exists in nature is not just originated and sustained by processes but is in fact ongoingly and inexorably characterized by them. On such a view, process is both pervasive in nature and fundamental for its understanding. (Rescher 2006: 3)

The resuscitation of his work has much to do with Deleuze (see Robinson 2009) and with Isabelle Stengers book Thinking with Whitehead: Free and Wild Concepts (Stengers 2011). Deleuze, like Whitehead, opposes substance metaphysics, the dominant paradigm since Aristotle, recasting the notion that being is a simple unchangeable substance rather than a becoming that is always occurring and undergoing a dynamic process of self-differentiation. ‘Substances’ might be thought to be a grammatical feature of Indo-European languages that prioritizes static entities.

Johanna Seibt (2018) in her entry ‘Process Philosophy’ in the Stanford Encyclopedia of Philosophy concludes

contemporary process philosophy holds out the promise of offering superior support for the three most pressing tasks of philosophy at the beginning of the 21st century. First, it provides the category-theoretic tools for an integrated metaphysics that can join our common sense and scientific images of the world. Second, it can serve as a theoretical platform upon which to build an intercultural philosophy and to facilitate interdisciplinary research on global knowledge representation by means of an ontological framework that is no longer parochially Western. Third, it supplies concepts that facilitate interdisciplinary collaboration on reflected technology development, and enable the cultural and ethical imagination needed to shape the expectable deep socio-cultural changes engendered by the increased use of technology, especially automation.

Process philosophy provides us with what Whitehead called “a philosophy of the organism” – it is a form of speculative metaphysics that privileges the event and processes over and above substance with the consequence that we are released from the mechanistic, deterministic universe that is a product of classical physics. It is also a clear rejection of scientific realism substituting a relation process ontology that points towards a indeterministic universe at the sub-atomic level and a form of quantum philosophy based on quantum mechanics and computing characterizing an era we are just entering. It will be transformative, dynamic, system-built ontology very different from our understanding of the digital, which itself has only got underway. A critical philosophy of the postdigital must be able to understand the processes of quantum computing, complexity science, and deep learning as they constitute the emerging techno-science global system and its place within a capitalist system that itself is transformed by these developments.

Footnotes

  1. 1.

    This section draws on Peters, M. A. (2014). The university in the epoch of digital reason fast knowledge in the circuits of cybernetic capitalism. In P. Gibbs, O.-H. Ylijoki, C. Guzmán-Valenzuela, & R. Barnett (Eds.), Universities in the time of flux: An exploration of time and temporality in university life. London: Routledge.

  2. 2.

    This section draws on Peters, M. A. (2018). Deep learning, education and the final stage of automation. Educational Philosophy and Theory, 50(6–7): 549–553.  https://doi.org/10.1080/00131857.2017.1348928

  3. 3.

    This section draws on Peters, M. A. (2017). Algorithmic Capitalism in the Epoch of Digital Reason. Fast Capitalism, 14(1).

References

  1. Altepeter, J. B. (2010). A tale of two qubits: how quantum computers work. Ars Technica, 18 January. https://arstechnica.com/science/2010/01/a-tale-of-two-qubits-how-quantum-computers-work/. Accessed 12 May 2018.
  2. Andersen, C. U., Cox, G., & Papadopoulos, G. (2014a). Postdigital research—editorial. A Peer-Reviewed Journal About, 3(1).Google Scholar
  3. Anderson, C. U., Cox, G., & Papadopoulos, G. (2014b). Editorial. A Peer-reviewed Journal About Postdigital Research. http://www.aprja.net/post-digital-research/.
  4. APRJA. (2018). About. http://www.aprja.net/about/. Accessed 12 May 2018.
  5. Ashby, W. R. (1956). An introduction to cybernetics. London: Chapman and Hall.CrossRefGoogle Scholar
  6. Berry, D. M., & Dieter, M. (Eds.). (2015). Postdigital aesthetics: Art, computation and design. New York: Palgrave Macmillan.Google Scholar
  7. Boulding, K. (1953). Toward a general theory of growth. Canadian Journal of Economics and Political Science, 19, 326–340.CrossRefGoogle Scholar
  8. Boutang, Y. M. (2012). Cognitive capitalism. Cambridge: Polity.Google Scholar
  9. Cramer, F. (2015). What is ‘post-digital’? In D. M. Berry & M. Dieter (Eds.), Postdigital aesthetics: Art, computation and design (pp. 12–26). New York: Palgrave Macmillan.  https://doi.org/10.1057/9781137437204.CrossRefGoogle Scholar
  10. Csanyi, V. (1989). The replicative model of self- organization. In G. J. Dalenoort (Ed.), The paradigm of self-organization. New York: Gordon & Breach.Google Scholar
  11. Daly, H. (1973). Towards a steady-state economy. San Francisco: Freeman.Google Scholar
  12. Deleuze, G., & Guattari, F. (2009). Anti-Oedipus: Capitalism and schizophrenia. London: Penguin.Google Scholar
  13. Foster, J. B. (2007). The financialization of capitalism. Monthly Review, 1 April. https://monthlyreview.org/2007/04/01/the-financialization-of-capitalism/. Accessed 12 May 2018.
  14. Foucault, M. (1997). Technologies of self. In P. Rabinow (Ed.), Ethics (pp. 223–325). London: Penguin Books.Google Scholar
  15. François, C. (1999). Systemics and cybernetics in a historical perspective. Systems Research and Behavioral Science, 16, 203–219.CrossRefGoogle Scholar
  16. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. Cambridge: MIT Press http://www.deeplearningbook.org. Accessed 12 May 2018.Google Scholar
  17. Gregorio, J. (2002). Stigmergy and the world-wide web. http://bitworking.org/news/Stigmergy. Accessed 12 May 2018.
  18. Hardt, M., & Negri, A. (2001). Empire. Cambridge and London: Harvard University Press.Google Scholar
  19. Hardt, M., & Negri, A. (2005). Multitude: War and democracy in the age of empire. London: Penguin.Google Scholar
  20. Hardt, M., & Negri, A. (2009). Commonwealth. Cambridge: Harvard University Press.Google Scholar
  21. Jandrić, P., Knox, J., Besley, T., Ryberg, T., Suoranta, J., & Hayes, S. (2018). Postdigital science and education. Educational Philosophy and Theory, 50(10), 893–899.  https://doi.org/10.1080/00131857.2018.1454000.CrossRefGoogle Scholar
  22. Langton, C. (Ed.). (1989). Artificial life. Reading: Santa Fe Institute for Studies in the Sciences of Complexity and Addison-Wesley.Google Scholar
  23. Mandelbrot, B. (1975). The fractal geometry of nature. New York: Freeman.Google Scholar
  24. Mandelbrot, B. (1977). Fractal forms, change and dimensions. San Francisco: Freeman.Google Scholar
  25. Mandelbrot, B., & Hudson, R. L. (2004). The (Mis) behavior of markets: A fractal view of risk, ruin, and reward. New York: Basic Books.Google Scholar
  26. Maturana, H., & Varela, F. (1980). Autopoiesis and cognition. Boston: Reidel.CrossRefGoogle Scholar
  27. McNeil, D. H. (1993). Architectural criteria for a general theory of systems. Proceedings of the 37th ISSS Conference. Hawkesbury, Australia: University of Western Sidney.Google Scholar
  28. Miller, J. G. (1978). Living systems. New York: McGraw-Hill.Google Scholar
  29. Morris, K., Schenloff, C., & Srinivasan, V. (2017). Guest editorial. A remarkable resurgence of artifical intelligence and its impact on autonmation and autonomy. IEEE Transactions of Automation Science and Engineering, 14, 407–409.CrossRefGoogle Scholar
  30. Multitudes. (2018). About. http://www.multitudes.net/. Accessed 12 May 2018.
  31. NDB. (2018). Welcome to the Congnitive Era. http://www.ndb.bg/index.php/watson/. Accessed 12 May 2018.
  32. Odum, H. (1971). Environment, power and society. New York: Wiley. of philosophy (Winter 2014 Edition). https://plato.stanford.edu/archives/sum2012/entries/process-philosophy/. Accessed 12 May 2018
  33. Pask, G. (1975). The cybernetics of human learning and performance. London: Hutchinson.Google Scholar
  34. Peters, M. A. (2012a). Bio-informational capitalism. Theses Eleven, 110(1), 98–111.CrossRefGoogle Scholar
  35. Peters, M. A. (2012b). Algorithmic capitalism and educational futures: Informationalism and the Googlization of Knowledge. Truthout, 4 May. https://truthout.org/articles/algorithmic-capitalism-and-educational-futures-informationalism-and-the-googlization-of-knowledge/. Accessed 12 May 2018.
  36. Peters, M. A. (2014). The university in the epoch of digital reason fast knowledge in the circuits of cybernetic capitalism. In P. Gibbs, O.-H. Ylijoki, C. Guzmán-Valenzuela, & R. Barnett (Eds.), Universities in the time of flux: An exploration of time and temporality in university life. London: Routledge.Google Scholar
  37. Peters, M. A. (2017). Algorithmic capitalism in the epoch of digital reason. Fast Capitalism, 14(1).Google Scholar
  38. Peters, M. A. (2018). Deep learning, education and the final stage of automation. Educational Philosophy and Theory, 50(6–7), 549–553.  https://doi.org/10.1080/00131857.2017.1348928.CrossRefGoogle Scholar
  39. Peters, M. A., & Jandrić, P. (2018). The Digital University: A dialogue and manifesto. New York: Peter Lang.CrossRefGoogle Scholar
  40. Peters, M. A., & Jandrić, P. (2019). Posthumanism, open ontologies and bio-digital becoming. In K. Otrel-Cass (Ed.), Utopia of the digital cornucopia. Singapore: Springer.Google Scholar
  41. Peters, M. A.; Besley, T. & Jandrić, P. (forthcoming, 2018a). Postdigital knowledge cultures and their politics. ECNU Review of Education.Google Scholar
  42. Peters, M. A., Jandrić, P., & Hayes, S. (2018b). The curious promise of educationalising technological unemployment: what can places of learning really do about the future of work? Educational Philosophy and Theory, OnlineFirst.Google Scholar
  43. Pimentel, D. (1977). America's agricultural future. The Economist, 8th September.Google Scholar
  44. Prat, H. (1964). Le champ Unitaire en Biologie. Paris: Presses Universtaires de France.Google Scholar
  45. Prigogine, I. (1955). Thermodynamics of irreversible processes. Springfield: Thomas Press.Google Scholar
  46. Prigogine, I. (1977). Ilya Prigogine – Biographical. https://www.nobelprize.org/nobel_prizes/chemistry/laureates/1977/prigogine-bio.html. Accessed 12 May 2018.
  47. Rescher, N. (2006). Process philosophical deliberations. Heusenstamm: Ontos Verlag.CrossRefGoogle Scholar
  48. Robinson, K. (2009). Deleuze, whitehead, Bergson: Rhizomatic connections. London: Palgrave Macmillan.CrossRefGoogle Scholar
  49. Robles, G., Merelo, J. J., & Gonzalez-Barahona, J. M. (2005). Self-organized development in libre software: a model based on the stigmergy concept. In D. Pfahl, D. M. Raffo, I. Rus, & P. Wernick (Eds.), Proceedings of 6th International Workshop on Software Process Simulation and Modeling. Stuttgart: Fraunhofer IRB Verlag.Google Scholar
  50. Sabeili, H. (1991). Process theory: A biological model of open systems. Proceedings of the 35th ISSS Meeting (pp. 219-225). Ostersund, Sweden.Google Scholar
  51. Seibt, J. (2018). Process philosophy. In E. N. Zalta (Ed.), The Stanford encyclopedia. Google Scholar
  52. Shannon, C., & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.Google Scholar
  53. Simon, H. A. (1962). The architecture of complexity. Proceedings of the American Philosophical Society, 106(6), 467–482.Google Scholar
  54. Stengers, I. (2011). Thinking with whitehead: A free and wild creation of concepts. Cambridge: Harvard University Press.Google Scholar
  55. Susi, T., & Ziemke, T. (2001). Social cognition, artefacts, and stigmergy: a comparative analysis of theoretical frameworks for the understanding of artefact-mediated collaborative activity. Cognitive Systems Research, 2(4), 273–290.CrossRefGoogle Scholar
  56. Taleb, N. N. (2018). Nassim Nicholas Taleb's Home Page. http://www.fooledbyrandomness.com/. Accessed 12 May 2018.
  57. Taleb, N. N. & Mandelbrot, B. (2008). Nassim Taleb & Benoit Mandelbrot on 2008 Financial Crisis [Video recording]. http://nassimtaleb.org/tag/benoit-mandelbrot/. Accessed 12 May 2018.
  58. U.S. House of Representatives Committee on Energy and Commence. (2018). Hearing entitled “Disrupter Series: Quantum Computing”. https://docs.house.gov/meetings/IF/IF17/20180518/108313/HHRG-115-IF17-20180518-SD002.pdf. Accessed 12 May 2018.
  59. Von Bertalanffy, L. (1950). An outline of general system theory. British Journal for the Philosophy of Science, 1(2), 134–165.CrossRefGoogle Scholar
  60. Whitehead, N. A. (1920/2009). The concept of nature. Ithaca: Cornell University Library.Google Scholar
  61. Zadeh, L. (1965). Fuzzy sets. Information and Control, 8, 338–353.CrossRefGoogle Scholar
  62. Zeeuw, De G. (2000). Some problems in the observation of performance. In F. Parra-Luna (Ed.), The performance of social systems: Perspectives and problems (pp. 61–70). New York: Springer Science+Business Media.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1. Faculty of Education, Beijing Normal UniversityBeijingChina

Personalised recommendations