1 Introduction

For the Medieval thinkers, the famous puzzle of how many angels can dance on the head of a pin was a question about the relationship of the infinite god to the finite world. If god the creator was all-powerful, logically, he should be able to bid an infinite number of angels to dance on the head of a pin. At the same time, the Medieval thinkers believed that no infinite collection could, in fact, arise in the finite world, or appear to finite human beings.Footnote 1 Apart from shedding light on the rift between the realm of ideality and materiality and corporeality, characteristic of the two-world theory derived from Western metaphysics (Nishitani 1991), this paradox also sheds light on human inability to process complexity as multiplicity, collapsed orders of magnitude, virtuality and/or vertiginous speed. Unlike the infinite god who, we could say, ‘operates’ with complexity, multiplicity, polychromaticy as well as with unity, finite human beings need common denominators, unifying factors, and, often, also, reduction. In the two-world theory—where finite beings are created from a primordial overflowing, or emanation, from the One, the source of all being—the world of essence, infinity, and eternity, is separate from the world of appearance, finitude, and temporality. Moreover, the degree of separation is marked by the logic of wholes and parts. A whole (the infinite One) is a superior term because it denotes plenitude, the presence of all parts; a part (the finite human) is inferior because it denotes incompleteness.

There are many reasons why finite, corporeal, actually existing humans—seen as infinitesimal parts of a vast plenum or not—cannot see an infinite number of non-corporeal, virtual beings dancing in the impossibly small space of a pinhead.Footnote 2 I’ll focus on the most pertinent one for the purposes of this discussion. ‘Ordinary’ human consciousness, also called thetic or phenomenal consciousness, cannot perceive—or conceive of—infinity due to the limited nature of human perception and the ingrained ideas about the possible and impossible. Shaped by what Drew Leder has called the ‘phenomenological vector’, human perception is confined by the ‘culturally formatted structure of experience’, which makes possible ‘certain practical or interpretative directions’ while discouraging others (Leder 1990, 150). Despite the fact that there are, of course, certain ‘existential/biological invariants that shape human experience in general’—humans do not have eyes in the back of their heads, which is why they do not have 360° vision—the phenomenological vector is created from an ambiguous, potentially infinite ‘set of possibilities and tendencies’ that take on definite shape ‘within a cultural context’ (151). In other words, the embedded-ness of the organism in the environment and its day-to-day praxis constitute the organism’s perceptual world, which, in the case of humans, cannot be separated from their culture’s/cultures’ dominant ideologies. As we know from Andy Clark’s work, the mind is embodied, embedded, extended, and enactive (Clark 2010). Conceptual demarcations and daily practices are not separate. For example, the Aivilik, who do not conceptually separate space and time, do not view space as a static enclosure but rather as a dynamic situation, a direction in operation. When handed a copy of an illustrated magazine, they will not turn it ‘right side up’ and are ‘highly amused when the Westerners do so’ since, for the Aivilik, images can be viewed regardless of whether they are horizontal, vertical or diagonal—all directions are ‘right side up’ (Montagu 1986, 300). The inability to view/grasp images from multiple positions is like saying: I can only watch television when standing up. If I sit down the visual information becomes incomprehensible to me. Clearly, ideas about the possible and impossible are shaped by usage and praxis, much like practices are, in turn, shaped by the inherited, taken-for-granted ideas about what exists and does not exist. This is also the reason why the ability to penetrate the ‘perceptual unworld’, so to speak—as spatial, temporal, energetic and material complexity unformatted by culture, dominant ideas, education and/or cumulative experience—has often been associated with mysticism.

For Karen Barad, as for many quantum theorists,Footnote 3 time, like space and matter, is diffractive and indeterminate. Temporal and spatial indeterminacy manifests in many ways: a wave can behave like a wave or a particle depending on how it’s measured; a particle can be in two places at once, in a state of superposition. Time is discontinuous; multiple temporalities exist at the same time, as a superposition of all possible histories. The past is present in the present, not only as a result of past actions—we break a vase and the next time we come into the room the vase smithereens are on the floor—but as perpetual change. Much like space and matter are not definitively ‘there’, in one place, at one time, the vacuum is neither ‘(determinately) empty, nor is it (determinately) not empty’ (Barad 2017, 54). Its virtual particles are ‘non/existences that teeter on the edge of the infinitely fine blade between being and nonbeing’ (54). To explain that the virtual particles are not in the void but are ‘of the void’—because the void is not an enclosure but, rather, ‘a lively tension […] bursting with innumerable imaginings of what-could-be/might-yet- have-been’—which she calls the ‘dance of the vacuum’ (56)Footnote 4—Barad resorts to the medieval Torah commentator Rashi’s explanation of the Hebrew version of the first verse of Genesis. The opening verse of Genesis is B’reishit, which does not say: in the beginning god created xyz.

It says something very different, because, grammatically, B’reishit is the first of two nouns in a row, as in the phrase B’nei Yisreal—the children of Isreal (Rashi quoted in Barad 2017, 42). If the phrase read ‘B’nei—the children of—without ‘Yisreal’ it would be incomplete. Likewise, B’reishit means ‘in’ or ‘at’ ‘the beginning of’ but there is no second noun to indicate what B’reishit is the beginning of. In other words, the beginning itself is here an ‘of’ relation, as is time. This is very different from the concept of time evident in the phrase ‘in the beginning of time there was x’, where we know that time is a dimension or a medium which has (or had) a beginning. For Rashi, as for Barad (42), the incompleteness of B’reishit is a move that opens time to infinite interpretation not only in the sense of quiddity—‘what is time?’—but also quality—‘how is time time?’—as well as quantity—‘how many times are there?’ and ‘is this number finite or infinite? B’reishit is a relation of irreducible complexity, which invites multiperspectivalism.

But even if we replace the dancing angels with the dancing virtual particles, can we say that we have a clear idea of virtual infinity and/or complexity? That we know it? Most likely, we’ll agree that we know about it rather than know it. Yet—to return to the Aivilik example—if we have the experience of navigating environments like water (in swimming or diving) or snow (in bobsleighing) and are visually familiar with the Aivilik territory (Nanavut, part of the Canadian Arctic archipelago), it is possible to understand, by way of aesthetic analogy, what it means for space–time to be a direction in operation. Aesthetics, in this context, is not a beauty or harmony ideal or a pleasing arrangement of objects. It’s the sum total of that which can be experienced, a modality of comprehension that enables us to understand one thing in terms of another, or two or more things. As such, an aesthetic analogy is neither abstract, like applying a formula to different contexts, nor is it so particular as not to be relatable to other contexts.

In what follows, indeterminate artistic procedures (which are themselves based on deterministic parameters) are used as an aesthetic analogy to enable a qualitative understanding of ‘alien' thought in a non-formulaic way. First coined by AI pioneer Joseph Weizenbaum, and subsequently theorised by Ian Bogost, Yuk Hui, and Luciana Parisi, ‘alien' thought stems from the simple fact that, for Weizenbaum, the domain of thinking and intelligence is, with the exception of formal problems, ‘determined by man’s [sic] humanity’ (Weizenbaum 1976, 223). Weizenbaum concludes that ‘every other intelligence, however, great, must necessarily be alien to the human domain’ (223; emphasis original). In the last decade, the expression ‘alien' thought has been used to discuss spatio-temporal and interactional trajectories of (what are usually referred to as) inanimate objects and the resulting need for an ‘alien’ phenomenology (Bogost 2012); developments based on recursive behaviours of technical objects (Hui 2019); and circuits of reproduction based on learning architectures and the contextual use of content that defy the servo-mechanical model of machinic operations (Parisi 2019).

My contention is that indeterminate artistic procedures are particularly well suited to understanding the otherness of ‘alien' thought for two reasons. First, self-supervised (rather than humanly supervised) machine learning is stochastic as ‘only the input x is given from which an unknown pattern y must be discovered’ (Pasquinelli 2019, 5). To a much smaller degree, this is also true of other algorithmic procedures that rely on vast amounts of big data, since their results can, in fact, never be predicted with complete accuracy. Second, indeterminate artistic procedures operate as diagrammatic machines. For Gilles Deleuze and Félix Guattari, a diagrammatic machine is neither abstract nor particular. It is neither a transcendental idea that is determining in the supreme instance nor is it an infrastructure that is determining in the last instance; a diagrammatic machine does not have a representational function either (Deleuze and Guattari 1987, 142). Rather, it plays a ‘piloting role’ constructing’ a nascent ‘real’—a real yet to come (142).

In the third decade of the twenty-first century, there is an urgent need to understand the paradox of increasing over-determination and indeterminacy unleashed by machinic and algorithmic operations (where over-determination stands for the application of shortcutting procedures to decision processes resulting in ‘automatic’ account termination, or ‘automatic’ health insurance claim rejection. Phenomenologically speaking, we are thrown into a perceptual world that is simultaneously instrumentarian and chaotic, to use Shoshana Zuboff’s and Deleuze and Guattari’s astute respective observations. In instrumentarianism, ‘users’ have become ‘a means to profits in new behavioral futures markets’ where they are neither the producer nor the product but ‘the human natural source’ (Zuboff 2019, 13; emphasis original). For Deleuze and Guattari, chaos is not creative turbulence; it is the ‘infinite speed’ with which forms and objects ‘appear fleetingly and disappear immediately, without consistency or reference, without consequence’ (Deleuze and Guattari 1994, 118). This is due to many factors: regimes of machinic optimisation that desecrate existential terrainsFootnote 5 and destabilise deeply ingrained ideas about durability and ephemerality,Footnote 6 ultimately leading to a necropolitical brand of instrumentality.Footnote 7 Yet over-determination is also accompanied by unprecedented complexity, embroiled scales of magnitude, and, for humans, impossible-to-grasp temporalities, such as the vertiginous speed produced in high frequency trading. Although we could say, in general terms, that the notion of ‘alien’ thought is formulated against the relief of the human as a privileged agent of thought, which belongs to a specific (Western or global North) heritage, and that even a cursory glance at recent titles such as Eduardo Kohn’s How Forests Think, shows how limited this view is—not to mention an entire history of Asian and Indigenous thought—it is important not to force the inherent instrumentality of many machinic/algorithmic operations that regulate health, education, justice, finance and economics, and ‘alien thought’ into a single theoretical armature.

2 The paradox of over-determination and indeterminacy

There is, of course, an undeniable relationship of computation to deterministic, rule-based, provable, and non-particular reasoning. It can be traced to Gottfried Wilhem Leibniz’s mathesis universalis, a universal mode of reasoning, unencumbered by particularities, whether of a cultural or individual perceptual, sensorial or educational kind, as well as a mode of reasoning that is conclusively provable and applicable to any subject matter. Despite the fact that the idea of universal reasoning was not new when it first appeared—it was also propagated by René Descartes, in the form of ‘a general science which explains all the points that can be raised concerning order and measure irrespective of the subject matter’ (Descartes 1985, 19)—Leibniz’s contribution was far more significant than an orderly system of measurement. It consisted of what he referred to as characteristica universalis—an ‘alphabet’ of abstract symbols, representing the entirety of human knowledge in non-variable terms (Leibniz 1985). This process, which sought to create validity and continuity of thought beyond the human frame, human temporality and particularity, consisted of a series of sequence- and logic-locked steps.Footnote 8 Although mathesis universalis was never completed it was an undeniable influence on the work of the twentieth century mathematicians and computer scientists, among whom also Alan Turing. For Turing, computing was equivalent to a rule-governed, sequential succession of finite steps based on deductive inference, which does not allow for internal change or variation.Footnote 9 Likewise, algorithms were, for Turing, a form of automated thought in the sense of formalised, sequence-locked procedure, a closed system. For Leibniz, as for Turing, calculation and computation, which are abstract, rather than embodied, embedded in the environment, and particular, were a form of universal reasoning precisely because they were axiomatic—based on invariable deductive procedures where the validity of the input automatically translates into the validity of the output. However, neither Leibniz nor Turing lived in an era where such non-variable, ‘closed’ logics affected human lives in most, if not all spheres, from education and finance to dating and health.

The chief problem with such abstract, axiomatic reasoning is, of course, value: existential, social, natural, or cultural. One does not have to be a moral philosopher to see that axiomatic procedures subsume not only particular value but also particularity as such under generic principles. They cannot not be instrumental. Having co-developed with industrial rationalisation, instrumental rationality is more problematic today than ever because of its scale and ubiquity. Determining ‘expectations’ of how ‘objects in the environment’ should behave (Weber 1978, 24), instrumental rationality equates these expectations with ‘givens’. As members of the Frankfurt School have argued, a process of reasoning or calculation devoid of any relation to embodiment, a thing or a being’s embedded-ness in the environment has totalitarian tendencies as well as disastrous effects (Horkheimer and Adorno 1972; Horkheimer 2012).Footnote 10 Despite the difference of half a century, the crucial insight of the Frankfurt school—that the instrumentalising power of automation is simultaneously the automation of power—is by no means trivial, as can be seen from the widespread use of predatory machinic procedures that automate difference control and anomaly detection, perpetuating racism, sexism, and classism in crime prediction and medical diagnostics, among many other examples (Panagia 2017; Eubanks 2018). The ‘universal method of reasoning’, based on sequence- and logic-locked procedure, here amounts to pre-emption from procedure or ‘future from structure’ reducing ethical questions to technical management and continuing the mantra of industrial rationality: progress, productivity, efficiency, in a far worse—because automated—way.Footnote 11

When speaking of human intelligence, the shortest definition of stupidity is perhaps: the use of the same, simplistic, ‘shortcutting’ formulas in vastly different situations, without any regard for the situated-ness of the situation. The shortest definition of intelligence, by contrast, might be: a subtle dance with novel complexity. Dancing here does not refer to litheness alone. It refers to sensitivity, responsiveness, and, by implication, also, to respons-a-bility, in addition to the ability (affordance or willingness) to pursue a longer, more complicated path without immediate results (or perhaps no results at all) to allow the suchness of the dancing partner/s—their quiddity and specificity—to come to the fore. By contrast, ‘future from structure’ manipulates possibility into probability, and probability into necessity reducing relationships of relevance to those of causation. As Franco Berardi has extensively argued, automation is ‘the submission of the cognitive activity to logical and technological chains’, ‘a form of engendered determinism’, and, as such, the ‘fundamental act of power’ (Berardi 2020, 42). Equations like ‘if you don’t pay your car insurance you’ll be automatically locked out of your car’ are inscribed in the machine as logical necessities because they ‘convert real events into activators of mathematical functions’ (43). They are of course not logical necessities but instrumentalist, shortcutting operations that benefit particular parties: corporations and governments. However, artificial, like human, stupidity does not mean that AI should be viewed solely as an extension of instrumental rationality.Footnote 12 Much like classical physics exists alongside quantum theory, in scientific work, and in curricula, despite the fact that quantum theory negates most, if not all, postulates of classical physics, indeterminacy, in the broad field of AI, co-exists alongside over-determination. Dance is not altogether absent as automated procedures did not develop on their own.

Plasticity played a key role in the mid-twentieth century co-development of computers and neurosciences. Discussing the indeterminate element present in Turing’s thinking machine, which, it should not be forgotten, developed amidst undecidability and the halting problem theories,Footnote 13 David Bates and Nima Bassiri refer to Donald Hebb’s famous phrase ‘neurons that fire together wire together’ to establish a connection between plasticity or deviance from set routes and routines (Hebb quoted in Bates and Bassiri 2015, 195). Pointing to the fact that contingency exists in human and computer synapses alike they go on to suggest that at the time when the first computer was being conceptualised, the digital was not yet fully aligned with automaticity (195). The co-development of AI and experimental neuroscience meant that the plastic brain offered an insight into unpredictable leaps in human behaviour and thinking, related to hidden capacities that go beyond habitual behaviour. In machines, this referred to unpredictable leaps in functional mechanisms, which were often treated as errors, but which were not errors, merely different developments. Neuropsychological discourses focused on the disorders of the injured brain and its ability to recover functioning after injury indicating that the brain ‘was at once a site of openness and a space of artificial mechanisms’ (200). Quoting William James, Bates and Bassiri conclude that ‘[p]lasticity means the possession of a structure weak enough to yield to an influence, but strong enough not to yield all at once’; (James quoted in Bates and Bassiri 202). This further means sensitivity and responsiveness to the environment as well as to change. Errance—wandering or movement away from the established or programmed path or course—is inherent to thought. To emphasise the point about machinic plasticity, the authors reiterate Gilbert Simondon’s famous remark: ‘the true perfection of machines does not correspond to an increase in automation, but on the contrary to the fact that the functioning of a machine harbours a certain margin of indetermination (Simondon quoted in Bates and Bassiri 2015, 214).

Seven decades later, we know, as Katherine Hayles has noted in relation to Norbert Weiner’s cybernetic paradigm of circular feedback, and as Yuk Hui has extensively argued, that in machinic and algorithmic operations, feedback is recursive and spiral, rather than circular (Hayles 2005, 241; Hui 2019). This means that feedback does not reinforce self−same operations but rather creates an internal dynamic that opens onto the ‘undecidable and the unknowable’ (Hayles 2017, 202). Furthermore, contemporary machine learning uses back propagation to train multilayer architectures, which makes feedback less relevant than aggregation, de-aggregation, and re-aggregation, all of which create internal change and, therefore, emergent behaviours and alien thought.Footnote 14 In a (still) largely anthropocentric tradition, derived from Euro-centric metaphysical and scientific traditions with a global reach,Footnote 15 there are bound to be disagreements about what constitutes other-than human thought.

For example, for Hayles, cognition, and particularly non-conscious cognition, is different from thinking. Thinking, for her, includes ‘reasoning abstractly, creating and using verbal languages, constructing mathematical theorems [and] composing music’ (Hayles 2017, 14). Many of Hayles’s examples rely on a specific faculty and have a teleological dimension, such as constructing mathematical theorems or composing music. I do not think that a teleological approach—completing a work (a whole), such as (traditional) musical composition—should be considered (higher-order) thinking whereas rhizomatic, indeterminate, non-teleological thought, should not. First, anything that is ‘complete’ or complex is so from the human point of view. Second, the division into conscious and unconscious operations, both in humans, and as a separation of humans from other-than-humans, belongs, once again, to the Western/global North heritage which is only one among many schools of thought. Although a detailed discussion of consciousness and non-consciousness is beyond the scope of this essay, suffice it to mention that Eastern mind-body theories, such as those of Shigenori Nagatomo do not differentiate between ‘conscious’ and ‘unconscious’ behaviour in humans or non-humans (Nagatomo 1992). Rather, there is a spectrum of hazy-to-clear-consciousness movements, directions and dispositions. Interoception—the internal operation of an organism that can be applied to machines, too—is, on this account, not not conscious. It simply occupies a different place on the spectrum, which has no final point, no ‘clearest of all forms of clear consciousness’. Different animals have different interpenetrations of different hues or intensities of hazy consciousness.Footnote 16 Other conceptions of the relationship of sentience to sapience eschew consciousness altogether. For example, for Eduardo Viveiros de Castro, nature is not the universal ground of multiple cultures. On the contrary, a commonly shared culture is the ground for pan-sentient multi-naturalism (de Castro 2016). In similar fashion, in Chinese philosophy, where processual, in situ creation entails the interpenetration of a vast quantity of existents, the emphasis is on dynamic co-articulation, represented by the principle of the one and the manyFootnote 17—or temporary unity in disparity—comparable to Whitehead’s principle of one and many (Wen 2010), and his process philosophy more generally.Footnote 18

Following pragmatist,Footnote 19 more specifically, John Dewey’s notion of thought (Dewey 1976)—where (similarly to Nagatomo and Wen, and other more recent commentators on the relationship of pragmatist to Asian notions of thinking, such as Richard Shusterman’sFootnote 20), thinking is considered actional, I use the word ‘thought’ to refer to temporal, spatial, material and immaterial emergence and dynamics. To return to one of Hayles’s examples: the difference here is between composing a (traditional) piece of music, where the composer steers the process in a particular direction, and experimental composition, which is neither teleological nor the work of a single agent, but rather a form of distributed thinking and co-evolution through relationality.

In a nutshell, my argument is that understanding ‘alien' thought requires an understanding of (a) thinking by doing; (b) distributed other-than-human thinking (and, implicitly, agency); and (c) understanding the ontological indeterminacy of the internal dynamics qualitatively, rather than in a formulaic way. Understanding these modalities of thought will, in turn, enable us to understand the indeterminacy of incomputability and n+ dimensions; of temporal swarming; and of inscriptive-significational errance. All are present in machinic procedures as well as in the work of Marcel Duchamp, John Cage, and Xu Bing. As we know from Howard Gardner’s theory of multiple intelligences (Gardner 2011), there are many non-abstract human modalities of thought. A tennis player thinks by doing; in and through movement, a complex interplay of rhythm, action, reaction, and tactics that includes the environment, the various game elements, other players, and is kinaesthetic as well as interactional. Additionally, indeterminate artistic procedures delegate what, in the above example, is the synthesising agency of the player, to the environment, time, rhythm, procedure, material and immaterial objects. They think by doing in a distributed manner where distributed agency does not pre-exist action, but, rather, forms and transforms within the action itself (Ingold 2018). Lastly, indeterminate artistic procedures rely on deterministic structures, which is why they can be compared to machinic/algorithmic procedures. They are not arbitrary, random, or disordered. Rather, they use both deterministic parameters and randomness to cue ontological indeterminacy, which is why they can be said to operate as diagrammatic machines.

3 Duchamp, incomputability and n + dimensions

In a single sentence, the work of the ‘father of conceptual art’, Marcel Duchamp, can be described as the study of the ways in which diagrammatic machines produce and reproduce forms, objects, and ideas. The time Duchamp spent as a librarian at the Bibliothèque Sainte-Geneviève, Paris, in 1913, led to his life-long interest in the dimensional parameters of thinking by doing. Much of his work with indeterminacy was indebted to the scientific work of the time, such as Henri Poincaré’s investigations of non-Euclidian geometry, which refuted the invariant nature of geometrical theorems.Footnote 21 Duchamp’s 1914–16 Three Standard Stoppages is a direct response to Poincaré. Using a string metre, a reference to the platinum metre conserved in the Parisian library, but, in this case, a purposefully malleable material, Duchamp dropped three strings of 1 m in length from the height of 1 m onto a black canvas. The obtained shapes, full of twists and curves, were used as a template to reproduce the 3 m in wood, which he subsequently encased and entitled Three Standard Stoppages.Footnote 22

Duchamp encased only 3 m, however, he repeated the experiment many times. The string metre changed every time it was dropped, demonstrating that there is no such thing as a universal standard metre, both in terms of shape and length. More significantly, Duchamp delegated agency to string, gravity, air, momentum, time, and the consistency of the surface onto which the string pieces fell: canvas. He also used computational methods to record different time-steps on a canvas, a sampling process that monitored occurrences in time while relying on (nascent) mapping and visualisation techniques. Apart from using deterministic parameters to demonstrate indeterminacy, Duchamp also showed two aspects of incomputability. Firstly, the precise position and shape of the string metre is unknowable. And, secondly, the medial situated-ness of the process is inseparable from that process. For Cornelia Vismann, all media, understood both as mediatic relations and as gadgets, objects, programmes, and protocols, engage in auto-praxis [Eigenpraxis] (Vissmann 2013, 84). Moreover, no thing or process is ever independent of its conditions of coming into being (space, time, and environmental forces), which is why, for Vismann, the agent-thing iteratively steers emergent processes in new, and, for humans, often unpredictable and imperceptible directions (84). Combining, one the one hand, Turing’s question of the limit of computability, and, on the other, Claude Shannon’s information theory where information does not apply to the individual message but to signal crafted from noise (Shannon 1949)—Gregory Chaitin suggests that computation, too, consists of unknowable probabilities (Chaitin 2005). Data entropy leads to algorithmic randomness resembling an infinite series of coin tosses where the outcome of each toss is unrelated to the previous one. Chaitin’s name for this is process is Omega—an infinitely long number whose digits, like Duchamp’s string metre (imagined as a kilometer, a hundred thousand or million kilometers) have no repeatable pattern.

Related to the halting problem—the question of whether or not a programme will halt after a thousand, million or billion years—Omega is ‘the concentrated distillation of all conceivable halting problems’ (Calude quoted in Chown 2007, 328). As such, it is ‘a cabalistic number’ which ‘can be known of, but not known through human reason (Bennett quoted in Chown 2007, emphasis mine). As a sequentially ordered computational processing of zeros and ones, Omega shows that there is an internal, intrinsic dynamic at work in every computation process negating the teleological, logic- and sequence-locked view of computation where randomness is seen as an error in the computation’s formal logic. This further means that, like Duchamp’s Three Standard Stoppages, incomputability is an ontological possibility—a possibility of an indeterminate coming-into-being—within and inseparable from computation. In Three Standard Stoppages, we see this in a schematic, step-by-step (or metre-by-metre) way; the example is useful precisely because it’s simple. Duchamp’s Bride Stripped Bare by her Bachelors, Even or The Large Glass expands incomputability into n + dimensions. In this (famously opaque) work, which took Duchamp 8 years to make, from 1915 to 1923, small cause-effect relationships are displaced within higher-order principles inherent in the system causing (what looks like) incongruity (Fig. 1).Footnote 23

Fig. 1
figure 1

© Artists Rights Society (ARS), New York / © Association Marcel Duchamp / ADAGP, Paris and DACS, London 2021. Courtesy of Philadelphia Museum of Art, ADAGP and DACS

Marcel Duchamp, The Bride Stripped Bare By Her Bachelors, Even (The Large Glass), 1915–1923. Philadelphia Museum of Art, Bequest of Katherine S. Dreier, 1952, 1952–98-1 

As Duchamp repeated many times in his career, ‘three dimensions can only be the beginning of a fourth, fifth and sixth dimension, if you know how to get there’ (Duchamp in Tomkins 2013, 93). Referring to the fact that Albert Einstein called the fourth dimension the ‘fourth coordinate’—not the fourth dimension—and that time exists even ‘in a thin line’ (93), Duchamp insisted that all objects had n + dimensions, only that humans lacked (a) sense(s) to perceive them.Footnote 24 In his experimentations with these dimensions Duchamp turned to the Renaissance mathematician Girard Desargues and his analysis of the conic geometry of Renaissance perspective. Desargues argued that the lines emanating from a centre of perspective create cones through which planes can be intersected at various angles, and ‘images’ ‘’placed’ in a painting. The theorem he derived from this was: two triangles are in perspective axially’ if they are ‘in perspective centrally.’Footnote 25(Kodokostas 2014; emphasis mine). The instability as well as interconnectedness of the various perspectival elements is precisely what we see in The Large Glass. The only difference is that The Large Glass is not a Renaissance perspective painting but oscillates between installation, sculpture, and performance, demonstrating the contingency of what are taken to be the immutable elements of geometry as well as art-making: horizon lines, centres of perspective, objects, forms, axes, sign systems, and, in the case of art-making: the agency of the artist. Building on his existing practices of deterministically ‘staged’ indeterminacy Duchamp treated all those elements as variable. The variability of every single component of the work is also the reason why The Large Glass bears no relation whatsoever to received ideas about aesthetics, form, content, nor does it have a signature artistic ‘style’.

Rather, the work stages inter-dimensional conversations between space-time, movement, geometry, and process, explored through a sequence of micro-logical, cause-and-effect steps. The only macro-logic can be found in the title, much like Chaitin’s ‘Omega’ serves as a linguistic delineator for a sphere of not-fully-defined meaning. Bride Stripped Bare by her Bachelors, Even brings together the mechanics of diverse erotic forces, which Duchamp studied through a series of arbitrarily determined procedures, interconnected via language games and chance operations. An example of a language game is the bride—whose ‘bodily envelope’, to use Duchamp’s words, is auto-mobilistic in natureFootnote 26 (Duchamp 1994, 62). Glass was used to durably capture ephemeral performative actions. In one part of the work, the nine bachelors’ desire is materially embedded in the canvas with the aid of nine malic molds from which matches, dipped in colour, were fired at a photograph of white gauze creased by the wind using a children’s toy gun. The chance operations of the captured wind were amplified by the chance operations of the firing mechanism and transposed onto the glass in which holes were drilled. In other words, The Large Glass is a residue of procedurally determined, micro-causally related actions ‘whose architecture relied on a ‘three-beat rhythm’, as Duchamp suggests in reference to Desargues’s theorem (Duchamp quoted in Schwarz 1974, 157).

Throughout his career, Duchamp continued to practise distributed thinking. He considered works as diverse as The Large Glass and his 1946–66 Étant donnés a series despite the fact that, to the human perceiver, they look nothing like a series. The seriality of these works is, for Duchamp, anchored in imperceptible dimensions, comparable to the ‘movement’ needed to fit one glove inside the other. In a pair of gloves, the right-hand glove does not fit inside the left-hand glove, because the ‘thumbs’ are on opposite sides. However, if one glove is pulled inside out, the two gloves can be superposed, one inside the other. Similarly, vastly visually different works can be seen as a series via n + dimensionality—an actual-virtual dynamic that does not change one or more (discrete) elements, but, rather, their inter-relations. Duchamp’s work delineates the ontological dimension of indeterminacy, which can never be fully instantiated or exhausted. Instead, it forms part of an infinite virtual repository, understood not as an enclosure but as a dynamic in operation, just like infinity, to which Duchamp gave much thought,Footnote 27 is a complex dynamic, that, in various traditions, manifests as B’reishit or as dancing angels. That is, Duchamp’s work does not merely illustrate incomputability as non-computability or unpredictability. It affords insights into other-dimensional developments within determined, systemic parameters, instantiating new possibilities while, at the same time, relativising the system within which it operates. In similar vein, the work of John Cage questions the system within which it operates probing the qualitative dimension of ontological indeterminacy in the temporal dimension and enabling an understanding of temporal swarming, which, in contemporary technical operations, creates temporalities that are ungraspable and inexperienceable by humans, but that, nonetheless, modulate human perception and affect.

4 Cage and temporal swarming

As a proponent of non-anthropocentric perspectives on art and life,Footnote 28 Cage was deeply influenced by Duchamp’s work as well as by the post-WWII research into machinic perceivers and data generation, which co-developed with the above-mentioned plasticity-inflected research into machine learning and neuroscience. For example, the work of designer György Kepes, who mined the ‘invisible world’ with radars and X-rays, transforming extensive quantities into intensive qualities (Kepes 1956), or, the work of architect Richard Buckminster Fuller, who delivered 10-h lectures with the specific purpose of liberating the transversal working of perception and cuing new synaptic connections. Similarly, Cage focused on temporal and medial multi-dimensionality, and on procedures that treat sound as an environment rather than a discrete temporal object. To this end, and in the spirit of ‘purposeful purposelessness’ (Cage 1961), he created multi-modal milieus where diverse flows of heterogeneous information, both recorded and performed, amplified indeterminacy through, among other procedures, the use of electronic musical technologies and composition techniques.

Cage used musical duration to foreground the (qualitatively different) temporalities inherent in every existent, however, small, and show that human perception captures an infinitesimal part of the tapestry of incessant worldly becomings. Like Duchamp’s use of scientific discourse and practice, Cage ‘imported’ another field of knowledge into artistic work. This field—the philosophical teachings of Zen,Footnote 29 which are the antithesis of teleological traditions—fundamentally transformed music, composition as well as (notions of) temporality. Cage’s 1951 Music of Changes famously used procedures derived from the ancient Chinese divination text I ChingFootnote 30 to determine duration, dynamics, rhythm, pitch, and the ordering of events within the composition. Similarly, his layering technique, such as the 1958 Fontana Mix, consisted of transparent sheets with dots, circles and lines randomly placed over opaque sheets with dots and lines, which the performer read and played exactly as they would read a ‘traditional’ score where notations are indicators of musical actions (Fig. 2).

Fig. 2
figure 2

© John Cage. Courtesy of John Cage Trust

John Cage, Fontana Mix, 1958

Instead of determining the relationship of notation to the production of sound, as is customarily the case, Cage determined the rules by which the performer may read the configurations that regulate sound production. The relation of the score to the quantity and quality of possible interpretations was here purposefully indeterminate: one-to-many or one-to-infinity. Cage’s 1952 Untitled Event is perhaps the best example of this relation (although there are many others, like Variations VII), which manifests as temporal swarming, and is, in this sense, an emergent diagrammatic machine.

Like biological swarming, which relies on distributed, heterogeneous agency, temporal swarming is based on perpetual intra-action. For Barad, ‘intra-action’ is different from interaction in that it does not depart from discrete objects and entities—that exist as discrete objects first, then enter into interactions with other discrete objects. Rather, in intra-action, relationality pre-exists relata—unbounded objects and entities (Barad 2007) an idea that is perhaps easier to understand in sound than in many other areas. By definition, temporal swarming unfolds in a realm beyond human perception, and separates actual temporal movement from the experienced one. Untitled Event, a collaboration with Merce Cunningham and Robert Rauschenberg, likewise, consisted of a number of micro-events and their chance-based dramaturgiesFootnote 31: Cage reading a text on the relation of music to Zen Buddhism and performing a composition with a radio; Rauschenberg playing old records on a hand-wound gramophone and flashing ‘abstract’ slides; David Tudor playing a prepared pianoFootnote 32; Cunningham dancing chased by a dog; film clips of the school cook and the setting sun being projected onto the ceiling; various participants, who were given a particular duration, such as 2′33″, performing musical or choreographic partitions and improvisations (Goldberg 1993, 126–127).

As the ‘content’ of the piece was determined solely by the formal aspect of duration, it afforded one-to-many or one-to-infinity possibilities of reception. Unsurprisingly, some audience members heard a lecture on Zen while others heard a lecture on silence, Henry Thoreau, or no lecture at all. Some thought that Untitled Event went on for 45 min precisely, others for close to 4 h, among many other perceptual ‘disagreements’ (Duberman 1972, 11–18). More significantly, the entwinement of the different tempi, timbres, durations, pitches, visual, verbal, kinaesthetic and environmental information created different pulsations which further triggered divergent temporal directions and affect modulations. Depending on the particular ‘micro-movement’ they were attuned to, the audience members navigated the environment following the integral temporal operation of the piece’s multiple micro-events. Traditional (Western) music is teleological because it is linear as well as tonal. Linearity means that musical determination is based on ‘implications that arise from earlier events in the piece’ (Kramer 1988, 20) while tonality marks ‘hierarchic relationships between tones’ (25). The combination of the two makes music ‘inescapably goal-directed’ (25). In contrast, Untitled Event relied on (a multiplicity of) integral time(s) which made audible ‘unique organizations of time’ intrinsic to an individual source or partition with no over-arching development (Epstein 1985, 58). The interpenetration of the different musical, visual, literary and kinaesthetic events created temporal swarming, whose micro-pulsations changed, at every point, the direction of the piece, ‘infinitising’ Untitled Event’s tempi, both in quantitative and qualitative terms. For Cage, such distributed-thinking-by-doing-derived temporal swarming is a sensorial articulation of the Zen notion of unimpeded interpenetration. Referring to D.T. Suzuki,Footnote 33 whose classes at Columbia University, New York, Cage attended from 1949 to 1951, Cage writes:

unimpededness is seeing that in all of time each thing and each human being is at the center…. Interpenetration means that each one of these [things or beings] is moving out in all directions penetrating and being penetrated by every other [thing or being]… no matter what the time and the space. So that when one says that there is no cause and effect, what this means is that there is an incalculable infinity of causes and effects and that in fact each and every thing in all of time and space is related to each and every thing in all of time and space (Cage 1968, 46).

This statement, which resonates with the quantum theory,Footnote 34 indicates the vastness of what, for humans, is the perceptual ‘unworld’ where the temporal, and necessarily dramaturgical weaving of stimuli, occurs below the threshold of perception. The emergent agency of the different swarming activators is very similar in technical environments. As is well known, there is a temporal gap between human and technical perception, which creates a realm of technical autonomy (Hayles 2017, 142). The most frequently used examples come from high frequency trading where, as Donald MacKenzie has argued, behaviours like ‘queuing’ (where existing bids are altered on the basis of temporal advantage, according to the ‘first come first served’ rule (Mackenzie 2019, 50) and ‘spoofing’, which refers to the placement and cancellation of orders, based on the millisecond temporal advantage and price drops caused by cancellations (48–49) are produced. High frequency trading is, of course, a specific domain of human-machinic endeavour.

However, the reason why these behaviours are relevant is that they show, in qualitative terms, that informational-algorithmic ecologies do not consist of pre-formed, immutable interfaces. Algorithmic interaction is, instead, full of complex ‘swarm behaviours’ (Lange 2016) predicated on temporal processes that brim under the surface of machinic operations, for example, accelerated pattern recognition and syntheses of diverse inputs. Despite expressions like ‘webpages’, the internet does not (or no longer) consists of pages but is, instead, an interpenetration of multiple ‘temporal latencies’ (Dieter and Guthier 2019, 63). In asynchronous scripts, such as JavaScript and XML, applications ‘continually respond to input and work through interrelated scripts, style-sheets and mark-up’ (63). Their ‘geographically dispersed operations’ do not ‘resolve into a uniform, mechanical rhythm’; on the contrary, they ‘propagate a fluctuating momentum based on highly dispersed ‘data-pours’ (Helmond quoted in Dieter and Gauthier 2019, 63; emphasis mine). The dynamism of these processes and their micro-pulsations produces affective modulations, some of which are used to instrumental ends. However, although the various authors focus on the political effects of chrono-design, they also acknowledge that anything that can be called chrono-design antecedes rather than precedes temporal swarming, inherent in micro-temporal machine processing. By definition, information is never first ‘composed’ then presented. It is always already operationally active (Shannon 1949; Hansen 2015), which is to say that it is changing all the time. Michael Dieter and David Gauthier call this medium-inherent process tertium quid or ‘third something’, a form of subterranean interpenetration and communication—in Shannon’s sense of the word—through the intersection and binding of ‘signals into reiterative sequences of action’, and the ‘production of divergent temporal processes’ from what they call the ‘milieu intérieur of machines’ (Dieter and Gauthier 2019, 66). Like integral time in Untitled Event (of a dog barking, poetry being recited, or images of the college cook flashing), interior machinic developments manifest as an operational split between human perception and technical operations. For Mark Hansen, this highlights the production of two different registers ‘the experiential and the operational’ (Hansen 2015, 71) establishing ‘the experiential duration of consciousness versus the operational micro-temporality of the apparatus’ (37). As in a temporally swarming composition, micro-sensors, computational processors and algorithmic operations environmentally transform the very possibilities for perception (Dieter and Gauthier 2019, 67).

The crucial point in both Cage et al.’s composition and technical environments analysed by Lange, Hayles, Dieter and Gauthier is that temporal swarming, which operates through aggregation, de- and re-aggregation, has a performative effect. It unfurls new temporal directions and triggers novel behaviours, which further cue action-reaction sequences. Yet, because swarming is an emergent diagrammatic machine, they do not follow, or conform to, easily comprehensible paths. Rather, they set things in motion through elastic connections, formations and transformations in and of different registers, opening onto infinity that is both changing all the time and ‘inscribed’ in generative machinic processes. The dichotomy of inscription and unforeseeable development, perceived as randomness, stands in the way of understanding another aspect of alien thought: the fact that indeterminacy is not independent of but forms part of operative rule architectures. Working with Chinese characters in a non-digital realm, Xu Bing sheds light on the co-emergence, and more importantly, the co-constitutivity of these processes.

5 Xu and inscriptive-significational errance

To err means to wander freely or stray from a set or programmed (inscribed and prescribed) path. Over the past 3 decades, the Chinese avant-garde artist Xu BingFootnote 35 has developed a particular diagrammatic machine in the field of inscriptive praxis and culture. This diagrammatic machine makes it possible to understand the realm beyond the programming-randomness dichotomy while, at the same time, hinting at the mutation of representation itself. Opening the underlying indeterminacy of all writing and inscription to scrutiny by way of multi-directional errance in significational and graphic ecologies, Xu’s work articulates the non-abstract, non-universal nature of inscription through the embodied and contextually embedded practice of calligraphy. Unlike the English alphabet, where B or G are invariant abstract symbols despite the fact that their combinations with other letters form different words, Chinese characters are condensed images. They are also logographic. Each character stands for one morpheme (instead of an individual phoneme of the spoken language) and is, in the majority of cases, composed of a semantic radical and a phonetic component (Chen et al. 1996). Although most characters share the same radicals, fall into the same semantic category, and have a similar shape, there are also groups of radical-opaque characters, which share the same sematic radicals but have no semantic relation whatsoever (T’sou 1981). This non-representational complexity has been hypothesised as correlated to the use of (human) neural networks that are not usually activated in the processing of Latin or English alphabets—in other words, in the processing of invariant abstract symbols (Tan et al. 2001).

The fact that Chinese characters encompass time, event-hood and praxis as well as retain their visual genealogy and mutability is also the reason why calligraphy has, for millennia, played such an important role in natural-cultural inscription.Footnote 36 Xu’s 1987 A Book from the Sky, also known as A Mirror to Analyse the World, consists of several 500-foot hand scrolls on which thousands of characters are printed in ink-painting style. Set in front of the scrolls are also boxes of books bound in blue paper to resemble traditional Chinese books (Fig. 3).

Fig. 3
figure 3

Xu Bing, Book From the Sky, 1987–1991. Mixed media installation/ hand-printed books and scrolls printed from blocks inscribed with “false” Chinese characters. Installation view at Ullens Center for Contemporary Art, Beijing, China, 2018. ©Xu Bing Studio. Courtesy of Xu Bing Studio

The characters, which Xu spent years carving, are composed in the same way as regular Chinese characters; they also look like real Chinese characters from afar. Yet upon closer examination, these characters form deviant or non-existent words. In Chinese calligraphy, where landscape is a pictographic calligraphic formation, the calligraphic stroke is the embodiment of nature’s forces, as can be seen in Xu’s 1999–2000 Landscript Sketchbooks which—literally—render landscape as calligraphic writing. In the Book from the Sky, the cosmological setup is a configuration that manifests elements generative of a world. Xu’s in-between characters emerged from a recombination of the existing characters’ parts and fragments, which are modular, since, in Chinese writing, simpler units recur in more complex cosmo-graphics. The reason why the characters oscillate between intelligibility and unintelligibility is that they follow a pattern of formal repetition, a deterministic procedure, comparable to those of Duchamp and Cage. As Jean François Billeter notes in his treatise on the Chinese art of writing, a well-formed character should possess an ‘organic autonomy’ equivalent to that of a living being; it should be fitted in an imaginary square, with a centre of gravity and a ‘silhouette’—as clean an outline as possible (Billeter 1990, 32–34). In complex characters, the different parts should be adjusted in terms of size and density to give the character a ‘body sense’ (34). This last point is crucially important as a character of appropriate size, density and ‘body sense’ acts as an ‘empirical paradigm for synthesis’ (36; emphasis mine). Acting as an empirical paradigm for synthesis further reiterates that characters do not represent a signified. Rather, they afford a non-abstract synthesis, dependent on execution and the particularity of that execution. This further means that change and mutation are part and parcel of the rule structure, not separate from it.

The diagrammatic nature of Chinese characters comes from marginally varied emphases and combinations of different modular parts. It’s highly unlikely that Xu had Jacques Lacan in mind while working on The Book from the Sky, however, Lacan provides a pertinent parable when, referring to his Ecrits, he says: ‘they should be placed in water … in order to unfold’ (Lacan 2013, 27). Unfolding is a spatial and temporal movement, which, when occurring in a textual or calligraphic ecology, is iterative and intra-actional. Xu’s work does not radically change Chinese characters. Instead, it relies on the movement of recursive repetition, which triggers and co-constitutes emergent paths in a microscopically errant manner, through marginally varied emphasis. Given its cosmological, environmental, and embodied nature, Chinese writing is inseparable from cultivation or culture.Footnote 37 To understand the process of algorithmic self-creation, usually described as opaque, unknowable, even dangerous, as in the much-quoted Ullman’s remark: ‘when code passes into algorithms and algorithms begin to create new algorithms there is no knowing what will happen!’ (Ullman quoted in McCorduck 2019, 253), it’s important to understand the medial aspect of inscription, which is anything but invariant. As Vismann has argued, inscription is inseparable from self-praxis and cultivation (Vismann 2013).

Indeed, there is a clear point of comparison between culture—as mutual co-constitutivity of agents and environments—and neural networks, where connections are modulated through a (re)-distribution of weights which contribute to the tendency of neurons ‘to fire through a function of the strength of the connection’ (Wyse 2020, 96). Neural networks create media based on the mechanisms configured during training on input data. However, what is also transparent in neural networks, in supervised and in reinforced learning, is the consecutively monitored and modified model of emergence, which has both an empirical and significational relevance. As Paul Bodily and Dan Ventura, whose work focuses on the mediality of the neural networks’ modulating affordances (rather than on output), suggest, an auto-productive, autonomous developmental logic, evident in unsupervised learning, is fundamental to neural networks (Bodily and Ventura 2018). ‘Auto-productive’ here suggests both self-movement and repetition. And, as Catherine Malabou has argued, in all machinic operations, any notion of invariant repetition is accompanied by spontaneous movement, given that the ‘automatic’ in auto-production—on which autonomy depends—comes from the double valence of automatism: as involuntary repetition and spontaneous movement, as both ‘constraint and freedom’ (Malabou 2019, 100).

Much like Xu explicitly works with traditional forms to produce novelty, interactive algorithmic ecologies are constantly changing, not in a consecutive way, as in supervised or reinforced machine learning, but as perpetual oscillations between intelligibility and unintelligibility. In deep learning network architectures, neurons are connected through synaptic weights to neurons in deeper layers. Typically, the adjustment of weights is seen as part of processual programming, where human intervention alternates, in a consecutive manner, with the generative aspect of the networks. However, the difference between these operations and ‘algorithmic contagions’, as Neil Johnson and Ullman call them (Johnson 2019; Ullman 2019), is that the former are knowable, or semi-knowable, while the latter are unknowable. And this is precisely where the piloting role of empirical and significational indeterminacy comes in. Xu’s work does not foreground a change of agencies (human intervention alternating with material and machinic agents) but, rather, a semi-miasmic operation in a state of almost-equilibrium. The dynamics of this operation, which are modulatory, are inseparable from iterative change in reading as well as in writing.

Xu’s Introduction to New English Calligraphy, first shown in 1996 in Finland, demonstrates this process in an even more striking way. As in The Book from the Sky, in this work, graphs assembled from the basic strokes of brush-written calligraphy appear to be Chinese characters but are, in fact, a mixture of Chinese-looking Roman letters spelling the words of nursery rhymes, sayings from Chairman Mao as well as non-existent characters that emphasise, even more than the existing words, the contagion created by the transposition of the former to the latter. The installation is staged as a workshop where viewers practise this in-between calligraphy, generating different characters—as well as different meanings and conversations—through iterative modulation. As Xu reports, many visitors wrote to him in this in-between language that they had learned and perfected (Xu et al. 1999). The specific brand of distributed-thinking-by-doing Xu employs in this work relies on shape, emphasis, and interaction, however, also on the mutually co-constitutive relationship of rules to praxis and execution. In addition to showing the underlying graphic, significational and cultural disequilibrium of both inscription and action, Xu’s work also shows that navigating the disequilibrium requires a sensitive distribution of agential moments across objects and entities, in other words: actional multi-agent thinking.

6 Conclusion

For Malabou, thinking, human or artificial, is a method that continually re-combines and re-articulates, creating an always-changing-almost-equilibrium. This action- and operation-orientated notion of intelligence—not as an innate disposition or programmed ability, but as a process that ‘unfurls continuously’—hinges on Jean Piaget’s interpretation of equilibrium as a ‘mobile point of stability’ (Piaget quoted in Malabou 2019, 10) as well as on Dewey’s ‘method’—a sequence of changes and transitions, a ‘constant adaptation in time’ (12). Such a dynamic reveals the Möbius-strip-like relationship of deterministic parameters to indeterminacy, of design to chance, rule to iteration, and otherness to identity, which is transversal and co-constitutive. As a method and process, distributed-thinking-by-doing emerges from de- and re-composition, dis- and re-location, errance and plasticity, constantly modulating its operational coherence and creating new unknowns. Such thinking cannot be aligned—either in the discussed artistic or in machinic/algorithmic procedures—with sequence-locked axioms or with randomness. Rather, distributed-thinking-by-doing is a-rational. In the experimental tradition, a-rationality is neither rational nor irrational; neither entirely determined by a sequence- and logic-locked procedure (as the Latin root of the word, ratio from reor—to count, calculate—would suggest), nor procedure-less. As the above artists’ work shows, it is always medium-specific, where the medium, in Vismann’s sense of the word, stands for an entire host of (oscillating) relations.

This in-between, both-and, neither-nor dynamic is important for two reasons: every mode of rationality, when taken to the extreme, is irrational; it produces effects antithetical to reason. Likewise, every form of irrationality, when systematised, appears rational and is (or at least can be) performatively efficacious, which is to say that, in operational terms, it acts as if it were rational. To move away from this impasse engagement with the manner in which alien thought opens onto infinity and complexity in and through incomputability, virtual dimensions, nascent temporalities, and indeterminate relations of signification to operation, is needed. Much like there is material agency and distributed thinking emerging from sentient ecologies, immaterial agency can and does emerge from non-sentient ecologies. As Tim Ingold has extensively argued, the thinking-acting of the hunter-gatherers and their environments occurs in a sentient ecology where everything co-thinks in a continually evolving, non-matrixed way (Ingold 2011). Such a form of intra-action can also be called dance. Given our increasingly ‘synthetic’, digital-natural environment (Bratton 2015) it’s crucially important to acknowledge the necropolitical effects of instrumental algorithmic operations without falling into the instrumentalist trap of attempting to harness—instrumentalise—alien thought. Rather, the capacity of non-programmed machinic and algorithmic operations to instantiate infinity and complexity, like the discussed artists’ (malleable) diagrammatic machines, should be approached in a multi-perspectival, infinitely interpretable way, like B’reishit, or the question: ‘how many angels can dance on the head of a pin?’, both of which are primarily performative: they push the boundaries of human comprehension-perception by virtue of their very existence.