Art and artifice
Coined in 1956 by John McCarthy, the term ‘artificial intelligence’ had its critics among those who attended the Dartmouth Conference (which famously established the field of AI); Arthur Samuel argued that ‘the word artificial makes you think there’s something kind of phony about this, […] or else it sounds like it’s all artificial and there’s nothing real about this work at all’ (in: McCorduck 2004, p. 115). The historian of AI Pamela McCorduck notes that while other terms, such as ‘complex information processing,’ were also proposed, it was ‘artificial intelligence’ that endured the trial of time. According to her, it is ‘a wonderfully appropriate name, connoting a link between art and science that as a field AI indeed represents’ (p. 115). She is referring indirectly here to the origins of the word artificial; in Latin, artificialis means ‘of or belonging to art,’ while artificium is simply a work of art, but also a skill, theory, or system.
When the philosopher Vilém Flusser traced the etymology of the word ‘design’ in his The Shape of Things: A Philosophy of Design (1999), he referred to this relationship between art and artifice to argue that all human production, all culture, can be defined as a form of trickery. Flusser rejects the distinction between art and technology, and goes back to these ancient roots: the Greek for ‘trap’ is mechos (mechanics, machine); the Greek techne corresponds to the Latin ars; an artifex means a craftsman or artist, but also a schemer or trickster—to demonstrate that in their essence all forms of making are meant to help us ‘elude our circumstances,’ to cheat our own nature. Culture itself becomes a delusion brought about by means of design—a form of self-deception that makes us believe we can free ourselves from natural restrictions by producing a world of artifice. From doors to rockets, from tents to computer screens, from pencils to mechanized intelligences, Flusser selects his examples to show that, ultimately, any involvement with culture is based on deception: sometimes ‘this machine, this design, this art, this technology is intended to cheat gravity, to fool the laws of nature’ (ch.1, n. pag.)—and sometimes to trick ourselves into thinking we control both gravity and the laws of nature. In that sense, art and technology are representative of the same worldview in which cultural production must be deceptive/artful enough to enable humans to go beyond the limits of what is (humanly) possible.
Flusser refers to the act of weaving to explain the ‘conspiratorial, even deceitful’ (ch.18) character of design. In the process of carpet production, he points out, knotting is meant to deny its own warp, to hide the threads behind a pattern, so that anyone stepping on the finished rug perceives it as a uniform surface, according to the designer’s plan. He offers weaving as one of the primordial forms of cultural production to embody trickery, but the same holds true for any form of design. The trick is always based on misdirection, shifting the end user’s attention from the material to the application, from the current state of things to emerging possibilities and new futures. Designing is a methodical way of crafting alternative realities out of existing materials—a process of casting the intended shape onto the chosen fabric so as to create a new possibility. The material used in that process must, so to speak, dematerialize: it has to disappear from view and give way to the new object—to abstract the end result from the point of origin and the labor process. By obfuscating some components while exhibiting others, ‘ideal’ design enables an end user’s cognitive efficiency.
Patterns, layers, and repetitions
For Flusser, any product of human making is both an object and an obstacle—Latin objectum, Greek problema—or, more specifically, any object is also an ‘obstacle that is used for removal of obstacles’ (ch.9). To move forward, we solve problems that lie ahead and obstruct our way; we produce objects that help us remove these obstacles; but the same objects turn into obstacles for those that come after us. In other words, since the results of human problem-solving are stored in objects, progress involves obfuscation and forgetting. We come up with a solution and, with time, this singular idea turns into a pattern; others use the already established template to produce new, more complex structures and these structures turn into new patterns, covering up previous layers of design with new design. To expedite the process of production, to advance, to move faster, the designer turns to these conventions and templates, choosing from a menu of preprogrammed options—or abstracting new rules based on previous patterns. And as the complexity of the production process increases, the reliance on patterns grows too. New design always depends on previous design, and this ultimate dependence on patterns and abstractions complicates understanding the process in its totality.
In the age of ubiquitous computing, speaking of obfuscation by design becomes of particular importance. In 2015, Benjamin Bratton called his model of the new kind of layering brought about by planetary-scale computation ‘the Stack’:
‘an accidental megastructure, one that we are building both deliberately and unwittingly and is in turn building us in its own image’ (p.5).
New technologies ‘align, layer by layer, into something like a vast, if also incomplete, pervasive if also irregular, software and hardware Stack’ (p.5). This makes it hard to perceive the Stack's overarching structure, indeed, to see it as design, however incidental. Today, we produce new technologies, new objects, to see, know, and feel more, to register what is normally hidden from our view, meanwhile, creating complex systems based on multiple, invisible layers and algorithmic operations whose effects are not always comprehensible even to the designers themselves.
Automations and automatisms
In her comprehensive account of what she calls ‘surveillance capitalism,’ Shoshana Zuboff points out the dangers of technological illusion—‘an enduring theme of social thought, as old as the Trojan horse’ (p.16)—that serves the new economic project in rendering its influence invisible. Surveillance capitalism claims ‘human experience as free raw material for translation into behavioral data,’ and turns that data into ‘prediction products’ that can later be sold to advertisers (p.8). Echoing the work of philosophers such as Bernard Stiegler (2014, 2015) or Antoinette Rouvroy (2016), Zuboff argues that the ultimate goal of this new form of capitalism is ‘to automate us,’ by reprogramming our behavior and desires. Various internet platforms that dominate the market prompt us to action, influence our decision making, relying on big data analyses of our preference trends online. Automated systems create statistical models to profile users, tracing any emerging patterns in countless interactions with digital products; patterns turn into further abstractions, new models that are later reflected in new products and solutions, which end up ‘automating’ us, guiding our decision-making without our knowing.
But is this process specific to AI-enhanced personalization under surveillance capitalism? Bratton has recently argued that what ‘at first glance looks autonomous (self-governing, set apart, able to decide on its own) is, upon closer inspection, always also decided in advance by remote ancestral agents and relays, and is thus automated as well’ (2019, loc.345, n. pag.). Any decision taken now relies on multiple decisions taken in the past; new design depends on previous design; a new object coalesces from an aggregation of old solutions. Culture is an amalgamation of such objects—objects that, ironically, become obstacles because they are meant to enable our cognitive efficiency. A tool becomes an obstacle because the results of our problem-solving and labor are already stored within it; a tool must never be seen as a tool, as its use must be intuitive—it must remain imperceptible; any new tool meant to advance the process is made with existing tools, and so the emerging layering of design in the Anthropocene makes it harder to distinguish between tool and fabric. Extending this to the ongoing automation of cognitive tasks in the age of ubiquitous computing, the phenomenon takes on new scale.
This is why the emerging need for transparency refers not so much to company politics of disinformation or algorithmic black boxes, as to the very essence of our culture, as a process of knowledge production, pattern formation, and concealment. Particular problems caused by the widespread adoption of automated decision-making systems, such as algorithmic bias, can have specific, targeted, solutions in the form of new policy, engineering standards, or better education. But a shift of focus from the particular to the total is more than an exercise in theory—it makes us realize that transparency has never been at the heart of our making, that design has always been a form of blackboxing. There is, in that sense, something deeply anti-cultural about transparency. Or, putting it differently, there is nothing natural about transparency by design: we have been programmed to cover up as we make, not the opposite.
The ongoing transformation of lived experiences into data is a new analytical paradigm that demands our intervention, truly calls for an ‘unboxing,’ an excavation of processes and data trails. But the opening of the algorithmic black box cannot be viewed only as a technical issue—precisely because any solution is, first and foremost, a result of cultural blackboxing. While contemporary debates on AI focus on transparency as a direct response to the opacity of algorithms, what we are in need of are approaches that aim to ‘unbox’ new technologies as objects–obstacles, solutions that aim towards cognitive automation, products that store the results of problem-solving performed ‘by remote ancestral agents,’ and that can thus perpetuate injustices via automatically accepted patterns and norms.