- 217 Downloads
Cryptographic technologies have become an essential and even ubiquitous component of online infrastructure, yet they are rarely acknowledged outside of technical literatures. This chapter contributes questions about the impact of these technologies on daily life, and suggests five threads for future analysis: encryption, obfuscation and hiding, code, language, and epistemology. While each has a technical side, in this chapter they are presented at the intersections with society, politics, and culture. In many cases, revitalizing these intersections requires looking for historical examples, when for instance, prior to the twentieth century, the technologies and our understanding of them were more fluid and integrated with cultural and scientific pursuits. The common dimension running through this history and its revitalization is a conception of cryptography and its cognate technologies as “media”.
KeywordsCryptography Code Encryption Media theory Internet infrastructure Digital culture
By most measures, we have entered an online world of ubiquitous cryptographic media. Although it is rarely acknowledged, the tide began to turn in the late 2000s (continuing the “crypto wars” of the early 1990s), when websites like Facebook and Google enabled fully secure web browsing – encrypting all data transfers, not just password exchanges, as had been typical previously. Other websites soon joined (Yahoo!, YouTube, Wikipedia, and so on), sometimes encrypting transmission by default, without explicit user knowledge or consent. Around the same time, many other Internet services begun using cryptography for data transmission and storage, and end-to-end encryption soon became the gold standard. BitTorrent was at this time a major driver of encrypted Internet traffic, and novel applications of cryptography (such as Bitcoin) soon joined the ranks, born out of the same sociopolitical milieu. In 2013, immediately following Edward Snowden’s disclosures of mass surveillance and global signals intelligence by state actors, the rate of deployment of cryptographic technologies skyrocketed. Contrary to popular belief, Snowden’s disclosures demonstrated that properly implemented cryptography is effective against the best surveillance and cryptanalysis techniques (with important caveats). Driven by privacy fears, consumer demand, and supply-side ease of implementation (commercial and free off-the-shelf products became widely available, such as Let’s Encrypt), rates of encrypted Internet traffic increased by double-digit percentages in the intervening years. 2016, for example, was a banner year: the largest increase in encryption use over the last 11 years (Korolov 2016), with over half of all web browsing encrypted and secure (Gebhart 2017) and with predictions that 70–75% of global Internet traffic will be encrypted between 2016 and 2019 (NSS Labs 2016; Sandvine 2016).
Despite the phenomenal rise in the use of cryptography, the emergence of a trillion-dollar computer security industry, unprecedented government interest and investment, and daily news stories describing the horrors of an insecure or overly secure Internet, academic work on cryptographic media has tended to focus on a few important but limited areas of investigation. Researchers in Internet Studies, Security Studies, and Science and Technology Studies have developed values and frameworks to analyze issues relating to policy change, privacy, and the ethical use of technology. Similarly, literature on the use of cryptography for privacy-enhancing technologies (PETs) – nearly a subfield into itself – is enormously vast. Technical domains of research, principally Computer Science, Engineering, and the mathematically oriented specialist field of Cryptography itself, are massive and well-funded (being, too often, cozy with corporate and government sponsors). Given the state of research and development, one might be mistaken in believing that cryptographic media are well understood. Far from it.
We might wonder, then, why have important questions not yet been asked? For instance, what is cryptography? Technologists, mathematicians, and engineers have answers, but they are not very satisfying – either doing too little or too much (the common plea that cryptography is just math is so broad that it risks explaining everything and nothing). Either way, these answers lack social and human richness. The questions motivating this chapter cut across many domains: why, given its ubiquity, is encryption not considered one of the fundamental media technologies of the twentieth century (alongside radio, telephone, and television), and how do we explain its emergence and its future?
Asking a basic question about the social and human dimensions of cryptographic media brings into view a host of practices often taken for granted and highlights the ways that social activities underpin existing domains of analysis. Who uses cryptography, how, and for what reasons is still poorly understood. We still lack a sufficient theoretical framework for cryptography, which would help inform existing discussions of social and human issues broadly, such as policy and ethics. Perhaps cryptographic media is not studied because we mistakenly believe that encrypted communication changes nothing, since, after all, encrypted communication is usually decrypted at its terminal location, seemingly returned to its original. But, by ignoring the subtle mediatic changes of encryption and decryption we may have missed something, and possibly something important, since nearly all digital communication is underpinned by ubiquitous cryptography. As we learned long ago from media theory investigating both old and new media alike, taking stock of media effects and understanding their processes and dynamics is important analytical work, even when the effects are very subtle and nearly invisible.
What resources might we draw on for this cryptographic media theory? Comparisons to mimetic (and largely pre-digital) media, such as radio, telephone, and television, seem to result in an analytical dead end. Instead, we might look to comparisons with writing media. Epistolary practices, in particular, might be updated from paper and pen to encrypted emails and social media messaging and offer ready comparison to the coded letters of the past. The media effect is not limited to material transformations either – cryptography vacillates between the temporality of microseconds needed for encryption and trillions of years needed for cryptanalysis (Brunton 2014). And, if we were to draw comparison to mimetic media, we would at least need to redraw the frame: no longer analog TV but now Netflix (encrypted with digital rights management) or peer-to-peer media transmission through the cryptographically hashed and tunneled BitTorrent network and “plain old telephone system” (POTS) replaced by secure voice-over-IP telecommunication. Deeper comparisons might look to new cryptographic applications, such as the emerging and exotic forms of homomorphic encryption, quantum cryptography, or blockchain and cryptographically enabled social consensus technologies.
Before moving on, it is worth pausing on the work of Friedrich Kittler, who can be read as a media theorist deeply interested in cryptography and therefore might be seen as an occult figure for the field. Specifically, Kittler drew comparisons between the media of alphabets, writing, and cryptology. These media are connected through discrete and combinatory analytics and common media effects. Famously, Kittler saw how the typewriter-cum-encryptor eclipsed the hand as a technical media (Kittler 1999). From this history, Kittler concluded that our age of “high technology… ends in cryptograms that defy interpretation and only permit interception” (Kittler 1999). Thus, the media teleology of writing technology is one that advances from hand to typewriter to cryptography.
Unfortunately, Kittler approached cryptography in much the same way as other scholars of military signals intelligence and computer security, focusing on technological application and secret state machinations. Among other issues, in focusing on this traditional view, Kittler failed to see how cryptography would be applied to consumer and other uses. Kittler’s analysis thus missed the transition implied by his own media theory, from its multiple and varied roots (including military and state actions) to its emergence as the dominant mode of media today. Nonetheless, the discourse network 2000, to follow Kittler’s (1990) analytic, is encrypted.
Despite being a multibillion dollar industry, with a massive, deep, and vibrant research community, very little attention has been paid to what the domain of cryptography covers and what its constituent parts are. This includes engineering and technical fields as well as social sciences and humanistic disciplines (even the groundbreaking Software Studies (2008) collection fails to make reference to cryptographic media). Basic categorical questions remain largely unasked, let alone answered. What is encryption and decryption, and what is the relationship to cryptanalysis (code-breaking) and hashing (i.e., the process of creating a message digest or digital fingerprint)? Do metaphors of hiding or obfuscation have useful meaning for studying cryptology? What is the relationship between cryptography and code? Cryptography and language? we know cryptographic media and what does (or doesn’t) it communicate?
It is puzzling that these questions remain unanswered. Cryptographic media pervades all digital technology and have done so for quite a while. Half a century ago, Alfred Dodd remarked that we are in “a cipher age” (Ellison and Kim 2018), and yet, while we have gotten much better at making cryptographic media, we have not significantly improved our understanding of it.
This chapter probes these questions and offers a description of the stakes implied in potential answers. This exploration builds on a multidisciplinary pastiche of cognate problematics and explorations. DuPont (2018b, 2018a) offers an initial set of explorations toward a framework for conceptualizing cryptography technologies and encryption simpliciter. In addition to the analysis of encryption, these cognate problematics and their authors include obfuscation and hiding (Mateas and Montfort 2005; Brunton and Nissenbaum 2015), code (Drucker 1995; Liu 2004; Cramer 2005; Fuller 2008), language (Eco 1986; Raley 2003; Lennon 2010), and epistemology (Pesic 2000). In asking these questions and surveying existing literatures, this chapter seeks to highlight the need for future research and helps to sketch the conceptual terrain that this research might take.
Commonly, cryptography is believed to have arisen out of statecraft and military research and development, especially during the last century. This narrative has led to the belief that cryptology is intimately associated with mathematics and physical sciences research, which proved decisive for the field’s instrumentalization and growth. DuPont (2018b, 2018a) problematizes the analytical assumptions implied in this account, while others have sought to draw much longer and richer histories (e.g., Potter 1989; Rosenheim 1997; Pesic 2000; Sherman 2010; Ellison and Kim 2018).
Cryptology is comprised of a set of constitutive processes, which includes encryption, decryption, cryptanalysis (code-breaking), codes, and hashing (a message digest or digital fingerprint). Encryption is the principal technique. Historical accounts differ in their typologies, but most accept these basic categories. For example, according to David Kahn (1967), the field of cryptology covers cryptography (encryption and decryption), cryptanalysis, and “codes” (he calls the special kind of substitution codes prevalent in medieval and late modern cryptology “nomenclatures”). The media theorist Friedrich Zielinski (2008) implicitly adapted Kahn’s typology and included “secret” or “hidden” writing, called steganography (traditionally, these methods included disappearing ink and the like but are now typically digital; see also Rosenheim 1997). The challenge facing both Kahn’s and Zielinski’s accounts is that the constituent parts are analytically muddled, held together in a historical assemblage but for no clear reason. This analytical muddiness mirrors practice, which tends to blend techniques and technologies together.
There are two dominant narratives used to explain and differentiate encryption: technology and mathematics. In the first, cryptological technologies are delineated by their algorithms (e.g., DES, RSA, or primitives such as s-boxes). Oftentimes, these technologies are measured by their perceived “strength” (usually a measure of computational “effort” or time needed to crack the encryption). This measure, however, turns out to be highly idiosyncratic and context-specific. Technical measures are themselves contingent on known mathematical properties. During and immediately following the World War II, Claude Shannon applied emerging mathematical approaches to the study of cryptography. Principally, this included Nyquist’s (1924) and Hartley’s (1928) work on telegraph transmission, as well as other minor developments (Thomsen 2009). In so doing, Shannon (1945, 1949) elevated the study of cryptology to a science but also as a deeply mathematical one.
Another key moment in the mathematization of cryptology occurred in the early 1970s. Public-key cryptography was first invented (and classified) between 1970 and 1974 by researchers working at GCHQ (Ellis 1970; Cocks 1973; Williamson 1974) and then reinvented publically in 1976 by Whitfield Diffie and Martin Hellman (1976). Within the GCHQ, inspired by the Bell C43 analog vocoder encryption machine (a later variant of the more famous SIGSALY machine), James Ellis proposed the possibility of a digital encryptor comprised of lookup tables, calling his “existence theorem” a type of non-secret encryption (Ellis 1970). However, his idea initially lacked a practical method for making the needed “irreversible” lookups, which, within a few years, were supplied by two other GCHQ researchers, Clifford Cocks (1973) and Malcolm Williamson (1974). Both Cocks and Williamson used sophisticated number theory for their solutions (Cocks’ solution was essentially the RSA algorithm that was later developed, which today powers a large proportion of online encryption) (Ellis 1999). The GCHQ did not immediately exploit the invention, and it remained classified until December 1997. Around the same time as the original invention (although unaware of the GCHQ researchers’ work), Whitfield Diffie begun exploring the possibility of encryption that would be suitable for emerging “online” applications, such as e-mail, and was soon joined by Martin Hellman. Like Cocks, Diffie and Hellman initially lacked a suitable method for irreversible lookup. The successful method, later called a “one-way function” or “trap-door knapsack,” would be supplied by Ralph Merkle, although other possible solutions were proposed (Diffie 1988). The result of this work was a reinvention of non-secret cryptography, then called a “public-key cryptosystem,” which was detailed in a groundbreaking publication by Diffie and Hellman (1976). The invention of public-key cryptography was considered a breakthrough made possible by the union of mathematics and technology and has set the conceptual narrative since.
In fact, the introduction of mathematical thinking to the field of cryptology is a recent addition. For most of the history of cryptology (several millennia), the domain of study was writing and empirical science (including occult science). In the past, cryptography has been called everything from “the art of reckoning” (Oxford English Dictionary 2014) to “brainwork” (see Ellison and Kim 2018). Given this longer timespan and wider-ranging analytic, it is clear that cryptographic media are not so much mathematical – although the processes may involve mathematical thinking or exploits. Rather, encryption is the process of ordering discrete (notational) tokens in such a way that it takes advantage of combinatory expansiveness (DuPont 2017c, b). Decryption is its deterministic reverse. Cryptanalysis or “code-breaking,” on the other hand, is ontologically and methodologically distinct from encryption and decryption (DuPont 2018a). Any technique can be considered a form of cryptanalysis, but typically cryptanalysis uses either intuition (guessing) or a statistical measure. In this way, cryptanalysis is ontologically (and historically) related to natural language but distinct from encryption and decryption (DuPont 2018a). A cryptographic hash has some ontological similarities to encryption and decryption (it orders discrete tokens), but since hashes are informationally destructive (i.e., permanently entropic), in ways that encryption and decryption are not, hashing is related to but distinct from encryption.
Obfuscation and Hiding
Descriptions of cryptographic media invariably include reference to obfuscation, hiding, veils, covers, and secrets. These techniques are technologically mediated, social relations of information transfer. Obfuscation is “the deliberate addition of ambiguous, confusing, or misleading information” (Brunton and Nissenbaum 2015), while hiding is its converse, a social relation that excludes or limits access to information. One version of information hiding is steganography, which makes the very presence of information traces unknown (unlike encryption, which merely “hides” the “meaning” of the message) (Cole 2003). Many techniques of obfuscation and hiding are in fact related to, but – it must be stressed – distinct from encryption.
Obfuscation and hiding are metaphors that often stand in for the range of processes applicable to cryptographic media. However, it is far from obvious what these metaphors describe. For instance, what is a hidden or obscured message, and how does “secret transmission” work? (The latter is troublesomely oxymoronic.) Nonetheless, metaphors can possess explanatory power. There are two basic classes of motivation relevant to techniques of obfuscation and information hiding: aesthetics and control (and a correlate set of politics and counterpolitics).
The aesthetics of obfuscation have been most fully explored in the production of software code. Software source code typically strives for clarity, minimalism, and elegance, in contrast to obfuscated code, which is opaque, complex, and convoluted. Obfuscated code may be the result of inattention and lack of programming skill (so-called “spaghetti code”) or, more interestingly, purposefully playful and clever programming that challenges the user’s or observer’s wit. One way to produce obfuscated code is by using “weird” or “esoteric” programming languages such as Brainfuck or INTERCAL (Mateas 2008). Another way to produce obfuscated code is by writing software source code in a deliberately obscure manner, taking advantage of naming confusion, data/code/comment confusion, pointer confusion, pattern-matching obfuscation, and multiple semantic codings (Montfort 2008). The best examples of obfuscated code are aesthetic in some deliberate way, being ironical, satirical, playful, beautiful, or clever.
Malbolge is a weird programming language that, in particular, highlights the ways obfuscated code blurs the line between “human-readable” software code (a quasi-language) and ciphertext. In their exploration of Malbolge, Michael Mateas and Nick Montfort (2005) describe the efforts of Andrew Cooke to produce the first-ever running Malbolge program. Whereas most weird languages, such as Brainfuck or INTERCAL, are intended to create a kind of mental torment for the user or observer, Malbolge was specifically “designed to be incomprehensible” (Ben Olmstead quote in Mateas and Montfort 2005). Because of this incomprehensibility – deploying a huge range of obfuscatory tricks – it took Cooke two years before he was able to produce the first “hello world” program in Malbolge, and this was only possible with the aid of an artificial intelligence search technique. Cooke’s search technique took advantage of the previous effort by Lou Scheffer, who had discovered several “weaknesses” in the process of his “cryptanalysis” of the language. Interestingly, Scheffer characterized his efforts as “a cryptographer and not a programmer” (Scheffer quoted in Mateas and Montfort 2005). This extreme example pushes the meaning of “programming language” beyond other weird languages, which exploit the limits of human comprehension and intelligibility, and, instead, enters into the realm of cryptanalysis. Thus, when it comes to obfuscated code, there is a continuum between, say, human-readable Python code, machine-cryptanalyzed Malbolge, and (one can imagine) pure ciphertext created by Advanced Encryption Standard (AES) encryption.
Obfuscation is also used to control information. Brunton and Nissenbaum (2015) describe multiple politics and counterpolitics of obfuscation technologies, which are partial and fugitive when compared to the “total” technologies of encryption. Since, they argue, surveillance technologies are often aligned against individuals, counterpolitical obfuscation technologies can be used to “put some sand in the gears, to buy time, and to hide in the crowd of signals,” but obfuscation techniques, they remind us, are a tactic, not a strategy (Brunton and Nissenbaum 2015). That is, the counterpolitics of obfuscation are a temporary fix and are not intended to replace full-blooded governance or politics. Privacy is a primary goal for obfuscation, but not a limiting criterion.
There are many types and techniques of obfuscation and hiding used to control information. Techniques include adding information or messages (perhaps to an excess), rerouting or readdressing information, burying information within larger collections, blending or combining information, being vague, and creating fake information. Consider the example of the web browser plugin AdNauseam. AdNauseam is a tool to avoid commercial surveillance, typically from advertising networks. Rather than block ads (as traditional ad blockers do) or try to hide from them (as, in a way, virtual private networks do), AdNauseam overwhelms surveillance by virtually “clicking” on every ad. In so doing, AdNauseam obscures information traces left by a web browser. Obscuring information traces does not guarantee the user complete privacy – clever or dedicated analysis might be able to detect the authentic trace within the fog of fake clicks – but it does offer a ready-at-hand tactic when more robust solutions, such as encryption, are not available.
Historically, codes were cryptographic. Newspapers published daily code puzzles and cryptograms, governments communicated through encrypted messages, and school boys and lovers alike sent messages encrypted with private schemes. Words like codes were, and are, analogical and polysemic. Today, however, the term has largely lost this meaning. More often than not, our encounter with “code” is with digital systems, now studied by Computer Science and Engineering disciplines. Transmission and compression codes are vibrant areas of research, and cryptography has in turn taken on its own specialized and distinct form of study. However, the distinction between these many types of code is in fact fuzzy. For example, “Morse code” is a digital transmission code that was designed for transmission efficiency but also worked as a kind of weak cipher (and many variants of telegraph code were produced that utilized stronger forms of encryption). Or consider that, in 2012, when the participants of a “cracking challenge” attempted to cryptanalyze an encrypted e-poem written by William Gibson two decades prior (published in 1992 as part of the Agrippa art book), they found it more difficult to crack the ZIP compression than the weak RSA-type “encryption” (DuPont 2012, 2013). The lesson learned was that there is no clear, stable, or necessary differentiator between “encoding” and “encryption.”
The critical study of code, in the older sense still related to cryptographic media, has been largely left to those outside of technical fields—mostly humanists, critical literature and culture scholars, and social scientists. Historians, in particular, have been slowly but increasingly engaging the relationships between code, cryptography, and code-like things, such as “perfect languages” and proto-computers (Potter 1989; Eco 1995; Stolzenberg 2001; Long 2004; Maat 2004; Jones 2016; Ellison and Kim 2018). Codes are also sometimes discussed in histories of the nineteenth and twentieth centuries, when the use of cryptography was no longer a craft activity and intellectual pastime (as it had been before); through the nineteenth and twentieth centuries, cryptography had become fully institutionalized within military and state apparatuses. Most of the literature on this era, however, is found in popular book surveys (e.g., Singh 2000; Churchhouse 2002) and the journal Cryptologia, written more often than not by retired engineers, and not professional historians. As one might expect, this literature has characteristic blind spots and often fails to provide rich scholarly critique. Exceptions include, for example, James Reeds, Whitefield Diffie, and J.V. Fields’ (2015) report on “Tunny,” but even this otherwise excellent work has an engineering slant – more often interested in cracking codes than contextualizing them.
The humanistic study of code and its intersections with cryptographic media has ranged widely. One area of exploration has included critical expositions of culture and politics. In his analysis of “cool” media, Alan Liu (2004) assessed the contradictory politics of cyberlibertarian organizations like the Electronic Frontier Foundation, who simultaneously call for “freedom” of information and unimpeded access to strong cryptography. Similarly, David Golumbia (2016) took Bitcoin evangelists to task for fusing right-wing economics to what might be called the “value” layer of cryptographic media. Sybille Krämer (2015) broadened the conceptual terrain of coded cash: what she characterized as paradigmatic “symbolic” and “media” machines. That is, Krämer reconfigured the transmission function of money exchange, arguing that electronic money (such as cryptocurrency) is decontextualized, dematerialized (virtual), and ultimately indifferent (what economists call the “fungibility” of money). These studies redefine our view of the “media” dimension of cryptographic transmission.
The study of cryptographic media can also be approached from the view of symbols, text, and diagrammatic writing. The alphabet – created once (some 3000 years ago) but constantly evolving – has spurred many historical and contemporary connections to cryptographic media. In the Renaissance and modern ages, many scholars saw direct connections between the powers of letters and their combination through encryption (this included Francis Bacon and Gottfried Wilhelm von Leibniz, among others). Johanna Drucker (1995) explored many of the visual and diagrammatic aspects of the connection between cryptography and the alphabet by searching the cryptographic handbooks of John Wilkins, Johannes Trithemius, and Giambattista della Porta. These works are rich in visual complexity and diagrammatic sophistication and reveal connections between the materialities of letters and their representations. Brian Lennon (2015) traced the essential concepts of security and authentication through the literary history of philology, which, he argued, have conflicted natures, as words and non-words (or “passwords” in the tradition of Baudrillard). Florian Cramer (2005) explored the ways that “executable” code existed long before computers. According to Cramer, algorithmic code cannot be separated from our cultural imagination, in particular, as it exists in science, magic, and occult codes. The “spell of this magic act,” writes Cramer, is the result of combinatory computation, which includes everything from the “cutup” prose of William Burroughs, Pythagorean musicology and logic, Hebrew Kabbalah, Ramon Lull’s volvelles (thinking machines), Renaissance word generators, Jonathan Swift’s satirical combinatorial machine, and poetics ranging from the Oulipo to concrete poets to the net artist Jodi (Cramer 2005). Beneath this massive range of techniques and technologies lies a cultural history of computing, grounded in code practices that turn words into materials or ways of becoming “flesh” (Cramer 2005).
Cryptographic media are not natural language (at least not in a straightforward way), but they can be profitably related to writing and writing technologies (DuPont 2018b). Katherine Ellison has thoroughly explored these connections, arguing that writing is “naturally cryptographic” (Ellison 2008). Through a reading of modern cryptography manuals, Ellison argues that materiality is an important and consequent input to cryptographic media. For instance, John Wilkins’ Mercury was available as a printed book, but, crucially, the materiality of cryptography demanded more than the available print technologies could provide. In this era, printing technologies did not replace written manuscripts (as is sometimes believed); the required manual additions to Mercury show how cryptographic media was “multimodal,” which, at least Wilkins believed, also led to increased security (Ellison 2011). For the “reader” of cryptographic media, too, the material hand and brain mattered. Cryptographic devices were craft materials, and while there is a formal sense that cryptography is abstract and pure (a “notation;” see DuPont 2017c), actually using cryptography requires “brainwork” and “handwork” and a corresponding set of technical skills (Ellison 2013). Critically, this meant that the process of reading and writing cryptographic media was necessary to learn, which was nonlinear, multiple, and puzzle-like (Ellison 2014). The reader and writer of cryptographic media were thus locked in a constantly negotiated “contract,” revealing and covering truth and meaning (Ellison 2008). Although the cryptography manuals Ellison explored date from the seventeenth and eighteenth centuries, they are far from insignificant to the origins of contemporary cryptography. These documents show the limits of human communication, the emergence of scientific (and later, industrial) practices, and, crucially, the emergence of the concept “information” and “information processing” (Ellison 2017).
The contemporary critical study of cryptographic media might be best situated within the emerging fields of Software Studies and Critical Code Studies yet to date this avenue remains largely unexplored. In the important edited volume Software Studies (Fuller 2008), cryptography receives only scant attention, retrospectively, through the inclusion of an essay on “code” by Kittler. As mentioned above, cognate topics are included, such as obfuscated code (Montfort 2008) and weird languages (Mateas 2008). Other works in the vein of Software Studies and Critical Code Studies make occasional mention of cryptography (Berry 2011; Cox et al. 2012), but rarely as a medium. When cryptography does get analytical treatment, it is usually in relation to science fiction topics (especially the novels of William Gibson and Neal Stephenson) (Hayles 1999; Liu 2004; Kirschenbaum 2008; Chun 2008; DuPont 2013) or discussions of privacy (Pasquale 2015) and hacking (Mackenzie 2006). Both Tung-Hui Hu (2015) and Siegfried Zielinski (2008) have tackled the topic somewhat more directly, but despite being otherwise fine studies, neither significantly reimagines the possibilities of what a critical study of cryptographic media might look like.
Umberto Eco argues that cryptography is a special kind of code, distinct from natural language (Eco 1986). Using Hjelmslev’s sign model, Eco describes natural language as a system of “double articulation” that creates semiotic connections across elements on the expressive and content planes. The content plane attaches meaning to particular semantic unities (through conceptual or psychological apparatuses), while the expression plane is the material substance of the sign and is devoid of meaning. Unlike natural language, cryptography is a system of “single articulation,” devoid of meaning and working through the expression plane alone. “Ciphers,” or the processes of encryption, are “correlational” codes that operate by the arbitrary transposition or substitution of alphabetic letters (Eco 1986). According to Eco, correlational codes, such as cryptography, are not a mechanism of communication but instead merely a transformation mechanism between two systems. Cryptography, Eco summarizes, “substitutes every minimal element of the plaintext with the element of another set of expressions” (Eco 1986).
Despite the clarity of Eco’s semiotic analysis, the histories and genealogies of cryptography paint a different picture. Brian Lennon (2010) and Rita Raley (2003) have both argued that the post-World War II context of machine translation intersected with cryptography. Lennon described the intersection as a triumph of technology and science – motivated by advances in cryptography – which resulted in the artificial bifurcation of human language from code (Lennon 2010). Paradoxically, the advancements in postwar cryptography enabled machine translation to become much more effective and ubiquitous and therefore reconnected to translation’s “hermeneutical” roots, in which, according to scholars like George Steiner, all discourses are translations (Steiner 1998; Lennon 2010). Raley (2003) makes a similar point, arguing that when Warren Weaver approached machine translation in the context of postwar cryptanalysis technology, he dismissed rhetorical nuances (as being too complicated for his vision of machine translation). In favoring a “universal signifier,” Weaver actualized the goals of “Basic English” (an effort to create a simplified version of English, useful for global communication), which contributed to instrumental and technocratic rationality. The result, Raley worried, was that machine untranslatable knowledge would be abandoned (Raley 2003).
What these histories point to is a deeper connection between cryptography and language. Eco’s point still stands – encryption is a correlational code distinct from natural language – but the history of machine translation connects not to encryption but to cryptanalysis (DuPont 2018a). The techniques that Weaver adapted from military cryptography were related to cryptanalysis, which uses statistical and intuitional techniques. The technocratic rationalities that worried Lennon and Raley were due to the application of statistical measurements that had long been part of cryptanalysis (DuPont 2018a). Recent developments in machine translation continue this trajectory, favoring “big data” and “machine learning” – learned from military cryptanalysis, refined by government surveillance and signals intelligence – at the expense of linguists, human translators, and hermeneutical sophistication.
For all the worries of “big data” and its largely unappreciated but necessary connection to cryptographic media, the techniques are nothing if not effective. As described, algorithmic and statistical techniques can be used for cryptanalysis and machine translation, as well as the curious imaginaries of finance, but they are also useful for investigating the natural world. Scientific advancements today often make use of techniques drawn from the tool chest of postwar cryptography and signal analysis. Despite the ethical and political contentiousness of this issue, science has discovered a powerful new epistemology. (This same debate has already played out in Digital Humanities, where critics have accused the field of ignoring its complicity in a post-Snowden surveillance state.) This epistemology, however, is far from new and connects in deep and significant ways to cryptography.
Peter Pesic (2000) has traced the epistemology of cryptographic media and its historical connections to scientific enterprises, finding a rich ontology and complex set of analytical tools. Francis Bacon, perhaps the patron saint of modern science, developed his tools of scientific analysis in relation to cryptography, even going so far as to develop his own (digital) cipher system (Pesic 2000). Bacon’s analysis was surprisingly prescient, even prefiguring “big data” approaches. For Bacon, nature’s raw “data” was an input to the empirical, inductive method, which could then be “computed” to yield insights into the secrets of nature. This vision of nature as secret and coded, and requiring a dynamic, combinatory epistemology, was actually somewhat common in Bacon’s day. For instance, Ramon Lull, a thirteenth-century Catalan polymath, developed a complex system of scientific analysis that utilized discrete symbol combination and permutation, which later influenced a young Leibniz (who would go on to lay the groundwork of our mathematics, logic, and computation). Today, this history is underappreciated, having fallen out of fashion due to its connections with occult symbolism, but it points to a deeper ontology of “executable” functions, discrete and analyzable natural substances, and the “unreasonable effectiveness of mathematics” (Wigner 1960; Kirby 2003).
This chapter described much more than debates about privacy and surveillance that typically surround cryptographic media. Instead, it pointed to a future mode of research that gives greater weight and significance to cryptographic media. Our scholarly forbearers already understood the significance of cryptography – to communication, language, meaning, and science – and future research may one day rehabilitate these insights. In fact, cryptographic media plays a large and important – and growing – role in society today, although this fact is widely unappreciated. Future research might question the ways that we are already committed to representation that is necessarily cryptographic. This research may redirect the dire warnings raised in the critical literature about the opacity of algorithmic, coded, and artificially intelligent software systems and recognize that the objects of such claims look a lot like cryptographic media. Politically, this research may point out the new modes of exclusion formed by cryptographic media. For example, when Google decides to encrypt e-mail transmission by default, how does our rhetoric and dialogue change when using these technologies? Here, in the name of “privacy,” Google blinds the prying eyes of government and bad actors, but, one wonders, what else changes? (Note: the corporate prerogative to serve ads inside encrypted tunnels does not change; it merely mainlines the attention economy into so many “eyeballs.”) How does cryptographic media change our relationship with expressions, media, and, above all, human and machine interaction?
- Brunton F (2014) Kleptography Radical Philosophy:2–6Google Scholar
- Brunton F, Nissenbaum H (2015) Obfuscation: a User’s guide for privacy and protest. MIT PressGoogle Scholar
- Chun WHK (2008) Control and freedom: power and paranoia in the age of Fiber optics. MIT Press, Cambridge, MAGoogle Scholar
- Churchhouse RF (2002) Codes and ciphers: Julius Caesar, the enigma, and the internet. Cambridge University Press, CambridgeGoogle Scholar
- Cocks CC (1973) A Note on “Non-Secret Encryption”. https://www.gchq.gov.uk/notenon-secret-encryption
- Cole E (2003) Hiding in plain sight: steganography and the art of covert communication. Wiley, IndianapolisGoogle Scholar
- Cox G, McLean A, Berardi F “Bifo” (2012) Speaking code: coding as aesthetic and political expression. MIT Press, Cambridge, MAGoogle Scholar
- Cramer F (2005) Words made flesh: code, culture, imagination. Piet Zwart Institute, RotterdamGoogle Scholar
- Drucker J (1995) The alphabetic labyrinth: the letters in history and imagination. Thames and Hudson, New YorkGoogle Scholar
- DuPont Q (2018a) The cryptological origins of machine translation, from al-Kindi to Weaver. Amodern 8, pp 1–20. http://amodern.net/article/cryptological-origins-machine-translation/
- DuPont Q (2018b) The printing press and cryptography: Alberti and the Dawn of a notational epoch. In: Ellison K, Kim S (eds) A material history of medieval and early modern ciphers: cryptography and the history of literacy. Routledge, New York, pp 95–117Google Scholar
- DuPont Q (2017b) Blockchain identities: notational Technologies for Control and Management of abstracted entities. Metaphilosophy. https://doi.org/10.1111/meta.12267
- DuPont Q (2017c) An archeology of cryptography: rewriting plaintext, encryption, and ciphertext. PhD dissertation, University of TorontoGoogle Scholar
- DuPont Q (2012) Cracking the Agrippa Code. http://www.crackingagrippa.net/. Accessed 19 Oct 2012
- Eco U (1986) Semiotics and the philosophy of language. Indiana University Press, BloomingtonGoogle Scholar
- Eco U (1995) The search for the perfect language. Blackwell, Cambridge, MAGoogle Scholar
- Ellis JH (1970) The possibility of secure non-secret digital encryption. https://www.gchq.gov.uk/possibility-secure-non-secret-encryption
- Ellison K (2014) ‘1144000727777607680000 wayes’: Early Modern Cryptography as Fashionable Reading. Journal of the Northern Renaissance 6. http://www.northernrenaissance.org/1144000727777607680000-wayes-early-moderncryptography-as-fashionable-reading/
- Ellison K (2017) A cultural history of early modern English cryptography manuals. Routledge, New YorkGoogle Scholar
- Ellison K, Kim S (eds) (2018) A material history of medieval and early modern ciphers: cryptography and the history of literacy. Routledge, New YorkGoogle Scholar
- Fuller M (ed) (2008) Software studies: a lexicon. MIT Press, Cambridge, MAGoogle Scholar
- Gebhart G (2017) We’re Halfway to Encrypting the Entire Web. In: Electronic Frontier Foundation. https://www.eff.org/deeplinks/2017/02/were-halfway-encrypting-entire-web. Accessed 20 Jun 2017
- Golumbia D (2016) The politics of bitcoin: software as right-wing extremism. University Of Minnesota Press, MinneapolisGoogle Scholar
- Jones ML (2016) Calculating devices and computers. In: Lightman BV (ed) A companion to the history of science. Wiley, Malden, pp 472–481Google Scholar
- Kahn D (1967) The codebreakers: the story of secret writing. Macmillan, New YorkGoogle Scholar
- Kirschenbaum M (2008) Mechanisms: new media and the forensic imagination. MIT Press, Cambridge, MAGoogle Scholar
- Kittler F (1999) Gramophone, film, typewriter, 1st edn. Stanford University Press, StanfordGoogle Scholar
- Kittler F (1990) Discourse networks 1800/1900. Stanford University Press, StanfordGoogle Scholar
- Korolov M (2016) Study: Encryption use increase largest in 11 years. In: CSO Online. http://www.csoonline.com/article/3088916/data-protection/study-encryption-use-increase-largest-in-11-years.html. Accessed 20 Jun 2017
- Lennon B (2010) In Babel’s shadow : multilingual literatures, monolingual states. University of Minnesota Press, MinneapolisGoogle Scholar
- Lennon B (2015) Passwords: Philology, Security, Authentication Diacritics 43:82–104Google Scholar
- Long PO (2004) Openness, secrecy, authorship: technical arts and the culture of knowledge from antiquity to the renaissance, paperbacks ed. Johns Hopkins University Press, BaltimoreGoogle Scholar
- Mackenzie A (2006) Cutting code: software and sociality. Peter Lang, New YorkGoogle Scholar
- Mateas M (2008) Weird Languages. In: Fuller M (ed) Software studies: a lexicon. MIT Press, Cambridge MAGoogle Scholar
- Mateas M, Montfort N (2005) A box, darkly: Obfuscation, weird languages, and code aesthetics. In: Proceedings of the 6th Digital Arts and Culture Conference, IT University of Copenhagen. pp 144–153Google Scholar
- Montfort N (2008) Obfuscated code. In: Fuller M (ed) Software studies: a lexicon. MIT Press, Cambridge MAGoogle Scholar
- NSS Labs (2016) NSS Labs Predicts 75% of Web Traffic Will Be Encrypted by 2019. https://www.nsslabs.com/company/news/press-releases/nss-labs-predicts-75-of-web-traffic-will-be-encrypted-by-2019/. Accessed 20 Jun 2017
- Oxford English Dictionary (2014) algorithm, n. OED OnlineGoogle Scholar
- Pesic P (2000) Labyrinth: a search for the hidden meaning of science. MIT Press, Cambridge, MAGoogle Scholar
- Potter L (1989) Secret rites and secret writing: royalist literature. Cambridge University Press, New York, pp 1641–1660Google Scholar
- Reeds JA, Diffie W, Field JV (eds) (2015) Breaking Teleprinter ciphers at Bletchley Park: an edition of I.J. Good, D. Michie and G. Timms: general report on tunny with emphasis on statistical methods (1945). Wiley-IEEE Press, HobokenGoogle Scholar
- Rosenheim S (1997) The cryptographic imagination: secret writings from Edgar Allen Poe to the internet. The Johns Hopkins University Press, BaltimoreGoogle Scholar
- Sandvine (2016) 70% Of Global Internet Traffic Will Be Encrypted In 2016. https://www.sandvine.com/pr/2016/2/11/sandvine-70-of-global-internet-traffic-will-be-encrypted-in-2016.html. Accessed 20 Jun 2017
- Shannon C (1945) A mathematical theory of cryptography. Bell labs. Murray HillGoogle Scholar
- Sherman WH (2010) How to make anything signify anything. CabinetGoogle Scholar
- Singh S (2000) The code book: the science of secrecy from ancient Egypt to quantum cryptography. Anchor Books, New YorkGoogle Scholar
- Steiner G (1998) After babel: aspects of language and translation, third edition. Oxford Paperbacks, New YorkGoogle Scholar
- Stolzenberg D (ed) (2001) The great art of knowing: the baroque Encyclopedia of Athanasius Kircher. Stanford University Libraries, FlorenceGoogle Scholar
- Williamson MJ (1974) Non-secret encryption using a finite field. https://www.gchq.gov.uk/non-secret-encryption-using-finite-field
- Zielinski S (2008) Deep time of the media: toward an archaeology of hearing and seeing by technical means. MIT Press, Cambridge, MAGoogle Scholar