Escaping the Panopticon Over Time

The ‘right to be forgotten’ has been labelled censorship and disastrous for the freedom of expression. In this paper, we explain that effecting the ‘right to be forgotten’ with regard to search results is ‘censorship’ at the level of information retrieval. We however claim it is the least heavy yet most effective means to get the minimum amount of censorship overall, while enabling people to evolve beyond their past opinions. We argue that applying the ‘right to be forgotten’ to search results is not a question of just ‘censoring’ search engines, but that seen from a broader perspective we—as society—will inevitably have to deal with developments in information technologies and choose between three types of ‘censorship’: (1) censorship of original sources, that is on the level of information storage; (2) censorship on the level of the initial encoding of that information or (3) censorship on the level of information retrieval. These three levels at which ‘censorship’ can take place are the three basic elements of the memory process; whether biological, technological or hybrid with the use of mnemonic technologies. Applying censorship as a means of ‘forgetting’ in the collective hybrid memory of the Web enables us to counter—at least partially—the functioning of the Web as a ‘Panopticon over Time’.


Introduction
When we consider Internet technology in its current state, there appears to be a public perception of the 'everlasting memory of the Web'.Indeed, the dramatic decrease in the cost of data storage (Komorowski 2014) has made possible the adoption of policies to 'never delete anything' (Hardy 2015).Such policies are now popular both on an informal base amongst individuals and as formal policies amongst key-players in the online economy.Although not actually everlasting, the easy and long-term accessibility of online information is cause for concern and can be problematic for individuals, as expressed by Rosen in The Web Means the End Of Forgetting (Rosen 2010b).
One of the cases that shows the problems that the collective online 'memory' of the Web can lead to is the heavily debated 'Google Spain case'.In this case, a Spanish citizen-referred to in this paper as 'G'1 -wanted the online search engine Google Search to remove or conceal links to newspaper articles from 1998 in the archive of the newspaper La Vanguardia that appeared as (top) search results when performing a search with G's name in the search engine.
On May 13th, 2014, the Court of Justice of the European Union (CJEU) ruled in the Google Spain case2 , that the search engine Google was responsible for the displayed search results and ordered that Google had to remove the results linking G.'s name to the articles from 1998 on La Vanguardia's website, thereby making the information less easy to find by the general public.The CJEU deemed that the passing of time may affect the accuracy, adequacy, and relevancy of information in such a manner that the continuous processing of that information may become excessive in relation to the purposes for which it was initially collected3 .
The Google Spain case is often discussed in the light of the 'Right To Be Forgotten'.This is a right that will be given shape in the upcoming European Union General Data Protection Regulation and gives individuals the right to have personal data erased under certain circumstances (European Union 2012).The case is often discussed in the context of this right because the case illustrates quite clearly how the persistent and easy availability of relatively old information on the Web can cause problems for individuals.
The 'Right To Be Forgotten' aims to help individuals 'escape' from being framed by their outdated past and opinions, which are fixated in the online available collective memory.This is meant to help individuals to develop themselves over time without having to fear from systematic stigmatisation of themselves in the here and now by their past actions and opinions (cf.Andrade, 2012).Ideally, it targets information that exists in the public domain, but that has become 'with the passing of time (. . . ) decontextualized, distorted, outdated, no longer truthful (but not necessarily false)' (Andrade 2012, p. 127).
What made the CJEU ruling especially interesting from a philosophy of technology perspective is that the CJEU ruling is aimed at a technological intermediary and not the original publisher of information (which was the initial target of G's wish to invoke his 'Right To Be Forgotten').The CJEU views the actions of a search engine as 'additional to that carried out by publishers of websites' [our emphasis]4 .In the ruling, the CJEU points out that the aggregation of information published on the Web and indexed by Google Search has such a scope that when a user performs a search query on an individual's name, the search engine 'enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet-information which potentially concerns a vast number of aspects of his private life and which, without the search engine, could not have been interconnected or could have been only with great difficulty-and thereby to establish a more or less detailed profile of him'5 .
However, in his opinion preceding the CJEU ruling, Advocate General Jääskinen took a different perspective on the case, and in particular the role of search engines.Jääskinen portrays a search engine as setting up 'automated, technical and passive relationships to electronically stored or transmitted content'6 and by doing so the search engine 'only indicates'7 where on the Web a user can find already existing content that is made available by other parties8 .According to Jääskinen, search results displayed by a search engine provide for a truthful reflection of relevant web pages to the users9 .Any censorship on the level of search engines would lead to a 'bowdlerised' version of this truthful display10 .Many share Jääskinen's vision.Especially, freedom of speech advocates look at the judgement in the Google Spain case-that under certain conditions a search engine may need to remove specific search results because they have lost their relevance and are a disproportionate burden for the individual-with horror and regard the Right To Be Forgotten as censorship (e.g.Hern 2014;Baker 2014) or even 'the biggest threat to free speech on the Internet in the coming decade' (Rosen 2012).
But are such claims just?Is a 'Right To Be Forgotten' really detrimental for the freedom of expression when it works on the level of a technological intermediary such as Google Search?
In this paper, we will look further into these accusations and will approach the subject from a perspective that draws heavily on philosophy of technology.The central question we aim to answer is 'How does the censoring of an online search engine relate to freedom of expression?' In order to answer this question, we will explain how the Web-and in particular the search engine technology within it-can function as a surveillance architecture.
To do so, we will combine Foucault's analysis of surveillance architectures with the emphasis on technologically driven retention of information over time that is present throughout Stiegler's philosophy.We will describe the Web as a Panopticon over time, a surveillance architecture steering towards the internalisation of power relations and self-censorship.We will do this by first pointing out the relation between time, (technologically aided) memory and forgetting.Here, we will describe how the Web can function as part of a collective hybrid memory process, and how in this process the 'Right To Be Forgotten' enforces 'forgetting'.After this, we will take a closer look at the meaning and practice of 'censorship', from its historical origins to its present uses.Here, we will examine different practices of censorship in relation to the different stages in the collective tertiary memory processes.This will be followed by an analysis of power relations established by the technologies at play in the tertiary memory, and especially the disciplinary effect of databases and search engines on the individual's freedom of expression.
Finally, we will argue that by applying censorship at the level of the retrieval of information by a technological intermediary, the 'Right To Be Forgotten' raises a buffer against self-censorship and can thus be beneficial for freedom of expression.

Mnemonic Technologies and Forgetting
In order to make sense of a 'Right To Be Forgotten', we must first examine what is meant by this 'forgetting'.To do so, let us first elaborate on the concept of memory and the human use of mnemonic technologies11 .

Exteriorised Memory
First of all, let us be clear that we use the term 'memory' to denote a process, and as such it has a dynamic character.The notion of 'memory' in this paper should not be confused with the relatively static concept of 'memory' that is used for example in computer science.As a process-whether biological, technological or hybrid-any memory system necessarily consists of at least three steps: Any memory system-whether physical, electronic or human-requires three things, the capacity to encode, or enter information into the system, the capacity to store it and-subsequently-the capacity to retrieve it.(Baddeley et al. 2009, p. 5) These three elements interact and shape the memory process: the manner of encoding determines what and how something is stored, which will in turn determine what can be retrieved (Baddeley et al. 2009, p. 5).'Memory' is a complex concept that has given rise to much thought in various academic fields (Sutton 2012).It is commonly used for very diverse systems, like biological (human) entities and computers (Oxford Dictionary 2008).Despite the (big) differences in their application, the term 'memory' is considered applicable for such diverse systems because all memory systems necessarily comprise these three process elements.
To overcome biological limitations on information processing, humans have throughout the course of history sought after mnemonic technologies to aid the biological memory in processing and retaining more information, moulding it in external forms and thereby bringing about externalised memory processes.In Phaedrus Plato already framed 'writing' as a technological externalisation of human memory (cf.Plato 1955).Research on the human use of an externalised memory process as a group process (the so-called transactive memory process, cf.Wegner 1987) has shown that an analogy can be made between informational group processes and the individual memory process in which the processes of encoding, storage and retrieval have 'both internal and external manifestations' (Wegner 1987, p. 188).
The exteriorised memory process can take shape in interactions between human agents, but can also be of a hybrid nature where humans interact with objects and the like.The use of technologies enhances the human abilities to perform processesdeal with complex problems and a magnitude of information by giving us tools to store, alter, combine and transform this information.Such complex processes are often time-consuming or even impossible for the biological organism to perform on its own without technological aid (Clark 2004, p. 78).
In turn, the human brain is very apt at learning how to optimally use environmental factors and instruments for its own thinking processes (Clark 2004, p. 26) and thereby seemingly minimises-or at least alters-its weaknesses (Clark 2004, p. 75).In doing so, technologies and human beings influence and co-constitute each other by being part of one hybrid information processing system (Heersmink 2012, p.122-123).
Technologies are not purely instrumental in this nor are they neutral.They shape the memory process, just as we shape the technologies and choose to apply them in particular ways.Such use of information technologies has shaped and still shapes our current world; by externalising the information, we want to remember in a certain shape or form, the information becomes perceptible to others-it becomes repeatable and transmissible.This eases the manner in which information can be passed on to others, both in the present and on to future generations (Stiegler 2010a).
Technologies which retain information outside ourselves can be labelled as tertiary memory, also called tertiary retention (Stiegler 2010b, p. 8).Stiegler derives the concept of tertiary memory from Husserl's primary and secondary memory, which take place in the individual's mind.The primary memory is the individual's experience of the present (Husserl 1991, p. 32). 12These experiences of the present can be stored in the individual's memory and can be recalled.This storing which enables the ability to recall experience is the secondary memory (Husserl 1991, p. 37).When a secondary memory is 'exteriorised', for example when an experience is written down on a piece of paper, it becomes a tertiary memory (Stiegler 2010b, p. 9).
Tertiary retained experiences have a unique aspect compared to primary and secondary ones, in that they have the potential to become collective.Such tertiary retentions have been experienced and selected by others and form an inherited past that can be collectively shared (Stiegler 2011, p. 112).These retentions are 'adopted' by the individual, a process described by Stiegler as follows: A newborn child arrives into a world in which tertiary retention both precedes and awaits it, and which, precisely, constitutes this world as world.And as the spatialization of individual time becoming thereby collective time, tertiary retention is an original exteriorization of the mind (Stiegler 2010b, p. 9).
According to Stiegler, tertiary memory is essential for human beings (Stiegler 2009, p. 4).Tertiary memories are experiences that are materialised because they are encoded into information carriers.The encoding into information carriers makes the experiences 'transmissible, inheritable and adoptable' (Stiegler 2011, p. 117), and thus these retained experiences constitute to a very large degree our cultural inheritance.The experiences that we store by means of technologies become relatively static over time (compared to the human mind) and can become part of a collective public memory.These stored experiences, though we mostly have not lived them for ourselves, shape the normative and experiential backdrop of our own experiences.Tertiary memory thus shapes our cultural, societal and knowledge frameworks and accordingly shapes our expectations for the future (Stiegler 2011, p. 112).
The three process-levels of encoding, storage and retrieval that make up the tertiary memory process are heavily shaped by the technologies that co-constitute this process (e.g. the storage of physical information carriers like books consumes space, thereby limiting the amount of information that can be stored in an analogue library).The same goes for the content of the information (e.g. a photo-image could only be retained as information once the technology to make photographs was available).The technologies that are used to 'carry' information, thus co-shape the information.Technology as a medium moulds the information it presents to its users, and by doing so, it subsequently molds the manner in which humans, both as individuals and as society, perceive the world (Carr 2010, p. 3).Information technologies thus shape how we remember, and what we remember, by shaping the processes of encoding, storing, and the retrieval of information in tertiary memory.Consequently, the actors who design and control these technologies play a fundamental role in the design and functioning of our tertiary memory and therefore heavily influence our society.Such power relations will be discussed further towards the end of this paper.

Forgetting
There seems to be a general fear that the presence of information in the tertiary memory is becoming more persistent with the rise of the Web (cf.Rosen 2010a).According to Dodge and Kitchin in their article The ethics of forgetting in an age of pervasive computing, an implementation of 'forgetting' is necessary to prevent the potential 'burdensome and pernicious disciplinary effects' (Dodge and Kitchin 2005, p. 14) of mnemonic technologies.We can find a similar idea with Galloway, who points out that both memory and forgetting are shaped by choices and power struggles-and that both are subject to abuse (Galloway 2006, p. 2).Galloway therefore calls for an ethical focus on the design of machines that support memory (cf.Galloway 2006).
The development of the Right To Be Forgotten as part of the upcoming European Union General Data Protection Regulation (European Union 2012) is living proof of the wish for an implementation of 'forgetting' in the context of ever more pervasive information technologies.But how should we understand a concept like 'forgetting' in this context?Generally, 'forgetting' is understood solely in relation to human beings and not artifacts.However, when 'memory' is regarded as a process of encoding, storage and retrieval of information in any entity-whether biological, technological or hybridthe use of the term 'forgetting' in relation to mnemonic technologies is not so farfetched.
'Forgetting' is defined as 'fail[ure] to remember' (Oxford Dictionary 2008).We can describe 'remembering' in a memory process as a successful retrieval of encoded and stored information.'Forgetting' is then nothing other than a term to describe a glitch in this process, equally applicable to glitches in organic as to those in hybrid memory processes.'Forgetting' can occur as a result of failures in any of the three elements of the memory process.In the case of hybrid memory processes, it can be the loss of acquired information (storage), the inability to retrieve stored information (retrieval) and the deterioration of correspondence between the acquired and the retrieved information (encoding) (Dudai 2004, p. 100).Many nuances are possible, like only losing part of the stored information, being able to retrieve information only partially, a temporary inability to receive certain information or a combination of different failures.All these can be considered glitches in the memory process and thus a form of 'forgetting'-albeit an externalised one.

The Right To Be Forgotten
If we take the above into account, we can view the 'Right To Be Forgotten' as a juridical intervention by a data subject that can invoke an artificial form of 'forgetting' in the tertiary memory by either having the stored information deleted or by hindering the retrieval process.An example of a deletion on the storage level would be a cease and desist order issued against a party hosting content, ordering them to remove said content.If the content is taken down, it is at least on that location no longer part of the tertiary memory.
Removing results from a search engine, as was ordered in the Google Spain case, is an example of an implementation of 'forgetting' by hindering the retrieval process.Search engines are a means for people to retrieve information from the Web.By removing search results in search engines engines referring to a specific piece of information, that information is 'forgotten' in the online tertiary memory process when engaging with it from the angle of a particular search term (a person's name, as G's name in the Google Spain case).

Censorship
Enforcing such an artificial form of 'forgetting' as the Right To Be Forgotten does has on numerous occasions been labelled as 'censorship' (e.g., Cunningham, 2015).Censorship is often presented as counterpart to the freedom of expression-as method to remove those opinions that governments find unacceptable for the general public.There is still ongoing academic discussion on what exactly constitutes censorship in the modern sense of the word.In its modern version, the word as a verb is defined as 'examine (a book, film, etc.) officially and suppress unacceptable parts of it' (Oxford Dictionary 2008).This definition, however, does not adequately reveal the vast variety of connotations surrounding the word.Hence, we will hereunder go into detail on the etymology of the word and the practices that were historically associated with it.
The word has its origin in the title for ancient Roman magistrates, responsible for supervising public morals and taking the census of the population.The Latin word censere, the verb to which the title owes its name, has an interesting variety of translations, ranging from 'taxing', 'assessing' and 'counting' to 'judging' and 'decreeing' (Short and Lewis 1879).Contrary to the modern interpretation of the word, its ancient roots display an associative link between administrative functions and the act of repression.After the fall of the roman empire and the decline of administrative functions in Western Europe in general, the word censor lost its meaning as an administrative title.Though sporadic instances of taking a census still appeared throughout medieval Europe (e.g. the Domesday Book ordered in 1086 by King William the Conqueror), these were generally held by royal decree and executed by officials, none of whom bore the title censor.Similarly, though the practice of censorship was paramount in attempts by the Catholic Church to eradicate heresy, the term 'censor' did not reappear until after the invention of the printing press and the subsequent mass distribution of literature had given cause for preventive censorship.In 1515, during the Lateran Council and by decree of Pope Leo X, new rulings were set in place that ensured each publication was examined and authorised before printing.Such authorisation-the imprimatur-was to be given by the local inquisitor or bishop.Later on in the sixteenth century, the Index Librorum Prohibitum came to see the light of day, administrating a list of books that were banned throughout the Catholic world (Hilgers 1908;Godman and Bellarmino 2000;Waterworth 1848).Note that discourse surrounding what we now call censorship was up to this point still largely phrased in terms of prohibition and authorisation and their Latin equivalents, whereas censura denoted merely the examination of literature (Godman 2000).The term censor as an official function did not reemerge until the institutionalisation of expert offices appointed by the Bishop for the preventive examination of books, as required before printing (Vaticana 1983).These offices were-and still are-named Censor Librorum.These are the first instances of the use of the word censor in its modern everyday meaning as an authority preventing publication, stripped of its direct connotations of public administration.
Taking into account the fact that the tertiary memory affects the construction of culture in essential ways, the importance for governments and authorities of controlling what information becomes part of the tertiary memory is clear.From the examples given in the previous paragraph, we derive that censoring historically most often took place on the level of the encoding of information into the tertiary memory by for instance the preventive examination of text for books before they got published.As the mnemonic technologies constituting the tertiary memory (e.g. the printing press, index cards and the Internet) evolved over time, so did the technological means for imposing censorship.Censorship at the level of storage and retrieval is now much more prevalent.
Not only the technologies but also the content of the tertiary memory evolved over time.With it, the notions of what needed to be censored also changed.The goal of censorship was often to prevent deviant or 'improper' ideas from reaching the general public and affecting them.This does not mean that censorship only means the prevention of information being encoded into the tertiary memory, but also information that with the passing of time-and the changes in society-has come to be considered inappropriate for the general audience in the current time frame.Examples of these are for instance the so-called 'Censored Eleven', a group of Looney Toons and Merrie Melodies cartoons that reflect racist stereotypes (Smolko 2012) which at the time of creation (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938)(1939)(1940)(1941)(1942)(1943)(1944) were not acknowledged as problematic, but have since 1968 been censored.Additionally, books and movies that were censored in the past due to for instance explicit pornographic nature can now be available uncensored in the tertiary memory (Boyd 2001).In conclusion, the nature of what we nowadays call censorship is intrinsically linked to the ever-evolving tertiary memory, both in technologies and in content.

Records and Self-Censorship
In this section, we aim to recollect the link between public administration-in the everyday meaning of the word and in particular in the maintaining of databasesand those acts of repression that might be understood under the banner of the modern meaning of censorship.
As mentioned before, there is still ongoing academic discussion on what exactly constitutes censorship in the modern sense of the word.The use of the word 'censorship' in common language however seems to have expanded in its meaning.In common use of language by the general public, non-institutional acts of freedom of speech oppression are also often labelled as 'censorship'.A clear example of this is the act of so-called 'self-censorship', where an individual refrains from publishing her opinion out of fear of future consequences and/or repercussions in one form or the other by others-institutional or non-institutional.In this light, the role of information about individuals and the manner in which this information has been used, evolved and ended up having a disciplinary character in the last centuries are of particular interest.
Therefore, we will now examine the use of databases and in particular one element that is central to all database systems, the record.Having been longtime confined to the exclusive use by specialists like librarians and archivists, the term 'record' had found its way into British popular discourse in the turn of the twentieth century after several police registers merged into the Criminal Record Office.Henceforth, the word 'record' became known to the general public as the sword of Damocles hanging over the heads of delinquents, ready to strike should they find themselves in need of good reference or relapse into recidivism.The origin of the office and the criminal record system in the English speaking parts of the world traces back to the Fielding brothers, who had been responsible for major reorganisations within the British police force during the eighteenth century.Their plans 'to create a coherent police administration centre and to develop a preventive strategy for crime management' (McMullan 1998, p. 139) involved creating registers where details of all reported crimes were archived, including the names of both suspected and convicted offenders.This database avant la lettre was not only utilised to prevent recidivism, promising more severe sentencing for those previously associated with criminal behaviour; as more information was stored, including physical descriptions (and later biometrics), suspects were more easily identifiable and the increased chance of getting caught formed another deterrent for the individual against indulging in criminal activity.By 1750, the archive found its use in Fielding's Universal Register Office, where employers could find background checks and character references of potential employees.Here, the function of the criminal record system expanded also beyond the use within the judicial system and out into the public domain (Mustafa et al. 2013).Note that this possibility is still present nowadays in quasi-public form through the demand on job applicants to provide a disclosure certificate.The threat of a criminal record nowadays may even constitute a larger repellent towards crime then the actual sentence.In particular for minor crimes associated with civil disobedience that generally result in mild sentencing, the threat of a lifelong public reminder of criminal activity is likely to pose more of a repellent than the expected sentence, usually a fine or short term incarceration.Unlike the criminal record, these onetime consequences of ones actions involved in the actual judicial sentencing can relatively easily be prepared for and weighed into the deliberation over ones willingness to engage in civil disobedience.Such sentences-fines or short term incarceration-are temporal in nature, pass by over time and will not haunt the individual for the remainder of her life.
The criminal record system thus produces and maintains a power-relation, where repressive force is not exercised by the authority administering the system, but has been internalised by the individual who adapts her behaviour accordingly.As (both suspected and convicted) criminals are transformed from an anonymous mob into individualised parcels, each with their own unique index card in the criminal record system, the medieval threat of being branded for life re-emerges.This mere threat constitutes sufficient reason for the individual to no longer feel safe to act freely in the guise of the mob.The individual will rather adapt her behaviour so as to conform to the normative regime that is keeping continuous track of her in every (mis)step that defies warranted behaviour.The system has evolved into a power that no longer needs to exercise its authority by means of physical force, but functions far more efficiently as the individuals correct themselves in their realisation that they will be forever reminded of every misstep they make.The knowledge of an existing archive that will recognise the individual over time and draw her out of the anonymising cloak of timeful forgetfulness internalises the repressive power of those administrating the criminal record system, adapting the individuals behaviour without the need for classical punishment.
The encoding, storage and accessibility of information as described above can thus function as a disciplinary pressure on the individual to behave in such a manner that she does not 'produce' long-term accessible information that can backfire on her in the future.This is not only the case for deviant behaviour, but also stretches across other realms like self-expression.Once we know that certain information is 'on record' and/or will be long-term accessible, most of us will consider our words more carefully.The long-term accessibility of our expressions can thus harbour a chilling effect with regard to these expressions.In the next section, we will take a closer look at the disciplinary effects of the long-term availability of information that is related to a specific individual.

The Panopticon Over Time
Those familiar with the works of Foucault will recognise the similarities between the functioning of the criminal record system and Foucault's analysis of the 'Panopticon', roughly translated as 'all seeing' from the Ancient Greek π άν (all) and òπτ ικ óν (for sight).Despite Stiegler's excellent work on reframing analysis of power from Foucauldian biopower into psychopower, we feel that an analysis in terms of Foucauldian panopticism is in place here.This is mainly to do with the way in which Stiegler places an emphasis on psychopower in relation to consumption.This emphasis provides a much needed addition to the Foucauldian analysis of power and is a valuable contribution to analysing power dynamics within the Internet economy.However, our research question relates to freedom of expression, a freedom towards production.We therefore see an analysis in terms of biopower-the steering of productive forcesas more appropriate for investigating the power dynamics surrounding freedom of expression.We will maintain Stiegler's emphasis on the psyche as we analyse these dynamics and feel this emphasis can be adequately maintained within the concept of biopower.
In Discipline and Punish, Foucault describes the workings of the infamous prison architecture originally developed by Jeremy Bentham (Foucault 1977, p. 200-228).The panoptic prison architecture has a central tower from which 'one can observe [...], standing out precisely against the light, the small captive shadows in the cells of the periphery' (Foucault 1977, p. 200).The Panopticon is originally mostly described in the light of disciplining the individual body in space.In this paper, however, we are interested in the potential disciplinary architecture of the tertiary memorywhich can discipline the individual in time.Information in itself is a entity that is inextricably linked with time: information becomes information by selection and retention.Now, it may seem rather far-fetched to compare a database system with a prison architecture and indeed the two are far from equal, but the comparison nevertheless reveals certain interesting aspects.
Though the criminal record system obviously does not physically isolate individuals like the Panopticon does, the similarities in the power relations it produces are remarkable.Similar to the Panopticon, in the criminal record system the crowd, 'is replaced by a multiplicity that can be numbered and supervised' (Foucault 1977, p. 201).Where the Panopticon induces 'a state of conscious and permanent visibility that assures the automatic functioning of power' by means of physical surveillance (Foucault 1977, p. 201), the criminal record system assures the same automatic functioning of power not through physical visibility, but through recording the individual's actions and placing them at hand ready for surveillance, thus stretching the visibility of their actions into permanency and forming an eternal reminder from which the individual can never dissociate herself.As such, the criminal record system functions like what we will call a 'Panopticon over time'.As the individual's actions are recorded without her seeing or knowing those who do the recording, the criminal record system, like the Panopticon, 'is a machinery that assures dissymmetry, disequilibrium, difference.Consequently, it does not matter who exercises power [... or] what motivates him' (Foucault 1977, p. 202).For when we consider the criminal record system in its technical architecture, it is reducible to a database keeping records of individual's descriptions and actions.Again, like the Panopticon, the utility of such an architecture is not confined to policing, but functions as a laboratory, a 'privileged place for experiments on men' (Foucault 1977, p. 204).Indeed, there are now abundant examples of database systems intrumentalised for experiments on men, ranging from pedagogic evaluations based on student's records to marketing assessments based on online tracking.It is 'a figure of political technology that may and must be detached from any specific use' (Foucault 1977, p. 205), a blueprint for technologically driven power relations that are integrated throughout the entire productive apparatus, keeping records of both producers and consumers.While establishing a maximum effect in reach and intensity as it holds up for display permanent reminders of the individual's past, present and future actions, it minimises the cost of maintaining the power relation, as it places inside the individual herself the force that drives her to adapt to normative behaviour.Now this is not to say that all databases storing personal records function like a 'Panopticon over time' in the sense that they are inherently repressive machines.For one, most databases storing personal records are very selective in what sort of actions they record, often in domains not particularly interesting to ones general behaviour.Nor are these databases generally publicly accessible.For instance, the database a shop may keep storing its clients previous orders is not likely to stretch into permanent visibility unwarranted behaviour by the client nor is it visible to anyone but the shopkeeper.In such a case, the power relation that is produced by the database merely provides the shopkeeper with a stronger commercial negotiation position and is unlikely to result in a strong adaption of the individual's behaviour outside of the particular sphere of commerce in which the shop ventures.
In other cases, the information stored may be more relevant to one's behaviour, but the database may still not be publicly accessible.For instance, the personal files that employers keep on their employees tend to specifically keep track of their perceived missteps, but generally do not leave that specific company or institution.In these cases, the power relation produced by the database can be seen as re-enforcement of the existing power relation between employer and employee-that is, between the administrator and those with access to the database and the individual whose record is kept.These power relations are likely to induce in the individual an adaption of her behaviour towards what is deemed desirable by those in charge of the database.
The criminal record system itself falls into a grey area.As most civilised countries have acknowledged the downsides of being permanently reminded of one's past actions, access to criminal records is usually heavily regulated.The power relations thus established are not solely between the individual and the police, but somewhat ambiguously stretch out to society at large.The inducement of the individual to adapt her behaviour in favour of non-criminal activity here stems not only from concerns regarding her reputation amongst the judicial apparatus, but also from how her recorded less legal activities may be regarded by anyone who may or may not at some point in the future gain access to her record.

Censorship, the Right To Be Forgotten and the Tertiary Memory on the Web
With the rise of digital computer networks a more clear example of a 'Panopticon over time' can be recognised in the Web functioning as tertiary memory.At the heart of this panoptic architecture of the internet stand the search engines, most notably Google Search.Taking the Web as a potential 'Panopticon over time', the three memory process elements are fundamental steps in this particular panoptic architecture of the tertiary memory.These three steps highly affect that which the spectator can view in the tertiary memory-as well as the effort he needs to spend to actually 'see'.What is encoded and stored, shapes our panoptic cell in which a spectator can surveil our actions over time.However, what the spectator can actually 'see' about us, highly depends on the architecture that enables the retrieval of information.

Our Cell
The encoded information in the tertiary memory shapes our cultural framework and our history-both individual and societal.It gives us a floor to stand on but is at the same time our barrier.On the individual level, this is especially the case for personal information.Personal information shapes the individual in the eyes of others, it helps others know what to expect from an individual and get an idea of her identity (cf.Goffman 1959).
However, seeing that human beings generally change over time, personal information can easily become outdated and lead to an inaccurate image of the individual's personality, character, interests, opinions, etc. in the current day and age.We are shaped by our history, but we are not our history.Being caught in a frame of outdated information is a risk when the Web functioning as tertiary memory starts to act on the individual level as a 'Panopticon over time'.
The rise of digital technologies radically effected the possible content of our informational cell.The 'digital turn' of information severely influenced information storage capacities and has countered the previously inevitable discarding of information by selection processes to a certain extent.When being compared to traditional media, the Web is often applauded for allowing bidirectional communication.People use the Web to communicate with others and express their opinions about a vast array of topics.Yet, many of these expressions posted online cast a far longer shadow into the future than they would if they were expressed in many other settings.
While analogue information storage spaces only allowed the storage of so much information (i.e.only so many books fit in a cupboard) and thus required a selection to only retain the information that was considered relevant and/or valuable in order to make room for new relevant information, this kind of selection became increasingly unnecessary with the digitalisation of information.Currently, seemingly trivial and detailed personal information can be available in the tertiary memory of the Web (like tweets on whether one bought grapefruits or oranges), without a need for selection and discarding due to a lack of storage space.However, the more personal information is retained in the tertiary memory, the stronger the cell of our informational past becomes.The Web enables the retention of large quantities of personal micro information over time, which can provide for an extremely detailed reflection of our past-up for scrutiny by whomever makes the effort to look.

The Tower and the Looking Glass
In order to take a look at an individual in her 'cell', a spectator needs to retrieve information from the Web.In the panoptic architecture that we here attribute to the tertiary memory of the Web, search engines play a key role.
Though technically different, searches in traditional databases using personal identifiers as index and the use of a personal name in a fast (Web) search query arefor the scope of our analysis-sufficiently similar in their functionality.However, instead of having to access various databases for different kinds of information, technological developments now allow the user to perform a 'macro-search' by using a search engine on the Web and get a far more extensive picture of a certain individual by querying at once the aggregation of vast amounts of different online sources.The resulting hits of such a query promptly hold up for display a far more detailed picture of the individual than was previously possible.With our lives being recorded online to an ever increasing degree, Google Search provides the ability to assemble personal records detailing vast amounts of the individual's previous actions, all of which now have their visibility stretched out into both permanency-through being recorded-and universality-through being publicly available.
Additionally, drawing further on the analogy between the Panopticon in space and Panopticon in time, what is interesting in the Panopticon in space is that the movement to the left and right by the individual in the cell is visually quite clear to the spectator.However, the movement to the front and the back of the cell, the distance of the individual in relation to the spectator, is less clear to pinpoint viewed from the point of the spectator.In the 'Panopticon over time', a similar tendency can be observed: while the perceived information on events is relatively clear, the distance of the original event to us at a certain point in time is to our perception less clear.Additional information and temporal context is often needed for a user to get a sense of the distance in time of the event portrayed.The (long or short) distance in time as such is difficult to perceive.
The search engine as technological retrieval mechanism thus functions as a magnifying glass: it zooms in, enlarges, but does so at the expense of context.And what it enlarges depends on the design of the algorithm.With its easy-to-use design, the search engine can turn anyone using the Web into a surveilling spectator.

The Prisoner
Being surveilled-or at least the idea that someone is watching-can affect our behaviour (cf., e.g.Stoycheff 2016;Bateson et al 2006;Keller and Pfattheicher 2011).The power relations this produces are in the case of Google Search both between the individual and Google and between the individual and everyone able to perform an internet search.For practical purposes, we will consider this last group to be representative of society at large.Hence, even if the individual is to naively believe Google's motto 'don't be evil' and perceives no desire from Google to adapt her behaviour, there is still society at large to consider.We therefore argue that the functionality that Google Search offers is likely to induce in the individual a force driving her to adapt to normative behaviour-at least with regard to what she expects will be encoded or encodes herself into the online tertiary memory.The individual's current behaviour is thus not only adjusted to cope with the possibility of being surveilled now but also of the same action being surveilled in a distant and maybe very different future.
It is important to note here the similarities between the functioning of Google and the ancient Roman Censor.The one function of the censor in taking the census 'may be compared to the 'crawlers' sent out by search engines to 'harvest information" (Tantner 2014, p. 123).The other function of supervising public morals is implemented in the power relations that Google Search produces, resulting in the individuals adapting towards normative behaviour.
Much of the online behaviour of an individual consists of communications and expressions of feelings and opinions.The Panopticon over time therefore works not solely on the realm of disciplining physical behaviour (although it works there as well) as the spatial panopticon does, but also disciplines the exteriorised behaviour of the psyche-the expression of our feelings and opinions.
Taken to the modern meaning of the word, the particular type of force that repels the individual from expressing herself in ways that would be deemed undesirable by society at large can be labelled self-censorship.If we know that the information we produce about ourselves gets stored for a long time and may be easily retrievable, the only thing we can do to dodge future consequences is to make sure certain potentially risky information is not encoded in the first place.Thus, we self-censor.
The panoptic gaze constituted by the tertiary memory on the Web-readily supported in its architecture by the search engine-seems to be slowly internalised by individuals in the current day and age.People seem to become more convinced that the 'Internet does not forget' (which is incorrect to a certain extent, because a lot of content does disappear over time), and that information is easily found and retrieved and adjust their behaviour accordingly (e.g.Schenone 2010; Quan 2011; Zachary 2016).
In the case of Google, we thus see a resemblance to the ancient censere, linking together once more the meanings of 'counting' or 'assessing'-in the creation and maintenance of databases-and 'judging' or 'decreeing'-in its repressive function in a 'Panopticon over time', culminating in an effect that we can describe in the modern meaning of the word as self-censorship.

Prisonbreak
We have discussed how the 'Panopticon over time' can lead to self-censoring.The idea that a once uttered expression will be attached to one's persona for the rest of one's life may scare us into uttering only those feelings and opinions which we believe will not harm us in the long run-and even those might be risky to express, because what if we change our views over time?If we look at the panoptic affects on behaviour (cf., e.g.Caine et al 2012;Brown 2013), we may conclude that with an increasing awareness of the long-term availability of online information the inclination to express oneself online might be reduced.Distinct or controversial opinions and arguments might not be vocalised at all out of fear for the long-term.Such selfcensoring affects the level of encoding of the tertiary memory.Self-censorship is not in itself fundamentally negative, but it may have a chilling effect on experimenting with oneself or expressing certain opinions, which leads to a loss of creativity and diversity in society and the societal debate.Such a loss is problematic for a society that values freedom of expression and in which the online information flow is an increasingly important medium for social and societal interactions.
If we do not want censorship at the level of encoding in the tertiary memoryor at least keep it at a minimum-we need to disrupt the panoptic informational architecture.This can be achieved by intentionally inflicting a glitch, an artificial form of 'forgetting', in the tertiary memory process at one of the other two levels: storage or retrieval.The 'Right To Be Forgotten' provides us with the tool to do just that.
There is still ongoing dispute as to whether the 'Right To Be Forgotten' can be considered a form of censorship (e.g.Solon 2014; Lindsay 2012).In this debate, we can see the political instrumentalisation of the strong negative connotations that surround the word censorship.Such instrumentalisation corners those defending the 'Right To Be Forgotten' into denying accusations of censorship, leading to a peculiar sort of discussions that do not make assessments based on a shared notion of censorship, but instead focus on addressing or defending against its negative connotations.Taking into account the ambiguity of the term, we refrain from making bold statements and leave it up to the reader to decide whether the 'Right To Be Forgotten' qualifies as censorship in their definition of the word.What we can and will do, for the sake of caution, is assume that within the scope of this paper the 'Right To Be Forgotten' should be considered as a form of censorship.It is after all for sure an official suppression of unacceptable parts of the tertiary memory and in assessing its merits, we feel obliged to take into account the dangers that lie therein.
In the Google Spain case, G. first aimed to invoke a 'Right To Be Forgotten' by requesting that the newspaper remove the information from the archive, which would be censoring the tertiary memory on a storage level.However, the disputed information was legally retained in the online archive.Following that, G. attempted and succeeded to censor on the retrieval level by having the links to the information removed in Google Search.
From the point of view of the tertiary memory, this is an elegant solution: history is still retained and accessible, but the access is less easy.Google is not the only search engine, and the information is still retrievable once one has a rough idea of what one is looking for and is willing to spend a bit of effort.The 'Panopticon over time' is not dissolved with this-nor do we claim that it needs to be-but the spectator has changed in character from any random curious and/or bored bystander to specific spectators who want to invest to 'see', thereby greatly reducing the scope of the power inequalities at play.
The 'Right To Be Forgotten' can thus be labelled as a tool to censor, but in order to safeguard freedom of expression for individuals, we need to keep the option open to censor the agents that may induce self-censorship.

Conclusion
In this paper, we have shown how search engines play a crucial role in the formation of a panoptic architecture in the Web as tertiary memory.By gathering relatively old information at hand ready for inspection in the here and now, search engines can heavily reduce the 'forgetting' that generally takes place with the passing of time.By doing so search engines can link old and outdated opinions and expressions to an individual.As a result, individuals might never successfully distance themselves from their past views and expressions in the eyes of others who use the search engine.Consciousness of this artificially imposed permanence can place a heavy burden on the freedom to express oneself.It is therefore inevitable that we address the panoptic architecture of the Web serving as our tertiary memory, lest we find ourselves captive behind bars formed by database records, thickening with every bit of data we produce, with every expression posted online.
We can address this panoptic architecture of the tertiary memory process by applying censorship on any of the three levels of the process: on the level of encoding, on the level of storage and on the level of retrieval.The 'Right To Be Forgotten' can be regarded as a tool to invoke a form of censorship on the levels of storage as well as retrieval.However, by being available as a tool for individuals to invoke certain censoring on their outdated past, it may counter the disciplinary effects the architecture of the Web holds on individual freedom of expression and prevent censorship on the level of encoding by performing self-censorship in the here and now.This is not to say that we have no responsibility with regard to our actions and what we encode about ourselves or others.Options of escaping the 'Panopticon over time', not being forever tied to our outdated past, is by no means a carte blanche to disregard considerations about what we say and do in the here and now, and how this affects others and ourselves.However, the sheer fear of the meaning and value of our current actions for our future self should-as such-not have a chilling effect on our choice to express ourselves or experiment with ourselves in the public debate.
When delegating more and more of our culture and public debate to the Web, the possible chilling effect of easy future accessibility and lack of temporal context can heavily effect and may even be detrimental for addressing challenges and making changes in society.If we cannot express ourselves freely in the public debate due to an inducement to conform to normative standards for fear of being forever reminded of our expressions, the debate will be deprived of exactly those voices that stand out against the norm, exactly those voices that have the option of bearing novel and critical ideas.This would deprive society of a possibility to acknowledge, address and confront the feelings and issues that live in the deeper structure of society.
Furthermore, since the tertiary retention of information constitutes a major part of the world that we live in and are born into, such a flattened and uncritical reflection of our current society reproduces itself, affecting both our lives and those of future generations.
Lacking the means to fundamentally address the panoptic architecture of online tertiary memory, we thus consider censoring at the level of retrieval under certain circumstances a necessary and even elegant solution.While providing means for the individual to escape the 'Panopticon over time', at least in its most dramatic manifestations of being held sway by an outdated past, it allows us to 'decensor' the information with relative ease in case of societal changes over time.Compared to censoring on the level of encoding or storage, the imposed hindering in the retrieval process is easily reversible.This is especially poignant with regard to self-censorship, in which case the information does not get encoded at all.'Forgetting'-censoringinvoked by the Right To Be Forgotten, may in time challenge for instance historians with the retrieval of information to get an accurate view of past societies, but this difficulty is not automatically an impossibility-which would be the case if the information were thoroughly deleted on the storage level or not encoded at all.