Keywords

1 Introduction

“Forget about privacy” is a message that grew in popularity throughout the past two decades (for its origins, see [2], referring to the words of Sun Microsystems co-founder, Scott McNealy). This book chapter investigates in a broad way the changes that have taken place in the concept of privacy due to both societal and technological advances. These topics were raised in a workshop, held at the 2017 International Federation of Information Processing (IFIP) summer school, through the prism of disciplines such as ethics, philosophy, and education. The content of the workshop is described in sequence throughout this book chapter.

The goal of the workshop was to stimulate reflection and discussion among attendees by assessing a broad set of questions. Examples follow: Is privacy still a human right? What are the implications of data collection and the cloud for privacy? Has techno-determinism already conquered the younger generation’s mindshare through its ‘forget about privacy’ mantra? What kinds of consequences will these different attitudes to privacy have on the design of information systems? What will become of the concept of privacy in the future, e.g., by 2030? What kinds of education will be needed in the future to inform children and young people particularly about notions of privacy and future technological developments? Thus, the focus of these questions was on the well-known debate around the possible obsolescence of the concept of privacy in a fully interconnected society obsessed with information-sharing.

Each of the four sections of the book chapter that follow is dedicated to a specific question or questions: the issues covered have been adapted and fine-tuned in response to the workshop attendees’ comments and criticisms. The main points of the workshop are drawn together in a brief conclusion, together with some reflections that have emerged over the latest months since the workshop was held.

2 With the Growth in Data Collection, Is Privacy Still a Human Right?

Rapid technological advances are being made. The evolution in the design and use of information and communication technologies (ICT) is growing in speed. The amount of information stored, analysed, and visualised is a ‘tsunami’ of bits, handling data related to huge volumes of human beings. The typical 18+ year-old user in the United States of America (USA) spends more than three hours a day online, using a combination of mobile devices and apps [3]. The amount of data (which can be called the “digital universe”) is doubling every two years; by 2020, it is estimated that it will have reached 44 Zettabytes (44 × 1021 bytes) [4]. This information deluge contains not only data produced by sensors, but also the digital traces left by human beings – the logs of their digital lives.

Human lives are becoming transparent: easy to see, perceive, or detect. In the most extreme scenario, somewhere every human gesture is logged and there is someone (or some authority or corporation) who may have access potentially to all of these movements or behaviours. Some authors described this new era – of an environment totally populated by information – as one characterised by the end of privacy [5, 6]. One of the most well-known philosophical explorers of this new kind of infosphere is Floridi [7, 8].

A somewhat ‘poetic’ view of the end of privacy was presented by director, Peter Weir, in the 1998 movie, The Truman Show [9]. Set at the end of the 20th century, the film highlights the imagined degree of intrusion possible into people’s lives. In this film, a broadcasting television channel offers as regular viewing the whole of an unwitting person’s life to audience of viewers; the cast of television actors colludes in hiding this lie from the show’s protagonist.

A decade later, around 2009–2010, data was increasingly being perceived either purely as commerce or at least a trade of which to be wary [10, 11]. Information – once viewed as the source of knowledge and wisdom – was becoming a commodity to trade. As Peter Sondergaard famously said, “information is the oil of the 21st century” [10], when he echoed the earlier, somewhat more critical, speech of European Commission commissioner, Meglena Kuneva, in which she formulated the outlook that, “Personal data is the new oil of the internet and the new currency of the digital world” [11].

By 2010, corporations that are sometimes referred to as the Titans of the Web (among them, such well-known examples as Amazon, Apple, Facebook, Google, and Microsoft) were among the top ten companies in the world in terms of market value [12]: this economic positioning was due, in part, probably to the immense storage and processing capabilities of their data centres and capabilities to collect large data sets. They are among the few organisations on the planet with the capability of mining and distilling big data (the level of data: reserved to machines).

This collection/collation of data (‘big data’) enables the uncovering of interesting facts among the data bits that result from smart visualisations or images (available at the level of information). They result in a third level which is reserved for human beings only (the level of knowledge). This ‘lift’ or hierarchy (which provides the opportunity to move from level-to-level or stage-to-stage) has been much adopted in the world of information science and computing science since it was adapted from its origins in the work of poet and playwright, Eliot [13]. What is most conspicuous today is the increasing visual representation of the data or information involved, used in fields that range from research to commerce.

Vision and sound can be combined to be used in what are basically surveillance techniques, occasionally made more impressive by the fact that they are described as attempts to provide efficient or sophisticated services. The wording of the 2015 guidelines designed for the viewers of a smart television could be compared all too closely with an imaginary, fictional text published some 65 years earlier [14, 16]. Although they were later removed [14], Samsung TV privacy guidelines were originally reputed to state these instructions [15]: “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of voice recognition”. A similar text in the first chapter of George Orwell’s 1949-published novel, Nineteen Eighty-Four [16] reads: “Any sound that Winston made, above the level of a very low whisper, would be picked up by it, moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment”. The similarities between these two sentences, written some sixty years apart, were shocking to both direct consumers and general commentators.

The risks underpinning these various privacy-related scenarios are that humans may shift steadily, as entities, from being organic, living beings to becoming sensors that simply produce data streams. These streams of data could be stored and analysed to sell goods and services: out of bits, human beings will produce merely therefore the visualisations of useful hints for future technological, scientific or commercial developments. This data could be intimately related to information stored in people’s brains and bodies. Ultimately, therefore, while privacy has long been considered a human right [e.g., 17], it could appear that this perceived right is being worn down through leaps in available technologies, the attitudes of commercial companies and of designers, the variety of stances on privacy globally, and generational behaviours and attitudes.

3 With the Expansion of the Cloud, What Kinds of Systems Will Be Designed?

The technological infrastructure that enables the data collection (‘big data’) scenarios, described in the previous section of this chapter, is that of cloud computing. Users connect their mobile devices to information services where the data and the processing power are located in the cloud, an expression developed some 20 years ago [18]. The cloud provides a global infrastructure with a number of characteristics: It is a network based on broadband that has computing servers which act as shared platforms. It is typified by resource pooling and multi-tenancy, rapid scalability and elasticity, and – for billing purposes – uses either measured or metered services. It is available on-demand and is therefore ‘self-service’ in orientation [19].

Society is now entering the cloud computing era. Today, for many users – such as people who have started to use computing technology most recently or younger generations – it can seem absolutely natural to hold just a touchscreen in their hands: everything else, such as storage space and computing power, can be based ‘in the cloud’.

In terms of computing architecture, there is a shift that can be termed ‘back to the future’ which can be envisioned as a repeat of an earlier era of centralised computing infrastructure. The term is adapted from the Californian 1985 adventure movie, in which a young protagonist travels backwards in time with consequences that will alter his family’s future [20]. It is argued that people will lose the computing freedom of members of recent or past generations, a freedom available at the phase when personal computing was first introduced [19, 21]. Thus, there is a reaction to the phase of the autonomy of personal computers – when input, storage, processing, output and networking were all in the hands of the end-users – to the heteronomy of the cloud: this forms a leap back in time to the pre-personal computer era of ‘dumb terminals’ [19, 21] that possessed no processing capabilities. These two aspects of independence and self-control as opposed to external control can be directly contrasted with each other.

What impact will the revival of such a sequence of events have on society? What kind of relationship is there, in reality, between technology and society? According to Deborah Johnson [22], it is co-shaping that typifies the relationship between technology and society: “The belief that technology develops independently from society is wrong; social factors steer engineers in certain directions and influence the design of technological devices and systems; on the other direction, technology shapes society, society and technology shape each other (co-shaping); adoption of a particular technology means adoption of a particular social order; systems are infused with social and moral values”. Cloud computing can be viewed as simply one example of a socio-technical system that involves co-shaping.

With this vision of co-shaping, it would be wise to scrutinise other upcoming generations of technologies and ICT systems that will be designed by 2030, including robotics and artificial intelligence and their successors such as quantum computing. As a result of developments in the cloud and other future technologies, further questions will arise, such as: What kind of society will be shaped by these new directions in ICT? Are people losing the status of digital citizens so that they become simply digital consumers? Will human beings lose even the status of ‘human’ beings?

It is for these reasons, among others, that the workshop explored various ways and means of dealing with an obsessive onlife environment (see the next section of this chapter), the historic and developing notions of privacy in society, the kinds of educational developments that might help to alleviate the predominance of technology use as well as how to handle data privacy and data protection specifically.

4 Onlife, and What Can One Do to Get Back to Real Life?

By 2018, people are now entering an onlife age [23] which is typified by what has been called the “persistent, visible, searchable, and spreadable nature of online social environments” [24]: more and more aspects of people’s existences are becoming digital and are reshaped through their increasingly relentless online interactions. Onlife real and virtual dimensions are becoming intertwined. Attention and focus has shifted from entities (such as organisations or machines and devices) to connections.

Compulsive applications (‘apps’) are deliberately designed that keep users tethered i.e., tied or restricted to their mobile devices and, as a result, they are constantly prompted to consume their own intellect, time, and attention while they generate floods of personal trails that feed various online business models [25]. Data has increasingly been monetised, and people are encouraged either not to give data-sharing any thought whatsoever or simply to think about the release of data as a transactional (commercial) procedure. Profound transformations are altering people’s relationships with themselves, with others, and how they experience the world around them: older values are becoming obsolete.

As users instant-message, e-mail, text, and tweet, they develop new ‘alone together’ behaviours that show a reliance on, and preference for, technology rather than real social relationships, first documented in 2011 [26]. In 2015, in Reclaiming Conversation, scholar Sherry Turkle [27] reported on the electronic erosion of conversational attention at both work and home. At many face-to-face encounters or meetings, although people are physically in the same space, they cannot refrain from turning away from each other to their phones/online connections [28]. These common societal trends, in which technology overuse increasingly diminishes human relationships, was illustrated visually in 2014 in a highly symbolic way by street artist, Banksy [29]: called Mobile Lovers, the mural features a pair of lovers in the dark, checking busily their smartphones for new messages rather than kissing.

There is the potential that this predominance of the influence of technology over meaningful relationships and conversations will lead to an unlearning of human values and an impoverishment in human capacities like empathy, self-reflection, creativity, and productivity. Hence, Turkle [27] called for a re-taking of control in response to a disenchantment with technology. She suggested a series of first steps towards the self-regulation of one’s personal onlife world, and a set of disengagement strategies so that people might learn to start and, most importantly, to close or end their digital interactions. Since 2013, the Center for Humane Technology – founded by early members of a number of high-tech firms – has focused on forms of humane technology design, ways of re-focusing attention, and tips to enhance a more self-controlled use of mobile phones [29, 30]. More and more people are attempting to escape from technology by looking towards completely different ways of living [31, 32]. Conversely, there are also currently shifts taking place towards the sharing of data for more publically altruistic purposes [33]. One interpretation of good ICT [34] might also be that it should include ‘privacy for good’ or good forms of privacy.

In addition to these suggestions, the next section of the chapter explores other concrete interventions that have a more direct linkage to the notion of data privacy and data protection, particularly in the fields of education and training.

5 Privacy Past, Present, and Future: What Are the Educational Trends?

Part of this workshop looked at privacy past, present, and future. The attendees explored past meanings of the term ‘privacy’, what changing views of privacy mean in current terms with regard to the development and implementation of the General Data Protection Regulation [35], and where privacy may head in the future – particularly in terms of either the trading of data and/or consideration of the use of data to assist with commitments to the public good e.g., in relation to health or well-being or sustainability [33].

When exploring the meanings of the term privacy, there are many different views of the same term, developed over more than a century and a half, that can be identified. Contributors to an event like this summer school are among the most eminent and informed of researchers and practitioners in the privacy field and are at the leading edge of developments in this domain. Therefore, in this case, simply five of the most well-known perspectives on privacy, past and present, were cited. They ranged from the historical “right to be let alone” [36], which dates back to 1890, to the “right to control the use that others make of information about myself” [37]’’, to the more recent “protection of life choices against any form of public control and social stigma” [38] to the noteworthy definition Stefano Rodotà, of the “right to do NOT know, right to keep control of our information and determine the modality of construction of our private sphere” in which the lawyer proclaimed the shift from the legal term of habeas corpus to habeas data [39]. This right is indeed available in several countries around the globe, including a number in Latin America. Probably the most concise statement about privacy is associated with the utterance attributed to Hollywood actor, Greta Garbo (1905–1990): “I never said ‘I want to be alone!’ I only said, ‘I want to be let alone.’” [40].

Ultimately, in the workshop, the date of 2030 was selected as a specific point in time, for reflection, in order to cover both millennials who originated in the early 1980s (who will be aged around 50 years old at that point) and those born in the early years of the 21st century (who will by then be adults shifting from one stage of maturity in their lives to another). However, the workshop attendees did not have the time or opportunity to explore this futures-related thinking in detail. Instead, the focus was more on the current present and on May 2018.

In 2018, the General Data Protection Regulation will come into force during the month of May [34]: this new over-arching regulation has tremendous importance for the meaning of data protection and data privacy. As the introduction to this book (see: ‘The Smart World Revolution’) points out, there can be fundamental contradictions between the pressures (even if simply implied or perceived) to share all forms of personal data, and the individual rights of citizens to privacy and security.

While the introduction of regulations is crucial, the need for education and training about what such legislation means for ordinary human beings of all ages, but especially for young and ever younger children, is equally important. Teachers have themselves been highly critical of the lack of thinking and planning they are giving to privacy and security while they are introducing children to digital technologies and managing school resources. Some current data privacy, data protection, and cybersecurity education initiatives have been focused on coping and/or resilience narratives that attempt to counter threats rather than offer mechanisms for positive self or community empowerment [41]. Hence, there is an urgent requirement for data protection governance in educational settings combined with robust teacher training. Teaching aids in cybersecurity, online safety, data protection awareness, data literacy and skills development should be brought to the attention of teachers, educators and parents as ‘ready-for-use’ resources.

This is nevertheless a favourable time-period, in the sense that there are a number of positive actions taken in recent years to fill digital skill gaps, especially in the fields of data protection and data privacy. There are several materials that it is worthwhile citing. One is a handbook published by researchers from the Vrije Universiteit Brussel [42]. it is a compilation of work undertaken by three eastern European data protection authorities that identifies leading examples of schools-based education about data privacy and data protection. Another has a more international perspective: it is a training framework on data protection intended for young people at school [43]. Designed for educators by a wider set of data protection authorities, it outlines nine basic principles, each of which is enhanced by a description of the competences needed in this field. A third example, for children and adults interacting together, Happy Onlife is a quiz or game that can be used to construct Internet safety and security [44]. More generically, research sponsored by the European Commission’s Joint Research Centre, focusing on research and skills for the digital era, has led to the production of a fundamental set of needed digital competences [45]. Five competence areas are supported by eight levels of proficiency that can be taught and assessed – these are often more pertinent to adults than children, however. Examples of how these proficiencies can be observed and used in both employment and education settings are described. Of most interest in the context of this summer school is likely to be the field of safety, which is taken in this booklet [45] to relate also to privacy and security.

As a result of contemporary developments, an opportunity arises to ride on the back of the need for awareness of data privacy, data protection, and cybersecurity, to shift to other awareness-raising approaches. In the education field, training about data protection and data privacy could be bundled together with education about digital competences more generally. As legislation, regulation, and technologies change, such materials need to be designed for both child and adult populations; materials need to be refined and upgraded continuously; they need to be valid for international contexts as well as European settings; and they need to be relevant to a range of technologies and not be limited simply e.g., to the use of mobile phones.

6 A Discussion that Leads Towards a Conclusion

It is feasible to merge both the opinions expressed at the end of this summer school workshop and those of the authors themselves.

As often in times past, a growth in emerging technologies poses ethical and societal challenges that may be old as well as new or revived or revised. Among technological developments are those related in particular to big data and cloud computing. Today, the use of technology is becoming ultra-pervasive: technologies have entered many different aspects of the lives of human beings, to the extent that they are encroaching on spheres of great intimacy. They are no longer present solely in places of isolation such as outer space or theatres of war, but in people’s places of employment, communities and residences and increasingly near to their bodies, brains, and minds.

Society as well as technology changes. As the series of IFIP summer schools shows, over the past decade and more, the notion of data privacy has also been changing and developing. Many aspects of privacy are being modified: through the attitudes of commercial companies and designers; resulting policies and legislation; and generational and inter-generational behaviours and attitudes. These challenges are paralleled by the posing of many challenging and provocative questions about the past, present, and the future.

Taking a view that merges both the social and the technical, through processes like co-shaping, encourages people to group together to consider, on the one hand, what kinds of technologies they wish to see designed and advanced and, on the other hand, to determine not only what kinds of data privacy they desire for themselves, but also what forms of data they wish to share with others (including commercial companies and services) and how this data-sharing can be used to help wider communities of people. These discussions form part of wider questioning and debates about social responsibility and societal accountability. They enable also a re-thinking of educational and training needs. Both sets of challenges, and suggestions of solutions to them, formed part of this summer school workshop’s discussions.

Opportunities like the introduction of the General Data Protection Regulation [35] can act as enablers for opportunities to re-explore certain approaches to education and training. Ultimately, events such as this summer school can benefit from increasing coverage of the educational and training needs required to prepare people at large, and young people in particular, for new ways of handling data protection and data privacy. Options can be taken up that encourage practical application and assessment of these challenges.

Considering developments that have taken place since the summer school itself was held, it is unlikely – given contemporary socio-political circumstances – that people at large will forget about data privacy and data protection. They are much more likely to desire to reinforce their own individual competences in this field, but also demand that organisations and institutions take greater responsibility for their use of data and actions pertaining to data-sharing too. People may begin to consider not only what can they themselves gain or obtain from data-sharing, but also what they can do to benefit themselves, their families, communities, and societies e.g., by way of ‘data donoring’. It may ultimately be that a much more international perspective is taken on these challenges than a purely European approach.