Skip to main content

About the E-Privacy Directive: Towards a Third Generation of Data Protection Legislation?

  • Chapter
  • First Online:
Data Protection in a Profiled World

Abstract

The main purpose of this contribution is not to analyse provision by provision the E-Privacy Directive presently in course of revision, but to describe the emergence of new principles which, in our view, might be considered as going far beyond the traditional principles enshrined in the Council of Europe Convention 108 and already translated in the E.U. Directive 95/46/EC. These new principles fully take into account the new Privacy threats incurred by individuals, due to the characteristics of modern and future information systems on a more and more global interactive and convergent Internet.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sectors.

  2. 2.

    See the still in discussion proposal for a Directive of the EU Parliament and of the Council amending directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sectors and Regulation (EC) No. 2006/2004 on consumer protection cooperation, 2007/0248. On May 6, 2009 the European Parliament adopted the second reading amendments on the review of the EU 2002 regulatory framework for electronic communications including the review of the E-Privacy Directive (The famous Telecom Package). The amendments under discussion in the course of the Directive’s review reflect the compromise reached between Parliament and Council, except for what concerns the refusal by the EU Parliament to endorse a provision legitimizing the French HADOPI system about the fight against illegal copying. A vote, by the telecommunications Council after this second reading by the EU Parliament, is expected in September 2009 and a third reading by the EU Parliament in mid-December 2009.

  3. 3.

    See A. Rouvroy and Y. Poullet, “The Right to Informational Self-determination and the Value of Self-development: Reassessing the Importance of Privacy for Democracy”, in Reinventing Data Protection, S. Gutwirth et al. (eds.) (Dordrecht: Springer, 2009), 65 ff. we have developed extensively this idea of third generation in Y. Poullet, « Pour une troisième génération de réglementation de protection des données », in Défis du droit à la protection à la vie privée, coll. Cahiers du Centre de Recherches Informatique et Droit, 31 (Bruxelles: Bruylant, 2008), 25–70.

  4. 4.

    G. Karlsruhe, B.Verf. December 15, 1983, EuGRZ (1983): 171 ff. See E.H. Riedl, “New Bearings in German Data Protection”, Human Rights Law Journal 5 (1) (1984): 67 ff.; H. Burkert, « Le jugement du Tribunal Constitutionnel fédéral allemand sur le recensement démographique et ses conséquences », Dr. Inf. (1985): 8 ff. See also, E. Brouwer, “Digital Borders and Real Rights” (Nijmegen: Wolf Legal Pub, 2007), 501.

  5. 5.

    The Charter of Fundamental Right of the European Union (2000/C 364/01) in its Article 7, § 1 reasserts the existence of the right to private and family life, home and communication, whereas Article 8 of the same Charter raises the protection of personal data to the status of a fundamental right.

  6. 6.

    Among others, see R. Bronsword, “Rights, Regulation and the Technological Revolution” (Oxford: Oxford Univ. Press, 2008) and the suggestive article written by J. Cohen, “Examined Lives: Informational Privacy and the Subject as Object”, Stanford Law Review, 52 (2000): 1373 ff.

  7. 7.

    About these different aspects, read our article, Y. Poullet, “Data Protection Legislation: What’s at Stake for Our Society and Our Democracy?”, Computer Law & Security Report 25 (2009): 211 ff.

  8. 8.

    L. Introna and H. Nissenbaum, “Shaping the Web: Why the Politics of Search Engines Matters”, The Information Society 16 (3) (2000): 169–86.

  9. 9.

    P. De Hert and S. Gutwirth, “Privacy, Data Protection and Law Enforcement. Opacity of the Individuals and Transparency of the Power”, in Privacy and the Criminal Law, eds. E. Claes, A. Duff, and S. Gutwirth (Antwerpen: Interscientia, 2006), 74, express it quite correctly: “Never does an individual have an absolute control over an aspect of his or her privacy. If individuals do have freedom to organise life as they please, this will only remain self-evident up to the point that it causes social or inter-subjective friction. At that stage, the rights, freedoms and interests of others, as well as the prerogatives of the authorities come into play. The friction, tension areas and conflicts create the need for a careful balancing of the rights and interests that give privacy its meaning and relevance. That shows clearly, although quintessential for a democratic constitutional state, because it refers to liberty, privacy is a relational, contextual and per se social notion which only requires substance when it clashes with other private or public interests”. The link between protection of privacy and the defence of our democracy is asserted by many authors. See notably, J. Habermas, “Between Facts and Norms” (Cambridge, MA: MIT Press, 1996); P.M. Schwartz and W.M. Treanor, “The New Privacy”, Michigan Law Review 101 (2003): 216; James E. Flemming, “Securing Deliberative Autonomy”, Stanford Law Review 48 (1) (1995): 1–71, arguing that the bedrock structure of deliberative autonomy secures basic liberties that are significant preconditions for persons’ ability to deliberate about and make certain fundamental decisions affecting their destiny, identity, or way of life. On deliberative democracy, see James E. Flemming, “Securing Deliberative Democracy”, Fordham Law Review 72 (2004): 1435.

  10. 10.

    The contrasts between the visions of Orwell and Kafka are very well described in the book of D.J. Solove, “The Digital Person: Technology and Privacy in the Information Age” (New York: New York Univ. Press, 2004), 7 ff.: “The dominant metaphor for modern invasions of Privacy is Big Brother… Big Brother oppresses its citizens, purges, dissenters, and spies everyone in their homes. The result is a cold, drab grey world with hardly any space for love, joy, original thinking, spontaneity or creativity. It is a society under total control. Although the metaphor has proven quite useful for a number of privacy problems, it only partially captures the problems of digital dossiers. Big Brother envisions a centralized authoritarian power that aims for absolute control, but the digital dossiers constructed by business aren’t controlled by a central power, and their goal is not to oppress us but to get us to buy new products and services”. “The trial captures an individual’s sense of helplessness, frustration and vulnerability when a large bureaucratic organization has control over a vast dossier of details about one’s life…The problem is not simply a loss of control over personal information nor is there a diabolical motive or plan for domination as we describe it by the Big Brother… The problem is a bureaucratic process that is uncontrolled”.

  11. 11.

    D.J. Solove, “Privacy and Power: Computer Data Bases and Metaphors for Information Privacy”, Stanford Law Review 53 (6) (2001): 1393 ff.

  12. 12.

    The importance of the respect of the context, that is to say the zone of confidence in which personal data is transmitted by the concerned individual has been remarkably highlighted by H. Nissenbaum, (“Privacy as Contextual Integrity”, Washington Law Review 79 2004: 150 et s.). The author asserts: “the freedom from scrutiny and zones of ‘relative insularity’ are necessary conditions for formulating goals, values, conceptions of self, and principles of action because they provide venues in which people are free to experiment, act and decide without giving account to others or being fearful of retribution”.

  13. 13.

    The dangers resulting from the lack of transparency and threatening our information societies for citizens; where they cannot know exactly how the information systems work, which data are collected, where data processing takes place and by whom, were underlined in 1983 by the well known constitutional judegment in the case of census (Bundesverfassungsgerichtshof, December 15, 1983, EuGRZ (1983): 171 et s.). The temptation of citizens is thus to behave in the way they think is expected by society and not to dare expressing themselves freely what is harmful to our democracies: “The possibility of inspection and of gaining influence have increased to a degree hitherto unknown, and may influence the individuals’ behaviour by the psychological pressure exerted by public interests. Even under certain conditions of modern information processing technology, individual self-determination presupposes that the individuals left with the freedom of decision about actions to be taken or to be omitted, including the possibility to follow that decision in practice. If someone cannot predict with sufficient certainty which information about himself in certain areas is known to his social milieu and cannot estimate sufficiently the knowledge of parties to whom communication may be possibly be made, he is crucially inhibited in his freedom to plan or to decide freely and without being subject to any pressure influence. If someone is uncertain whether deviant behaviour is noted down and stored permanent as information, or is applied or passed, he will try not to attract attention by such behaviour. If he reckons that participation in an assembly or a citizens’ initiative will be registered officially and that personal risks might result from it, he may possibly renounce the exercise of his respective rights. This would not only impact his chances of development but would have also impact the common good (“Gemeinwohl”), because self-determination is an elementary functional condition of a free democratic society based on its citizen’s capacity to act and to cooperate.”

  14. 14.

    The danger of « reductionism » is analysed by J. Rosen, “The Unwanted Gaze: The Destruction of Privacy in America” (2000) quoted by D.J. Solove, ibidem, 424: “Privacy protects us from being misdefined and judged out of context in a world of short attention spans, a world in which information can easily be confused with knowledge”.

  15. 15.

    On this classic distinction and its radical calling into question J.A. Eichbaum, “Towards an Autonomy Based Theory of Constitutional Privacy: Beyond the Ideology of Familial Privacy”, Harvard Civil Rights—Civil Liberties Review 14 (1979): 361–84. On this point, read also, D.J. Solove, “Conceptualizing Privacy”, California Law Review 90 (2002): Especially 1138 and 39.

  16. 16.

    See the Durant case in the UK, where a plate identification number was not considered as a personal data.

  17. 17.

    Two recent decisions of the French Highest Court (Cour de cassation, May 1, 2007 and April 27, 2007) have clearly asserted that IP addresses were not personal data: “Cette série de chiffres (ne constituent) en rien une donnée indirectement nominative à la personne dans la mesure où elle ne se rapporte qu’à une machine et non à l’individu qui utilise cette machine” et « l’adresse IP ne (permettait aps d’identifier le ou les personnes qui ont utilisé cet ordinateur puisque seule l’autorité légitime pour poursuivre l’enquête (police ou gendarmerie) peut obtenir du fournisseur d’accès l’identité de l’utilisateur ». More cautious, was the decision issued by the same court on January 13, 2009. About these discussions, see Y. Defraigne and A.M. Escoffier, « La vie privée à l’heure des mémoires numériques », Rapport d’information, Commission des lois 441 (2008–2009), available here www.senat.fr.

  18. 18.

    See Articles 6 and 9 about the severe limits imposed on public communications services as regards the processing of these data and the EU Directive on data retention, which regulates strictly the retention (which data, for which duration, …) of these data.

  19. 19.

    “The reference to the definitions provided by the Data Protection Directive is logical in light of art. 1(1) since the provisions of the Directive are intended to complement and particularize the provisions of the Data Protection Directive’s. The definitions provided in art. 2 of the Data Protection Directive relate to key concepts of the application of data protection legislation such as ‘personal data’, ‘processing of personal data’, ‘controller’ or ‘processor’. However, the Directive makes rather limited use of these key concepts and more generally relies on specific proper concepts that are not based on these definitions. For instance, the Directive uses the terms ‘traffic data’ in arts. 6 and 9,‘location data’ in art. 9 or ‘information’ in art. 5(3) which data or information are not necessarily ‘personal data’ per se”.

  20. 20.

    On that point, M. Rundle, “International Personal Data Protection and Digital Identity Management tools” (paper presented at the Identity Mashup Conference, Harvard Law School, June 20, 2006), available at the SSRN paper collection: http://papers.ssrn.com/abstract or at the Berckman Center for Internet and Society Research Publication Series web site (Research publication No. 2006, June 2006–06) available at: http://cyber.law.harvard.edu/publications. From the same author, M.C. Rundle and P. Trevithick, “Interoperability in the New Digital Identity Infrastructure”, (paper published at Social Science Research Network, February 13, 2007), available on the web site http://papers.ssrn.com/sol3/papers.cfm?abstract._id=962701.

  21. 21.

    On that point, read J.M. Dinant, “The Concepts of Identity and Identifiability: Both a Legal and Technical Deadlock for Protecting Human Beings in the Information Society?”, in Reinventing Data Protection, eds. S. Gutwirth et al. (Dordrecht: Springer, 2009).

  22. 22.

    A. Ceyhan, “Technologization of Security: Management of Uncertainty and Risk in the Age of Biometrics”, Surveillance and Society 5 (2) (2008): 102–23.

  23. 23.

    About the very specific peculiarity of biometric data and the risks linked with their uses, read C. Priens, “Biometric Technology Law. Making Your Body Identify for Us: Legal Implications of Biometric Technologies”, Computer Law & Security Review 14 (1998): 159 ff. A. Cavoukian and A. Stoianov, “Biometric Encryption: A Positive-Sum. Technology that Achieves Strong Authentication, Security and Privacy” (Information and Privacy Commissioner/Ontario, March 2007).

  24. 24.

    Article 6 of the E-Privacy Directive as regards traffic data provides:

    1. 1.

      Traffic data relating to subscribers and users processed and stored by the provider of a public communications network or publicly available electronic communications service must be erased or made anonymous when it is no longer needed for the purpose of the transmission of a communication without prejudice to paragraphs 2, 3 and 5 of this Article and Article 15(1).

    2. 2.

      Traffic data necessary for the purposes of subscriber billing and interconnection payments may be processed. Such processing is permissible only up to the end of the period during which the bill may lawfully be challenged or payment pursued.

    3. 3.

      For the purpose of marketing electronic communications services or for the provision of value added services, the provider of a publicly available electronic communications service may process the data referred to in paragraph 1 to the extent and for the duration necessary for such services or marketing, if the subscriber or user to whom the data relate has given his/her consent. Users or subscribers shall be given the possibility to withdraw their consent for the processing of traffic data at any time.

    Article 9 provides as regards the location data:“Where location data other than traffic data, relating to users or subscribers of public communications networks or publicly available electronic communications services, can be processed, such data may only be processed when they are made anonymous, or with the consent of the users or subscribers to the extent and for the duration necessary for the provision of a value added service.”

    In the new version of the directive presently debated, the authors have foreseen an additional purpose legitimizing the processing of location or traffic data. The public communications service operator might also process the data for their own internal security needs.

    See also Directive 2006/24/EC on Data Retention, which explicitly regulates the storage of traffic and location data.

  25. 25.

    The Article 29 Working Party echoes these doubts present in the E-Privacy Directive revision by repeating that, under its opinion, IP address must be considered as personal data. The Working Party 29 recalls that, in most cases—including cases with dynamic IP address allocation—the necessary data will be available to identify the user(s) of the IP address. The Working Party noted in its WP 136 that “… unless the Internet Service Provider is in a position to distinguish with absolute certainty that the data correspond to users that cannot be identified, it will have to treat all IP information as personal data, to be on the safe side…”. These considerations will apply equally to search engine operators (WP 148). See again the strong emphasis put on this point in the WP 159 dated from February 10, 2009 on the last version proposed by the EU Council of Ministers.

  26. 26.

    “Working Paper 4/2007 on the Concept of Personal Data, WP 136”, (June 20, 2007).

  27. 27.

    Concerning the use of location data and all the possibilities linked with the use of location data and data mining methods, read F. Gianotti and D. Pedreschi, eds., “Mobility, Data Mining and Privacy” (Dordrecht: Springer, 2007).

  28. 28.

    On that point, read J.M. Dinant, “The Concepts of Identity and Identifiability: Both a Legal and Technical Deadlock for Protecting Human Beings in the Information Society?”, in Reinventing Data Protection, eds. S. Gutwirth et al., (Dordrecht: Springer, 2009).

  29. 29.

    Article 8 asserts the three major elements of any data protection legislation based on the European model:

    1. 1.

      Everyone has the right to the protection of personal data concerning him or her.

    2. 2.

      Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.

    3. 3.

      Compliance with these rules shall be subject to control by an independent authority.

  30. 30.

    “Humans will, in an Ambient Intelligent Environment, be surrounded by intelligent interfaces supported by computing and networking technology that is embedded in everyday objects such as furniture, clothes, vehicles, road ad smart materials-even particles of decorative substances like paint. AmI implies a seamless environment of computing advanced networking technology and specific interfaces. This environment should be aware of the specific characteristics of human presence and personalities; adapt to the needs of the user; be capable of responding intelligently to spoken or gestured indications of desire; and even result in systems that are capable of engaging in intelligent dialog. AmI should be relaxing and enjoyable for the citizen, and not involve a steep learning curve” (IST Advisory Group’s Report, “Ambient Intelligence: From Vision to Reality. For Participation in Society and Business” (2003), ftp://ftp.cordis.europa.eu/pub/ist/docs/istag).

  31. 31.

    This collection may use in particular the processing of traffic data and user queries on the Internet, the recording of consumer purchasing habits and activity, the processing of geo-location data concerning mobile telephone users, the data collected by video surveillance cameras and by RFID systems, foreshadowing the “Internet of things”, and finally by biometric systems.

  32. 32.

    About these profiling practices and the need to regulate them, J.M. Dinant et al., “Profiling and Data Protection”, Report Addressed to the Convention 108 Consultative Committee (September 2008), available on the Council of Europe website.

  33. 33.

    On these risks, read M. Hildebrant and S. Gutwirth eds., “Profiling the European Citizen” (Dordrecht: Springer, 2008). See also, A. Rouvroy, “Privacy, Data Protection, and the Unprecedented Challenges of Ambient Intelligence”, Studies in Law, Ethics and Technology (2008), available at http://works.bepress.com/antoinette-rouvroy//2.

  34. 34.

    D. Hellman, “Classification and Fair Treatment: An Essay on the Moral and Legal Permissibility of Profiling”, (Univ. of Maryland School of Law, Working Research Paper No. 2003–04), available at http://www.ssrn.com/abstract=456460.

  35. 35.

    In the context of the works done by the Council of Europe about profiling and privacy, J.M. Dinant et al., “Profiling and Data Protection”, Report Addressed to the Convention 108 Consultative Committee (September 2008), available on the Council of Europe website. See in the same sense, M. Hildebrandt, “Profiling and the Identity of the European Citizen”, in Profiling the European Citizen, Hildebrandt & Gutwirth eds. (Dordrecht: Springer, 2008), 303–393.

  36. 36.

    This article needs to be interpreted in regard of Article 12 about the right of access. This article provides that the data subject has the right to obtain from the data controller the logic involved into any automatic system referred to in Article 15(1).

  37. 37.

    See the Recommendations in preparation by the Council of Europe on that issue.

  38. 38.

    A. Rouvroy, « Réinventer l’art d’oublier et de se faire oublier dans la société de l’information? », in La sécurité de l’individu numérisé: Réflexions prospectives et internationales (2008), Paris, L’Harmattan, 240–78.

  39. 39.

    The notion of “terminal equipment” is quite broad. The term “telecommunications terminal equipment” (in this paper referred to as “terminal equipment”) is defined in the European Directive on telecommunications (European Parliament and Council Directive 1999/5/EC of March 9, 1999, concerning terminal equipment, Official Journal L091 (April 4, 1999): 10 ff.) terminal equipment as “a product enabling communication or a relevant component thereof which is intended to be connected directly or indirectly by any means whatsoever to interfaces of public telecommunications networks (that is to say, telecommunications networks used wholly or partly for the provision of publicly available telecommunications services)”. This broad definition includes not only personal computers, or other typical user terminals such as telephones (mobile or fixed), faxes, but equally RFID, chip cards, and tomorrow, “intelligent molecules” implanted in people themselves.

  40. 40.

    Decision as last amended by the 1994 Act of Accession, Official Journal L36/31 (February 7, 1987).

  41. 41.

    “Transclusion is the inclusion of part of a document into another document by reference. For example, an article about a country might include a chart or a paragraph describing that country’s agricultural exports from a different article about agriculture. Rather than copying the included data and storing it in two places, a transclusion embodies modular design, by allowing it to be stored only once (and perhaps corrected and updated if the link type supported that) and viewed in different contexts. The reference also serves to link both articles.” (definition taken from Wikipedia encyclopaedia (http://www.en.wikipedia.org/)).

  42. 42.

    Directive 1999/5/EC of March 9, 1999 on Radio equipment and telecommunications terminal equipment and the mutual recognition of their conformity, Official Journal L91 (7 April, 1999): 10–28.

  43. 43.

    Directorate of Public Sector Innovation and Information Policy (DIIOS), “Privacy-Enhancing Technologies-White Paper for Decision-Makers”, Written for the Dutch Ministry of the Interior and Kingdom Relations (2004). This paper distinguishes different kinds of PETs: “From a functional perspective, it is not difficult to implement PET. With the aid of PET, it is possible to protect information about a person, such as identity and personal details. PET comprises all the technological controls for guaranteeing privacy. For instance, PET can be used to detach identification details from the other data stored about the person. The link between the identification details and the other personal details can only be restored with the use of specific tooling. Another option offered by PET is to prevent the registration of personal details altogether, for instance, once the identity has been verified. Software can also be used to enforce the condition that personal data are always disclosed to third parties in compliance with the prevailing privacy policies.” See also, J.J. Borking and C. Raab, “Laws, PETS and Other Technologies for Privacy Protection”, Journal of Information, Law and Technology 1 (February 2001), available at: http://elj.warwick.ac.uk/jilt:01-1:borking.html.

  44. 44.

    DIIOS, White Paper.

  45. 45.

    On that point, see also the Working Party 29’s Opinion 2/2008 on the E-Privacy Directive issued May 15, 2008 (No. 150): “The Working Party 29 advocates the application of the principle of data minimisation and the deployment of Privacy Enhancing Technologies by data controllers. The Working Party calls upon European legislators to make provision for a reinforcement of said principle, by reiterating Recitals 9 and 30 of the ePrivacy Directive in a new paragraph in Article 1 of this Directive.”

  46. 46.

    See as regards this concern, the Article 29 Working “Party Opinion 1/2002 on the CEN/ISSS Report on Privacy Standardisation in Europe”, WP 57 (May 30, 2002). See also, the International Conference of data protection Commissioners expresses its strong support to the development of an effective and universally accepted international privacy technology standard and make available to ISO its expertise for the development of such standard … Final resolution of the 26 International Conference on Privacy and Personal data protection (Wroclaw, September 14, 2004), Resolution on a draft ISO Privacy standards read also, the CEN/ISSS secretariat Final Report. “Initiatives on Privacy Standardisation in Europe”, (February 13, 2002), Brussels, available at http://ec.europa.eu/enterprise/ict/policy/standards/ipse_finalreport.pdf.

    About the importance of these private standardisation bodies, such as W3C or IETF, read P. Trudel et al., “Droit du cyberespace” (Montréal: Themis, 1997), Book. 3 and the critiques addressed to that privatisation, M.A. Froomkin, “Habermas@discourse.net: Towards a Critical Theory of Cyberspace”, Harvard Law Review 116 (1996): 800 ff.

  47. 47.

    C (2009) 3200 Final. It must be also underlined that the draft E-privacy directive still in discussion provides that the directive provisions (Article 3: Services concerned ) are applicable to devices like RFID when they are connected to publicly available communications networks or make use of electronic communications services as a basic infrastructure: “This directive shall apply to the processing of personal data in connection with the provisions of publicly available communications services in public communications networks in the Community, including public communications networks supporting data collection and identification devices .

  48. 48.

    “Working Paper on the Questions of Data Protection Posed by RFID Technology WP 105”, (January 19, 2005), available on the European Commission website: http://www.ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2005/wp105_fr.pdf.

  49. 49.

    The Recommendation refers to a sign which has to be developed by the European standardisation body (CEN).

  50. 50.

    As asserted by Anne Cavioukan, DPA Commissioner from Ontario (Canada) in its introductory remarks to the Privacy Guidelines for RFID Information Systems available on the web site: http://www.ipc.on.ca.: “Privacy and Security must be built in from the Outset: At the design Stage”. Examples of privacy by design include the road per-use payment system proposed in De Jonge and Jacobs “Privacy-Friendly Electronic Traffic Pricing via Commits”, in Proceedings of the Workshop of Formal Aspects of Securiy and Trust (FAST 2008), Lecture Notes in Computer Science, vol. 5491, (Berlin: Springer, 2008) in which the car journeys are not sent to a central server for fee computation but kept on the on board computer (and still auditable in case of dispute). Another illustration of the approach is the ambient intelligence architecture put forward in Le Métayer “A Formal Privacy Management Framework”, in Proceedings of the Workshop of Formal Aspects of Securiy and Trust (FAST 2008), Lecture Notes in Computer Science, vol. 5491, (Berlin: Springer, 2008), 162–76) which involves “privacy agents” in charge of managing and protecting personal data.

  51. 51.

    The ‘operator’ is defined by the Commission Recommendation as “the natural or legal person, public authority, agency, or any other body, which alone or jointly with others, determines the purposes and means of operating an application, including controllers of personal data using on RFID application”. It must be underlined again that this concept designates a category of persons broader than the ‘data controllers’ and might definitively target RFID information systems or RFID terminal producers.

  52. 52.

    We underline. See our reflection in our conclusions.

  53. 53.

    On “Privacy Impact Assessment”, see R. Clarke, “Privacy Impact Assessment: Its Origins and Development”, Computer Law & Security Review 25 (2009): 123 ff. This article provides in two appendices a list of exemplars of PIA documents and references to guidelines describing different PIA methodologies.

  54. 54.

    On that point, we would like to pinpoint R. Bronsword’s approach: “Potentially however even this degree of plurality could be destabilizing. It is critical, therefore, that members of a community of rights not only agree on the general shape of their ethical commitments but also agree upon the processes that will employed to resolve their disagreements. In other words, the community of rights needs to develop a political-legal framework orientated towards the Community’s basic ethics, that facilitates the provisional settlement of the Community’s differences.” (in Rights, Regulation and the Technological Revolution (New York: Oxford Univ. Press, 2008), 292).

  55. 55.

    About intrusive software, read: http://www.clubic.com/actualite-21463-phishing-et-spyware-les-menaces-pesantes-de-2005.html.

  56. 56.

    See the amendments proposed to the Article 5.3 and so commented: “Software that surreptitiously monitors actions of the user and/or subverts operations of the user’s terminal equipment for the benefit of a third party poses serious threat to users’ privacy. A high and equal level of protection of the private sphere of users needs to be ensured, regardless of whether unwanted spying programmes are inadvertently downloaded via electronic communications or are delivered and installed in software distributed on other external data storage media, such as CDs, CD-Roms or USB keys.”

  57. 57.

    In our opinion it is far from being legally founded that cookies have to be considered per se as ‘personal data’, since that information refers to equipment and not to an individual except through other data linked to the cookies. It is quite interesting to underline that speaking about cookies and other spyware Article 5.3 of Directive 2002/58 speaks about ‘information’ (broader concept) and not about ‘personal data’. See already the comments made about the qualification of IP addresses as ‘personal data’, supra No. 11.

  58. 58.

    BVerfG, 1 BvR 370/07 vom February 27, 2008, Absatz-Nr. (2008): 1–267, http://www.bverfg.de/entscheidungen/rs20080227_1bvr037007.html. About this decision, see G. Hornung, “Ein neues Gründrecht”, Computer und Recht (2008): 299 ff. and G. Hornung and C. Schnabel, “Data Protection in Germany II: Recent Decisions on On-line Searching of Computers, Automatic Number Plate Recognition and Data Retention”, Computer Law & Security Review 25 (2009): 114 ff. See also on that decision, in the present book the contribution signed by R. Bendrath, G. Hornung, and A. Pfitzmann, “Surveillance in Germany: Strategies and Counterstrategies”.

  59. 59.

    About that judgement, read, P. De Hert, K. de Vries, and S. Gutwirth, “La limitation des « perquisitions en ligne » par un renouvellement des droits fondamentaux, note sous Cour constitutionnelle allemande 27 février 2008”, Revue du Droit des Technologies de l’Informatique, 34 (2009): 87–93.

  60. 60.

    The Court enumerates different reasons why technical protection against these intrusions are not sufficient to protect citizens: “Information technology systems have now reached such a degree of complexity that effective social or technical self-protection leads to considerable difficulties and may be beyond the ability of at least the average user. Technical self-protection may also entail considerable effort or result in the loss of the functionality of the protected system. Many possibilities of self-protection—such as encryption or the concealment of sensitive data—are also largely ineffective if third parties have been able to infiltrate the system on which the data has been stored. Finally, it is not possible in view of the speed of the development of information technology to reliably forecast the technical means which users may have to protect themselves in future”.

  61. 61.

    “The encroachment may take place regardless of location, so that space-oriented protection is unable to avert the specific endangerment of the information technology system. Insofar as the infiltration uses the connection of the computer concerned to form a computer network, it leaves spatial privacy provided by delimitation of the dwelling unaffected. The location of the system is in many cases of no interest for the investigation measure, and frequently will not be recognisable even for the authority. This applies in particular to mobile information technology systems such as laptops, Personal Digital Assistants (PDAs) or mobile telephones.”

  62. 62.

    K. Konvitz, “Privacy and the Law: A Philosophical Prelude”, Law and Contemporary Problems 31 (1966): 272, 279–80.

  63. 63.

    Copland v. U.K., 62617/00 (ECHR, April 3, 2007).

  64. 64.

    M. Cornelis et al., “Miauce, Deliverable D5.1.2. Ethical, Legal and Social Issues”, available online on the Miauce website, www.miauce.org.

  65. 65.

    On the comparison between legal and technological normativities, read M. Hidebrandt, “Legal and Technological Normativities: More (and Less) Than Two Sisters”, Techné 12 (3) (2008): 169 ff.

  66. 66.

    The same problematic is explored in V. Mayer-Schônberger, “Demystifying Lessig”, Wisconsin Law Report 4 (2009): 714 ff.

  67. 67.

    Certain arguments have been proposed in Y. Poullet and A. Rouvroy, “General Introductory Report”, in Ethical Aspects of the Information Society, Conference Organized Jointly by UNESCO and Council of Europe, Strasbourg (November 15, 2007), text available on the UNESCO website.

  68. 68.

    On that point see precisely the European Parliament resolution condemning the French HADOPI system which introduced the possibility in case of repeated illegal copying to block the access of the Internet user.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yves Poullet .

Editor information

Editors and Affiliations

Additional information

Yves Poullet

Thanks to Karen Rosier, lawyer and researcher at the CRID for the fruitful discussions and her precious advices.

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Poullet, Y. (2010). About the E-Privacy Directive: Towards a Third Generation of Data Protection Legislation?. In: Gutwirth, S., Poullet, Y., De Hert, P. (eds) Data Protection in a Profiled World. Springer, Dordrecht. https://doi.org/10.1007/978-90-481-8865-9_1

Download citation

Publish with us

Policies and ethics