Skip to main content

Biometric Data, Data Protection and the Right to Privacy

  • Chapter
  • First Online:
Book cover Privacy and Data Protection Issues of Biometric Applications

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 12))

Abstract

Are biometric data always personal data ? Until today, this question is raised repeatedly, as well as whether biometric data are ‘sensitive’ data. The Chapter provides answers based on a detailed analysis of the concepts from a national and European law perspective. The guidance of the Article 29 Working Party on these questions is hereby commented in a critical way, also taking into account the opinions WP192 and WP193 of 2012. The author also suggests a definition of biometric data, indispensible for framing later proposals. Biometric data is further compared with biological material and DNA, for which more detailed regulation exists. Many esteem that DNA analysis will soon be possible in ‘real time’ as well. The author concludes with sketching the privacy and data protection framework, now also both fundamental rights as proclaimed in the EU Charter, in which biometric data processing shall fit in. Varying national approaches surface as an omen for diverging interpretations and guidance for processing biometric data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Data protection legislation in this work refers in general to the legislation which emerged since 1995 in Union countries implementing Directive 95/46/EC and regulating the processing of personal information.

  2. 2.

    Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, O.J. L 281, 23.11.1995, pp. 31–50, also available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:1995:281:0031:0050:EN:PDF. About the Proposals for Reform, see § 396 below.

  3. 3.

    It should be noted that in some EU countries, data protection rights principles and legislation already existed long before these Directives. See, for example, the data protection legislation enacted in France in 1978. Other examples of ‘early’ data protection legislations are the legislation in the German state of Hesse (Germany) (1970, the worldwide first ‘modern’ data protection legislation), Sweden (1973) and federal data protection legislation in Germany (1977). Such legislation was later modified where needed to implement the Directive. For an overview of the implementation of the Directive in the 27 Member States, see European Commission, Status of implementation of Directive 95/46/EC on the Protection of Individuals with regard to the Processing of Personal Data, previously available at http://ec.europa.eu/justice_home/fsj/privacy/law/ implementation_en.htm

  4. 4.

    Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector, O. J. L 201, 31.07.2002, pp. 37–47. Directive 2002/58/EC replaced the Directive 97/66/EC of the European Parliament and of the Council of 15 December 1997 concerning the processing of personal data and the protection of privacy in the telecommunications sector, O. J. L 24, 30.01.1998, pp. 1–8 and was amended by Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009 (O. J. L 337, 18.12.2009, pp. 11–36) introducing inter alia the obligation to notify personal data breach (see Article 2). Article 3 (as amended) on services concerned states: ‘This Directive shall apply to the processing of personal data in connection with the provision of publicly available electronic communications services in public communications networks in the Community, including public communications networks supporting data collection and identification services’. Biometric data could be included in such publicly available electronic communications services, for example in an identity management service available to the public for secure logging into particular online services (compare with an application such as e.g., OpenID). The general data protection principles, however, will remain in such case applicable as well. Similar a for Directive 95/46/EC, we will not analyze in further detail all the obligations for publicly available electronic communications service under Directive 2002/58/EC.

  5. 5.

    See, e.g., the development of data protection principles in Ibero-America during the last decade, with strong involvement of the Spanish Data Protection Agency. See also the APEC Privacy Principles, developed in the Asia-Pacific region, discussed in § 190 below. The fact that the Directive 95/46/EC requires for the transfer of personal data to third countries an ‘adequate level of protection’, has without doubt as effect that some third countries have adopted similar data protection legislation as in the European Union in order to facilitate transborder personal data transfers (see, e.g., Switzerland).

  6. 6.

    For treatises explaining the obligations under the Directive 95/46/EC, see, e.g., Ch. Kuner, European Data Protection Law. Corporate Compliance and Regulation, Oxford, Oxford University Press, 2007, 552 p. (‘Kuner, European Data Protection Law, 2007’); R. Jay, Data protection: law and practice, London, Sweet and Maxwell, 2007, 1025 p. (‘Jay, Data Protection, 2007’); see also L. Bygrave, Data Protection Law. Approaching its rationale, logic and limits, The Hague, Kluwer Law International, 2002, 426 p. (‘Bygrave, Data Protection Law, 2002’). For treatises discussing the Belgian data protection obligations, see, e.g., D. De Bot, Verwerking van persoonsgegevens, Antwerpen, Kluwer, 2001, 403 p. (‘De Bot, Verwerking Persoonsgegevens, 2001’); B. Docquir, Le droit de la vie privée, Brussels, De Boeck-Larcier, 2008, 354 p. (‘Docquir, Vie Privée, 2008’) and Graux and Dumortier, Privacywetgeving in de praktijk, 2009.

  7. 7.

    We refer to inter alia A. Albrecht, BioVision. Privacy Best Practices in Deployment of Biometric Systems, BioVision, 28 August 2003, 49 p.; Article 29 Data Protection Working Party, Working Document on Biometrics, WP80, 1 August 2003, 11 p discussed below in §§ 193–204; see also E. Kindt and J. Dumortier, Summary of legal data protection requirements for the processing of biometric data, European Biometrics Portal, September 2005, 35 p., previously available at http://www.europeanbiometrics.info

  8. 8.

    See Art. 1.1 of the Directive 95/46/EC.

  9. 9.

    Reference is made to, e.g., biometric data in public commercially available research databases, such as fingerprint databases made available for competitions on performances of algorithms. On this particular use of biometric data, see Part III.

  10. 10.

    A token is e.g., a smart card or a Universal Serial Bus (USB) storage token. A USB token has the advantage that it offers a high storage capacity and that most PCs have USB interfaces, eliminating reader costs and availability.

  11. 11.

    The concept was not fully new. Various OECD Member countries already introduced privacy protection laws. The German and the French data protection legislation for example already introduced the concept in respectively the German Federal Data Protection Act (‘Bundesdatenschutzgesetz’ or ‘BDSG’) of 1977 (for an English translation, see http://www.iuscomp.org/gla/statutes/BDSG.htm) and the French general data protection Act N° 78-17 of 6 January 1978 relating to informatics, files and liberties (‘Loi n° 78-17 du 6 janvier 1978 relative à linformatique, aux fichiers et aux libertés’).

  12. 12.

    Council of the OECD, Recommendation concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal data, 23 September 1980, available at http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html; for a retrospective, see Organisation For Economic Co-Operation And Development, Thirty Years after the OECD Privacy Guidelines, Paris, OECD, 2011, 111 p. (‘OECD, Thirty Years after 2011’).

  13. 13.

    The United States were leading in data processing services and feared too restrictive regulations of transborder data flows. See also E. Kindt, The escape of transborder data flow of a global legal framework. An analysis of the policies. A search for an efficient global legal framework, Athens (GA, USA), Law Library UGA, 1988, p. 79.

  14. 14.

    See e.g., H. Lowry, ‘Transborder Data Flow: Public and Private International Law Aspects’, Houston Journal of International Law, 1984, (159–174), p. 166: ‘As the reader can see, very little of this information is about individuals. Most transborder data flows are by organizations and about their operations. Privacy plays a very minor part of the import and export of this type of information. Certainly some data, such as payroll or personnel files, should be protected. But often privacy is just a convenient club with which to beat to death the freedom to exchange information’ (emphasis added).

  15. 15.

    Council of Europe, ETS No. 108, Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, 28 January 1981, available at http://conventions.coe.int/Treaty/EN/Treaties/HTML/108.htm; the Convention No. 108 was adopted in the same month as the 1980 OECD Guidelines, but was not opened for ratification until 1981; see also OECD, Thirty Years after 2011, p. 20.

  16. 16.

    See P. Miller, ‘Teleinformatics, Transborder Data Flows and the Emerging Struggle for Information: An Introduction to the Arrival of the New Information Age’, Columbia Journal of Law and Social Problems, 1986, (89-14), p. 120.

  17. 17.

    Article 1 (b) of the Annex.

  18. 18.

    Consultative Committee of the Convention for the Protection of Individuals with regards to Automatic Processing of Personal Data [CETS No. 108] (T-PD), Progress report on the application of the principles of convention 108 to the collection and processing of biometric data, Strasbourg, Council of Europe, CM(2005)43, March 2005 (‘Council of Europe, Progress report, 2005’), available at https://wcd.coe.int/ViewDoc.jsp?Ref=CM(2005)43&Language=lanEnglish&Site=COE&BackColorInternet=DBDCF2&BackColorIntranet=FDC864&BackColorLogged=

  19. 19.

    Council of Europe, The need for a global consideration of the human rights implications of biometrics, Doc. 12522, Parliamentary Assembly, 16.02.2011, 15 p., available at http://assembly.coe.int/nw/xml/XRef/Xref-XML2HTML-en.asp?fileid=13103&lang=en (‘Council of Europe, The need for a global consideration of the human rights implications of biometrics, 2011’).

  20. 20.

    Regulation (EC) No 45/2001 of the European Parliament and of the Council of 18 December 2000 on the protection of individuals with regards to the processing of personal data by the Community institutions and bodies and on the free movement of such data, O.J. L8, 12.01.2001, pp. 1–22. This Regulation will apply to all biometric data processing by the institutions (e.g., the Commission or the European parliament) and bodies of the Union. Because the Regulation is similar to Directive 95/46/EC, it will not be discussed separately in this work.

  21. 21.

    We will refer later to some opinions of the EDPS in the domain of biometric data processing.

  22. 22.

    In the original proposal, it was explained that as in Convention No. 108, a broad definition was adopted ‘in order to cover all information which may be linked to an individual’. See the commentary on Article 2 in the Commission’s original proposal at COM(90) 314 final, 13.9.1990, p. 19. This position, meeting Parliament’s wish, was also taken into account by the Council and was maintained throughout the legislative process (see COM (92) 422 final, 10.1992, p. 10 and Common Position (EC) No 1/95, adopted by the Council on 20 February 1995, O.J. C 93, 13.04.1995, p. 20 (‘The common position also takes on board Parliament’s idea of adopting a broad concept for implementing protection, (…)’).

  23. 23.

    For an overview in some selected countries, see Kuner, European Data Protection Law, 2007, no 2.82; see also more recently, on diverging implementation of basic concepts, D. Korff, Comparative Study on Different Approaches to new privacy challenges, in particular in the light of technological developments, Working Paper N° 2: Data protection laws in the EU: the difficulties in meeting the challenges posed by global social and technical developments, 20 January 2010, Brussels, European Commission, 120 p., (‘Korff, New Challenges to Data Protection. Working Paper N° 2, 2010’), available at http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_working_paper_2_en.pdf

  24. 24.

    See, for example, the interpretation of personal data by the English courts in the case of Durant v. Financial Services Authority. The Court of Appeal (Michael John Durant v. Financial Services Authority [2003] EWCA Civ 1746 (‘the Durant case’)) refused in its decision of 8.12.2003 the request of Mr. Durant for disclosure of unredacted computerized documents and manual records held by the Barclay’s Bank in a dispute. This decision of the Court of Appeal included the argument that the information did not affect Durant’s privacy because the information (1) was not going beyond the recording of the individual’s involvement in a matter or an event that has no personal connotations and (2) had not as focus the individual but some other person with whom he may have been involved or some transaction or event in which he may have figured or have had an interest (emphasis added). The Court therefore concluded that the information did not ‘relate to’ the individual concerned and that the information did not qualify as ‘personal data’. Mr. Durant brought his case later before the European Court for Human Rights alleging violations of inter alia article 8 ECHR (see below). See and compare the Durant case with two other cases, Council of the European Union v. Hautala and Commission v. Bavarian Lager, decided in respectively 2001 and 2010 by the Court of Justice (see Part II, Chap. 5, § 229), involving equally access to documents. In the latter case, the Court decided (reversing the decision of the General Court) that the refusal of the European Commission to disclose the name of five persons mentioned in documents generated by the Commission upon an access demand by Bavarian Lager was correct since such disclosure would have been an ‘actual and certain threat to privacy’ for which the condition of necessity for transferring the data for justifying interference was not established (about the conditions for interference, see Part II).

  25. 25.

    See, for example, about the interpretation of personal data in the banking sector, K. Van Raemdonck, ‘De invloed van de wet van 8 december 1992 ter bescherming van de persoonlijke levenssfeer t.o.v. de verwerking van persoonsgegevens op de banksector’, R.W. 1997, (897), p. 902. The author states – in our view erroneously – that the processing of the payment details of a banking card by a vendor is not a ‘processing of personal data’, since the vendor only knows the card number and cannot identify the user of the card because the vendor has no details of the user (contrary to the bank who issued the card). In our opinion, however, the vendor does process personal data to the extent the card user is identifiable with reasonable means by a third party, i.e. the bank, and the information hence relates to a person who is at least indirectly identified (see also below).

  26. 26.

    See, for example, UK Information Commissioner, TheDurantCase and its impact on the interpretation of the Data Protection Act 1998, 27 February 2006. The UK Information Commissioner stated in this guidance document – based on the holdings in the Durant case –that a determination whether information constitutes personal data depends on whether it affects an individual’s ‘privacy’ and on whether it might have an adverse impact on the individual (see p. 2). This position, however, was criticized and questioned, also by the European Commission.

  27. 27.

    One of these debates is e.g., whether the processing of Internet Protocol (IP) addresses data shall be considered as the processing of personal data. The views on this issue diverge considerably.

  28. 28.

    One of the (few) examples, is the Lindqvist case of the European Court of Justice (ECJ, C-101/01), decided on 6 November 2003. This case gave guidance on the interpretation of personal data processing in relation to textual information uploaded to the Internet and data concerning health (see also below § 255).

  29. 29.

    See APEC, APEC Privacy Framework, 2005, 40 p., available at http://www.apec.org/Groups/Committee-on-Trade-andInvestment/~/media/Files/Groups/ECSG/05_ecsg_privacyframewk.ashx APEC Member Economies include Australia, Canada, China, Japan and the United States.

  30. 30.

    ‘Relevance’ in computer science, particularly searching, is a score assigned to a search result, representing how well the result matches the search query. See Merriam-Webster’s Online Dictionary, at http://www.merriam-webster.com/

  31. 31.

    R. Van Kralingen, C. Prins en J. Grijpink, Het lichaam als sleutel. Juridische beschouwingen over biometrie, Alphen aan den Rijn/Diegem, Samsom BedrijfsInformatie Bv, 1997, pp. 31–33 (‘Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997’); in the same sense, P. Kolkman and R. van Kralingen, ‘Privacy en nieuwe technologie’, in J. Berkvens and J. Prins (eds.), Privacyregulering in theorie en praktijk, Deventer, Kluwer, 2007, (395), p. 410. In Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, the authors stated in a footnote, that if such data would be considered personal data, one should wonder whether such off-line verification processing is a relevant processing for the application of the data protection legislation. They hereby probably intended to say that such processing may fall outside the scope of this legislation. This was later argued in a report published by the Dutch DPA (see § 231). With ‘off-line verification’, the authors refer to a comparison with the template stored on the chip card.

  32. 32.

    See M. Rejman-Greene, ‘Privacy Issues in the Application of Biometrics: a European Perspective’, in J. Wayman, A. Jain, D. Maltoni, D. Maio (eds.), Biometric systems: Technology, Design, and Performance Evaluation, New York, Springer, 2005, (335), pp. 344–345 (‘Rejman-Greene, Privacy Issues, in Wayman, Jain, Maltoni, Maio, Biometric Systems, 2005’).

  33. 33.

    The Article 29 Working Party was established according to Article 29 of the Directive 95/46/EC and may inter alia, at its own initiative, make recommendations on all matters relating to data protection. The Article 29 Working Party is composed of a representative of the national DPAs designated by each Member State, a representative of the European Data Protection Supervisor established for the Union institutions and bodies, and a representative of the Commission (see Art. 29.2 Directive 95/46/EC). The Article 29 Working Party is to act independently.

  34. 34.

    One of the cases relating to the processing of biometric data and which was decided in 2008 by the European Court for Human Rights, is S. and Marper v. U.K., discussed below. Some other cases relating to biometric databases were initiated before national courts as well, in particular with regard to the central storage of fingerprint data for the ePassport (see Part III, Chap. 7). Some of these cases have been decided while others are pending. These cases may provide in their (final) decisions more clarification on this issue.

  35. 35.

    See Article 29 Data Protection Working Party, Working Document on Biometrics, WP80, 1 August 2003, 11 p. (‘WP 29 Working Document on Biometrics 2003 (WP80)’); Article 29 Data Protection Working Party, Opinion 3/2012 on developments in biometric technologies, WP193, 27 April 2012, 34 p. (‘WP 29 Opinion on developments in biometric technologies 2012 (WP193)’) and Article 29 Data Protection Working Party, Opinion 02/2012 on facial recognition in online and mobile services, WP192, 22 March 2012, 9 p. (‘WP 29 Opinion on facial recognition 2012 (WP192)’). The WP 29 Opinion on facial recognition 2012 (WP192 will be further discussed in § 286 et seq. below and Part III, Chap. 7, §§ 157–162.

  36. 36.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 5.

  37. 37.

    The Art. 29 WP stated it as follows: ‘It appears that biometric data can always be considered as “information relating to a natural person” as it concerns data, which provides, by its very nature, information about a given person’. This phrase in exactly the same words is also mentioned in the At Face Value report (p. 35) published by the Dutch DPA in 1999. Other phrases of the At Face Value report have been used in the WP 29 Working Document on Biometrics 2003 as well (see and compare, e.g., on p. 5 and in footnote 12 of the WP 29 Working Document on Biometrics 2003 (WP80)).

  38. 38.

    See, e.g., Consultative Committee of the Convention for the Protection of Individuals with regards to Automatic Processing of Personal Data [CETS No. 108] (T-PD), Progress report on the application of the principles of convention 108 to the collection and processing of biometric data, Strasbourg, Council of Europe, CM(2005)43, March 2005, no 51 and 52, (‘Council of Europe, Progress report of application of the Convention to biometric data, 2005’), available at https://wcd.coe.int/ViewDoc.jsp?Ref=CM(2005)43&Language=lanEnglish&Site=COE&BackColorInternet=DBDCF2&BackColorIntranet=FDC864&BackColorLogged=. In this report, the Committee stated that ‘as soon biometric data are collected with a view to automatic processing there is the possibility that these data can be related to an identified or identifiable individual’. It made however some other arguments, such as that the circumstances of the collection alone (i.e., time and place of collection) ‘always reveal information about the data subject being the source of the biometric data’.

  39. 39.

    The Article 29 Working Party uses the terms ‘images’, ‘samples’ and ‘raw data’ in order to refer to the (captured) biometric samples in a system as compared to templates. As stated above, the term ‘raw data’ or ‘raw biometric samples’ should be avoided. We use the term (captured) (biometric) samples as explained in Chap. 2, § 98. The term ‘image’ refers in our view to the digital or analog representation of biometric characteristics, whether used in a biometric system or not.

  40. 40.

    See S. Callens, Goed geregeld? Het gebruik van medische gegevens voor onderzoek, Antwerpen – Apeldoorn, Maklu, 1995, p. 32, no 14, where the author discussed the definition of ‘data’. Because of the advent of computers, this definition of data has gradually been adapted (‘Callens, Goed geregeld? 1995’). On the description of ‘data’, see also J. Dumortier, ‘Privacybescherming en gegevensverwerking’, Vlaams Jurist Vandaag 1993, pp. 6–7.

  41. 41.

    Term 37.03.22 ISO Vocabulary for Biometrics 2012. See also above. ‘Template’ could be translated as ‘sjabloon’ (Dutch) or ‘gabarit’ (French). ‘Sjabloon’ is defined in the general dictionary Van Dale (1999) as ‘1. (…) 2. (in figurative sense) conventional model, standardized figure’.

  42. 42.

    New algorithms are continuously being developed to reach a better comparison result. For example, in fingerprint feature extraction algorithms, research goes on to evaluate whether poors of the skin could be useful in this feature extraction.

  43. 43.

    See and compare also with the position held in 2000 by biometric experts in the United States on the qualification of templates: ‘The numerical features generally do not have a clear and direct mapping to any physiological characteristics and, consequently, contain no personal information’. J. Wayman, ‘A Definition of “Biometrics”, in J. Wayman (ed.), National Biometric Test Center Collected Works 19972000, San Jose State University, 2000, (21), p. 23, available at http://www.engr.sjsu.edu/biometrics/nbtccw.pdf

  44. 44.

    As described in Chap. 2, law enforcement in European Union Member States now start to use automated systems as well. About this evolution towards automation, see also Wayman, Jain, Maltoni, Maio, Biometric systems, 2005, pp. 27-33. It is therefore possible that template comparison may increasingly be used.

  45. 45.

    Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, p. 25. This argument, which at that time may have had some ground, does no longer hold because of the increasing standardization and interoperability of biometric processing methods.

  46. 46.

    Ibid., pp. 25–26.

  47. 47.

    Ibid., p. 26. We will find in Part II that the use of templates, rather than the samples, will be considered by the DPAs an important criterion for the proportionality of the processing of personal data.

  48. 48.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 5.

  49. 49.

    Ibid., p. 5, footnote 11 of the document. In this phrase of the footnote, the WP 29 Working party seems to refer to the possibility of the controller to use reasonable means (‘(…) in a way that no reasonable means can be used (…)’).

  50. 50.

    CBPL, Advies 17/2008 uit eigen beweging over het verwerken van biometrische gegevens in het raam van authenticatie van personen, Advies No 17/2008, 9.4.2008, p. 8 (‘CBPL, Opinion N° 17/2008 biometric data’). About this opinion in more length, see Part II, Chap. 5, § 381 et seq.

  51. 51.

    For example, would it be sufficient that the comparison is ‘off-line’? See also above § 192.

  52. 52.

    It could be possible that this type of storage has been discussed and was considered by the Art. 29 Working Party. The Dutch DPAs had previously published a report in which it was stated that if the biometric data was scanned and compared on a chip card held by the data subject, or that if the template was stored in a decentralized way (on a chip card) and the sensor for the capture and other equipment could be trusted, whereby only the comparison decision (yes or no) is communicated by the system, the biometric data processed and stored in such way could be considered as processed for personal use and hence as falling outside the scope of the Directive 95/46/EC. See also below.

  53. 53.

    We will explain in Part III that the terminology of ‘anonymous’ biometric data and ‘untraceable biometrics’ is misleading and explain under which specific conditions biometric data could be used in an anonymous way for the service providers.

  54. 54.

    This will also be discussed below in Part II, Chap. 4, Sect. 4.3.

  55. 55.

    For example, by comparison of the templates with templates obtained from e.g., latent samples collected by law enforcement or samples from large-scale public databases.

  56. 56.

    On this ‘availability’ as risk, see also below Part II, Chap. 4, Sect. 4.3.

  57. 57.

    However, this may exist in case of the use of research databases. See about this particular type of use of biometric data, Part III. The argument of the Article 29 Working Party on the use of templates in a categorisation system to escape from the definition of personal data in its recent opinion 02/2012 does not convince either because of the many assumptions. The Article 29 Working Party stated: ‘A template or set of distinctive features used only in a categorisation system would not, in general, contain sufficient information to identify an individual. (…) In this case it would not be personal data provided the template (or the result) is not associated with an individuals record, profile or the original image (…) (emphasis added) – see p. 4). Moreover, the use in a categorisation system does not prevent linking the templates with individuals, which will in most cases remain the purpose of the processing.

  58. 58.

    A. Cavoukian and M. Snijder, A Discussion of Biometrics for Authentication Purposes: The Relevance of Untraceable Biometrics and Biometric Encryption, July 2009, 7 p. (‘Cavoukian and Snijder, Untraceable biometrics 2009’).

  59. 59.

    See Part III, Chap. 7, §§ 107–108.

  60. 60.

    For example, because the keys are held by a so called ‘trusted third party’ who will not release the keys. In that case, the use of the encrypted templates, if decryption is necessary for the comparison, will have to be ‘entrusted’ to the trusted third party. Such trusted third party could in some plots also be a certified electronic device.

  61. 61.

    It is useful to note that a similar discussion about key-encoded data has taken place in relation with the encoding of personal data in clinical trials. The bottom line which appeared over the years, however, is now that trial data should be considered personal data for those who may have access to the key, whether the sponsor, the clinical research organizations or the investigator. But, in some countries, e.g., the Netherlands, key-coded data may still not be considered personal data for the sponsor (see the NEFARMA code of conduct which expired in 2007). See on this subject, K. Van Quathem, ‘Controlling personal data in clinical trials’, P&I 2009, pp. 74–79; see and compare also with the opinion 4/97 of the Belgian DPA on the National Cancer Registry. The DPA concluded that the Registry was for the organization not subject to the data protection legislation because the data are processed in encoded form, and the identity of the patients only known to the doctor; see also Linklaters, Data protected, 2008. For this reason, the revision of the Data Protection Directive 95/46/EC (see also below) should in our opinion pay attention to and solve this remaining discussion more explicitly as to whether encoded personal data are personal data (or not) to endeavor harmonization.

  62. 62.

    The Opinion 03/2012 recommends such ‘protected templates’. On the concept of ‘protected templates’ and the recommendation of the Article 29 Working Party to use such templates, see below Part III.

  63. 63.

    Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data, WP 136, 20 June 2007, 26 p. (‘WP 29 Opinion personal data 2007 (WP136)’ or ‘Opinion 4/2007’).

  64. 64.

    Ibid., p. 6. This is to be distinguished from the obligation to process only accurate data.

  65. 65.

    ‘Nature’ is a separate criterion used by the Article 29 Working Party. This criterion, however, could also have been included in the notion of ‘content’ of the information.

  66. 66.

    Images of individuals, captured by a video surveillance system are explicitly given as an example of personal data to the extent individuals are recognizable. WP 29 Opinion personal data 2007 (WP136), pp. 6–7.

  67. 67.

    WP 29 Opinion personal data 2007 (WP136), pp. 7–8. Note that in the past, there has been some confusion as to the criteria for video images to qualify as personal data. See, e.g., the altering opinions of the Belgian DPA on this issue, mentioned below in § 288.

  68. 68.

    Ibid., p. 8.

  69. 69.

    About the risks of the use of biometric data as unique identifiers, see below.

  70. 70.

    The Article 29 Data Protection Working Party makes a distinction between sources of information, such as fingerprint, but refers also to human tissue samples, and the information extracted. The reason is that the Article 29 Working Party does not consider e.g., cell material or human tissue samples, as personal data to which the Directive 95/46/EC applies. In the same sense, see also the summary of the Dutch DPA about its opinion in Registratiekamer, Ontwerpbesluit DNA-onderzoek in strafzaken, 17.02.2000 (‘Registratiekamer, DNA-onderzoek in strafzaken, 2000’), also available at http://www.cbpweb.nl/Pages/adv_z1999-1201.aspx

  71. 71.

    Also the storing of blood samples and the possibility that DNA information may in the future be obtained by automated means from these samples (see also below), renders the distinction ‘dangerous’.

  72. 72.

    The Article 29 Data Protection Working Party considers the second building block as ‘crucial’. However, the third factor is in our view far more important as this will trigger whether the information is personal or not. The reason why ‘relating to’ may be considered important, may well be the discussion in the Durant case as mentioned above at the time the opinion was rendered.

  73. 73.

    WP 29 Opinion personal data 2007 (WP136), p. 10.

  74. 74.

    E.g., health related data in the file of a patient of a health care institution.

  75. 75.

    WP 29 Opinion personal data (WP136), p. 10. The example is given of call logging data of a telephone at the premises of a company used by employees but also to be used by the cleaning staff to notify when they are leaving the premises. The recording of the calls of this last category of persons relate to these persons because of the purpose and use of the recording.

  76. 76.

    See also the example on impact given by the Article 29 Data Protection Working Party of a GPS monitoring system in taxi’s. The example illustrates not only that the information collected may have an impact, but in the first place that information is provided about where the taxi driver is driving and whether he or she respects the speed limits.

  77. 77.

    The two additional criteria of ‘purpose’ and ‘result’ may also be useful to determine to whom information relates if the information relates to different persons. This is important in order to apply the provisions of the data protection legislation, such as the right of access.

  78. 78.

    The qualification of profiles under data protection legislation is in particular interesting. See on this aspect W. Schreurs, M. Hildebrandt, E. Kindt and M. Vanfleteren, ‘Chapter 13. Cogitas, Ergo Sum The Role of Data Protection Law and Non-discrimination Law in Group Profiling in the Private Sector, in M. Hildebrandt and S. Gutwirth (eds.), Profiling the European Citizen. Cross-Disciplinary Perspectives, Springer, 2008, pp. 241–270 (‘Schreurs, Hildebrandt, Kindt and Vanfleteren, Cogitas, Ergo Sum 2008’). On the use of profiling and behavioral profiling, see further Part II, Chap. 4, Sect. 4.1 below.

  79. 79.

    In fact, ‘identified’ and ‘identifiable’ are understood by the Article 29 Data Protection Working Party as to whether there are sufficient identifiers to single out a particular person.

  80. 80.

    WP 29 Opinion personal data 2007 (WP136), p. 13. In the opinion, the Article 29 Data Protection Working Party gives the example that a name may not be sufficient to identify a person from the whole of a country’s population, while this may well be possible for a pupil in a class.

  81. 81.

    If the data subject would give a false name (e.g., criminals often adopt various aliases), the individual may no longer be directly identified or identifiable.

  82. 82.

    WP 29 Opinion personal data 2007 (WP136), p. 13. A reference to someone’s name could also directly identify a person, e.g., the abbreviations of names used in a work related context.

  83. 83.

    Art. 2 (a) Directive 95/46/EC.

  84. 84.

    WP 29 Opinion personal data 2007 (WP136), p. 13.

  85. 85.

    Other pieces of information could also include the presentation of a newly submitted sample to the system of the data subject suspected of being the person that is identifiable. If the system renders a positive decision, it is acceptable to say that (it is likely that) the previously stored biometric information, with which the newly submitted sample is compared, will belong to the same person.

  86. 86.

    This is to some extent also the discussion going on about the use of ‘anonymous’ and ‘untraceable’ biometric data. About the debate and these terms, see Part III, Chap. 7, §§ 107–108.

  87. 87.

    The phrase ‘all the means likely reasonably to be used to identify a person’ in fact contains two components: (i) all the means and (ii) likely reasonably to be used. As already mentioned, this phrase is now included in the definition itself in the Proposals for Reform. For the complete wording, see above § 198.

  88. 88.

    Ibid., p. 15.

  89. 89.

    The wording used in the Opinion may however be confusing. It is stated that the fore mentioned factors ‘should all be taken into account’. We interpret this however as that each of these factors could be taken into account, without the necessity for all factors to be at the same time present. This would not be logical as for example organizational dysfunction may not occur while it remains possible to identify the persons on the basis of the biometric data stored.

  90. 90.

    On the architecture of biometric systems, see Part II Chap. 4, Sect. 4.2 and Part III Chap. 7, §§ 71–73.

  91. 91.

    On Eurodac, see above Chap. 2, §§ 143–144.

  92. 92.

    Article 4.6 Eurodac Regulation 2000 juncto Article 15 Dublin Convention, the latter being replaced by Article 21 Dublin II Regulation 2003. For Eurodac, only the fingerprint data (and some limited accompanying data such as the date of taking the prints) are stored in a central database (Article 5 Eurodac Regulation 2000). Additional identifying information is kept by the Member States. We would therefore qualify Eurodac as a distributed system. The use of the central database containing the fingerprints in Eurodac and the access to the additional information is further organized by the fore mentioned Regulations.

  93. 93.

    See, e.g., the plans of the EU Commission to establish ‘smart borders’ at the EU frontiers. See also above § 160.

  94. 94.

    See WP 29 Opinion personal data 2007 (WP136), p. 15. The Article 29 Working Party, however, seems to state that only as of the moment that the technology exists, the data will become personal data (‘However, if they are intended to be kept for 10 years, the controller should consider the possibility of identification that may occur also in the 9 year of their lifetime, and which may make them personal data at that moment’).

  95. 95.

    See also Council of Europe, Progress report of application of the Convention to biometric data, 2005, p. 24, § 103.

  96. 96.

    See also below footnotes 291–293 and Part II, Chap. 4, § 39.

  97. 97.

    Also note that pictures in the profiles of social network users have gotten a more prominent role, e.g., the Facebook profile pages renewed since end of 2010. This was a few months before making face recognition technology available to (European) users.

  98. 98.

    Compare, e.g., with the discussion about the qualification of search engine terms as personal data.

  99. 99.

    WP 29 Opinion personal data 2007 (WP136), p. 15. It is also stated that a person is not identifiable ‘in case that possibility does not exist or is negligible’. Ibid., p. 15.

  100. 100.

    E.g., the high budgets of cities involved for the installation of intelligent camera surveillance systems, the 1 billion U.S. dollar project of the FBI for setting up a database with multiple biometric characteristics (see above § 164), the 1.2 billion U.S. dollar eID project of the Indian government, (see above § 168), …

  101. 101.

    E.g., the advantages of rendering face recognition technology widely available to social network users.

  102. 102.

    The ‘processing’ of personal data has become a well established concept in law. The term is very broad and was defined in the Directive 95/46/EC as ‘any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction’ (Article 2(b)).

  103. 103.

    See WP 29 Opinion personal data 2007 (WP136), p. 15.

  104. 104.

    In a general dictionary, ‘identification’ is described as ‘(1) to identify (2) (in particular) establishment of the identity of a person’ (Van Dale, 13th edition).

  105. 105.

    In an IT-context, identifiability is defined from an attacker’s perspective as that an attacker can ‘sufficiently identify the subject within a set of subjects’. See A. Pfitzmann and M. Hansen, Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity ManagementA Consolidated Proposal for Terminology (Version v.0.31 Febr. 15, 2008), term 13.2, p. 28, (‘Pfitzmann and Hansen, Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management’), available at http://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.31.pdf In the same context, entities are ‘identified’ by identifiers (see also the repeated use of ‘identifiers’ in this context. WP 29 Opinion personal data 2007 (WP136), p. 12). Biometric data are such identifier.

  106. 106.

    ‘Distinguished’ was also the term used in the explanation about biometric data as personal data in the WP 29 Working Document on Biometrics 2003 (WP80): ‘In the context of biometrical identification, the person is generally identifiable, since the biometric data are used for identification or authentication/verification at least in the sense that the data subject is distinguished from any other’ (p. 5).

  107. 107.

    Identification is here understood as in the context of IT- systems.

  108. 108.

    This is in principle not the aim of an identity management system, except in very specific situations (e.g., request of judicial authorities for information about users of specific identifiers (such as an IP number)). Identification would then in principle refer to revealing the identifier used in the system.

  109. 109.

    It could also be revealing information about the capacity or the role of a person (e.g., being a customer of shop A, being an employee, being a friend, etc.) The group to which one belongs could also be such information.

  110. 110.

    See WP 29 Opinion personal data 2007 (WP136), p. 14. See also the mentioning of ‘online identifier’ in the definition in the Proposals for Reform as mentioned above.

  111. 111.

    It could be argued that this way of identification refers only to the identification of a partial identity or a particular role of a person. This however does not exclude that an individual is distinguished within a certain group of persons.

  112. 112.

    For example, by controllers having obtained publicly available research databases for testing purposes.

  113. 113.

    See, e.g., the experience of the granting of access to large-scale databases such as VIS to law enforcement authorities.

  114. 114.

    The exact reason of this exemption in the footnote is as far as we know, not well documented. One possible reason could be the discussion held in relation with biometric data exclusively stored on a personal token, without any reference to name, held under the control of the data subject. We refer to this discussion below in § 231.

  115. 115.

    Whether it is possible to use biometric data ‘anonymously’ is an issue of data minimization (rather than an issue whether the biometric data are in such case personal data). As already stated, this will be further analyzed and discussed below.

  116. 116.

    Such techniques (e.g., encryption, protected templates, etc…) have been developed in the field of biometric data, arguably to exclude identification possibilities but are in fact rather enhancing privacy by limiting identification possibilities. For this reason, these techniques will be discussed in the context of the recommendations in Part III. As these techniques do not render the biometric data not personal data, protected templates will not be discussed here.

  117. 117.

    Article 3.2 Directive 95/46/EC. Compare with Article 2.2 of the Proposal for General Data Protection Regulation (see below), in particular ‘national security’ instead of ‘public security’ (art. 2.2.a) and the slightly modified wording ‘without any gainful interest in the course of its own exclusively personal or household activity’ (emphasis added). This new wording in the latter part does not change our analysis. There is however a difference between national and public security (see Part II). We further do not mention here the processing of data otherwise than by automatic means which no not form part of a filing system, since for our research only biometric data processed by automated means are considered.

  118. 118.

    See the decision of the Court of Justice in the PNR case which explicitly acknowledged the limited field of the Directive: ECJ, C-317/04 and C-318/04, European Parliament v. Council of the European Union and Commission of the European Communities, 30.05.2006, ECR 2006, p. I-4721, § 59 (‘ECJ, PNR case 2006’); see also, e.g., H. Hijmans, ‘Recent developments in data protection at European Union level’, ERA Forum, 2010, (219) p. 224, also published online on 1.07.2010 at http://www.springerlink.com/content/55v28703k0401j06/?p=8deefc2e6fe 44277952aa80998026d61&pi=1(‘Hijmans, Recent developments, 2010’); E. Brouwer, P. De Hert and R. Saelens, ‘Ontwerp-Kaderbesluit derde pijler holt bescherming persoonsgegevens uit’ in P&I 2007, p. 9 (‘Brouwer, De Hert and Saelens, Ontwerp-Kaderbesluit, 2007’).

  119. 119.

    For a succinct overview of this integration and changes since 1995, see, e.g., K. Lenaerts and P. Van Nuffel, Europees recht, 2011, Antwerpen, Intersentia, pp. 217–240 (‘Lenaerts and Van Nuffel, Europees recht, 2011’).

  120. 120.

    See P. De Hert and C. Riehle, ‘Data protection in the area of freedom, security and justice. A short introduction and many questions left unanswered’, ERA Forum, 2010, pp. 159–167, also published online on 13.07.2010 at http://www.springerlink.com/content/u6566750w5954262/ (‘De Hert and Riehle, Data Protection, 2010’).

  121. 121.

    Public security should however not be confused with public safety, mentioned in Article 8, § 2 ECHR as a ground for interference with fundamental rights. See below and Part III.

  122. 122.

    Council Framework decision 2008/977/JHA of 27 November 2008 on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters, O.J. L 350, 30.12.2008, pp. 60–71 (‘Framework decision 2008/977/JHA’). About this Framework decision, see also, e.g., De Hert and Riehle, Data Protection, 2010, pp. 162–164 and Brouwer, De Hert and Saelens, Ontwerp-Kaderbesluit, 2007. The authors list the many short comings of this Framework decision. At the same time, it does not exclude that some data protection rules and principles are ad hoc included in regulatory instruments, e.g., for the Prüm cooperation. About the Proposals for Reform 2012 see below footnote 570.

  123. 123.

    The decision and the control over the means (choice of the system) and the purposes of the processing of the data, however, shall remain with the property owner. In case the system would be connected with a central unit, operated by security services, e.g., in a gated community, the decisions may no longer be taken by the property owner (alone) but (also) by third parties. We recommend to have a careful reading of the service contract in that case.

  124. 124.

    For example to allow to start the car, or for customization purposes (e.g., of the seats).

  125. 125.

    This conclusion is in our view acceptable. However, that the same should be concluded if the laptop or mobile phone is owned and exclusively used by one and the same professional for its professional activities, may not be desirable. It would imply that, e.g., this professional one-man/women company owning and using this equipment shall respect the obligations (e.g., the notification obligation, …) under the data protection legislation. The revision of the Directive 95/46/EC may want to address this specific case where personal (or biometric data) are kept under the control and use of one and the same individual (about the control element, see also Part III).

  126. 126.

    See, e.g., the face recognitions software developed and demonstrated at the Mobile World congress trade show in 2010 by two Swedish companies, Polar Rose and The Astonishing Tribe, and which can be used on the Internet. Google has also considered to make face recognition technology available, after the launch of a similar product by Face.com. See M. Palmer, ‘Face recognition software grows’, Financial Times, 21.5.2010, available at http://www.ft.com/cms/s/2/143bedaa-64fa-11df-b648-00144feab49a,stream=FTSynd.html. In June 2011, Facebook made face recognition technology available to all its users. On further developments and the consequences of the latter case, see also footnote 293 below and Part III.

  127. 127.

    In case the pictures are stored in central or distributed systems, the individual has most likely no control over the data anymore. The Polar Rose solution was targeted to ‘web-service providers, social networks, carriers and other companies with photo repositories’. The Swedish face recognition company was later in 2010 acquired by Apple.

  128. 128.

    Match-on-card technology (‘MOC’) refers to technology whereby the comparison process takes place within the card (or token) and whereby the biometric data does not have to leave the card for the comparison process to other (untrusted or not) biometric system components.

  129. 129.

    R. Hes, T. Hooghiemstra and J. Borking, At Face Value. On Biometrical Identification and Privacy, Achtergrond Studies en Verkenningen 15, The Hague, Registratiekamer, September 1999, pp. 36–37 (‘Hes, Hooghiemstra and Borking, At Face Value, 1999’), available at http://www.cbpweb.nl/Pages/av_15_At_face_value.aspx. In the report, reference was made (in a footnote) to the approach taken by some Dutch authors in case of the processing of biometric data in a similar setting, whereby these authors concluded that no personal data were processed. It was stated that it was preferable to conclude in such case that the biometric data processing was outside the scope of the Directive 95/46/EC.

  130. 130.

    About the importance of the concept of control of the data subject over his or her personal data, see Part III.

  131. 131.

    This concept, however, remains undefined.

  132. 132.

    As stated above, if a natural person would decide to install a biometric system for access control to his or her private house to be used by him or her and family members and eventually house personnel, such processing remains ‘in the course of a purely personal or household activity’ and falls outside the scope of the Directive 95/46/EC.

  133. 133.

    Hes, Hooghiemstra and Borking, At Face Value, pp. 52–53.

  134. 134.

    For an identity or access management application, the service provider will need to know to whom access has been granted.

  135. 135.

    A ‘zero-knowledge protocol’ is a cryptographic technique which does not leak information about the secret to be protected. One will hence not learn the secret, but one can learn something else derived from the secret that is unique for this secret allowing to distinguish different secrets. About the advantages of the use of such methods, see also Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, p. 32.

  136. 136.

    What the authors may have intended to say, could be that a certain level of protection is obtained because the biometric data is not communicated and therefore the individual stays ‘anonymous’ to the extent biometric data could not be used to identify him or her.

  137. 137.

    The other data mentioned are personal data revealing ‘political opinions, religious or philosophical beliefs, trade-union membership,’ and ‘data concerning sex life’ (Article 8 Directive 95/46/EC). The term ‘sensitive data’ or ‘sensitive personal data’, however, is presently not used in the Directive 95/46/EC as such. Nevertheless, the concept is frequently deployed by legal authors in order to refer to all or some of the fore mentioned kinds of data. See, e.g., De Bot, Verwerking Persoonsgegevens, 2001, pp. 136–139; M.-H. Boulanger, C. de Terwangne, Th. Léonard, S. Louveaux, D. Moreau and Y. Poullet, ‘La protection des données à caractère personnel en droit communautaire’, Journal des tribunaux droit européen 1997, (121), p. 148. Sometimes, the term is also used in national data protection legislation (see, e.g., Swedish Personal Data Act (‘SFS 1998:204’), 29 April 1998, Article 13 et seq., available at http://www.government.se/content/1/c6/01/55/42/b451922d.pdf).

  138. 138.

    Article 8(5) of the Directive 95/46/EC.

  139. 139.

    Since the research is restricted to the use of biometric data in the private sector, excluding the review in depth of the processing of biometric data in the public sector and for law enforcement purposes, the use of biometric data, such as of pictures taken at the time of detention of a convict, is not further part of the analysis.

  140. 140.

    See, for example, J. Bing, ‘Classification of personal information, with respect to the sensitivity aspect’, in Data Banks and Society, Proceedings of the First International Oslo Symposium on Data Banks and Society, Oslo, Scandinavian University Books, 1972, pp. 98–150 (‘Bing, Classification of personal information, 1972’); see also e.g., K. Mc Cullagh, Data Sensitivity: resolving the conundrum, 22nd BILETA Conference, 2007, 15 p., available at http://works.bepress.com/karen_mccullagh/10/; R. Turn, ‘Classification of personal information for privacy protection purposes’, in AFIPS’ 76 Proceedings of the June 7–10, 1976, National Computer Conference and Exposition, New York, ACM, 1976, pp. 301–307, available at http://portal.acm.org/citation.cfm?id=1499799&picked=prox&CFID=8583698&CFTOKEN =72455603

  141. 141.

    See also M. Simitis, Les données sensibles revistées, 1999, p. 7 (‘Simitis, Les données sensibles revistées, 1999’), available at http://www.coe.int/t/f/affaires_juridiques/coop%E9ration_juridique/protection_des_donn%E9es/documents/Rapports%20et%20%E9tudes%20des%20experts/1Report_Simitis_1999_fr.pdf

  142. 142.

    See below Part III, Chap. 7, § 63. Furthermore, if a data subject is physically or legally incapable of giving his consent, and the processing is necessary to protect the vital interests of the data subject or another person, the processing of ‘sensitive data’ is also allowed (Article 8 (2) (c)).

  143. 143.

    About Art. 8(4) Directive, see also Part III, Chap. 7, § 166 and Chap. 8, § 262 and § 276, where we plead for the adoption of legislation, rather than that controllers are subject to authorizations of supervisory authorities for biometric applications.

  144. 144.

    In Part II, Chap. 4, Sect. 4.1, we will describe in detail with more examples to what extent biometric data could reveal sensitive information.

  145. 145.

    Wet 8 december 1992 tot bescherming van de persoonlijke levenssfeer ten opzichte van de verwerking van persoonsgegevens, B.S., 18.03.1993, pp. 5801–5814 (‘Data Protection Act 1992’).

  146. 146.

    More in particular, it was expressly stated in a ‘report to the King’ upon the enactment of the Royal Decree N° 7 for the fixation of the purposes, the criteria and the conditions of the authorized processing of the data which relate to ‘race, ethnic origin (…)’ in execution of article 6 of the Data Protection Act 1992 that not only so-called ‘direct sensitive data’, but alsoindirect sensitive data’ fall under the specific protection mechanisms of Article 6 of the Belgian Data Protection Act. See Verslag aan de Koning met betrekking tot het K.B. nr. 7 tot vaststelling van de doeleinden, de criteria en de voorwaarden van toegestane verwerkingen van de gegevens bedoeld in artikel 6 van de wet van 8 december 1992 tot bescherming van de persoonlijke levenssfeer ten opzichte van de verwerking van persoonsgegevens, B.S. 28.02.995, (4409), pp. 4412–4413 (‘Royal Decree N° 7 of 1995’) (‘Report to the King relating to Royal Decree N° 7 of 1995’).

  147. 147.

    CBPL, Advies nr. 07/93 van 6 augustus 1993 betreffende de verwerking van gevoelige gegevens, in de zin van artikel 6 van de wet van 8 december 1992 tot bescherming van de persoonlijke levenssfeer ten opzichte van de verwerking van persoonsgegevens, B.S. 28.02.1995, (4416), p. 4420 (‘CBPL, Opinion N° 7/93 relating to the processing of sensitive data in the sense of article 6 of the law of 8 December 1992 in relation with the Royal Decree N° 7’).

  148. 148.

    See the ‘Report to the King’ relating to Royal Decree N° 7 of 1995, pp. 4412–4413.

  149. 149.

    Article 6 Data Protection Act 1992 (stating that the processing of such sensitive data ‘is only permitted for the purposes specified by or on the basis of the law’ (emphasis added) (‘slechts door de door of krachtens de wet vastgestelde doeleinden toegestaan’/‘nest authorisé quaux fins déterminées par ou en vertu de la loi’).

  150. 150.

    On the need for regulation by law, as we will argue, see below Part II and Part III, Chap. 8, Sects. 8.1 and 8.2. Moreover, the Data Protection Act 1992 required that various other conditions had to be met for the processing of such data and which were not set forth in the Directive 95/46/EC. See also Ch. van Oldeneel, ‘Protection de la vie privée. Incidences pratiques de la directive européenne sur le droit belge’, C.J. 1996, (21), p. 23. On the other hand, the initial Data Protection Act 1992 was not so strict as compared with the Directive 95/46/EC which issued a prohibition of processing of sensitive data as a principle as such.

  151. 151.

    Article 148 of the Act of 21.12.1994, B.S. 23.12.1994. See on this shift and issue also F. Robben, ‘De verwerking van gevoelige en gerechtelijke gegevens en de bescherming van de persoonlijke levenssfeer’, in J. Dumortier and F. Robben (eds.) Persoonsgegevens en privacybescherming. Commentaar op de wet tot bescherming van de persoonlijke levenssfeer, Brugge, Die Keure, 1995, (119), pp. 127–129 (‘Robben, De verwerking van gevoelige en gerechtelijke gegevens, Dumortier and Robben, Persooonsgegevens en privacybescherming, 1995’).

  152. 152.

    K.B. nr. 14 tot vaststelling van de doeleinden, de criteria en de voorwaarden van toegestane verwerkingen van de gegevens bedoeld in artikel 6 van de wet van 8 december 1992 tot bescherming van de persoonlijke levenssfeer ten opzichte van de verwerking van persoonsgegevens, B.S. 30.05.1996, pp. 14532–14534 (‘Royal Decree N° 14 of 1996 for the fixation of the purposes, the criteria and the conditions of the authorized processing of the data mentioned in article 6 of the Act of 8 December 1992’).

  153. 153.

    Wet 11 december 1998 tot omzetting van de richtlijn 95/46/EG van 24 oktober 1995 van het Europees Parlement en de Raad betreffende de bescherming van natuurlijke personen in verband met de verwerking van persoonsgegevens en betreffende het vrij verkeer van die gegevens, B.S., 03.02.1999, pp. 3049–3065 (‘Act of 1998 modifying the Data Protection Act 1992’).

  154. 154.

    The formulation adopted in the original Article 6 was: ‘personal data concerning race, etnic origin, (…)’ (‘met betrekking tot ras, etnische afstamming’/‘relatives aux origines raciales ou ethniques, …’) (emphasis added).

  155. 155.

    Memorie van Toelichting bij Wetsontwerp tot omzetting van de Richtlijn 95/46/EC en bij Wetsontwerp tot wijziging van de wet van 8 december 1992, Hand. Kamer 1997–98, pp. 33–34. The legislator clarified in a report to the King relating to Royal Decree N° 14 of 1996 that such deduction shall be ‘directly and imperative’ and interpreted in a strict way (B.S. 30.05.1996, (14515), p. 14515). The legislator stated that it hereby now followed the interpretation of the CBPL in its Opinion N° 7/93, but without using the terms ‘directly and indirectly sensitive data’. The example was given of being named in a membership list of a political party, from which one can deduce with certainty one’s political conviction as opposed to be on a client list of a particular newspaper or for the purchase of a book from which one could not deduce with sufficient certainty political belief. These examples, however, are in our view not very convincing.

  156. 156.

    See also Docquir, Vie Privée, 2008, pp. 208–209, no 488 (‘Si l’on veut bien rappeler par ailleurs que la violation de l’article 6 de la loi est sanctionnée pénalement, ce manque de clarté dans la définition légale est fortement critiquable’) and references mentioned.

  157. 157.

    These consequences are that in case ‘sensitive personal data’ are processed, one of the exemption grounds need to be present, for example, the written consent (if permitted) of the data subject, exemption by law or by the supervisory authority.

  158. 158.

    T. Léonard and Y. Poullet, ‘La protection des données à caractère personnel en plein (r)évolution. La loi du 11 décembre 1998 transposant la directive 95/46/EG du 24 octobre 1995’, J.T. 1999, p. 386, no 36 (‘Léonard and Poullet, La protection des données à caractère personnel en plein (r)évolution, 1999’).

  159. 159.

    Other approaches which have been defended in general in relation with sensitive data, includes the context-based approach: data become ‘sensitive’. This would for example be the approach taken in Germany. See Simitis, Les données sensibles revistées, 1999.

  160. 160.

    De Bot, Verwerking Persoonsgegevens, 2001, p. 141, no 184.

  161. 161.

    See above § 236.

  162. 162.

    Art. 6 § 1 Royal Decree N° 7 of 1995.

  163. 163.

    Report to the King relating to Royal Decree N° 7 of 1995, p. 4428.

  164. 164.

    See below Part II, Chap. 4, §§ 71–91.

  165. 165.

    See, e.g., as a more recent illustration, the opinion of the EDPS on the Turbine project, EDPS, Opinion 1.02.2011 on a research project funded by the European Union under the 7th Framework Programme (FP 7) for Research and Technology Development (Turbine (TrUsted Revocable Biometric IdeNtitiEs), p. 3, (‘EDPS, Opinion on Turbine, 2011’), available at https://secure.edps.europa.eu/EDPSWEB/webdav/shared/Documents/Consultation/Opinions/2011/11-02-01_FP7_EN.pdf, where the qualification of sensitive data is not used, but rather that biometric data ‘due to their specific nature, present special risks in their implementation which have to be mitigated’. But: see and compare with other wording used by the EDPS in an opinion a few months later in EDPS, Opinion on a notification for prior checking received from the Data Protection Officer of the European Commission related to the “Fingerprint recognition study of children below the age of 12 years”, 25.7.2011 (Case 2011-0209), p. 4 (‘EDPS, Opinion Fingerprint recognition study, 2011’), cited in Part II, Chap. 4, footnote 263.

  166. 166.

    On the concept of medical data and its broad interpretation under the Data Protection Act 1992 (before its modification), see Callens, Goed geregeld, 1995, pp. 80–90 and the various references mentioned therein.

  167. 167.

    For example, one can deduce from a picture that the data subject has a disability. See De Bot, Verwerking Persoonsgegevens, 2001, p. 154, no 204.

  168. 168.

    See, e.g., in general about the processing of data concerning health and the set up of an ehealth platform, AH (Belgium), N° 15/2008, 14.02.2008.

  169. 169.

    De Bot, Verwerking Persoonsgegevens, 2001, p. 155, no 205.

  170. 170.

    Art. 7 § 2 a) of the Data Protection Act 1992 as modified.

  171. 171.

    K.B. 13 februari 2001 ter uitvoering van de wet van 8 december tot bescherming van de persoonlijke levenssfeer ten opzichte van de verwerking van persoonsgegevens, Art. 27, B.S. 13.03.2001, (7908), p. 7913 (‘Royal Decree of 13 February 2001’). This Royal Decree replaced various previous royal decrees, including the Royal Decree N° 14 of 1996.

  172. 172.

    Art. 7 § 4 and Art. 7 § 5 of the Data Protection Act 1992 as modified.

  173. 173.

    See and compare with the quote of Susan Sontag, an American essayist, saying that we all carry and travel on two passports – one that allows us into the kingdom of the well and another, which we’re less inclined to use, that takes us into the realm of the sick. One may wonder whether this is no longer a metaphor but is becoming reality or at least closer in the strict sense (with the introduction of the new ePassports)?

  174. 174.

    CBPL, Opinion N° 17/2008 biometric data, p. 9, n° 29.

  175. 175.

    Ibid., p. 9, n° 30. See also P. De Hert, Biometrics: legal issues and implications. Background paper for the Institute of Prospective Technological Studies, DG JRC, 2005, Seville, European Commission, p. 17, (‘De Hert, Background paper, 2005’) available at http://cybersecurity.jrc.ec.europa.eu/docs/LIBE%20Biometrics%20March%2005/LegalImplications_Paul_de_Hert.pdf. De Hert states: ‘(…) it may well be that judges and policy makers do not regard biometric data as sensitive data as long as the purpose of the processing is not to identify sensible data.’

  176. 176.

    See, e.g., Council of Europe, Progress report of application of the Convention to biometric data, 2005, pp. 19–20, §74 (‘Council of Europe, Progress report biometric data, 2005’). The report suggest the use of templates as they believe that the choice of data to be extracted in generating a template could and should avoid revealing sensitive data ‘as, in general, these data will not be able to verify the data subject’s identity or identify him or her’. This may not be entirely correct as precisely distinguishing information is sought for in biometric applications (for example, to improve the functioning or for use for specific purposes (e.g., ethnic profiling)); see also Hes, Hooghiemstra and Borking, At Face Value, p. 39. The position in the report of the Council of Europe may be inspired by the general belief and intention in DNA matters to use only identifying information (see below).

  177. 177.

    The Dutch general data protection legislation is currently set forth in the general Act on Protection of Personal data which entered into force on 1 September 2001 (Wet bescherming persoonsgegevens, Stb. 2000, 302) (‘WBP’). It replaced the previous and first data protection Act (Wet op de Persoonsregistratie, Stb. 1988, 665 (‘Wet op de Persoonsregistratie 1988’ or ‘Wpr’).

  178. 178.

    Besluit Gevoelige Gegevens, 19 February 1993, Stb. 1993, nr. 158 (‘BGG’). In this regulatory act, it was stated that ‘sensitive personal data’ as defined in the WPR could not be registered (‘opgenomen’) than as stated by a (formal) law or in the BBG.

  179. 179.

    For example, the registration of the nationality or the place of birth is in principle not the registration of ‘sensitive personal data’, but could become ‘sensitive personal data’ if the purpose of the registration is to find out the ethnic origin. See on the BGG also F. Kuitenbrouwer, ‘Een zwak ontwerp-besluit over gevoelige (medische) gegevens’, Tijdschrift voor Gezondheidsrecht 1990, pp. 130–138 (‘Kuitenbrouwer, Een zwak ontwerp-besluit over gevoelige (medische) gegevens, 1990’). Kuitenbrouwer refers to several reports and authors discussing and rejecting the fact that the nature rather than the use of the data (‘karakterbovenkader’) would determine its sensitivity (e.g., referring to J. Sentrop, Privacy-bescherming in Nederland, Deventer, 1984 (p. 61) giving criticism on the choice of nature).

  180. 180.

    See G.-J. Zwenne and L. Mommers, ‘Zijn foto’s en beeldopnamen ‘rasgegevens’ in de zin van artikel 126nd Sv en artikel 18 Wbp?’, P&I 2010, (237), p. 237 (‘Zwenne and Mommers, Rasgegevens, 2010’) who cite from the preparatory works, referring to Kamerstukken II, 1997/98, 25 892, nr. 3, p. 101. Facial images on badges of employees were further discussed during the preparatory works as falling under the sensitive data provision. Ibid., p. 237. In another context, it was accepted during parliamentary discussions about minority politics, that data revealing race compromises data about ethnic origin, which include data about the country of birth of the data subject, parents and grandparents. See Kamerstukken II 1996/97, 25 001, nr. 24, Minderhedenbeleid 1997, Brief van de Ministers van Justitie en van Binnenlandse Zaken, 29.04.1997, p. 1. A broad interpretation that external characteristics (deduced on the basis of DNA) may reveal information about health was also maintained in an opinion of the CBP on the use of DNA for the specification of external characteristics of a suspect. See Registratiekamer, Wetsontwerp DNA-onderzoek pers. kenmerken, 22.12.2000, p. 3 (‘Registratiekamer, DNA-onderzoek Pers. kenmerken, 2000’), available at http://www.cbpweb.nl/downloads_adv/z2000-1143.pdf. Further to Art. 8 (4) Directive, Art. 23 1 (e) Wbp provides for exceptions for reasons of ‘substantial public interest’, and several additional explicit legal provisions allow for particular reasons the processing of sensitive data, although not in a way, allowing for an overview. See also J. Dierx and D. Ernste, Verwerking van persoonsgegevens over etnische afkomst. Verkenning onder achttien gemeenten, 2010, De Beuk, p. 9.

  181. 181.

    CBP, Richtsnoeren. Publicatie van persoonsgegevens op internet, Den Haag, CBP, 2007, p. 15, available at http://wetten.overheid.nl/BWBR0033572/geldigheidsdatum_15-11-2013, and also published in the official publication gazette (‘Staatscourant’) of 11 December 2007 (‘CBP, Richtsnoeren. Publicatie van persoonsgegevens op internet, 2007’). About these guidelines, see, e.g., J. Berkvens, ‘Richtsnoeren publicatie persoonsgegevens op internet’, Computerrecht 2008, pp. 199–201; see also below § 292 et seq. This approach has been identified as a ‘principle problem’ in a report on the evaluation of the data protection legislation: see G.-J. Zwenne, A.-W. Duthler, M. Groothuis, H. Kielman, W. Koelewijn en L. Mommers, Eerste fase evaluatie Wet bescherming persoonsgegevens. Literatuuronderzoek en knelpuntenanalyse, Leiden, 2007, p. 75, available at http://www.wodc.nl/onderzoeksdatabase/1382a-evaluatie-wet-bescherming-persoonsgegevens-wbp-1e-fase.aspx. For a similar position, where the aim or intention of the controller is taken as criterion, see also EDPS, The EDPS Video-surveillance Guidelines, Brussels, March 2010, pp. 28–29.

  182. 182.

    Hoge Raad, 23.03.2010, LJN BK6331. A request for sensitive data requires a prior authorisation as laid down in art. 126nf of the Dutch Code of Criminal Procedure (see also Part III, Chap. 8, § 283). See also and compare with Murray v. Express Newspapers & Big Pictures (UK) Ltd [2007] EWHC 1908. In this case in the United Kingdom where a photo was taken covertly by a photographer using a long range lens, of a child accompanied by his parents in a public street, it was ‘pleaded fact and therefore a given (…) that the photograph does show the colour of the Claimant’s hair and the colour of his skin’ (§77). According to the claimant, ‘the information conveyed by his image in the photograph does consist of information about his racial or ethnic origin and his physical health precisely because it shows him to be a white Caucasian male child with no obvious physical infirmities or defects’ (§ 78). The High Court thereupon stated that ‘(…) if a photograph and the information it contains constitutes personal data then it is hard to escape from the conclusion that insofar as it indicates the racial or ethnic origin of the data subject it also consists of sensitive personal data’ (emphasis added) (§ 80). The High Court hence confirmed that the photograph constituted in its view sensitive personal data but did not follow the argument about the health condition on the basis that a picture ‘of an apparently healthy individual in fact tells one nothing about his actual state of health’.

  183. 183.

    About this case, and (extensive) case law from lower courts which do not follow the reasoning of the Supreme Court, see Zwenne and Mommers, Rasgegevens, 2010.

  184. 184.

    Hoge Raad, 3.03.2009, LJN BG9218. Mere identity data of persons who requested medical treatment at a hospital were by the lower court in this case considered as providing indirectly information about health and therefore sensitive data, which was not incorrect.

  185. 185.

    Hes, Hooghiemstra and Borking, At Face Value, p. 39. Confusing terminology relating to the processing steps of the biometric data was used and the images in this case were called ‘initial templates’. See on this point and on this opinion also below, Part II.

  186. 186.

    Registratiekamer, Biometrisch toegangscontrolesysteem VIS 2000, 19.03.2001, p. 7 (‘Registratiekamer, Discopas opinion 2001’), also available at http://www.cbpweb.nl/downloads_uit/z2000-0080.pdf

  187. 187.

    See Article 18(a) WBP.

  188. 188.

    J. Prins and J. Berkvens, ‘De Wet bescherming persoonsgegevens’ in J. Berkvens and J. Prins (eds.), Privacyregulering in theorie en praktijk, Deventer, Kluwer, 2007, p. 36, footnote 50. Although this aspect of data processing is important, this was only mentioned in a footnote.

  189. 189.

    See e.g., C. Prins, ‘Making our body identify for us: Legal implications of biometric technologies’, Computer Law & Security Report, 1998, p. 162 (‘Prins, Making our body identify for us, 1998’); J. Holvast, ‘Elektronische overheid’, in J. Berkvens and J. Prins (eds.), Privacyregulering in theorie en praktijk, Deventer, Kluwer, 2007, p. 120.

  190. 190.

    Loi n° 78-17 du 6 janvier 1978 relative à l’informatique, aux fichiers et aux libertés, as amended in 2004 and 2009.

  191. 191.

    Criminal sanctions are imposed in case of unlawful processing of sensitive data. See also Part III, Chap. 9, § 384, § 543 and § 569. About the processing of sensitive data in France, see also D. Korff (ed.), Comparative Study on Different Approaches to new privacy challenges, in particular in the light of technological developments. Country studies. A.3France, May 2010, pp. 11–14. Korff, e.g., mentions that the French DPA can allow the processing of certain categories of sensitive purposes for a “brief period” after which they are anonymised (p. 14).

  192. 192.

    See TGI Privas, 3.09.1997, Expertises 1999, no 213, p. 79. See also F. El Atmani, ‘Données sensibles: la notion de consentement de la personne concernée’, Lamy droit de linformatique 1996, N° 86, pp. 1–11. About the French DPA: see CNIL, La CNIL et le-santé. Note de synthèse, 8.03.2001, 3 p., available at http://www.cnil.fr/fileadmin/documents/approfondir/dossier/sante/e_sante.pdf

  193. 193.

    The four Member States which will be mentioned became EU Member State in 2004.

  194. 194.

    Article 4 (b) of the Personal Data Protection Act N° 101/2000 of 4 April 2000 on the Protection of Personal Data and on Amendment to Some Acts, 4 April 2000 (‘Czech Republic Personal Data Protection Act N° 101/2000 of 4 April 2000’), of which an unofficial translation of the coordinated version in English is available at http://ec.europa.eu/justice/policies/privacy/docs/implementation/czech_republic_act_101_en.pdf

  195. 195.

    Article 6 (19) of the Personal Data Protection Act (ZVOP-1) of the Republic of Slovenia, as amended (‘Republic of Slovenia Personal Data Protection Act (ZVOP-1)), of which an unofficial translation in English is available at http://www.ip-rs.si/index.php?id=339. The Republic of Slovenia Personal Data Protection Act ZVOP-1 was adopted on 15 July 2004 and came into force on 1 January 2005. The wording is however unclear, e.g., how should one understand ‘in connection with’.

  196. 196.

    Act N° 428/2002 Coll. On the Protection of Personal Data, as amended by the Act No. 602/2003 Coll., Act No. 576/2004 Coll and the Act No. 90/2005 Coll. (‘Slovak Republic Act N° 428/2002 Coll. On the Protection of Personal Data’), of which an unofficial translation in English is available at http://www.dataprotection.gov.sk/buxus/docs/act_428.pdf

  197. 197.

    § 4, (2), 5) of the Personal Data Protection Act of 12 February 2003, as amended (‘Estonia Personal Data Protection Act of 12 February 2003’), available at http://www.legaltext.ee/et/andmebaas/tekst.asp?loc=text&dok=XXXX041&keel=en&pg=1&ptyyp=RT&tyyp=X&query=isikuandmete+kaitse

  198. 198.

    ECJ, C-101/01, Bodil Lindqvist, 6.11.2003, ECR 2003, p. I-12971, § 50 (‘ECJ, Lindqvist, 2003’). The questions submitted for preliminary ruling, however, did not relate to the interpretation of other ‘sensitive personal data’. Especially the question as to whether the listing of the names could be ‘data revealing religious beliefs’ would also have been interesting to have clarified.

  199. 199.

    ECtHR, Z. v. Finland, no. 22009/93, 25 February 1997, §§ 95–96 and § 113 (‘Z. v. Finland 1997’). See and compare with ECtHR, I. v. Finland, 20511/03, 17 July 2008, where Finland was considered to have been in breach of Article 8, §1 of the Convention because of its failure to ensure that medical data of the applicant were adequately secured against unauthorized access (see §§ 35–49) (‘I. v. Finland 2008’). See also Part III, Chap. 8, § 332.

  200. 200.

    ECtHR, S. and Marper v. United Kingdom, nos. 30562/04 and 30566/04, 4 December 2008, § 103 (‘S. and Marper 2008’).

  201. 201.

    Ibid., §120.

  202. 202.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 10.

  203. 203.

    Ibid., p. 10.

  204. 204.

    This conclusion has come forward in various reports and opinions on biometric data processing. See, e.g., Hes, Hooghiemstra and Borking, At Face value report, 1999, p. 39; Registratiekamer, Discopas opinion 2001, p. 7; the WP 29 Working Document on Biometrics 2003 (WP80), p. 10 and, more generally about sensitive data, the Council of Europe, Progress report of application of the Convention to biometric data, 2005, p. 19.

  205. 205.

    See also Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, pp. 34–35, where the skin color of face is mentioned as being sensitive information.

  206. 206.

    See De Hert, Background paper, 2005, p. 36. However, no references are made to literature or reports which would confirm this.

  207. 207.

    For research in the medical field, see, e.g., H. Maricq, ‘“Ethnic” Differences in the Fingerprint Data in an “All White” Control Sample’, Human Heredity 1972, pp. 573–577. The research indicated that the frequency of whorls in fingerprint of specific ethnic (sub)groups contained statistically significant differences; H. Swofford, ‘Fingerprint Patterns: A Study on the Finger and Ethnicity Prioritized Order of Occurrence’, from the Journal of Forensic Identification 2005, available at https://www.ncjrs.gov/App/publications/Abstract.aspx?id=210510; S. Globe, ‘Which Fingerprint patterns is Most Common: Arch, Whorl, or Loop?’, Project summary, available at http://www.usc.edu/CSSF/History/2003/Projects/J1005.pdf. About fingerprint revealing gender, see e.g., M. Nithin, B. Manjunatha, D. Preethi and B. Balaraj, ‘Gender differentiation by finger ridge count among South Indian population’, Journal of Forensic and Legal Medicine 2011, pp. 79–81. See also Part II, Chap. 4, §§ 72–78 and the references mentioned there. Characteristics, other than face, which are sometimes not well pronounced for specific ethnic groups, may reveal origin as well, such as the very fine finger ridge structure of Asian people, and may lead to increased failures to capture. See about this issue, e.g., J. Schneider, Ultrasonic Sensors for Capturing Lifescan of Fingerprint Biometrics, p. 9, available at http://www.ultrascan.com/Portals/16/Schneider%20%20Ultrasonic%20Fingerprint%20Sensors.pdf

  208. 208.

    S. Cole, ‘The myth of fingerprints: The Legacy of Forensic Fingerprinting and Arrestee Databases’, GeneWatch 2006, Number 6, pp. 3–6.

  209. 209.

    See also De Hert, Background paper, 2005, p. 17.

  210. 210.

    Other information that is included in biometric characteristics is randotypic information (also called phenotypic, without genetic parts), behavioral information and information about unchanging marks (e.g., scars, but also chronic disease). M. Bromba, On the reconstruction of biometric raw data from template data, 2003 (first release), (‘Bromba, On the reconstruction of biometric raw data, 2003’), available at http://www.bromba.com/knowhow/temppriv.htm. Randotypic information is completely random and behavioral information is completely determined by training. Bromba states that these four types of information is useable and used for authentication purposes in biometric systems.

  211. 211.

    See M. Bromba, ‘What factors contribute to a biometric characteristic’s development?’, available at http://www.bromba.com/faq/biofaqd.htm#entstehen

  212. 212.

    Ibid.

  213. 213.

    See also the Council of Europe, Progress report of application of the Convention to biometric data, 2005, p. 20: ‘The precautionary principle demands that where new techniques may uncover unexpected new information one should be reticent to start with systems where there can be reasonable doubt that in the long run unwanted and possibly irreversible side effects may appear’.

  214. 214.

    Ibid., pp. 19–20.

  215. 215.

    Biological characteristics may indeed also contain information concerning health. Almost all characteristics have the potential to contain information concerning health. This will be further explained in detail in Part II. This can also partly be explained because several health related problems have genetic causes and may therefore be contained in the genotypic information of the characteristics.

  216. 216.

    CBPL, Opinion biometric data and authentication, p. 9.

  217. 217.

    See Bromba, On the reconstruction of biometric raw data, 2003.

  218. 218.

    See, for work in this area, S. Lu and A. Jain, ‘Ethnicity identification from face images’ in Proceedings SPIE Defense and Security Symposium, Orlando, April 2004, 10 p., available at http://www.cse.msu.edu/biometrics/Publications/Face/LuJain_EthnicityFace_SPIE04.pdf

  219. 219.

    See, e.g., Bromba, On the reconstruction of biometric raw data, 2003; Adler, A., ‘Sample images can be independently restored from face recognition templates’, in Electrical and Computer Engineering 2003, IEEE CCECE 2003. Canadian Conference, pp. 1163–1166, also available at http://www.sce.carleton.ca/faculty/adler/publications/2003/adler-2003-ccece-restore-face-recognition-templates.pdf (‘Adler, Sample images can be independently restored from face recognition templates, 2003’); see also C. Hill, Risk of Masquerade Arising from the Storage of Biometrics, 2001, Australian National University, 116 p. on the reconstruction of a similar biometric artefact from fingerprint templates equivalent to the original biometric data provided to the system. For iris image reconstruction, see Part II, Chap. 4, footnote 304.

  220. 220.

    See, e.g., Registratiekamer, Discopas opinion 2001, p. 7.

  221. 221.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 10. The Article 29 Working Party takes in its Opinion 3/2012 now the more clear view that ‘the template should be a one-way process, in that it should not be possible to regenerate the raw biometric data from the template’ (emphasis added) (p. 4). See also Part III, Chap. 8, footnote 248.

  222. 222.

    The Council of Europe, Progress report of application of the Convention to biometric data, 2005, pp. 19–20; see also Article 29 Data Protection Working Party, Opinion on Implementing the Council Regulation (EC) No 2252/2004 of 13 December 2004 on standards for security features and biometrics in passports and travel documents issued by Member States, WP 112, 30 September 2005, p. 8 (‘WP 29 Opinion on Implementing Regulation No 2252/2004 (WP 112)’). In this opinion on the biometric ePassport, the Article 29 Working Party stated ‘(…) governmental institutions and other public authorities will be able to collect and store a huge number of sensitive information about their citizens. In this context it should be particularly pointed out that collecting biometric features means collecting data of the body of a person’. The Article 29 Working Party has in 2012 stated that some biometric systems reveal health information and process sensitive data (see Part II, Chap. 4, § 86).

  223. 223.

    Term 37.03.06 ISO Vocabulary for Biometrics 2012. Note however that this term is defined in the adopted version more precisely as ‘biometric sample or aggregation of biometric samples at any stage of processing, e.g., biometric reference, biometric probe, biometric feature or biometric property’ while the SD2 Version 12 – Harmonized Biometric Vocabulary proposed the same definition which did however not contain ‘e.g.’. Based on the definition of the other terms (e.g. features are not the same as or an example of samples), we believe that adding the word ‘e.g.’ may lead to confusion. About the Vocabulary, see also Chap. 2, § 97.

  224. 224.

    A biometric model is the stored function generated from the biometric data, such as a Gaussian mixture model for speaker recognition (see term 37.03.13 ISO Vocabulary for Biometrics 2012).

  225. 225.

    The definition has been expanding over the years when working on this document. Compare, e.g., ISO/IEC JTC 1/SC 37, Standing Document 2Harmonized Biometric Vocabulary, version 5, N 1480, New York, ANSI, 31 January 2006, working draft text, 54 p. and the SD2 Version 12 – Harmonized Biometric Vocabulary, the latter about 200 pages.

  226. 226.

    Term 37.03.11 ISO Vocabulary for Biometrics 2012. An example of biometric features are, e.g., the numbers or labels for the minutiae or the patterns of a fingerprint.

  227. 227.

    Term 37.03.15 ISO Vocabulary for Biometrics 2012.

  228. 228.

    In particular circumstances, however, (e.g., a very small group) such distinctive characteristics could also be used to single a person out of a group of persons, sometimes referred to as to identify someone. We referred to these characteristics which are universal (‘common’) but not unique or sufficiently distinctive for (identity) verification, ‘soft biometric characteristics’ (see Chap. 2, §§ 82–83).

  229. 229.

    P. van Hengel, Detecting verbal aggression in a real-life inner city environment, 31 January 2008, Werkgemeenschap voor Informatie- en Communicatietheorie, conference ‘Public Safety’, Eindhoven, previously available at http://www.sas.el.utwente.nl/wic2008mwm/PresentatieVanHengel.pdf

  230. 230.

    Such profiling could also be done by the use of soft biometric data (for example, length, …). On the use of soft biometric data for profiling purposes, see Kindt, Need for Legal Analysis of Biometric Profiling. Reply, in Hildebrandt and Gutwirth, Profiling the European Citizen, 2008, pp. 139–144; for a critical view, see also I. van der Ploeg, ‘Chapter 3. Normative Assumptions in Biometrics: On Bodily Differences and Automated Classifications’, in S. van der Hof and M. Groothuis (eds.), Innovating Government. Normative, policy and technological dimensions of modern government, The Hague, Asser, 2011, pp. 29–40. However, not only soft biometric data could be used for profiling purposes. Facial images, e.g., from which gender or age is deduced, could also be used for profiling, e.g., for targeted advertisement purposes. See, e.g., Immersive Labs, a company that builds software for digital billboards, registering faces of customers, for delivering tailored messages in real-time (see home page at http://immersivelabs.com/). For applications for tailored services in bars based on profiling, see, e.g., SceneTap, claiming to use ‘anonymous facial detection technology’ (see home page http://www.scenetap.com/) (about the – in our view- contradiction of terms, see Part III). In these cases, however, no claim is made by the data subject or controller. But: see WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 5–6, which seems to include the use of biometric systems for categorisation/segregation purposes as well.

  231. 231.

    Soft biometric data would hence only be included as biometric data in our definition in so far the data are of a biological nature ànd are fit for or actually used for the purposes of identification or (identity) claim verification. See and compare with e.g., tattoos, as discussed below in § 276.

  232. 232.

    The term ‘biometrics’ generally encompasses the counting, the measuring and the analysis of any kind of data in the biological and medical sciences, e.g., of particular sizes (length, weight, etc.) of newborns. See and compare also with term 37.01.03 ISO Vocabulary for Biometrics 2012. ‘Biostatistics’ is another general term (sometimes also referred to as biometrics) which is different from the processing of biometric data as we will define it. It is about the application of statistics to (a wide range of topics in) biology (and most commonly to medicine).

  233. 233.

    See also the SD2 Version 12 – Harmonized Biometric Vocabulary where biometric data was further described in a note as ‘any form of information (numerical value, statistic, type, class, etc.) obtainable from biometric characteristic for the purpose of using ‘biometric system” (p. 56).

  234. 234.

    See also OECD, Biometric-based technologies, 2004, p. 10.

  235. 235.

    Such attendant could assist the data subject during enrolment or for later comparison or attend the system for security reasons. An attendant would from a legal point of view be considered an agent or personnel of the controller or the processor, acting on behalf of the controller.

  236. 236.

    E.g., wrinkles may presently not be a common (distinctive) characteristic or feature used in biometric systems, but future use thereof may not be excluded.

  237. 237.

    A. Cavoukian, Biometrics and Policing: Comments from a Privacy Perspective, Information and Privacy Commissioner, Ontario, Canada, 1999, p. 2, available at http://www.ipc.on.ca/images/Resources/biometric.pdf

  238. 238.

    See above §§ 214–225.

  239. 239.

    However, this does not imply that tattoos cannot be used in automated systems for identification or identity verification. See also Chap. 2, footnote 77 above. Because this type of physical characteristics is more fit for manipulation than other characteristics, we do not deem it necessary to include tattoos in our definition of biometric data for discussions about biometric data processing.

  240. 240.

    Since the score relates to the measuring of a particular human characteristic, there is an (indirect) relationship with the biometric data. A score could therefore also be biometric data. About scores as biometric data, see also above and below § 320.

  241. 241.

    See and compare with the suggested components of a definition in A. Cavoukian, Consumer Biometric Applications: A Discussion Paper, Information and Privacy Commissioner, Ontario, Canada, 1999, available at http://www.ipc.on.ca/images/Resources/cons-bio.pdf, p. 5. The reference to ‘measurable’ in the definition suggested by Cavoukian is in our opinion meant to refer to ‘persistence’.

  242. 242.

    For example, the measurement of the chemical components of a characteristic (e.g., odor).

  243. 243.

    For example, the way someone is positioned on and uses a bike.

  244. 244.

    Moreover, adding persistence may also provoke debate on the applicability of the definition. For example, fingerprints of manual labor workers are sometimes difficult to read over time, or are sometimes removed or made ‘unreadable’(e.g., by criminals who try to delete their fingerprints), so that one could argue that they are not persistent and might not fall under the definition.This would of course not be acceptable. Universality may also provoke debate, as it is know that with some ethnic groups, some characteristics are less pronounced (e.g., ridges of fingerprint of Asian people). About this aspect in the context of ethical and societal concerns, see also above footnote 207 and Part II, Chap. 4.

  245. 245.

    This would further require that the data can be processed and understood by a machine. It would in principle imply the digital representation of biometric data.

  246. 246.

    While the use of the identification functionality will in most cases be fully automated, the identification system may in some cases provide only a list of several potential candidates to be identified. From this list, the exact person has to be chosen by the agent appointed by the controller for attending the system, for example after obtaining further information. This is an additional decision which could be an additional complication. Nevertheless, if the automated processing covers nearly the whole process, this type of use of personal data is also included in our definition of biometric data. See also above, § 268.

  247. 247.

    WP 29 Opinion personal data 2007 (WP 136), p. 15.

  248. 248.

    About these rapid improvements, see also above, e.g., Chap. 2, § 128.

  249. 249.

    But, see above footnote 246.

  250. 250.

    See also above Chap. 2, § 91 et seq.

  251. 251.

    About the claims which may be authenticated by a biometric system, see above, Part I, Chap. 2, Sect. 2.2.2. For a further analysis, see below Part III, Chap. 8, Sect. 8.3.

  252. 252.

    For an overview of which characteristics may qualify, see above, Chap. 2, Sect. 2.2.1.

  253. 253.

    In ISO Vocabulary for Biometrics 2012, the term ‘anonymized biometric data’ is used and defined (see term 37.03.01). This term is confusing and not in conformity with a legal interpretation of biometric data. See also below Part III, Chap. 7, §§ 107–109.

  254. 254.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 2.

  255. 255.

    WP 29 Opinion personal data 2007 (WP136), p. 8. See also WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 3–4, while adding ‘behavioural aspects’ without explicit mentioning that addendum. See also and compare with Article 29 Data Protection Working Party, Opinion 01/2012 on the data protection reform proposals, WP191, 23.03.2012, p. 10 (WP 29 Opinion on reform proposals 2012 (WP191)).The Article 29 Working Party therein suggested to ‘focus on what types of data are to be considered biometric data instead of focussing on what they allow’. We do not agree however with this suggestion. Additional comments of the Working Party on the reform proposals were adopted, in particular in relation to the definition of personal data, which should in its view also mention ‘(…) natural person who can be (…) singled out and treated differently’. See Article 29 Data Protection Working Party, Opinion 08/2012 providing further input on the data protection reform discussions, WP199, 5.10.2012, 45 p. (WP 29 Opinion further input reform discussions 2012 (WP199)).

  256. 256.

    ‘Biometrics’ is often used by some as referring to either biometric characteristics, biometric data, to the technologies and/or the field relating to biometric identification (see also OECD, Biometric-based technologies report, 2004, p. 11). Because of the varying meanings of the term, for which other more precise terms should be used, the term ‘biometric’ instead of ‘biometric characteristic’ (and other than for biometric recognition) should as stated not be used. It is also depreciated (see ISO Vocabulary for biometrics 2012, term 37.01.02).

  257. 257.

    OECD, Biometric-based technologies, 2004, pp. 10–11. The OECD referred to definitions proposed by the International Biometric Group (IBG) and a definition of G. Roethenbaugh. Roethenbaugh defined a ‘biometric’ as ‘a unique, measurable characteristic or trait of a human being for automatically recognizing or verifying identity’ in G. Roethenbaugh, ‘An introduction to Biometrics and General History’, Biometrics Explained, 1998.

  258. 258.

    ‘Biometric characteristics’ are defined to be sensitive personal data if their use makes it possible to identify an individual in connection with sensitive personal data (sic) (see also above).

  259. 259.

    Republic of Slovenia Personal Data Protection Act (ZVOP-1), Article 6 (21).

  260. 260.

    Slovak Republic Act N° 428/2002 Coll. On the Protection of Personal Data Act, Section 4 (1) n).

  261. 261.

    See, e.g., in the State of Ontario, in the Ontario Works Act, 1997, where in Chapter 25 biometric information is defined as ‘information derived from an individual’s unique characteristics but does not include a photographic or signature image’ (Article 2), available at http://www.e-laws.gov.on.ca/html/statutes/english/elaws_statutes_97o25a_e.htm#BK1

  262. 262.

    See, e.g., in the State of Illinois, in Act 095-0232 concerning education whereby the School Code was amended and adopted in 2007 requiring school districts to have a policy before collecting any student biometric information, prohibiting the sale or disclosure of biometric information and requiring parental consent. Biometric information is therein defined as ‘any information that is collected through an identification process for individuals based on their unique behavioral or physiological characteristics, including fingerprint, hand geometry, voice, or facial recognition or iris or retinal scans’ (105 ILCS Sec.10-20.40 new), available at http://www.ilga.gov/legislation/publicacts/fulltext.asp?Name=095-0232. This definition only contains a few examples of biometric characteristics (e.g., there is no reference to vein characteristics). Furthermore, it seems to require ‘an identification process’ and whereby biometric characteristics are collected. See also and compare with the Biometric Information Privacy Act adopted in the same state (740 ILCS 14/1), available at http://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57. In the latter, many (important !) exceptions are included in the definition of ‘biometric identifier’. For other (attempts of) legislations in the United States, see Y. Welinder, Y., ‘A face tells more than a thousand posts: developing face recognition privacy in social networks’, Harvard Journal of Law and Technology, Vol. 26, No. 1, 2012, pp. 28–38 (‘Welinder, A face tells more, 2012’).

  263. 263.

    In the Australian Privacy Code of the Biometrics Institute, ‘biometric’ is defined as the ‘biological or behavioral unique characteristic of an individual which is captured for the purposes of identification and/or verification of the individual’ while ‘biometric information’ is ‘any data that can be used to biometrically identify an individual. This data includes, but is not limited to, images, sounds, chemical or geometric properties. It also includes any information encrypted or unencrypted that is derived from these raw acquired biometrics, such as biometric templates or filtered or pre-processed data. It does not include non-biometric information such as name or address. It also does not include single factor biometric measurements, such as age, height, eye color and place of birth, unless such simple factor biometric measurements are used for automated verification purposes’. This definition is in our view one of the better definitions. About the Code, see Part III, Chap. 8, § 227.

  264. 264.

    The three other considerations in the subsequent recitals repeated the other existing criteria for image and sound data to be qualified as personal data (i.e., automation or intention or inclusion in a filing system, falling within the scope of the Directive and exemptions for specific fields of use).

  265. 265.

    CBPL, Advies 34/1999 uit eigen beweging betreffende de verwerkingen van beelden, in het bijzonder verricht door middel van systemen van video-toezicht, 13 December 1999.

  266. 266.

    CBPL, Advies 14/1995 betreffende de toepassing van de wet van 1992 op de beeldopnamen, 7 June 1995.

  267. 267.

    See also P. De Hert, O. De Schutter, S. Gutwirth, ‘Pour une reglementation de la videosurveillance’, J.T. 1996, (569), p. 576. The authors kindly disagree with this interpretation of the CBPL on this issue; for the position of the Dutch DPA: Registratiekamer (now CBP), In Beeld gebracht. Privacyregels voor het gebruik van videocamera’s voor toezicht en beveiliging, Den Haag, 1997, p. 18 (‘Registratiekamer, In beeld gebracht, 1997’).

  268. 268.

    See and compare with Registratiekamer, Registratiekamer, In beeld gebracht, 1997, p. 15. The Dutch DPA hereby makes distinctions based on whether the videosurveillance is made with analoge or digital means.

  269. 269.

    In the Belgian legislation relating to camera surveillance, Article 4 states clearly that the data protection legislation applies unless the Act relating to camera surveillance contains explicitly a provision otherwise (see Wet 21 maart 2007 tot regeling van de plaatsing and het gebruik van bewakingscamera’s, B.S. 31.05.2007, pp. 29529–29532, as modified by Wet 12 november 2009 (‘Act of 21 March 2007 on camera surveillance, as modified’); see also Verslag namens de Commissie voor de Binnenlandse Akten en voor de Administratieve Aangelegenheden uitgebracht door de heer Noreilde, Cameratoezicht, Parl. St. Senaat 2005–06, n° 3-1413/1, p. 54 also available at http://www.senate.be/www/?MIval=/publications/viewPubDoc&TID=50348521&LANG=nl (‘Verslag Noreilde’); for legal authors in Belgium commenting the Act of 21 March 2007 on camera surveillance, see P. Van Eecke and B. Ooms, ‘De nieuwe wet op de camerabewaking: een analyse van een langverwachte wet na een rechtsonzekere periode’, Computerrecht 2008, 99.

  270. 270.

    See, e.g., in the Netherlands, CBP, Opinion, z2007-00875, 10 December 2007; CBP, Cameratoezicht in kleedruimtes. Zwembad past na klacht situatie goed aan, 2007, available at http://www.cbpweb.nl/Pages/uit_z2007-00875.aspx. Compliance for a similar situation was stressed by the CBPL later as well. See, E. Kindt, ‘Camera’s mogen intimiteit van personen niet schenden’, P&I 2010, p. 209, nr. 199.

  271. 271.

    See, e.g., EDPS, The EDPS video-surveillance guidelines, 2010, 64 p.

  272. 272.

    See also on this notion under Belgian law, De Bot, Verwerking Persoonsgegevens, 2001, pp. 154–155. The broad interpretation is maintained in the Act of 21 August 2008 for the organization of the eHealth platform in Belgium (B.S. 13.10.2008) (see Article 3, 9°) defining personal data concerning health as ‘all data of a personal nature from which information can be deduced concerning the previous, present or future physical or psychological health condition (…)’ (emphasis added).

  273. 273.

    CBPL, Respecteer de Privacywet bij het nemen en publiceren van foto’s en (video)beelden, previously available at http://www.privacycommission.be/nl/in_practice/recht-op-afbeelding/index.html

  274. 274.

    See, e.g., CBPL, Advies N° 008/2006 van 12 april 2006 betreffende het gebruik van cameratoezicht in een kinderdagverblijf, 9 p.

  275. 275.

    Other, sector-specific legislation relating to camera surveillance in Belgium exists. For example, in the employment context (Collective Labor Agreement N° 68 relating to the protection of privacy of employees in the context of camerasurveillance in the workplace, generally applicable by Royal Decree of 20 September 1998), use in soccer stadia (Royal Decree of 12 September 1999 relating to the installation and the operation of camerasurveillance in soccer stadia), for road traffic purposes and relating to use for law enforcement purposes (Act of 6 January 2003 relating to special search methods and any other investigation methods).

  276. 276.

    Criminal sanctions are provided in case of non respect of Article 10 or in case of possession of such images (Article 13). Article 10 is hence broad and seems to play both options (‘shall not furnish (…) nor be intended’).

  277. 277.

    Registratiekamer, In beeld gebracht 1997, p. 18. The CBP hereby refers to R. de Corte, In beeld gebracht, privacyregels voor gebruik van videocameras voor toezicht en beveiling. In an earlier opinion of 1 December 1993 relating to the Decision on regulated exemptions (from the notification duty) from the same DPA, the use of pictures in an access control and visitors registration system, although race can be detected, was not considered a problem.

  278. 278.

    Registratiekamer, Discopas opinion 2001, p. 7: ‘Uit de stukken (…) is gebleken dat het gebruikte algoritme kan ‘terugrekenen’ en zo de oorspronkelijke scan kan reconstrueren (…) Uit het oorspronkelijke gezicht zijn in principe steeds rasgegevens af te leiden. De gezichtstemplate is daarom als een bijzonder gegeven aan te merken’ [free translation: ‘From the documents (…) it appears that the algorithm used can ‘reverse engineer’ and reconstruct the original scan. (…) One is able to deduce in principle always racial information from the original face. The facial template is for this reason a special category of data’]. See also above § 252. For a further analysis of this opinion, see Part II.

  279. 279.

    CBP, Richtsnoeren. Publicatie van persoonsgegevens op internet, 2007, p. 17 and p. 22.

  280. 280.

    On these two decisions, see also above § 251. See also WP 29 Opinion on facial recognition 2012, p. 4 explicitly referring to the decision of 23.3.2010. However, immediately thereafter, the Article 29 Working Party seems to return to a subjective approach: ‘Specifically where digital images of individuals or templates are further processed to derive special categories of data, they would certainly be considered within this category’ (emphasis added).

  281. 281.

    See also Council of Europe, Recommendation No. Rec(97) 5 on the protection of medical data, 1997, p. 2 defining medical data as data ‘concerning the health of an individual’, stating that the concept is also referring to ‘data which have a clear and close link with health (…)’.

  282. 282.

    See e.g., Graux and Dumortier, Privacywetgeving in de praktijk, 2009, p. 132. The authors discuss the inclusion of biometric identifiers in the Belgian biometric e-Passport and hesitate to qualify the facial image included in the chip of the passport as biometric data.

  283. 283.

    See, e.g., X., ‘Kamer eist stop op opslag gelaatsscans en vingerafdrukken’, De Volkskrant, 25 februari 2006, available at http://www.volkskrant.nl/den_haag/article231026.ece

  284. 284.

    See Antwoorden op kamervragen over het bericht dat het ministerie al begonnen is met het aanleggen van een databank met gelaatsscans en vingerafdrukken zonder wettelijke grondslag, 7 maart 2006, previously available on http://www.minbzk.nl//onderwerpen/persoonsgegevens-en/reisdocumenten/kamerstukken/@80736/antwoorden_op_21

  285. 285.

    Conseil d’Etat, App. N° 297888, 297896, 298085, 12 March 2007, available at http://arianeinternet.conseil-etat.fr/arianeinternet/ViewRoot.asp?View=Html&DMode=Html&PushDirectUrl=1&Item=3&fond=DCE&texte=2007+ELOI&Page=1&querytype=simple&NbEltPerPages=4&Pluriels=True 2 years later, the Conseil d’Etat annulled again specific legal provisions adopted a second time, but now by décret, for the creation of the ELOI database, in particular relating to the retention period of 3 years of particular personal data and the registration of a national identification number. See Conseil d’Etat, App. N° 312051, 30 December 2009, available at http://arianeinternet.conseil-etat.fr/arianeinternet/ViewRoot.asp?View=Html&DMode=Html&PushDirectUrl=1&Item=2&fond=DCE&texte=2009+ELOI&Page=1&querytype=simple&NbEltPerPages=4&Pluriels=True

  286. 286.

    Ligue des droits de l’homme, Annulation du fichier ELOI: après la victoire du droit, la CNIL et le Conseil dEtat doivent faire prévaloir les droits, available at http://www.ldh-toulon.net/spip.php?article1853

  287. 287.

    See above, § 275.

  288. 288.

    The facial image may be submitted in paper or other document form, where after an analog or digital representation of the image containing the biometric characteristics is taken for storage in the system. Digitalizing analog images is nowadays very simple. With the advent of the digital cameras and because hand held personal digital assistants (whether in the form of mobile phones or otherwise) contain almost always a picture functionality, facial images are increasingly submitted in digital form for storage.

  289. 289.

    See WP 29 Opinion on facial recognition 2012, p. 5: ‘Therefore, facial recognition constitutes an automated form of processing of personal data, including biometric data’. See also Cavoukian and Marinelli, Privacy-Protective Facial Recognition, 2010, p. 13.

  290. 290.

    See also, and for references, Part III, Chap. 7, § 163. See and compare with the use of facial recognition systems in China at the opening and closing ceremony of the Olympic Games in Beijing in 2008, when the audience passed 100 gates for speedy identity verification based on the photos provided when tickets were bought. X. Zhai, The Status Quo and Ethical Governance in Biometric in Mainland China, presentation at the Rise Third International Conference on Ethics and Policy. Biometrics and International Data Sharing, 4–5.1.2010, Hong Kong, available at http://www.riseproject.eu/events.9.html (‘Zhai, The Status Quo and Ethical Governance in Biometric in Mainland China, 2010’).

  291. 291.

    This launch received much media attention as well. The then existing website http://face.com invited members of social networks to use ‘Phototagger’ and developers to make face recognition applications. In May 2010, seven billion of pictures were scanned, increasing every minute of the day (as indicated with the automated number teller on the website), allowing for the identification of (millions of) individuals. This number increased to over 30 billion pictures. Face.com was later acquired by Facebook and their website and service is no longer active.

  292. 292.

    Google however revised in 2010 its plan to make the face recognition tools available on the Internet under pressure of privacy advocates. At the same time, Google then offered already face recognition on Picasa, where users dispose of a tool for editing and tagging their photos uploaded. About Google see also Part II, Chap. 4, § 97. Other companies presented in 2010 at the Mobile World Congress in Barcelona face recognition for use on pictures made by the mobile phone through search of the Internet, as mentioned in the footnotes 126–127.

  293. 293.

    Early June 2011, it was reported that Facebook modified its privacy settings (again) and rolled out its face recognition technology now also outside the United States for its at that time more than 750 million users in total. The technology allowed by default for (semi-automated) tagging of names (only ‘friends’ are suggested) to pictures uploaded by users unless such other users (‘friends’) would opt-out; see also the United States Federal Trace Commission (FTC) complaint over Facebook’s use of facial recognition technology filed by four consumer protection organizations, led by the Electronic Privacy Information Center (EPIC) of 10 June 2011, 34 p., available at http://epic.org/privacy/facebook/EPIC_FB_FR_FTC_Complaint_06_10_11.pdf (EPIC et al., Complaint In re Facebook, 2011’). Where facial recognition was initially applied by default to uploaded pictures unless such other Facebook users (‘friends’) would opt-out, Facebook later modified its privacy settings (again) whereby users need to opt in and agree with the tagging function. In September 2012, Facebook halted face recognition under the pressure of various (legal) proceedings, including investigations by the Article 29 Working Party, the Irish Data Protection Authority and a German DPA (see Part III, Chap. 7, footnote 395) and the FTC in the United States, See also the comments filed by EPIC to the FTC’s Face Facts in which EPIC strongly pleaded for a moratorium on the commercial deployment of facial recognition techniques: Electronic Privacy Information Center, Comments of the Electronic Privacy Information Center to the Federal Trade Commission. Face Facts: A Forum on Facial Recognition, project number P115406, 31 Januari 2012, 24 p. (‘EPIC, Comments to the FTC. Face Facts, 2012’), available at http://epic.org/privacy/facerecognition/EPIC-Face-Facts-Comments.pdf. The investigation of the FTC resulted inter alia in Federal Trade Commission, Facing Facts. Best Practices for Common Uses of Facial Recognition Technologies, October 2012, 30 p., available at http://ftc.gov/os/2012/10/121022facialtechrpt.pdf (‘FTC, Best Practices 2012’). See further also J. Lynch, J., Written testimony, Senate Committee on the Judiciary, What Facial Recognition Technology Means for Privacy and Civil Liberties, 18.7.2012, 24 p. (‘Lynch, What Facial Recognition Technology Means 2012’).

  294. 294.

    See also a study of 2011 of Carnegie Mellon, mentioned in Part III, Chap. 7, footnote 383, which confirms that facial images can be collected and used for identification purposes. The researchers state that the technologie has improved a lot, where companies can use 3D face images to reconcile pose variations in different images. That facial images are to be considered biometric data is further reinforced by the fact that one of the first (and few) legislations regulating the use of biometric data, i.e., the Ontario’s Work Act of 1997, expressly excludes inter alia facial images (‘photographique images’) of the definition of ‘biometric information’ (Article 2 – definitions). If facial images would not fall under the concept or definition, it would not be necessary to exclude them. About this Act, see also Part III, Chap. 8, § 269 footnote 171. See also and compare with the in the State of Illinois adopted Biometric Information Privacy Act, 740 ILCS 14/1 which excludes photographs as well.

  295. 295.

    DNA is generally known and referred to as code which contains the instructions which are needed to construct components of cells. These DNA parts that carry the genetic information are called genes.

  296. 296.

    But about the uniqueness of DNA profiles, see, e.g., Weir, DNA Profiles. The author states for example that there is no satisfactory probabilistic or statistical genetic theory for the growing acceptance of DNA profiles being unique, because of the possible dependencies between loci and between individuals.

  297. 297.

    This is different from fingerprint, which is unique, even for identical (monozygotic) twins.

  298. 298.

    In particular, DNA is included in the definition of biometric data in the data protection legislation of the Slovak Republic. See above § 254.

  299. 299.

    For the general description in this section about the DNA technology used for the (chemical) extraction, use and comparison of DNA information, we made inter alia use of the following publications: J. Cassiman, H. Nys, I. Vinck and R. Decorte, ‘Genetica, patiëntenrechten, mensenrechten: DNA van recht tot krom’, in J. Cassiman et al., Wat zit er in mijn genen, 2008, Leuven, Davidsfonds, pp. 180–202, Nuffield Council on Bioethics, The forensic use of bioinformation: ethical issues, London, Cambridge Publishers, 2007, 139 p. (‘Nuffield, Bioinformation 2007’) (which also contains a useful glossary), Callens, Goed geregeld, 1995, pp. 122–124, JRC, Biometrics at the Frontiers, 2005, pp. 124–130 and OECD, Biometric-based technologies, 2004, Annex III DNA-based technologies. About DNA, see also WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 25–27.

  300. 300.

    See also JRC, Biometrics at the Frontiers, 2005, p. 124.

  301. 301.

    In other words, although it has been thought that this non-coding DNA had little biological function (“junk DNA”) until recently, it becomes now more and more clear that these fragments also contain additional control elements that regulate the expression of the so-called coding DNA segments.

  302. 302.

    The regions used for DNA identification are limited to the use of the regions composed of non-coding ‘in tandem’ repetitive DNA (as opposed to non-coding randomly repetitive DNA). According to some sources of a while ago, only about 10 % of the total DNA would be fit for coding, whereby the remaining 90 % would be non-coding, but could be used for identification purposes. See Callens, Goed geregeld, 1995, p. 121, no. 140 and the reference there mentioned. About the forensic use of DNA and non-coding regions, see also e.g., D. Kaye, ‘Science Fiction and Shed DNA’, 101 Northwestern University Law Review Colloquy 2006, pp. 62–67, also available at http://www.law.northwestern.edu/lawreview/colloquy/2006/7/#fn13down (‘Kaye, Science Fiction and Shed DNA, 2006’).

  303. 303.

    The collection of the biological samples can take place either without the cooperation of the individual or needs his cooperation.

  304. 304.

    For a useful overview of the different steps and techniques with images and schemes, see, e.g., L. D’Haeninck, L. Dekeersmaeker, B. Hempen, K. Geris, R. Goossens, P. Vernemmen, Biogenie 6.1, Antwerpen, De Boeck, 2009, pp. 223–237.

  305. 305.

    The bases or nucleotides are A adenine, C cytosine, G guanine and T thymine.

  306. 306.

    A always pairs with T and G always pairs with C.

  307. 307.

    See Nuffield, Bioinformation 2007, p. 121; see also the article in which the double helix structure was firstly presented and which was published in Nature in 1953: J. Watson and F. Crick, ‘A Structure for Deoxyribose Nucleic Acid’, Nature 171, 1953, pp. 737738, available at http://www.nature.com/nature/dna50/watsoncrick.pdf

  308. 308.

    At given loci, also referred to as markers on a chromosome, the number of repeated DNA fragments is subject to variation from one person to another. These repeated short sequences of DNA that vary in length from person to person have a strong discriminating power.

  309. 309.

    Research however indicates that there may be minor differences in DNA of even identical twins.

  310. 310.

    The result is transferred to a (nylon or nitrocellulose) sheet. The specific DNA fragments are visualized with a complementary labeled probe (see Fig. 3.2) (about probing, see footnote 311).

  311. 311.

    To further examine the DNA patterns, probing (which consists of adding radioactive or colored probes) and various techniques are used. One of these techniques includes adding of enzymes by which the DNA is multiplied many times and which permits to recognize and compare the DNA patterns, which is also called the PCR reaction.

  312. 312.

    See A. Jeffreys, V. Wilson & S. Lay Thein, ‘Hypervariable ‘minisatellite’ regions in human DNA’, Nature 1985, pp. 67–73, available at http://www.nature.com/nature/journal/v314/n6006/abs/314067a0.html. Because DNA profiles are represented in a particular format, some refer to DNA profiles as a second type of template of DNA, besides the DNA template which consist of the image with black bands, resembling a ‘bar code’, as shown in Fig. 3.2.

  313. 313.

    See, for an example of the DNA profile representation for the 13 core loci of the Combined DNA Index System (CODIS) (U.S.A.) at Blackett Family DNA Activity 2, available at http://www.biology.arizona.edu/human_bio/activities/blackett2/str_codis.html. About CODIS, see also § 386 below.

  314. 314.

    See and compare with the representation of templates, e.g., for fingerprint, by a sequence of numbers as well (see above Fig. 2.2).

  315. 315.

    For an explanation about sequencing, see, e.g., How do we Sequence DNA?, available at http://seqcore.brcf.med.umich.edu/doc/educ/dnapr/sequencing.html

  316. 316.

    Similar arguments are sometimes raised with regard to e.g., fingerprint images.

  317. 317.

    Several of these elements were put forward in the defense for the retention of DNA profiles in a database by the U.K. Government in the case S. and Marper v. the United Kingdom, discussed below.

  318. 318.

    An allele is one of the two (or more) versions of a gene at a given locus (site) on a chromosome. An individual inherits two alleles for each gene, one from each parent.

  319. 319.

    Nuffield, Bioinformation 2007, p. 20. In the fore mentioned S. and Marper v. the United Kingdom case, the ECtHR further stated that some police practices do exist whereby suspects are routinely classified into ‘ethnical appearance’ categories based on their DNA profile (§ 40). These ethnicity tests, which could be used as a tool to reduce a ‘suspect pool’, in combination with other factors that lead to a disproportionate number of people from black and ethnic minority groups being stopped, whose DNA profiles are recorded and classified, entail – according to the Court – a prominent risk of reinforcement of racist views of propensity to criminality.

  320. 320.

    For identical twins, however, this is not true (see above). Furthermore, the discriminating power of a profile will depend on several factors, including the markers used. See Human Genome Project Information, available at http://www.ornl.gov/sci/techresources/Human_Genome/elsi/forensics.shtml#2

  321. 321.

    For example, see, after the routine newborn screening, the Danish Newborn Screening BioBank, keeping residual dried blood spot samples from people born after 1982. The practice was only regulated in 1993. Newborn screening samples are taken and collected in many countries. About this (type of) database, see also below, e.g., at footnotes 398–399.

  322. 322.

    DNA markers consist of repeated short sequences of DNA that vary in length between different people. DNA marker systems are e.g., SGM+ used in Europe (uses ten markers) and the CODIS system (uses 13 markers).

  323. 323.

    For a survey and inquiry about the use of DNA databases around 2002, see also INTERPOL, Global DNA Database Inquiry. Results and Analysis, 2003, 31 p., available at http://www.interpol.int/Public/Forensic/dna/Inquiry/ InquiryPublic2002.pdf

  324. 324.

    For example, for the largest DNA database in the world, the CODIS database, 13 short tandem repeats (STRs) are used (about CODIS, see below § 386; see also footnote 313 above).

  325. 325.

    See also, e.g., OECD, Biometric-based technologies, 2004, p. 11. Our reference to biometric comparison process includes the capture, extraction and comparison. Once the DNA information, in particular the profile, is stored in a database, the comparison between the information in two databases can be done in ‘real time’.

  326. 326.

    In the past, confusion may have been created by giving DNA data (DNA profile) as an example of biometric data (biometric template) (see, e.g., the statement in Registratiekamer, Ontwerpbesluit DNA-onderzoek in strafzaken, 17.02.2000, p. 4 (‘Registratiekamer, DNA-onderzoek in strafzaken, 2000’), available at http://www.cbpweb.nl/Pages/adv_z1999-1201.aspx) and the collection of DNA samples as a ‘begin of a biometric database’ (Registratiekamer, DNA-onderzoek Pers. kenmerken, 2000, p. 4).

  327. 327.

    NSTC, National Biometrics Challenge, 2011, pp. 14–15; see also P. Rincon, Four hours for forensic DNA test, BBC News, 5.08.2010, available at http://www.bbc.co.uk/news/science-environment-10873706. It is expected that systems will provide in the future for a more and better automation of the DNA extraction and comparison process, allowing for ‘molecular biometrics’. Especially the capture and extraction currently still takes time. Once DNA information is measured and transformed in a digital representation, it can be compared in just some seconds. Another indication that DNA in the future is likely to become included as biometric data, is the ongoing standardization work on (biometric) data formats, including for DNA (e.g., the proposed DNA data format in draft ISO standard 19794-14). See also WP 29 Opinion on developments in biometric technologies 2012 (WP193), p. 25: ‘The continuous advances made over the years by academic research and biotechnology developers have reduced the time needed for the generation of a DNA profile from days to hours and even a fraction of an hour’(emphasis added) and ‘It is very likely that in the near future it will be possible to perform real-time (or near real-time) DNA profiling and matching (…)’.

  328. 328.

    JRC, Biometrics at the Frontiers, 2005, p. 128 and references therein. Contamination plays for accuracy an important role.

  329. 329.

    See, however, M. Prinsen, ‘De bestaande forensische DNA-databank en een verkenning van de mogelijkheden tot uitbreiding’, P&I 2006, (54), p. 57. The author considers DNA-profiles as such – if the profile is not linked with a name – not personal data to which the data protection legislation applies, because from such profile, it is ‘not possible to see to whom the material belongs’. We do not agree with this point of view (compare with a fingerprint sample, from which as such, if left on a door knob, it is not clear to whom it belongs as well, but, which is clearly personal data as we have argued above). The author hereby makes a distinction between DNA material (where we believe that the author wants to refer to the samples or the biological material from which the samples are taken) and DNA profiles. A similar point of view has been taken by some before with regard to biometric data as well.

  330. 330.

    For Belgium, there is for example a research project based on analysis of DNA in order to establish family links in a particular region.

  331. 331.

    See, for example, the project to unravel the genetic information from the entire Icelandic population.

  332. 332.

    Article 29 Data Protection Working Party, Working Document on Genetic Data, WP 91, 17 March 2004, p. 4 (‘WP 29 Working document genetic data 2004 (WP91)’).

  333. 333.

    See Sect. 3.2.4, Genetic and DNA analysis.

  334. 334.

    As stated above, the insights are evolving rapidly and non-coding fragments may also be relevant for the coding fragments.

  335. 335.

    Adler, Sample images can be independently restored from face recognition templates, 2003. The reverse-engineered biometric samples are in this article referred to as the ‘source images.’ The term ‘match score values’ is also used instead of the recommended term ‘comparison scores’. See and compare with the term 37.03.27 ISO Vocabulary for Biometrics 2012.

  336. 336.

    In the SD2 Version 12 – Harmonized Biometric Vocabulary, however, the comparison scores are mentioned in a scheme as non-biometric data produced in biometric processing (p. 75).

  337. 337.

    See, e.g., J. Prins, ‘Eigendom op informatie: economische realiteit maar juridische fictie?’, Nederlands Juristenblad 2005, p. 623; J. Litman, ‘Information Privacy/Information Property’, 52 Stan. L. Rev. 1999–2000, pp. 1283–1313; P. De Hert and S. Gutwirth, ‘Hoofdstuk 2: Informatie: wel beschermd, doch niet vatbaar voor diefstal. Denkoefeningen over het juridisch statuut van informatie vanop het grensvlak tussen het strafrecht en de intellectuele rechten’, in K. Byttebier, E. de Batselier and R. Feltkamp (eds.), Tendensen in het economische recht, Antwerpen, Maklu, 2006, pp. 85–116; see also Lessig, focusing on the economic aspects of personal information control. He is of the opinion that ‘the protection of privacy would be stronger if people conceived of the right as a property right’. L. Lessig, Code: Version 2.0, New York, U.S., Basic books, 2006, p. 244, available at http://codev2.cc/

  338. 338.

    See also J. Prins, ‘The propertization of Personal Data and Identities’, EJCL 2004, available at http://www.ejcl.org/83/art83-1.html (‘Prins, The propertization of Personal Data and Identities, 2004’). The author holds that the European system seems to offer room for a proprietary rights model for personal data. It is based on the view that the Directive 95/46/EC allows for a contractual approach to protect personal data.

  339. 339.

    See about this discussion also E. Kindt, ‘Ownership of Information and Database Protection’, in J. Dumortier, F. Robben and M. Taeymans (eds.), A Decade of Research @ the Crossroads of Law and ICT, Gent, Larcier, 2001,(145) pp. 151–152 (‘Kindt, Ownership of information, in Dumortier, Robben and Taeymans, A Decade of Research, 2001’).

  340. 340.

    See and compare, e.g., with the economic value estimated at several billion dollars at the moment of the stock market launch of (social) network sites, such as LinkedIn, consisting of mainly (personal) information.

  341. 341.

    Article 7 (1) of the Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of database, O.J. L 77, 27.03.1996, pp. 20–28. The Directive 96/9/EC does not define who is to be considered as the maker. Recital 41, however, explains that the maker who is entitled the sui generis right ‘is the person who takes the initiative and the risk of investing’. For an (early) overview of case law on the new database right, see P. Hugenholtz, The New Database Right: Early Case Law from Europe’, paper presented at Ninth Annual Conference on International IP Law & Policy, Fordham University School of Law, New York, 19–20 April 2001, available at http://www.ivir.nl/medewerkers/hugenholtz.html

  342. 342.

    Databases are hardly ever the result of the work and investment of one party, but rather of several parties working together. In that case, several parties could be co-makers and ‘co-owners’ of a (biometric) database and could exercise the rights. In view of this possibility, it is important that parties to a project which includes investment in databases, including biometric databases, provide an express clause in their agreement with regard to the rights in the database. See Kindt, Ownership of information, in Dumortier, Robben and Taeymans, A Decade of Research, 2001, p. 150.

  343. 343.

    See also, on this subject, Th. Hoeren, presentation at 30 ans du C.R.I.D., conference, Namen, 22.01.2010.

  344. 344.

    The sui generis right would in our opinion not prevent other ‘makers’ of biometric databases to collect the data from the data subjects again (provided all other legislation, in particular privacy and data protection legislation would be respected).

  345. 345.

    This would also include respect for the rights of the data subjects to have their data corrected and/or deleted in some cases.

  346. 346.

    We base our view on the nature of data protection rights and obligations, which could be considered as having a mandatory or even public order effect, especially since data protection is now recognized in the EU Charter as a fundamental human right (see below § 409 et seq).

  347. 347.

    For example, because the database is used to test and improve (proprietary) algorithms or in case of transfer of assets of a company-employer having installed a biometric control access system.

  348. 348.

    See also Prins, The propertization of Personal Data and Identities, 2004, who holds that even if it would be possible to vest a property right in personal data, it is doubtful that this would lead to a better protection in our ‘society of pervasive technologies’.

  349. 349.

    For example, of a mail account or a social network site.

  350. 350.

    See, on this issue, E. Visser, ‘Who owns your bits when you die?’, Computerrecht 2007, pp. 195–198. The author refers to a court case in the United States with Yahoo, where parents of the deceased soldier Ellsworth in the Iraq war, requested full access to their son’s e-mail account. The judge gave an order directing Yahoo to provide the contents of the e-mail account.

  351. 351.

    For arguments to claim property rights in personal data, see N. Purtova, Property Rights in Personal Data: A European Perspective, Oisterwijk, BOXPress, 2011, 283 p.; see also N. Purtova, Property in Personal Data: Second Life of an Old Idea in the Age of Cloud Computing, Chain Informatisation, and Ambient Intelligence, TILT Law & Technology Working Paper No. 2010/017, 2010, available at SSRN: http://ssrn.com/abstract=1641027

  352. 352.

    See Article 9 of the Belgian Act on Patient Rights of 22 August 2002, B.S. 26.09.2002; see on this issue also K. Schutyser, ‘Eigendomsrecht en medische dossiers’, R.W. 1983–84, pp. 3021–3048; S. Brillon, ‘Le droit d’accès au dossier du patient’, in S. Brillon, S. Callens, N. Gauche, N. Noel, G. Schamps and M.-N. Verhaegen, Memento: Droits du patient et responsabilité médicale, Mechelen, Kluwer, 2004, pp. 69–105.

  353. 353.

    See and compare with the discussion about genetic information, for example in the report Australian Law Reform Commission, Essentially Yours: The Protection of Human Genetic Information in Australia, ALRC Report 96, May 2003, Sydney, ALRC, (‘ALRC, Essentially Yours, 2003’), available at http://www.alrc.gov.au/publications/report-96

  354. 354.

    About the proposal of certification of particular data flow processes and equipment, see Part III as well.

  355. 355.

    See Part III, Chap. 7, §§ 55 et seq.

  356. 356.

    For the need for a thorough public debate and policy decisions, see, e.g., I. Geesink and Ch. Steegers, Nader gebruik nader onderzocht. Zeggenschap over lichaamsmateriaal, 2009, Den Haag, Rathenau Instituut, 168 p., available at http://www.rathenau.nl/uploads/tx_tferathenau/Nader_20gebruik_20nader_20onderzocht._20Zeggenschap_20over_20lichaamsmateriaal.pdf (‘Geesink and Steegers, Nader gebruik nader onderzocht. Lichaamsmateriaal, 2009’); about the subject, see also C. Trouet, Van lichaam naar lichaamsmateriaal. Recht en het nader gebruik van cellen en weefsels, Antwerpen, Intersentia, 2003, 590 p. and M. Ploem, ‘Het verschijnsel biobanking in privacyperspectief’, Computerrecht 2011, pp. 320–328.

  357. 357.

    Council of Europe, Recommendation Rec(92) 3 on genetic testing and screening for health care purposes, 10 February 1992, available http://www.coe.int/t/dg3/healthbioethic/texts_and_documents/default_en.asp; see also much earlier, more generally on automated medical data banks, Council of Europe, Recommendation No. R(81)1 on Regulations for Automated Medical Data Banks, 23.1.1981, also available on http://www.coe.int/t/dg3/healthbioethic/texts_and_documents/default_en.asp but replaced by Council of Europe, Recommendation No. Rec(97)5 on the protection of medical data, 17.02.1997; see also in the Netherlands, P. Ippel, Gegeven: de genen. Morele en juridische aspecten van het gebruik van genetische gegevens, Registratiekamer, 1996, 61 p., available at http://www.cbpweb.nl/downloads_av/av07.pdf

  358. 358.

    Council of Europe, Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the Application of Biology and Medicine: Convention on Human Rights and Biomedicine, ETS No. 164, 4 April 1997, available at http://conventions.coe.int/Treaty/EN/Treaties/html/164.htm (‘Council of Europe, Convention on Human Rights and Biomedicine (ETS No. 164), 1997’); about this Convention, see, e.g., H. Nys (ed.), De conventie Mensenrechten en Biogeneeskunde van de Raad van Europa, Antwerpen, Intersentia, 1998, H. Nys, ‘Het Verdrag Mensenrechten en Biogeneeskunde van de Raad van Europa: enkele krachtlijnen’, in R.W. 1997–98, pp. 666–674.

  359. 359.

    For an overview of the Member States that signed and ratified the Convention and the additional protocols, see http://conventions.coe.int/Treaty/Commun/ListeTableauCourt.asp?MA=9&CM=16&CL=ENG

  360. 360.

    UNESCO, Universal Declaration on Bioethics and Human Rights, 2005, available at http://www.unesco.org/new/en/social-and-human-sciences/themes/bioethics/bioethics-and-human-rights/

  361. 361.

    Council of Europe, Recommendation No. Rec(2006)4 of the Committee of Ministers to Member States on research on biological material of human origin, 15 March 2006, available at https://wcd.coe.int/ViewDoc.jsp?id=977859 (‘Council of Europe, Recommendation Rec(2006)4’).

  362. 362.

    Article 3 of the Council of Europe, Recommendation Rec(2006)4.

  363. 363.

    On this issue, see also European Society of Human Genetics (ESHG), Guidelines, a document published of the BIOTECH programme financed by the EU Commission (CEE BIO4-CT98-0550), 2001, p. 30, (‘ESHG, Guidelines, 2001’), available at http://jshg.jp/resources/data/ESHG_BG2_e.pdf

  364. 364.

    Article 2 of the Council of Europe, Convention on Human Rights and Biomedicine (ETS No. 164).

  365. 365.

    See and compare also with provisions relating to information about the use and purposes of the material to the donor and the required consent in the Belgian legislation implementing the Directives (in particular Articles 10, §5 and 11) (see below footnote 382).

  366. 366.

    See above, Chap. 3, Sect. 3.1.

  367. 367.

    The processing of personal data is presently defined as ‘any operation or set of operations’ upon personal data, ‘whether or not by automatic means’, such as ‘collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction’ (emphasis added) (Article 2 (b) Directive 95/46/EC).

  368. 368.

    This is apparently the approach taken by the DPA in Denmark, taking the view that a structured collection of biological material (i.e., a biobank) constitutes a manual (non-electronic) register, as mentioned by L. Bygrave, The body as data? Reflections on the relationship of data privacy law with the human body, 3 p., edited text of speech given at Federation Square, Melbourne, 8 September 2003, p. 3 (‘Bygrave, The body as data? 2003’), available at https://www.privacy.vic.gov.au/privacy/web2.nsf/files/body-as-data-conference-2003-lee-bygrave-presentation/$file/conference_03_no2.pdf

  369. 369.

    See also below on the regulation of the processing of genetic information.

  370. 370.

    See also Bygrave, The body as data? 2003, p. 3. The author states that it is not impossible to apply data protection legislation to biological material. He mentions the New South Wales Privacy and Personal Information Protection Act 1998, which defines ‘personal information’ as encompassing, inter alia, ‘body samples’ (section 4). He also refers to the discussion in Norway and to the report ALRC, Essentially Yours, 2003; see also L. Bygrave, ‘The Body as Data? Biobank Regulation via the ‘Back Door’ of Data Protection Law’, Law, Innovation and Technology 2010, pp. 1–25 (‘Bygrave, The body as data, 2010’). But: see Callens, Goed geregeld, 1995, pp. 40–50. About existing different views, see D. Beyleveld and M. Taylor, ‘Patents for biotechnology and the data protection of biological samples and shared genetic data’ in J. Herveg (ed.), The protection of medical data. Challenges of the twenty first century, Louvain-La-Neuve, Anthemis, 2008, pp. 131–152.

  371. 371.

    In particular, for example, when DNA tests can be applied. In some cases, this may not be possible (anymore), because of the quality of the material.

  372. 372.

    See also above, Chap. 3, Sect. 3.1.2.

  373. 373.

    See, e.g., the European Data Protection Supervisor (EDPS) who states that it is questionable that biological materials ‘as such can be considered as personal data’. EDPS, Opinion on the Proposal for a Directive of the European Parliament and of the Council on standards of quality and safety of human organs intended for transplantation, O.J. C 192, 15.08.2009, p. 7, §12; but, see, e.g., Bygrave, The body as data, 2010, pp. 1–25; see also P. De Hert and E. Ellyne, ‘The Law and Ethics of Belgian Biobanking: A Reversal for the Logic of Regulation?’, Law, Innovation and Technology 2010, pp. 27–50.

  374. 374.

    WP 29 Opinion personal data 2007 (WP136), p. 9; see also WP 29 Opinion genetic data 2004, p. 5; see also the earlier and similar position of the Dutch DPA in its advice on DNA-profile use (Registratiekamer, DNA-onderzoek in strafzaken, 2000).

  375. 375.

    See and compare with a similar critical view concerning the hesitation of the Article 29 Working Party in relation with the legal status of DNA samples as personal data in its Opinion on genetic data: D. Korff, Automated processes of identification, behavioral analysis and risk detection, Paper and presentation at Security, privacy and data protection seminar, organized by the Spanish Data Protection Agency, 9–11 June 2010, pp. 33–34.

  376. 376.

    See also ALRC, Essentially Yours, 2003, in which one of the key recommendations is extending privacy protection to genetic samples (which is a type of biological information). Such extended privacy protection should however also find an adequate balance to allow use for research purposes. This conflict also exists for the use of biometric data for research purposes.

  377. 377.

    About these rapid developments announced in the mid 1990s, see also Callens, Goed geregeld, 1995, pp. 111–120. For definitions and interpretations, and additional information, see Bioinformaticsweb.tk, a bioinformatics resource portal, available at http://bioinformaticsweb.net/definition.html

  378. 378.

    See http://bioinformaticsweb.net/data.html

  379. 379.

    In particular, the prevention of the transmission of diseases.

  380. 380.

    Directive 2004/23/EC of the European Parliament and of the Council of 31 March 2004 on setting standards of quality and safety for the donation, procurement, testing, processing, preservation, storage and distribution of human tissues and cells, O.J.L 102, 7.04.2004, pp. 48–58 (‘Directive 2004/23/EC’).

  381. 381.

    Commission Directive 2006/17/EC of 8 February 2006 implementing Directive 2004/23/EC of the European Parliament and of the Council as regards certain technical requirements for the donation, procurement and testing of human tissues and cells, O.J.L 38, 9.02.2006, pp. 40–52 (‘Commission Directive 2006/17/EC’). Commission Directive 2006/86/EC of 24 October 2006 implementing Directive 2004/23/EC of the European Parliament and of the Council as regards traceability requirements, notification of serious adverse reactions and events and certain technical requirements for the coding, processing, preservation, storage and distribution of human tissues and cells, O.J.L 294, 25.10.2006, pp. 32–50 (‘Commission Directive 2006/86/EC’).

  382. 382.

    See: Wet van 19 december 2008 inzake het verkrijgen en het gebruik van menselijk lichaamsmateriaal met het oog op de geneeskundige toepassing op de mens of het wetenschappelijk onderzoek, B.S., 30.12.2008, pp. 68774–68786 (‘Act of 19 December 2008 on human body material’). This Act was later modified by Act of 23 December 2009, B.S., 29.12.2009. A biovigilance systems is described in general as a system to prevent the risks related to the use of organs, tissues and cells of the human body, and derived products, and to ensure the quality and the security of related procedures. The Act of 19 December 2008 on human body material modifies provisions of the Act of 13 June 1986 relating to the extraction and the transplantation of organs.

  383. 383.

    Art. 2, 1° of the Act of 19 December 2008.

  384. 384.

    Tissue is defined as any constituent part of the human body formed by cells.

  385. 385.

    See Recital 18 of Directive 2004/23/EC.

  386. 386.

    The standard operating procedures are referred to as ‘SOPs’. See further Article 2, 5 of the Commission Directive 2006/17/EC.

  387. 387.

    See also EDPS, Opinion on the Proposal for a Directive of the European Parliament and of the Council on standards of quality and safety of human organs intended for transplantation, O.J. C 192, 15.08.2009, p. 9. Directive 2004/23/EC clearly states that Directive 95/46/EC is applicable (see recital 24). The use of the term ‘anonymity’, however, is not in conformity with the concept as understood in Directive 95/46/EC.

  388. 388.

    More in particular, anonymous and semi-anonymous verification will be recommended as explained in Part III.

  389. 389.

    The suggested definition of biometric data are ‘personal data which (a) relate directly or indirectly to unique or distinctive biological or behavioral characteristics of human beings and (b) are used or are fit to be used by automated means (c) for purposes of automated identification or verification of the identity of natural persons’ (proposed definition of biometric data as discussed above).

  390. 390.

    WP 29 Opinion personal data 2007 (WP136), p. 9. This is maintained. See WP 29 Opinion on developments in biometric technologies 2012 (WP193), p. 4.

  391. 391.

    This is plausible, because the Article 29 Working Party mentions DNA pattern analysis in the list of biometric techniques in its Working Document on Biometrics 2003 (p. 3). At the same time, the Article 29 Working Party seems to exclude DNA from biometric data. The Article 29 Working Party mentions in a footnote that it will not discuss the use of DNA for biometric identification, mentioning that the ‘generation of a DNA profile in real time as an authentication tool seems not currently possible’ (WP 29 Working Document on Biometrics 2003 (WP80), p. 3, footnote 7). See however WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 25–27 which discusses DNA extensively because ‘it is very likely that in the near future it will be possible to perform real-time (or near real-time) DNA profiling and matching’ (p. 25).

  392. 392.

    Registratiekamer, DNA-onderzoek in strafzaken, 2000, p. 2.

  393. 393.

    Another reason is that human cells and tissue also rather contain the DNA-information which needs to be extracted from these parts of the human body, while for biometric data such extraction is in principle not required.

  394. 394.

    See above § 311.

  395. 395.

    Human tissues and cells are also as stated often accompanied by additional information about the data subject. In this way, they allow a fortiori for identification, e.g., if the traceability is mandatory in case of donation.

  396. 396.

    For example, in the context of donation of tissues and cells, various laboratory tests are required on the donor’s serum or plasma (which are part of blood).

  397. 397.

    On the (privacy) principles applicable to such tests, see Hendrickx, Privacy en Arbeidsrecht, 1999, pp. 243–250. See also, in Belgium, CAO no 100 betreffende een preventief alcohol-en drugsbeleid, allowing since 1 April 2009 to test employees on the workfloor on drugs and alcohol for reasons of ‘prevention’. This collective labor agreement describes various safeguards for the use of these tests in the employer-employee relation, which are gradually being introduced, because of the interference with the fundamental right to respect for privacy.

  398. 398.

    See, e.g., in the Netherlands, for (partial) information to the public about the use of these blood samples of new borns for the detection of particular (future) health problems, which often have a genetic cause, listing (presently) eighteen (18) diseases on which the blood samples are tested, Rijksinstituut voor Volksgezondheid en Milieu, Hielprik. Bevolkingsonderzoek, available at http://www.rivm.nl/Onderwerpen/Onderwerpen/H/Hielprik

  399. 399.

    For an example of misuse of this type of samples in the State of Texas, U.S.A., see Part II, Chap. 4, § 176; see, in Belgium, Orde van geneesheren, Bewaartermijn van Guthrie-kaartjes, 15 December 2001, available at http://www.ordomedic.be/nl/adviezen/advies/bewaartermijn-van-guthrie-kaartjes. In this advice, the organization of doctors refers to possible misuse and agrees with the suggestion to limit the duration of the keeping of the cards on which dried blood is kept. A regulation relating to the taking of blood samples, if any, is often (only) concerned with the procedures for the operation and the accreditation of the laboratoria involved for the analysis (see, for Belgium, Ministerial Decree of 23 March 1998 relating to the operational – and certification procedure of the centers for the detection of metabolic diseases at birth, modified by Ministerial Decree of 16 January 2008, B.S. 21.02.2008, p. 10891). About the use of blood samples of new borns for law enforcement, see also C. Mund, BiobanksData Sources without Limits? 3 October 2005, p. 3, available at http://www.privacyconference2005.org/fileadmin/PDF/mund_e.pdf. The author discusses inter alia the example of the Swedish police investigating the murder of Foreign Minister Anna Lindh succeeding in obtaining access to a neonatal database which kept blood samples of the last 30 years, in order to compare crime scene traces with the blood samples of a suspect.

  400. 400.

    Wet tot invoering van speekseltesten op drugs in het verkeer, 31.07.2009, B.S. 15.09.2009, pp. 62185–62190.

  401. 401.

    On the collection and use of saliva of persons offending train conductors for identification purposes, see X., ‘DNA-spuugkit verdeelt juristen’, P&I 2010, pp. 85–86; see also A. Pollack, ‘Firm Brings Gene Tests to Masses’, 28.01.2010, The New York Times, available at http://www.nytimes.com/2010/01/29/business/29gene.html

  402. 402.

    See §§ 358–392.

  403. 403.

    In terms of the Directive 95/46/EC, a ‘filing system’ means ‘any structured set, ‘any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis’ (Article 2(c)).

  404. 404.

    See the definition of the scope of the Directive 95/46/EC in Art. 3.

  405. 405.

    See above §§ 294–297.

  406. 406.

    About Art. 8 ECHR, Art. 7 and 8 of the EU Charter and the fundamental rights to privacy in national constitutions, see below.

  407. 407.

    For example, as stated, the facial images are on Facebook since late 2010 in a more prominent manner available in the profile pages, linked with additional information, for example about residence, education and civil status.

  408. 408.

    See, e.g., Rb. Antwerpen, 9.05.2003, AM 2003, p. 400, reversed by Antwerpen, 11.10.2005, referred to by D. Voorhoof, ‘Johan Demol krijgt gelijk voor Hof van Beroep Antwerpen’, Juristenkrant 2005, p. 13. The publication on the cover of a book of the image of an ex-policeman, together with some defamatory text, was (also) considered to be in breach of the right to one’s own image (see below).

  409. 409.

    See, e.g., ECtHR, Von Hannover v. Germany, no. 59320/00, 24 June 2004, §§ 76–78 (‘Von Hannover 2004’). See also ECtHR, Schüssel v. Austria, no. 42409/98, 21 February 2002 (‘Schüssel v. Austria 2002’). In that case, the applicant, vice-chancellor running for public office and hence a public figure, complained about an electoral poster containing his distorted picture accompanied by a disparaging text. While the Court extended in this case the protection under art. 8 ECHR to one’s image, it declared the application inadmissible.

  410. 410.

    See Von Hannover 2004, § 24 and §§ 76–80.

  411. 411.

    ECtHR, Sciacca v. Italy, no. 50774/99, 11 January 2005, §§29–30. (‘Sciacca 2005’). See and compare with the practice in some countries of fingerprinting convicts, without legal basis (see below § 356).

  412. 412.

    ECtHR, Reklos and Davourlis v. Greece, no 1234/05, 15 January 2009 (‘Reklos and Davourlis 2009’). In this case, the taking of images of all newborns by a commercial company requested by the hospital and the subsequent storage of a photograph of a newborn baby in a hospital without consent of the parents was considered a violation of Article 8 ECHR.

  413. 413.

    See, e.g., E. Guldix, ‘Algemene systematische beschouwingen over het persoonlijkheidsrecht op eigen afbeelding’, R.W. 1980–81, pp. 1161–1192; M. Isgour and B. Vincotte, Le droit à limage, Brussel, Larcier, 1998, 155 p.; D. Voorhoof, Actuele vraagstukken van mediarecht. Doctrine en jurisprudentie, Antwerpen, Kluwer, 1992, pp. 490–506. The (commercial) portrait right is deemed a subpart of the right to one’s own image. For France, the right to one’s image was recognized, first in case law in 1858 (in relation to the publication of a painting), and was later more generally integrated in the Civil Code (Article 9) by an Act of 17.07.1970 as an ‘exclusive and absolute’ right. See M. Moulla, Vie privée et droit à limage des personnes’, 23.09.2003, available at http://www.avocats-publishing.com/Vie-privee-et-droit-a-l-image-des,142

  414. 414.

    See L. Dierickx, Het recht op afbeelding, Antwerpen – Oxford, Intersentia, 2005, p. 55 (‘Dierickx, Recht op afbeelding, 2005’).

  415. 415.

    For a further explanation of these characteristics, see, e.g., Dierickx, Recht op afbeelding, 2005, pp. 2–3.

  416. 416.

    Article 1382 of the Civil Code relating to torts was initially and is until present also often invoked.

  417. 417.

    These provisions are often stated in criminal law. Provisions in the Penal Code forbid for example the use of images of minors in particular cases (e.g., Article 433bis Penal Code, forbidding the distribution of pictures or other images of minors who are prosecuted or under a specific youth protection regime).

  418. 418.

    See and compare, e.g., with the sections 22 and 23 of the German Copyright Act (‘Kunsturhebergesetz’), discussed extensively in Von Hannover 2004, § 25. The provisions concern the publication of photographical representations of persons. Section 22, para 1 states that pictures can only be disseminated or exposed to the public eye with the express approval of the person represented. Section 23 (1) excludes pictures relating to contemporary society, unless the dissemination interferes with the legitimate interest of the person represented.

  419. 419.

    Article 20 of the Copyright Act of 1886 stated: ‘The author or the owner of a portrait is not entitled to reproduce it or to show it in public without the consent of the person who has been portrayed (…)’ [free translation]. The Article was slightly amended in 1994 but the present version is similar.

  420. 420.

    See and compare also Voorhoof, who refers to Van Isacker in D. Voorhoof, Commercieel portretrecht in België, 2009, p. 151, also available at http://www.psw.ugent.be/Cms_global/uploads/publicaties/dv/05recente_publicaties/VOORHOOF.finalversion.14.05.2009.pdf

  421. 421.

    See, e.g., a recent case in which tennis player Kim Clijsters claimed and obtained her right of image: Gent, 21.02.2008, AM 2008, p. 318.

  422. 422.

    Various grounds are invoked in such cases, not only the fore mentioned Article 10 but also the fundamental right to privacy and data protection legislation.

  423. 423.

    See Dierickx, Recht op afbeelding, 2005, p. 62.

  424. 424.

    See e.g., Dierickx, Recht op afbeelding, 2005, p. 19 and the references therein mentioned.

  425. 425.

    See Dierickx, Recht op afbeelding, 2005, pp. 21–23, where the author argues and illustrates that a literal interpretation is not made in the case law. But see e.g., E. Guldix, De persoonlijkheidsrechten, de persoonlijke levenssfeer en het privéleven in hun onderling verband, Doctoral thesis, Brussels, VUB, 1986, pp. 121–123 (‘Guldix, De persoonlijkheidsrechten, 1986’). Guldix’s work contains further interesting references to the protection of one’s right to his or her (unique) voice. See Guldix, De persoonlijkheidsrechten, 1986, pp. 123–126.

  426. 426.

    A discussion of these restrictions on the right to one’s image are outside our subject of research. We refer to prominent legal scholars who have analyzed this topic in depth. See, e.g., D. Voorhoof, ‘Artikel 10 Portretrecht’ in F. Brison and H. Vanhees (eds.), De Belgische auteurswet. Artikelsgewijze commentaar, Gent, Larcier, 2008, pp. 61–66; see also K. Lemmens, La presse et la protection juridique de lindividu. Attention aux chiens de garde !, Brussel, Larcier, 2004, 603 p.

  427. 427.

    Compare with Peck v. United Kingdom, where the images were rendered public to promote the use of CCTV. Such additional use could in our view also include identification, such as by tagging, in social networks.

  428. 428.

    E.g., for feature extraction, for storage, for later comparison, …

  429. 429.

    See in this context for the need of consent, CBPL, Advies no 33/2007 inzake de verspreiding van beeldmateriaal, 28.11.2007, 4 p. (‘CBPL, Opinion no 33/2007 dissemination of images’). The consent obtained according to the advice would in our opinion not pertain to the use of the images in a biometric system.

  430. 430.

    In this case, also Article 10 of the Copyright Act could be relied upon.

  431. 431.

    About the consent of an individual and the restrictive interpretation of the consent for use in a particular context, see also Dierickx, Recht op afbeelding, 2005, pp. 114–124. For case law, forbidding the use (publication) of an image distributed in a particular context (in particular the image of a toddler distributed at a funeral) (but allowing publication of the identity details of the brother in the press), see Brussel, 14.09.1999, AM 2000, p. 92.

  432. 432.

    E.g., photographs taken by a photo boot in a train station. See about his issue, e.g., Guldix, De persoonlijkheidsrechten, 1986, p. 123. This discussion about the need of copyright protection of ‘mechanical’ photographs is increasingly relevant, for example, in case of pictures taken by satellites for global positioning and road mapping systems.

  433. 433.

    These are for example the two condition for copyright protection set forth by the Belgian Supreme Court.

  434. 434.

    The publication of legislative acts passed at the end of a calendar year (usually the 30th or 31st of December) containing legal provisions in various domains which now is customary in Belgium, was not in vogue yet.

  435. 435.

    See and compare with the use of the facial images in social networks. The dissemination of tagged images is precisely at the core of the debate about the use of the biometric function in these networks.

  436. 436.

    For some history of dactyloscopy, see Chap. 2, §§ 34–37. The collection and use of fingerprints is an identification technique which is still most successful for law enforcement purposes. For Belgium, e.g., see Develtere, Dactyloscopy, in Van de Voorde, Goethals and Nieuwdorp, Multidisciplinair forensisch onderzoek, 2003, p. 326. New nanotechnology techniques, described as revolutionary, would now also allow to recover and use old and weak latent fingerprints, which were not detectable so far by current techniques. See J. van Dooren, Using gold nanoparticles to recover old fingerprints, 8.06.2011, available at http://www.bitsofscience.org/gold-nanoparticles-recover-fingerprints-1676/

  437. 437.

    Ibid., p. 322. In Belgium, a legal basis for the procedure of criminal investigation as such was only provided by an Act adopted in 1998 (Wet van 12 maart 1998 tot verbetering van de strafrechtspleging in het stadium van het opsporingsonderzoek en het gerechtelijk onderzoek, B.S. 2.4.1998, pp. 10027–10041).

  438. 438.

    Develtere, Dactyloscopy, in Van de Voorde, Goethals and Nieuwdorp, Multidisciplinair forensisch onderzoek, 2003, p. 322. The reliability of fingerprint identification has however been criticized and is debated, including in the United States, where criteria were developed in case law for the approach taken by latent fingerprint analysts. These should meet the criteria as developed in the Daubert v. Merrell Dow Pharmaceuticals case (509 U.S. 579, (U.S. 1993)) for the reliability of scientific evidence. These standards are (i) general acceptance, (ii) testing or testability, (iii) peer review and publication, (iv) known or potential error rate and (v) standards for controlling application of the technique.

  439. 439.

    The automated systems were largely adopted without much public discussion (compare, e.g., with the controversy surrounding the use of DNA databases). One of the reasons is probably the fact that the AFIS fingerprint databases were built up by scanning the fingerprint cards that were already in the possession of police departments. See also S. Garfinkel, Database Nation. The Death of Privacy in the 21st Century, Sebastopol, CA, United States, O’Reilly, 2000, p. 46.

  440. 440.

    See Part II, Chap. 4, §§ 175–179.

  441. 441.

    In Belgium, e.g., Wet van 15.12.1980 betreffende de toegang tot het grondgebied, het verblijf, de vestiging en de verwijdering van vreemdelingen, in particular art. 30bis (as modified several times, including in 2004 and 2011).

  442. 442.

    We refer to, e.g., Eurodac, which allows the automated processing of the fingerprint data.

  443. 443.

    Genetic and DNA information however will also contain information relevant for family members.

  444. 444.

    As mentioned above, this distinction between coding and non coding fragments is far from sharp and is evolving, part of the non coding DNA containing also important information. See about the distinction, also Callens, Goed geregeld, 1995, pp. 120–124 and the references mentioned.

  445. 445.

    See, e.g., Article 2 of the Prüm Treaty. See also WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 25–27.

  446. 446.

    See also EDPS, Opinion on the Initiative of the Federal Republic of Germany, with a view to adopting a Council Decision on the implementation of Decision 2007/…/JHA on the stepping up of cross-border cooperation, particularly in combating terrorism and cross-border crime, O.J. C 89/1, 10.4.2008, p. 4 (‘EDPS, Opinion cross-border cooperation, 2008’).

  447. 447.

    See also M. Meints, Privacy and Security Challenges in Deploying Biometrics in the EU. A Data Protection Commissions Perspective, 2008, Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein, Biometrics Symposium, Tampa, Florida, U.S.A., slide 37 (‘Meints, Privacy and Security Challenges in Deploying Biometrics in the EU, presentation, 2008’); I. Rigoutsos, T. Huynh, K. Miranda, A. Tsirigos, A. McHardy, D. Platt, ‘Short Blocks from the noncoding parts of the human genome have instances within nearly all known genes and relate to biological processes’, Proceedings of the National Academy of Science of the United States, 2006, Washington, pp. 6605–6610. This is also acknowledged in WP 29 Opinion on developments in biometric technologies 2012 (WP193), pp. 25–27.

  448. 448.

    See also Council Resolution 2001/C187/01 of 25 June 2001 on the exchange of DNA analysis results, O.J. C187, 03.07.2001, Annex 1, p. 1.

  449. 449.

    This is the general ‘belief’. This is, however, not very clear as so-called non-codifying parts may also contain information about someone’s health as mentioned. See Callens, Goed geregeld, 1995, p. 124. See also on this issue, Kaye, Science Fiction and Shed DNA, 2006: ‘Recent discoveries establish that some intergenic DNA (not ‘markers’) is biologically significant, but no forensic STR locus has been found to be predictive’. The author refers to J. Butler, ‘Genetics and Genomics of Core Short Tandem Repeat Loci Used in Human Identity Testing’, 51 Journal Forensic Science 2006, (253), pp. 259–260, also available at http://www.cstl.nist.gov/div831/strbase/pub_pres/Butler2006JFS_coreSTRreview.pdf and D. Kaye, ‘Two fallacies About DNA Data Banks for Law Enforcement’ 67 Brooklyn Law Review, 2001, (179), pp. 187–188, also available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=301650

  450. 450.

    It shall be noted that the general data protection legislation as set forth in the Directive 95/46/EC, including the concept of sensitive data as we described, is in many cases not applicable to the collection and use of personal data, in this case DNA data, for police and law enforcement purposes, subject however to specific national (data protection) regulation or European or international regulation (see, e.g., the prohibition to process sensitive data by art. 56 SIS II Decision). It does not imply, however, that no principles would apply at all. The Council of Europe’s Convention of 1981 (see below) or the recommendations formulated in Council of Europe, Recommendation No. R(87) 15 regulating the use of personal data in the police sector, 1987, 4 p., available at http://www.coe.int/t/dghl/cooperation/economiccrime/organisedcrime/Rec_1987_15.pdf remain fully applicable (see also art. 57 SIS II Decision).

  451. 451.

    Recent legislation however does not always maintain these guarantees, e.g., by allowing storage without such intervention. We will provide a more extensive overview of the regulation of DNA analysis of non-codifying parts and of the use of DNA samples and profiles for identification purposes in criminal matters, and which is different from the regulation of genetic data, below in §§ 378–388. For a description of the techniques for the extraction and the use of DNA information for identification purposes, as compared with the techniques for processing biometric information, we refer to §§ 300–310 above.

  452. 452.

    See also ALRC, Essentially Yours, 2003, where genetic testing is divided in three broad categories based on the purposes of the testing: medical testing, identification (forensic) testing and kinship testing. Medical testing includes diagnostic testing, predictive or presymptomatic testing, genetic carrier testing, screening testing, pre-implantation and prenatal testing and testing for medical or scientific research.

  453. 453.

    The whole of the genetic information in an organism (e.g., of a human, but also of an animal or a plant) is referred to as the ‘genome’. The genome of some persons has been unraveled as a prestigious act, e.g., of Nobel prize winner James Watson and Craig Venter, whose genome has been published for the public as well (see S. Levy and others, ‘The Diploid Genome Sequence of an Individual Human’, Plos Biology, 2007, available at http://www.plosbiology.org/article/info:doi/10.1371/journal. pbio.0050254).

  454. 454.

    An example of such collection of genetic data is the Estonian Genome Project, which started collecting tissue samples from gene donors in 2002. The aim is to create a database of health, genealogical and genome data representing 10 % of Estonia’s population, making it possible for researchers both in Estonia and outside to look for links between genes, environmental factors and common diseases. On this case, see London Economics, Study on the economic benefits of privacy-enhancing technologies (PETs). Final Report to the European Commission DG Justice, Freedom and Security, July 2010, pp. 205–209, also available at http://ec.europa.eu/justice/policies/privacy/docs/studies/final_report_pets_16_07_10_en.pdf

  455. 455.

    WP 29 Opinion genetic data 2004, p. 2.

  456. 456.

    See also WP 29 Working document genetic data 2004, p. 11. The term ‘biobank’ is also used for the collection of other biological material, such as blood samples (for an example, see above § 341).

  457. 457.

    A definition given by the American National Bioethics Advisory Commission in 1999 to DNA banks, for example, is ‘a facility that stores extracted DNA, transformed cell lines, frozen blood or other tissue, or biological materials, for future DNA analysis’, as mentioned in B. Godard, J. Schmidtke, J.-J. Cassiman and S. Aymé, ‘Data storage and DNA banking for biomedical research: informed consent, confidentiality, quality issues, ownership, return of benefits. A professional perspective’, European Journal of Human Genetics 11, 2003 (‘Godard et al., Data storage and DNA banking, 2003’), available at http://www.nature.com/ejhg/journal/v11/n2s/pdf/5201114a.pdf

  458. 458.

    See, e.g., UNESCO, International Declaration on Human Genetic Data, 16 October 2003, Article 2, (i). Other definitions have been proposed as well. See WP 29 Working document genetic data 2004, p. 4. Some more precise descriptions of terms are given in Council of Europe, Additional Protocol to the Convention on Human Rights and Biomedicine, concerning Genetic Testing for Health Purposes, 21 November 2008, (‘ETS No. 164. Additional Protocol 203’), Article 2, available at http://conventions.coe.int/Treaty/EN/Treaties/html/203.htm

  459. 459.

    For a very comprehensive description and discussion of the special categories of genetic data, see Callens, Goed geregeld, 1995, pp. 100–135.

  460. 460.

    A chromosome is generally described as ‘a threadlike body in the cell nucleus that carries the genes in a linear order’ (Webster’s Online Dictionary).

  461. 461.

    Callens, Goed geregeld, 1995, p. 125. See and compare with the (in our view too narrow) proposed definition of genetic data in the Proposal for General Data Protection Regulation, art. 4 (10).

  462. 462.

    About this distinction in the context of the discussion of the use of genetic and medical information in the insurance sector, see, e.g., B. Desmet, ‘Genetisch onderzoek en verzekeringen. De wet van de (genetisch) sterkste’, Jura Falconis 2005–2006, pp. 505–548, available at http://www.law.kuleuven.be/jura/art/42n4/desmet.html (‘Desmet, Genetisch onderzoek, 2005’).

  463. 463.

    See Part II for the discussion about the risks of biometric data processing, in particular Chap. 4, § 79 and the footnotes there mentioned.

  464. 464.

    We will therefore make recommendations in this regard in Part III.

  465. 465.

    WP 29 Working document genetic data 2004, p. 5.

  466. 466.

    Ibid., p. 5.

  467. 467.

    About the review of the Directive 95/46/EC and this suggestion, see also Part III.

  468. 468.

    See also Part II.

  469. 469.

    Ibid., Article 7; Council of Europe, Convention on Human Rights and Biomedicine (ETS No. 164), Article 11; ETS No. 164. Additional Protocol 203, Article 4, available at http://conventions.coe.int/Treaty/EN/Treaties/html/203.htm

  470. 470.

    For a more extensive overview of principles and rules which apply, see, e.g., Godard et al., Data storage and DNA banking, 2003; See also the extensive study made in Australia by the Law Reform Commission and the Health Ethics Committee on the protection of human genetic information resulting in 144 recommendations for reform: ALRC, Essentially Yours, 2003.

  471. 471.

    In (only) 50 years, sciences proceeded from the description of the DNA double helix (see above footnote 307) until the mapping of the entire human genome.

  472. 472.

    For example, a society may have an interest in the (compulsory) acquisition of DNA samples, e.g., by law enforcement authorities.

  473. 473.

    For example, employers and insurance companies may defend their interests in genetic testing.

  474. 474.

    European Commission, Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions. A comprehensive approach on personal data protection in the European Union, 4.11.2010, COM(2010) 609 final, p. 9, available at http://ec.europa.eu/justice/news/consulting_public/0006/com_2010_609_en.pdf (‘Commission, Communication. Personal Data Protection, 2010’)

  475. 475.

    For an overview of the emerging regulatory framework, see e.g., K. Grimaldi, M. Look, A. Scioli, J. Coll Clavero, S. Marinos, and T. Tagaris, ‘Personal genetics: regulatory framework in Europe from a service provider’s perspective’, European Journal of Human Genetics, 2011, pp. 382–388. The regulation of personal genetic information use is a specialized field of research.

  476. 476.

    See Act on Terrestrial Insurance Contracts of 1992, Articles 5 para 1 and 95; see also Desmet, Genetisch onderzoek, 2005, pp. 505–548.

  477. 477.

    See about proposals submitted to the Belgian parliament to regulate the use of such tests, such as the proposal of an act submitted by Blanpain relating to the predictive genetic test, (Parl. St. Senaat 1988–89, nr. 513-1), Hendrickx, Privacy en Arbeidsrecht, 1999, pp. 254–255; about the ethical and legal framework for the use of genetic information, see, e.g., K. Jacobs, ‘Verzekeringen en genetica’, in E. Guldix, J. Stuy, K. Jacobs and A. Rigo, Het gebruik van genetische informatie. Het ethisch en juridisch kader voor het maatschappelijk gebruik van geïndividualiseerde genetische informatie, Brussel, Federale Diensten voor Wetenschappelijke, Technische en Culturele Aangelegenheden, 1994, pp. 129–150; see also B. Godard, S. Raeburn, M. Pembrey, M. BobrowAymé, ‘Genetic information and testing in insurance and employment: technical, social and ethical issues’, European Journal of Human Genetics 2003, Suppl. 2, available at http://www.nature.com/ejhg/journal/v11/n2s/full/5201117a.html. (‘Godard, Raeburn, Pembrey, Bobrou, Farndon and Ayme, Genetic information and testing in insurance and employment, 2003’)

  478. 478.

    Article 25, I, 2° Act N° 78-17.

  479. 479.

    Article 21, 4 Act of 6 July 2000 (‘Dutch Data Protection Act 2000’).

  480. 480.

    The so-called ‘Gendiagnostikgesetz’ or ‘GenDG’ of 31.07.2009 (BGBl. I S. 2529, p. 3672).

  481. 481.

    See §§ 8–9 GenDG and § 7 GenDG. This implies that the tests generally shall only be done by doctors or specialized persons. See, e.g.,, for affiliation tests, § 17 (4) GenDG.

  482. 482.

    The Act limits for example genetic tests on fetuses to purely medical reasons under specific conditions (see § 15).

  483. 483.

    See §§ 18–19 GenDG. For example, for a life insurance contract in case the payment exceeds 300,000 Euros or 30,000 euro per year (§18 (1) para. 2). A genetic test may also be permitted if a specific job entails potential health risks.

  484. 484.

    The rationale behind this legislation is the fear that the newly gained knowledge about genetic risks may lead to discrimination of persons with Alzheimer, diseases of Huntington, etc. For a discussion about the use of genetic information in the insurance and employment context, see, e.g., Godard, Raeburn, Pembrey, Bobrou, Farndon and Ayme, Genetic information and testing in insurance and employment, 2003.

  485. 485.

    In some countries, for example, chemical tests, blood tests and common lab tests have been excluded from the application field. Genetic research, however, is often based on such tests.

  486. 486.

    See I. van Hoyweghen and K. Horstman, ‘European practices of genetic information and Insurance. Lessons for the Genetic Information Nondiscrimination Act’, JAMA 2008, pp. 326–327. Others have casted doubts whether such regulation was necessary and useful.

  487. 487.

    For an interesting overview of use in various countries, see also ALRC, Essentially Yours, 2003.

  488. 488.

    In France, see Art. 706-54 of the Criminal procedural Code. The database is held under the control of an independent judge. In the United States, the CODIS system is operated by the FBI (about CODIS, see also footnote 540 below).

  489. 489.

    See, for example, Loi n° 2003-239 pour la sécurité interne, 18.03.2003, as modified, available at http://www.legifrance.gouv.fr/affichTexte.do?cidTexte=JORFTEXT000000412199. See also CNIL, FNAEG: Fichier national des empreintes génétiques, 25 June 2009, available at http://www.cnil.fr/en-savoir-plus/fichiers-en-fiche/fichier/article/41/fnaeg-fichier-national-des-empreintes-genetiques/. On 30.1.2010, the FNAEG database contained 972,042 DNA profiles of accused persons (‘mise en cause’), 285,140 of condemned persons and 64,774 traces.

  490. 490.

    See Cons. const. N° 2010-25, 16.09.2010, also available at http://www.conseil-constitutionnel.fr/conseil-constitutionnel/francais/les-decisions/acces-par-date/decisions-depuis-1959/2010/2010-25-qpc/decision-n-2010-25-qpc-du-16-septembre-2010.49343.html. See, also in France, X., ‘A 8 et 11 ans, ils sont menacés de fichage génétique pour vol de jouets’, Le Monde, 7.05.2007, available at http://www.lemonde.fr/societe/article/2007/05/05/a-8-et-11-ans-ils-sont-menaces-de-fichage-genetique-pour-vol-de-jouets_906026_3224.html. In the article, it is reported that between 2003 and 2006, the number of DNA records had increased from about 3,000 to more than 330,000. About the continuing extensions of the DNA databases in the Netherlands and the U.K., see also M. Prinsen, ‘De bestaande forensische DNA-databank en een verkenning van de mogelijkheden tot uitbreiding’, P&I 2006, pp. 54–58 (‘Prinsen, De bestaande forensische DNA-databank, 2006’).

  491. 491.

    Council of Europe, Recommendation Rec(92) 1 of the Committee of Ministers to Member States on the use of analysis of deoxyribonucleic acid (DNA) within the framework of the criminal justice system, 10 February 1992, (‘Council of Europe, Recommendation R(92) 1 DNA’), available at https://wcd.coe.int/com.instranet.InstraServlet?command=com.instranet.CmdBlobGet&InstranetImage=1518265&SecMode=1&DocId=601410&Usage=2

  492. 492.

    Council of Europe, Recommendation Rec(97) 5 on the Protection of Medical Data, 17 February 1997, available at https://wcd.coe.int/wcd/com.instranet.InstraServlet?command=com.instranet.CmdBlobGet&InstranetImage=564487&SecMode=1&DocId=560582&Usage=2. About 15 years later, the Council of Europe calls for legislation for biometric data processing as well. See Part III, Chap. 8, § 370.

  493. 493.

    E.g., in Belgium (§ 378 et seq.), but also in France and the Netherlands. For an overview, see, e.g., C. van den Heuvel, J. Nijboer, A. van Rijsewijk, Th. de Roos, Forensic DNA-onderzoek: een rechtsvergelijkende verkenning, 2006, Kluwer, 186 p.

  494. 494.

    See Orde van geneesheren, Toenemend en ongeregeld uitvoeren van vaderschapstests, 16 June 2001, available at http://www.ordomedic.be/nl/adviezen/advies/toenemend-en-ongeregeld-uitvoeren-van-vaderschapstests and Orde van geneesheren, Het uitvoeren van vaderschapstests, 21 February 2009, available at http://www.ordomedic.be/nl/adviezen/advies/Het-uitvoeren-van-vaderschapstests. In the latter opinion of the National Association of doctors, a negative advice was given to doctors for collaboration for the use of paternity tests outside any judicial procedure. One of the reasons invoked are the fundamental rights of the child.

  495. 495.

    Wet betreffende de medische onderzoeken die binnen het kader van de arbeidsverhoudingen worden uitgevoerd, 28.1.2003, B.S. 9.4.2003.

  496. 496.

    Loi n° 94-653 relative au respect du corps humain, 29.07.1994.

  497. 497.

    In these cases, consent is still required.

  498. 498.

    See F. El Atmani, ‘Données sensibles: la notion de consentement de la personne concernée’, Lamy droit de linformatique 1996, N° 86, (1), p. 4. ‘Res extra commercium’ is a doctrine origination from Roman law and is latin for ‘a thing outside commerce’.

  499. 499.

    See Article L. 111-6 Code de l’entrée et du séjour des étrangers et du droit d’asile, as modified (Loi n° 2007/1631, 20.11.2007). The immigration bill raised controversy. The French Constitutional Council made several reservations with regard to the proposed legislation, including that if the family links with the mother can be proven otherwise by any other admissible means of proof under the applicable law, DNA testing should not be applied. See Cons. const. N° 2007-557 of 15 November 2007, § 16, available at http://www.legifrance.gouv.fr/affichTexte.do?cidTexte=JORFTEXT000000341640&dateTexte=

  500. 500.

    Wet DNA-onderzoek bij veroordeelden, 16 September 2004, Stb. 2004, p. 465 (‘DNA Analysis Act 2004’) also available at http://www.st-ab.nl/wetten/0461_Wet_DNA-onderzoek_bij_veroordeelden.htm. See also the opinion of the Dutch DPA on the bill: Registratiekamer, DNA-onderzoek in strafzaken, 2000. Before this DNA Analysis Act 2004, the rules for the existing DNA database were regulated in the DNA Analysis in Criminal Cases Decree (‘Besluit DNA-Onderzoek in strafzaken’), 27 August 2001, Stb. 2001, p. 400, also available at http://www.st-ab.nl/wettennr04/0475-049_Besluit_DNA-onderzoek_in_strafzaken.htm. About the situation in the Netherlands, see also Prinsen, De bestaande forensische DNA-databank, 2006, pp. 54–58. See also the doctoral thesis of the same author: M. Prinsen, Forensisch DNA-onderzoek. Een balans tussen opsporing en fundamentele rechten, Nijmegen, Wolf, 2008, 256 p. (‘Prinsen, Forensisch DNA-Onderzoek, 2008’).

  501. 501.

    For example, if the DNA investigations can play no meaningful role for the solution of the crime for which the person has been convicted (e.g., forgery as opposed to e.g., rape).

  502. 502.

    For example, if a person is most unlikely able to commit an offence in respect of which DNA investigation might be of use. See and compare also with the requirement of relevancy in case of interference with fundamental rights, explained and applied for biometric applications in Part II, Chap. 5, §§ 347–353.

  503. 503.

    DNA Analysis Act 2004, Section (2) sub b.

  504. 504.

    The first regulation of DNA analysis use in criminal cases dates from 1993 (Act DNA Analysis in Criminal Cases, (‘Wet DNA-onderzoek in strafzaken’), Stb. 1993, p. 596 as modified.

  505. 505.

    Wet van 8 mei 2003 tot wijziging van de regeling van het DNA-onderzoek in strafzaken in verband met het vaststellen van uiterlijk waarneembare persoonskenmerken uit celmateriaal, Stb. 2003, 201. See the (new) Art. 151d, 2 of the Code of Criminal Procedure law (introduced by fore mentioned Act). The characteristics should be restricted to those visible at ‘the time of birth’. About this Act, and DNA analysis for forensic purposes, see Prinsen, Forensisch DNA-Onderzoek, 2008, p. 203 et seq.

  506. 506.

    See, for example, Switzerland and the Verordnung über das DNA-Profil informationssystem, 31 May 2000, available at http://www.admin.ch/ch/d/as/2000/1715.pdf. About this system, see W. Bär, A. Kratzer & M. Strehler, Swiss Federal DNA Profile Information SystemEDNA, available at http://www.promega.com/geneticidproc/ussymp12proc/abstracts/bar.pdf. In this system, all samples taken are ‘anonymized’ by a unique identification number, as a result whereof the name of the suspect is not revealed to the lab. The information is also transferred to the AFIS services of the federal police in Berne, for linking with the corresponding names of the suspects and the crimes. See also and compare with the storage of DNA information in other national databases, e.g., Belgium, and the information mentioned therewith (see below, § 378 et seq.).

  507. 507.

    ECtHR, Van der Velden v. the Netherlands, no 29514/05, 7 December 2006.

  508. 508.

    BVerfG, 2 BvR 1741/99, 14.12.2000 (‘Zur Speicherung desgenetischen Fingerabdrucksverurteilter Personen’).

  509. 509.

    S. and Marper 2008. See also the Protection of Freedoms Act 2012, available at http://www.legislation.gov.uk/ukpga/2012/9/pdfs/ukpga_20120009_en.pdf. (‘Protection of Freedoms Act 2012’ or ‘Freedoms Act 2012’)). The Act modified PACE.

  510. 510.

    Wet 22 maart 1999 betreffende de identificatieprocedure via DNA analysis in strafzaken, B.S. 20 May 1999, err. B.S. 24.6.1999, pp. 17547–17552 (‘Act DNA analysis’ or ‘Act’)). The Act which regulates the identification procedure through DNA in criminal matters has been implemented by various Royal Decrees taken some years thereafter. The Act came only into force on 31 March 2002. Detailed rules specify the procedures for the taking, the keeping, the examination and deleting the samples, for the counter – expertise, the licensing of the labs and the storage, the processing and the use of the DNA- profiles in the databases. For legal authors about the Act DNA analysis, see, e.g., Ch. Van den Wyngaert, Strafrecht, strafprocesrecht en internationaal strafrecht, Antwerpen-Apeldoorn, Maklu, 2006, p. 948 (‘Van den Wyngaert, Strafrecht, strafprocesrecht en internationaal strafrecht, 2006’). See about the use of DNA and this Act, e.g., Graux and Dumortier, Privacywetgeving in de praktijk, 2009, pp. 109–112.

  511. 511.

    See R. Decorte and J.-J. Cassiman, ‘DNA-analyse in strafzaken. Een synthese van de mogelijkheden en beperkingen voor het gerechtelijk onderzoek’, in W. Van de Voorde, J. Goethals en M. Nieuwdorp (eds.), Multidisciplinair forensisch onderzoek, Brussel, Politeia, 2003, (369), p. 384 (‘Decorte and Cassiman, DNA-analyse in strafzaken, in Van de Voorde Goethals and Nieuwdorp, Multidisciplinair forensisch onderzoek, 2003’); see and compare e.g., with the set up a the National Registry in 1968 in Belgium without any legal regulation (see below footnote 702) or the use of several ‘special investigation methods’ in Belgium before the adoption of the Act of 2003 regulating these practices (see also footnote 437) (see Graux and Dumortier, Privacywetgeving in de praktijk, 2009, p. 107).

  512. 512.

    This Act has been modified by the legislator in 2011: Wet 7 november 2011 houdende wijziging van het Wetboek van strafvordering en van de wet van 22 maart 1999 betreffende de identificatieprocedure via DNA onderzoek in strafzaken, B.S. 30.11.2011, pp. 70716–70727 (‘Act DNA analysis 2011’). The modifications will only enter into force subject to a royal decree to be adopted. For this reason, we discuss herein in essence the Act of 22 March 1999, while we make some references to the most important modifications of 2011.

  513. 513.

    These crimes include, e.g., sexual assault, rape and manslaughter.

  514. 514.

    The databases are set up with the Belgian National Institute for Criminalistics and Criminology (‘NICC’).

  515. 515.

    The Article 44ter (Art. 44quater-Art. 44sexies after the Act DNA analysis 2011) which regulates the collection of (human) cellular material (traces), DNA profiles and its use and storage in the context of a criminal investigation (and for which the consent of the person involved is required) and a new Article 90undecies which provides for a DNA analysis without consent ordered by the investigation judge.

  516. 516.

    The Act DNA analysis 2011 states now explicitly that such decision of the public prosecutor may also pertain to a comparison of the profiles of the found human cellular material.

  517. 517.

    This is in the Act of 1999 upon order of the public prosecutor (Art. 44ter §2 Code of Criminal Proceedings.). Additional information summed up in the law is stored with the DNA information as well. This information is 1° the number of the criminal file, 2° the name of the magistrate in charge 3° the name and address of the lab where the DNA profile was made, including the file number, 4° information about the biological nature of the trace, 5° gender of the person to whom the DNA belongs, and 6° if applicable, a code issued by the magistrate by which the DNA profile can be linked to a name of a person involved. The Act DNA analysis 2011 adds also ‘if applicable, the positive link between the DNA profiles’. However, the Act DNA analysis 2011 now foresees that the expert sends ex officio the profiles resulting from his or her analysis to the national database for comparison purposes as foreseen in the law, unless motivated decision otherwise of the public prosecutor. The profiles are hence sent under the Act DNA analysis 2011 as a rule for storage in the database DNA database Criminalistics, within 15 days after the report. The expert needs to summit such report to the public prosecutor within 1 month after the request for analysis.

  518. 518.

    The Act of 1999 states that only adults can be requested to give their consent. The law is not clear as to whether that means that minors are excluded from the sample taking.

  519. 519.

    Art. 44ter §3 para 2 Code of Criminal Proceedings. It is thus a minimum condition that traces have been found which may be compared with the sample taken.

  520. 520.

    Van den Wyngaert, Strafrecht, strafprocesrecht en internationaal strafrecht, 2006, p. 948.

  521. 521.

    Art. 44ter §4 Code of Criminal Proceedings. There are precise rules on how and at whose cost this counter expertise shall be done.

  522. 522.

    Once the expert is informed that (a) there will be no counter expertise or (b) that the counter expertise results have been communicated to the person involved, he shall destroy the DNA samples taken and confirm this to the prosecution office (Art. 44ter §5 Code of Criminal Proceedings); see also Article 23 § 2 of the Royal Decree of 4 February 2002. Some commentators criticized this measure. See, e.g., Decorte and Cassiman, DNA-analyse in strafzaken, 2003, p. 385.

  523. 523.

    See Art. 90undecies Code of Criminal Proceedings. The investigation judge hereto issues a motivated (written) order which is communicated to the public prosecutor (Art. 90undecies §1 para 5 Code of Criminal Proceedings).

  524. 524.

    Samples can be taken from either blood, buccal mucous membrane or hair roots (Art. 90undecies §3 Code of Criminal Proceedings).

  525. 525.

    Art. 90undecies §2 Code of Criminal Proceedings. This Article 90undecies has been modified by the Act DNA analysis 2011. The modifications pertain inter alia to the motivated decision of the judge and the elements which this decision shall contain, in particular in relation with the comparisons, any positive links and the storage of the profiles. Moreover, a new Article 90duodecies is added providing for a legal basis for the investigation judge to take a sample without consent of non-suspects provided there are indications that that individual has a direct link with the investigation.

  526. 526.

    In case physical force has to be exercised, this is done by police under the supervision of an officer of the judicial police. In case force is needed, no blood samples shall be taken.

  527. 527.

    Art. 44ter §1 Code of Criminal Proceedings. The definition does not explicitly refer to the comparison with the profiles in the DNA databases. However, the DNA profiles of traces found or samples taken are sent to and stored in the DNA databases as stipulated in the Act (in the case of samples taken from the suspect(s) these will only be stored after a conviction as explained) and such comparison with the stored DNA profiles is further described in the Act.

  528. 528.

    It is hence not explicitly stated in the Act of 1999 that the DNA data can also be used for exculpatory evidence (i.e. evidence which is favorable for the defendant in a criminal trial). But: see Act DNA analysis 2011. Furthermore, this definition was not maintained in the Act DNA analysis 2011 which clearly has a broader scope. Involvement in a crime is no longer explicitly mentioned in the definition of the new Art. 44ter 3° as a condition for the comparison and the purposes of comparison broader than only identification.

  529. 529.

    The database contained on 29 September 2009 18,712 profiles. See Vr. & Antw. Senaat 2009–10, 7 December 2009 (Question no 4-5675 of 7 December 2009 of Wille), also available at http://www.senate.be/www/?MIval=/Vragen/SchriftelijkeVraag &LEG=4&NR=5675&LANG=nl

  530. 530.

    See Art. 4 §1 Act DNA analysis.

  531. 531.

    Art. 4 §2 Act DNA analysis. The prosecutor or the investigation judge may, upon a duly motivated decision, request an expert of the National Institute for Criminalistics and Criminology to compare DNA profiles of traces found (in which case the DNA samples will in most case not (yet) be linked to a person) or samples taken (in this case, the DNA samples would in principle be attributed to a known person) with the DNA profiles (these DNA profiles are sometimes linked to other DNA profiles of the data base, or sometimes linked – through the use of a code – to a person) stored in the database. In case of a positive link, the expert has to inform the magistrates ex officio (‘ambtshalve’/‘doffice’).

  532. 532.

    See Art. 4 §3 para. 2 Act DNA analysis. Because no other identifying information (e.g., name, …) is mentioned with the DNA profiles, some say that the database is ‘anonymous’. This terminology is however misleading as the DNA profiles per se relate to identified or identifiable persons and the database is therefore not anonymous. About a similar discussion in relation with biometric data, see also below Part III, Chap. 7, §§ 102–109.

  533. 533.

    The deletion of the DNA sample is in this case done because of (a) the suspect is acquitted or (b) the suspect is found guilty and condemned. Only in very specific cases, for specific crimes only, the DNA profiles will be stored in another database, the DNA database of the Convicted Persons (see above).

  534. 534.

    See Art. 5 Act DNA Analysis. These crimes are typically serious facts and in principle penalized with imprisonment of maximum 5 years or more. In case a DNA profile was already available, this is upon order of the public prosecutor stored in the database. In other cases, a DNA sample and profile is taken and stored, with physical force if needed. The database contained on 29 September 2009 17,292 profiles. See Question no 4-5675 Wille.

  535. 535.

    Art. 5 §3 Act DNA analysis. The prosecutor or the investigation judge may, upon a duly motivated decision, request an expert of the National Institute for Criminalistics and Criminology to compare DNA profiles of traces found with the DNA profiles stored in this database. A similar procedure as set out above applies. The database contains the results of any comparative DNA analysis, i.e. a positive link with other profiles and/or a code which links the profile to a person (Art. 5 §4 para. 4 Act DNA analysis).

  536. 536.

    See Art. 5 §5 Act DNA analysis.

  537. 537.

    Art. 6 Act DNA analysis.

  538. 538.

    See the Annex to K.B. 4 februari 2002 ter uitvoering van de wet van 22 maart 1999 betreffende de identificatieprocedure via DNA-onderzoek in strafzaken, B.S. 30.03. 2002, 1ste ed., (13471), p. 13475; see also the seven DNA markers which constitute the European Standard Set (ESS) as set forth in the Council Resolution 2001/C187/01 of 25 June 2001 on the exchange of DNA analysis results, 25.06.2001, O.J. C187, 03.07.2001. DNA-profiles (a code of numbers and letters) may also be established on the basis of the seven DNA markers of this European Standard Set.

  539. 539.

    JRC, Biometrics at the Frontiers, 2005, p. 129; see also See Nuffield, Bioinformation 2007, p. 121; if it would become know that one of these loci would contain hereditary (genetic) information, Member States are advised to no longer use such loci.

  540. 540.

    CODIS is a software and database developed and launched by the FBI in 1998 to support local, state and national laboratory’s DNA testing and exchange of DNA profiles in the U.S. with the use of a matching algorithm. CODIS is presently one of the largest DNA databases in the world, consisting originally of the Convicted Offender Index and the Forensic Index, but has been extended with DNA of arrested and missing persons. The database uses 13 STRs (‘Short Tandem Repeats’ or ‘STRs’, referring to repeated patterns) as the core loci. The DNA Identification Act of 1994 authorized the FBI to operate CODIS. About CODIS in more detail, see also http://www.fbi.gov/about-us/lab/codis/codis-and-ndis-fact-sheet. For the average match probability of different markers, see also Nuffield, Bioinformation 2007, p. 122.

  541. 541.

    See Annex to the Royal Decree of 4 February 2002 for the execution of the Act of 22 March 1999 relating to the identification procedure via DNA analysis in law enforcement, B.S. 30 March 2002, (13471), p. 13475.

  542. 542.

    For criticism on this point, see Decorte and Cassiman, DNA-analyse in strafzaken, 2003, p. 385.

  543. 543.

    This happens if the same guarantees are not imposed upon such third country authorities (e.g., in international agreements) or enforced. Another reason is generally the lack of accountability of third country authorities to citizens of other countries. See and compare also with the SWIFT case where U.S. authorities accessed financial information of inter alia EU citizens of the SWIFT network without respecting fundamental privacy and data protection principles valid in the EU.

  544. 544.

    See, for Belgium, a Bill to extend the law with DNA databases for inter alia missing persons (see Parl. St. Senaat 2011–12, n° 5-1633/1).

  545. 545.

    As already mentioned, presently, DNA profiles of suspects, after (negative) comparison with the DNA database Criminalistics, need to be destroyed under the Act DNA analysis 1999.

  546. 546.

    Electronic Privacy Information Center and Privacy International, Privacy and Human Rights 2006. An International Survey of Privacy Laws and Developments, Washington – London, Electronic Privacy Information Center and Privacy International, 2007, p. 28 (‘EPIC and Privacy International, An International Survey, 2007’).

  547. 547.

    In the Belgian federal parliament, for example, several questions have been asked to the Minister of Justice relating to the exchange of DNA and fingerprint data. As compared to fingerprint data, international request for comparison for DNA is not yet significant (the NCCI received in 2008 less than 50 requests for the comparison of DNA from countries outside Belgium), but this is likely to increase. See Vr. & Antw. Kamer 2008–09, 16 February 2009, pp. 629–630 (Question no 301 of 15 January 2009 of Logghe).

  548. 548.

    See EDPS, Opinion on the Initiative of the Federal Republic of Germany, with a view to adopting a Council Decision on the implementation of Decision 2007/…/JHA on the stepping up of cross-border cooperation, particularly in combating terrorism and cross-border crime, O.J. C 89, 10.4.2008, pp. 1–7 (‘EDPS, Opinion cross-border cooperation 2008’).

  549. 549.

    About the Prüm Treaty, see also Chap. 2, §§ 155–156. For a (critical) discussion of the Prüm Treaty (in the U.K.), see, e.g., House of Lords, Prüm: an effective weapon against terrorism and crime?, London, HL Paper 90, 9 May 2007, 98 p. (‘House of Lords, Prüm, 2007’) also available at http://www.publications.parliament.uk/pa/ld200607/ldselect/ldeucom/90/90.pdf

  550. 550.

    Belgium, e.g., reached as 10th Union Member State an agreement for the exchange of inter alia DNA data (but also fingerprint and other biometric data) with the United States in December 2010. The agreement was signed on 20 September 2011 but remained subject to approval by the national parliament. The Netherlands concluded a similar agreement with the United States end of November 2010.

  551. 551.

    See, e.g., Kaye, Science Fiction and Shed DNA, 2006 replying to an article of E. Joh, ‘Reclaiming “Abandoned” DNA’: The Fourth Amendment and Genetic Privacy’, 100 Northwestern University Law Review 2006, p. 857, available at http://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=86638

  552. 552.

    ESHG, Guidelines, 2001, p. 37.

  553. 553.

    It becomes increasingly clear that certain combinations of genetic information can provide important knowledge of the risk of a person to become diseased. For instance, carriers of the APOE4 genotype have an increased risk to get alzheimer’s disease at an earlier age in life than APOE3 carriers.

  554. 554.

    See also Cole, who pointed to the similarity of the issues debated for DNA and fingerprinting in criminal matters: ‘Many of the most urgent issues now being debated with reference to DNA have been debated before with reference to fingerprinting. Indeed, our current discourse over DNA typing in many ways uncannily echoes the discourse in the early twentieth century when fingerprinting was a powerful new criminal identification technology sweeping the world’ in S. Cole, ‘Fingerprint Identification and the Criminal Justice System: Historical Lessons for the DNA Debate’, in D. Lazer (ed.), The Technology of Justice: DNA and the Criminal Justice System’, available at http://www.ksg.harvard.edu/dnabook/

  555. 555.

    One of the reasons is also that the results of a DNA analysis are believed and are perceived by the public as very accurate and irrefutable. Another reason is that through some high level (criminal) case, its use became known as well as the risks.

  556. 556.

    On the other hand, some have criticized with regard to for example the Belgian Act DNA analysis of 1999 that uncertainty seems to remain to the extent that guarantees relating to the security and the confidentiality for the processed data, and the criteria for the administration of the databases have not been determined yet. See Article 7 Act DNA analysis of 1999 and Article 13 and 14 of the Royal Decree of 4 February 2002 (as modified).

  557. 557.

    Other guarantees with regard to the protection of private life include a clear professional secrecy obligation for the personnel members of the NICC and access control specifications (only personnel of the entity DNA Index System within NICC has access) (see Art. 15 Royal Decree of 4 February 2002).

  558. 558.

    By the date due, i.e., 24 October 1998, some countries had not yet (fully) implemented the Directive 95/46/EC. These countries were France, Luxembourg, the Netherlands, Germany and Ireland.

  559. 559.

    As we indicated before, it is not our aim to give an overview of the rights and obligations which are generally applicable to biometric data under Directive 95/46/EC. Such overviews have been made before and we refer to these documents. See also § 181 and footnotes 6 and 7.

  560. 560.

    Hijmans, Recent developments, 2010, p. 224.

  561. 561.

    Title V of the original Treaty on European Union (TEU).

  562. 562.

    Title VI of the original Treaty on European Union (TEU). In the Maastricht Treaty, mechanisms were established for co-operation and intergovernmental decision making in these area’s of CFSP and JHA. About the development of the Third Pillar in the EU, see R. Bieber and J. Monar (eds.), Justice and Home Affairs in the European Union. The Development of the Third Pillar, Brussels, European Interuniversity Press, 1995, 437 p. The (as previously known) Third pillar JHA, created by the Maastricht Treaty in 1992, was later integrated by the Treaty of Amsterdam in 1997 in the European Community Treaty.

  563. 563.

    About the abolishment of the pillar structure and the main consequences thereof, see, e.g., R. Barents, Het Verdrag van Lissabon. Achtergronden en commentaar, Deventer, Kluwer, 2008, p. 148. About the Lisbon Treaty, see footnote 578.

  564. 564.

    See also, on this issue, H. Hijmans and A. Scirocco, ‘Shortcomings in EU Data Protection in the Third and the Second Pillars. Can the Lisbon Treaty be expected to help?’, Common Market Review 2009, pp. 1485–1525 (‘Hijmans and Scirocco, Shortcomings in the EU Data Protection, 2009’); see also D. Blas, ‘Ensuring effective data protection in the field of police and judicial activities: some considerations to achieve security, justice and freedom’, ERA Forum, 2010, pp. 233–250, also published online on 13.07.2010 at http://www.springerlink.com/content/u6566750w5954262/

  565. 565.

    See, e.g., P. De Hert and A. Sprokkereef, ‘Regulation for biometrics as a primary key for interoperability?’ in E. Kindt and L. Müller (eds.), D.3.10. Biometrics in identity management, Frankfurt, FIDIS, 2007, pp. 47–55 (‘De Hert and Sprokkereef, Biometrics as a primary key, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007’).

  566. 566.

    See Chap. 2, §§ 148–154.

  567. 567.

    For example, the debate of access by law enforcement to financial information (see the SWIFT case) or airline passengers’ data (see the Passenger Name Record (‘PNR’) discussion). About a new proposal for access to PNR, see also Part II, Chap. 5, footnote 344.

  568. 568.

    See, e.g., European Parliament, Resolution on the First Report on the implementation of the Data Protection Directive (95/46/EC) (COM(2003) 265 – C5-0375/2003 – 2003/2153(INI), 9.03.2004, O.J. C102E, 28.4.2004, pp. 147–153.

  569. 569.

    Council Framework Decision 2008/977/JHA of 27 November 2008 on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters, O.J. L 350, 30.12.2008, pp. 60–71 (‘Framework Decision 2008/977/JHA’); on this issue, see, e.g., also E. De Busser, Data Protection in EU and US Criminal Cooperation. A Substantive Law Approach to the EU Internal and Transatlantic Cooperation in Criminal Matters between Judicial and Law Enforcement Authorities, Antwerpen, Maklu, 2009, 473 p.

  570. 570.

    European Commission, ‘Safeguarding Privacy in a Connected World: A European Data Protection Framework for the 21st Century’, COM(2012) 9 final, 25.1.2012, 13 p.; European Commission, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final, 25.1.2012, 118 p. (‘European Commission, Proposal for General Data Protection Regulation COM(2012) 11 final’) and European Commission, Proposal for a Directive of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data, COM(2012) 10 final, 25.1.2012, 54 p. (‘European Commission, Proposal for Directive COM(2012) 10 final’); the two Proposals taken together will be referred to as the ‘Proposals for Reform 2012’ or ‘Reform proposals’; see also European Commission, Consultation on the legal framework for the fundamental right to protection of personal data, July 2009. The consultation was closed 31 December 2009. The Commission received 168 responses to the consultation. These can all be consulted in extenso at DG Justice, Public consultations, available at http://ec.europa.eu/justice/news/consulting_public/news_ consulting_0003_en.htm

  571. 571.

    About the concept, see Part III.

  572. 572.

    See also P. Hustinx, ‘Recent developments in EU data protection: stepping up to more comprehensive and more effective protection’, speech at the RISE conference ‘Ethics and Governance of Biometrics and Identification Technologies’, Brussels, 9 December 2010, available at www.edps.europa.eu

  573. 573.

    Article 1 Directive 95/46/EC.

  574. 574.

    See ECJ, C-222/84, Johnston v. Chief Constable of the Royal Ulster Constabulary, 15.5.1986, ECR 1986, § 18 (‘ECJ, Johnston v. Chief Constable 1986’) (see also Part II, Chap. 5, footnote 37); for the inspiration of the data protection legislation by the fundamental rights, see ECJ, Joint Cases C-465/00, C-138/01 and C-139/01, Rechnungshof v. Österreichischer Rundfunk and others and Christa Neukomm and Joseph Lauermann v. Österreichischer Rundfunk, 20.05.2003, ECR 2003, p. I-04989, § 68: ‘It should also be noted that the provisions of Directive 95/46/EC, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy’, as biometric data processing do as we contend, ‘must necessarily be interpreted in the light of fundamental rights, which, according to settled case law, form an integral part of the general principles of law whose observance the Court ensures (…)’ (ECJ, Rechnungshof v. Österreichischer Rundfunk, 2003’). The latter case, involving the collection of data by name relating to an individual’s professional income, above a certain level, with a view to communicating it to third parties, in the case at hand a public authority, and which the Court found to infringe the right of the persons concerned to respect for private life (see § 74), was one of the first decisions of the Court of Justice on Directive 95/46/EC. See also Part II, Chap. 5, § 268.

  575. 575.

    Charter of Fundamental Rights of the European Union, O.J. C 364, 18.12.2000, pp. 1–22.

  576. 576.

    The Charter was re-proclaimed on 12 December 2007, after some minor amendments.

  577. 577.

    After its original adoption in 2000, the EU Charter was slightly amended in 2007.

  578. 578.

    Treaty of Lisbon amending the Treaty on European Union and the Treaty establishing the European Community, 13 December 2007, O.J. C 306, 17.12.2007, pp. 1–229, also available at http://eur-lex.europa.eu/JOHtml.do?uri=OJ:C:2007:306:SOM:en:HTML; see also the consolidated versions of the Treaty on European Union and the Treaty on the Functioning of the European Union (previously named Treaty establishing the European Community), O.J. C 115, 9.05.2008, in particular Article 6 (1) of the (revised) Treaty on European Union, p. 19, available at http://eur-lex.europa.eu/JOHtml.do?uri=OJ:C:2008:115:SOM:en: HTML; in an intermediate step, the EU Charter was first inserted in the Treaty Establishing a Constitution for Europe (O.J. C 310, 16.12.2004, pp. 1–474). Because of ratification problems, this was not carried through however. See also footnote 584 above. About the evolution of EU human rights law, see, e.g., G. de Bùrca, ‘The evolution of EU Human Rights law’, in P. Craig and G. de Bùrca (eds.), The evolution of EU Law, Oxford, Oxford University Press, 2011, pp. 465–497 and the several references therein.

  579. 579.

    This was previously named the Treaty establishing the European Economic Community (EEC Treaty) of 1957, often referred to as the Treaty of Rome.

  580. 580.

    The United Kingdom and Poland negotiated restrictions regarding the application of the EU Charter. No justiciable rights were created by Title IV of the EU Charter (on solidarity) for these countries except if such rights are provided in their national laws. See Protocol on the Application of the Charter of Fundamental Rights of the European Union to Poland and to the United Kingdom, O.J. C 306, 17.12.2007, pp. 156–157. The Lisbon Treaty contains profound changes, including the move away from unanimity in the Council of Ministers and a far more important role for the European parliament.

  581. 581.

    See, e.g., de Bùrca, G., ‘The evolution of EU Human Rights law’, in P. Craig and G. de Bùrca (eds.), The evolution of EU Law, Oxford, Oxford University Press, 2011, pp. 465–497; see also M. Kumm, ‘Internationale Handelsgesellschaft, Nold and the New Human Rights Paradigm’, in M. Maduro and L. Azoulai (eds.), The Past and Future of EU Law, Oxford and Portland, Hart, 2010, p. 106 (‘Kumm, New Human Rights Paradigm, 2010’). This author comments e.g., that hereby and by earlier decisions of the ECJ in this sense, ‘the authority of EC law against potential challenges before national courts in the name of domestic constitutional rights’ is strengthened.

  582. 582.

    Article 52 (3) of the EU Charter states that ‘the meaning and scope of those rights shall be the same’. About the relationship between the EU Charter and the ECHR, see for example, F. Tulkens, ‘Towards a Greater Normative Coherence in Europe: The Implications of the Draft Charter of Fundamental Rights of the European Union’ (2000) 21 HRLJ 329.

  583. 583.

    See, however, about how the ECJ connects other rights of the Charter, in particular the right to data protection (which is not mentioned in the ECHR) with rights known from the ECHR below.

  584. 584.

    The ratification of the Treaty establishing a Constitution for Europe was stopped in 2005 due to rejection of voters in some Member States.

  585. 585.

    The fundamental right to data protection as applied by the European Court of Human Rights is deduced and based on Article 8 ECHR (see also below § 431).

  586. 586.

    With ‘Union law’, reference is in fact made to the EU Charter.

  587. 587.

    See R. Barents, Het Verdrag van Lissabon. Achtergronden en commentaar, Deventer, Kluwer, 2008, p. 160 (‘Barents, Het Verdrag van Lissabon, 2008’). The author further refers to the acceptance of the horizontal effect of the market freedoms by the Court of Justice.

  588. 588.

    See K. Lenaerts and E. de Smijter, ‘The Charter and the Role of the European Courts’, MJ 2001, (90), p. 92 (‘Lenaerts and de Smijter, The Charter and the Role of the Courts, 2001’). For a discussion of the complex system of human rights protection in Europe, including some milestones in case law, see, e.g., Smis, S., Ch. Janssens, S. Mirgaux and K. Van Laethem, Handboek Mensenrechten. De Internationale bescherming van de rechten van de mens, Antwerpen, Intersentia, 2011, pp. 213–379 (Smis, Janssens, Mirgaux and Van Laethem, Handboek Mensenrechten, 2011’); about the European Court of Human Rights see below § 427. The ECJ is further competent for the interpretation and the application of the TEU and the TEC and that the law is observed.

  589. 589.

    See Lenaerts and de Smijter, The Charter and the Role of the Courts, 2001, pp. 100–101. See also Accession of the European Union, available at http://hub.coe.int/what-we-do/human-rights/eu-accession-to-the-convention

  590. 590.

    See P. Lemmens, ‘The Relationship between the Charter of Fundamental Rights of the EU and the ECHR: Substantive Aspects’, MJ 2001 (49), p. 55 (‘Lemmens, The Relationship between the Charter and the ECHR, 2001’).

  591. 591.

    Art. 263 para. 4 TFEU. See Lenaerts and Van Nuffel, Europees recht, 2011, pp. 665–688.

  592. 592.

    Art. 276 TFEU.

  593. 593.

    Lenaerts and Van Nuffel, Europees recht, 2011, pp. 375–376.

  594. 594.

    The right to data protection has already been recognized as a national constitutional right, for example in the Netherlands.

  595. 595.

    See for other applications, ECJ, C-70/10, Scarlet Extended v. Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM), 24.11.2011, §§ 50–51 (‘ECJ, Scarlet 2011’) and ECJ, C-70/10, Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM) v. Netlog NV, 16.02.2012

  596. 596.

    See, for previous studies, e.g., L. Bygrave, ‘Data Protection Pursuant to the Right to Privacy in Human Rights Treaties’, International Journal of Law and Information Technology 1998, pp. 247–284 (‘Bygrave, Data Protection 1998’); L. Bygrave, ‘The Place of Privacy in Data Protection Law’, University of NSW Law Journal 2001, 6 p. (‘Bygrave, Place of Privacy, 2001’), available at http://www.austlii.edu.au/au/journals/UNSWLJ/2001/6.html

  597. 597.

    P. De Hert and S. Gutwirth, ‘Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action’, in S. Gutwirth, Y. Poullet, P. De Hert, C. de Terwangne, S. Nouwt (eds.), Reinventing Data Protection?, Springer, 2009, (3), p. 9 (De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, in Gutwirth et al., Reinventing Data Protection, 2009’); about privacy and the right to dignity, see E. Bloustein, ‘Privacy as an Aspect of Human Dignity: An answer to Dean Prosser’, 39 New York University Law Review, 1964, p. 962 et seq.

  598. 598.

    Bygrave, Place of Privacy, 2001.

  599. 599.

    P. De Hert and S. Gutwirth, ‘Annex 1: Making sense of privacy and data protection: A prospective overview in the light of the future of identity, location-based services and virtual residence’ in Institute for Prospective Technological Studies, Security and Privacy for the Citizen in the Post-September 11 Digital Age: A Prospective Overview, European Commission, 2003, pp. 111–162; Ibid., p. 126: ‘From the start the Data Protection Directive was based on a double logic: the achievement of an Internal Market (in this case the free movement of personal information) and the protection of fundamental rights and freedoms of individuals. It is said that in the directive, both objectives are equally important, but in legal terms the economic perspective and internal market arguments prevailed.(…) The rights-objective was less clear, especially since the Directive 95/46/EC contained several business-friendly regulations that were far from inspired by human rights arguments’.

  600. 600.

    N. Robinson, H. Graux, M. Botterman, L. Valeri, Review of the European Data Protection Directive, Cambridge, Rand, 2009, (‘Rand, Review of the European Data Protection Directive 95/46/EC, 2009’), p. 81.

  601. 601.

    ‘Ambient environment’ refers to a vision of an environment where individuals are surrounded on all sides by ‘intelligent interfaces supported by computing and networking technology that is everywhere, embedded in everyday objects such as furniture, clothes, vehicles, roads and smart materials. It is a vision where computing capabilities are connected, everywhere, always on, enabling people and devices to interact with each other and with the environment’ (from Y. Punie, A social and technological view of Ambient Intelligence in Everyday Life: What bends the trend?, IPTS, Technical Report, EUR 20975 2003, p. 6, available at http://ftp.jrc.es/EURdoc/20975-ExeSumm.pdf). Eric Schmidt, CEO of Google, used the term ‘augmented humanity’ and announced in September 2010 that we are entering the ‘Age of augmented humanity’, where he refers to an environment where connected devices will tell you what you want and tell you what to do. See L. Gannes, ‘Eric Schmidt: Welcome to ‘Age of Augmented Humanity”, 7.09.2010, Gigaom, available at http://gigaom.com/2010/09/07/eric-schmidt-welcome-to-the-age-of-augmented-humanity/. For some existing social networking service for smart phones which started to explore this vision (and location-based data), see Gowalla and Foursquare.

  602. 602.

    See also Article 29 Data Protection Working Party, Recommendation 4/99 on the inclusion of the fundamental right to data protection in the European catalogue of fundamental rights, WP 26, 7 September 1999, 3 p., in which no further clarifications are provided either.

  603. 603.

    ECJ, C-92/09 and C-93/09, Volker und Markus Schecke and Hartmut Eifert, 09.11.2010, §§ 47–48 (‘ECJ, Volker und Markus Schecke, 2010’). It is interesting to note that the Court in this case which involved the processing of data related to legal entities that obtained agricultural funds of the Union, extended protection under Article 7 and of 8 of the EU Charter to legal persons ‘in so far as the official title of the legal person identifies one or more natural persons’.

  604. 604.

    De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, in Gutwirth et al., Reinventing Data Protection, 2009, p. 10.

  605. 605.

    Article 8 ECHR also contains references to legitimate aims, however, but for exceptions to the fundamental right (see below).

  606. 606.

    De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, in Gutwirth et al., Reinventing Data Protection, 2009, p. 9; but: see ECtHR, Gaskin v. the United Kingdom, no. 10454/83, 7 July 1989, Series A no. 160 involving a claim of an applicant placed in public care as a baby until reaching majority for inter alia access to records; see and compare with I.v. Finland 2008, where in fact the failure to provide practical and effective protection against unauthorized access to health related data was considered an unlawful interference with Article 8 ECHR.

  607. 607.

    See, e.g., ECtHR, Goodwin v. United Kingdom, no. 28957/95, 11 July 2002, § 92 (‘Goodwin, 2002’). In this case, although the Court acknowledged inter alia breach of Article 8 ECHR, it did not expressly acknowledge a right to a transsexual to rectify (or to correct) gender. This was not recognized either in previous cases the Court mentioned in § 73 as well.

  608. 608.

    For a recent case on this issue, see ECJ, C-518/07, European Commission v. Federal Republic of Germany, 9.3.2010, ECR 2010, p. I-01885 (‘ECJ, Commission v. Germany, 2010’). The Court declared that Germany failed to fulfill its obligations under the Directive 95/46/EC by submitting DPAs of the Länder supervising data processing in the private sector ‘to State scrutiny’.

  609. 609.

    See S. Gutwirth, ‘Biometrics between opacity and transparency’, Ann Ist Super Sanita, 2007, pp. 61–65 (‘Gutwirth, Biometrics between opacity and transparency, 2007’); JRC, Biometrics at the Frontiers, 2005, p. 77. Such opacity tool for biometric data processing could be a general right that the data subject is entitled, as a matter of default, to control identifying representations of him or herself. See also De Hert, Background paper, p. 23.

  610. 610.

    Lemmens, The Relationship between the Charter and the ECHR, 2001, p. 58.

  611. 611.

    One of the first known pieces of privacy legislation, however, dates from long before: England’s 1361 Justices of the Peace Act, which legislated for the arrest of eavesdroppers and stalkers. See A. Beresford and F. Stajano, ‘Location Privacy in Pervasive Computing’, Pervasive Computing, IEEE, 2003, pp. 46–55, also available at www.psu.edu. See also the Fourth Amendment to the US Constitution (ratified with the other first nine Amendments as the Bill of Rights, effective December 15, 1791) (‘The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, (…)’), which combined with the First Amendment (right to free speech and the right to assemble peacefully) and the Fifth Amendment (protection against self-incrimination), as interpreted, provides the U.S. nationals with a constitutional right to privacy.

  612. 612.

    There are 167 parties to the Covenant, signed by 72 (status of 9.02.2011). For an overview, see http://treaties.un.org/pages/ViewDetails.aspx?src=TREATY&mtdsg_no=IV-4&chapter=4&lang=en. The Covenant was adopted by e.g., Belgium, France and the Netherlands. Several declarations and reservations, however, were made. For an analysis of the challenges under Article 17 of the International Covenant on Civil and Political Rights (protection against inter alia arbitrary or unlawful interference with his privacy), see T. Parker, ‘Are we protected? The Adequacy of Existing Legal Frameworks for Protecting Privacy in the Biometric Age’, Ethics and Policy of Biometrics, Lecture Notes in Computer Science, 2010, pp. 40–46 (Parker, Are we protected, 2010’).

  613. 613.

    See M. Hogg, ‘The Very Private Life of the Right to Privacy’, Privacy and Property, Edinburgh, Edinburgh University Press, 1994, p. 2: ‘A perpetual problem dogging any discussion of the subject is that there are as many definitions of the notion of privacy as there are commentators on the issue’.

  614. 614.

    The term privacy is hereby used as a synonym of the term ‘private life’ (‘privé-leven’ or ‘persoonlijke levenssfeer’/‘vie privée’). About the term ‘private life’, see P. Lemmens, ‘Het recht op eerbiediging van het privé-leven’, in Liga voor Mensenrechten, Privacy in de administratiefrechtelijk praktijk, Gent, 1989, pp. 10–16 (‘Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989’).

  615. 615.

    It is our aim nor possible to mention all important scholars discussing privacy. We just mention a few hereunder and in the following footnotes (see, e.g. footnote 643). For a comparative study (in particular for the United States, Germany and Italy) of the emergence of the right to respect for private life, see F. Rigaux, La protection de la vie privée et des autres biens de la personnalité, Brussels/Paris, Bruylant, 1990, pp. 639–652 (‘Rigaux, Protection vie privée, 1990’). This author gives in his elaborate work also a very detailed study about the nature of the right to respect for private life, including methods of applying restrictions. For another comparative study, see H. Janssen, Constitutionele interpretatie. Een rechtsvergelijkend onderzoek naar de vaststelling van de reikwijdte van het recht op persoonlijkheid, Den Haag, Sdu, 2003, 493 p. See also, e.g., K. Lemmens, ‘The Protection of Privacy between a Rights-Based and a Freedom-Based Approach: What the Swiss Example can teach us’, MJ 2003, pp. 381–403 and S. Gutwirth, R. Gellert and R. Bellanova, M. Friedewald, P. Schütz, D. Wright, E. Mordini and S. Venier, Legal, social, economic and ethical conceptualisations of privacy and data protection, Prescient, D.1, 23.03.2011, 76 p., available at http://www.prescient-project.eu/prescient/inhalte/download/PRESCIENT-D1---final.pdf; see also See Guldix, De persoonlijkheidsrechten, 1986, p. 382 et seq.

  616. 616.

    The right to respect for the home (‘one’s home is one’s castle’) is also sometimes tended to be considered under the broader scope of the protection afforded by the right to privacy. However, it is one of the oldest constitutional rights (which is for this reason sometimes also the most detailed) which can be found in separate provisions in many countries (see, e.g., Article 15 of the Belgian Constitution, Article 12 in the current version of the Constitution of the Netherlands and Article 13 §1 of the German Constitution).

  617. 617.

    J. Dumortier, ICT-recht, Leuven, Acco, 2008, p. 83 (‘Dumortier, ICT-recht, 2008’); P., Lemmens, ‘Het recht op eerbiediging van de persoonlijke levenssfeer, in het algemeen en ten opzichte van de verwerking van persoonsgegevens in het bijzonder’ in P. Arnou, L. Delwaide, E. Dirix, L. Dupont and R. Verstraeten (eds.), Liber amicorum J. Armand Vandeplas, 1994, Gent, Mys & Breesch, pp. 316–322 (‘Lemmens, Het recht op eerbiediging van de persoonlijke levenssfeer, 1994’); see also R. Leenes, B.-J. Koops, P. De Hert (eds.), Constitutional Rights and New Technologies. A Comparative Study, The Hague, Asser, 2008, 301 p. in which the constitutional protection against new technologies in various countries is discussed along these components of privacy; see also R. Decorte, ‘De achterkant van de privacy. Kan het beroep op privacy leiden tot straffeloosheid?’, N.J.W. 2003, pp. 800–801.

  618. 618.

    See Hendrickx, Privacy en Arbeidsrecht, 1999, pp. 5–14. New categories, however, may be added, especially if new technologies are adopted. For example, ‘privacy of location information’. See, e.g., A. Beresford and F. Stajano, ‘Location Privacy in Pervasive Computing’, Pervasive Computing, IEEE, 2003, pp. 46–55, also available at www.psu.edu

  619. 619.

    Dumortier, ICT-recht, 2008, p. 79. A rather new trend is the so-called ‘post-democracy’, described by some as weakening democratic states (and its values as well) ‘by acts of trans-national corporations, global financial speculations and other influences which limit the possibilities of democratic decisions within the states’. See R. Drulàkovà, Post-democracy within the EU: Internal security vs. human rightsunavoidable conflict?, Paper prepared for the Central and East European International Studies Association (CEEISA) 4th Convention, Tartu, Estonia, 25–27 June 2006, p. 6. In Part II, the risks of the use of biometric data including the risk of surveillance will be explained.

  620. 620.

    Article 3 EU Charter. Paragraph 2 of the article refers in particular to some practices in the field of biomedicine (and bioethics).

  621. 621.

    Hendrickx, Privacy en Arbeidsrecht, 1999, p. 12. About the concepts, see ibid., pp. 11–12. In some countries, e.g. Belgium, the right to physical integrity may also be considered as a general principle of law.

  622. 622.

    See also Bygrave, The body as data, 2003, p. 1.

  623. 623.

    This component is by some also named ‘relation privacy’ (‘relationele privacy’). It will be relevant in decisions placing prisoners in isolation, but also in decisions about asylum applications in which the right to family reunification is important.

  624. 624.

    This includes also provisions in the Belgian Penal code.

  625. 625.

    See, e.g., R. Clarke, Introduction to Dataveillance and Information Privacy, and Definitions of Terms, 15.08.1997, updated, available at http://www.rogerclarke.com/DV/Intro.html#InfoPriv; see also V. Schönberger, ‘Strands of Privacy: DNA databases and informational privacy and the OECD Guidelines’, in Lazer (ed.), The Technology of Justice: DNA and the Criminal Justice System’, available at http://www.ksg.harvard.edu/dnabook/

  626. 626.

    See also C. Cuijpers, Privacyrecht of privaatrecht? Een privaatrechtelijk alternatief voor de implementatie van de Europese privacyrichtlijn, Wolf Legal Publishers, 2004, 441 p. (‘Cuijpers, Privacyrecht of privaatrecht?, 2004’); about information privacy, also e.g., E. Volokh, ‘Freedom of Speech and Information Privacy: The Troubling Implications of a Right to stop People from Speaking About You’, 52 Stan. L. Rev. 1999–2000, pp. 1049–1124 and J. Kang, ‘Information Privacy in Cyberspace Transactions’, 50 Stan. L. Rev. 1997–1998, pp. 1193–1294.

  627. 627.

    S. Warren and L. Brandeis, ‘The Right to Privacy’, 4 Harv. L. Review 1890, p. 193 et seq.

  628. 628.

    Warren and Brandeis wrote as follows: ‘Recent inventions and business methods call attention to the next step which must be taken for the protection of the person (…)’. ‘Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life; and numerous mechanical devices threaten to make good the prediction that “what is whispered in the closet shall be proclaimed from the house-tops”’. Warren and Brandeis in fact focused on the divulgation of that information as a privacy threat. The authors also referred to an essay written in 1890 by E. Godkin, a famous social commentator at that time. While Godkin was recognizing the growing threats to privacy and remained cynical about a solution, Warren and Brandeis thought that law could and should provide protection for privacy. See D. Solove, M. Rotenberg and P. Schwartz, Information Privacy Law, New York, Aspen, 2006, pp. 10–11.

  629. 629.

    For an interesting account of the developments in the Netherlands, see J. Holvast, ‘Vijftig jaar discussie over de informationele privacy in vogelvlucht’, P&I 2011, pp. 234–246 (‘Holvast, Informationele privacy in vogelvlucht, 2011, pp. 239–244’). In the United States, the right to privacy, however, was initially mainly based, such as in some other countries including Germany, on the right to property and the prohibition to interfere with it. For an interesting legal analysis of the right to privacy in the United States until the end of the 1960s by (and through the eyes of) a German legal scholar, see R. Kamlah, Right of privacy. Das allgemeine persönlichkeitsrecht in Amerikanischer sicht unter berücksichtigung neuer technologischer entwicklungen, Köln, Carl Heymanns, 1969, 195 p.

  630. 630.

    A. Westin, Privacy and Freedom, New York, Atheneum, 1970, p. 7 (‘Westin, Privacy and Freedom, 1970’).

  631. 631.

    For a discussion of Westin’s work, see also A. Jóri, ‘Data protection law-an introduction’, available at http://www.dataprotection.eu/pmwiki/pmwiki.php?n=Main.Privacy. See also Lemmens who pointed to this gradual development of a right for an autonomous development of one’s personality: ‘Deze ontwikkeling is een haast logisch antwoord op de steeds toenemende massificatie van de samenleving, die tot gevolg heeft dat het individu meer en meer gedwongen wordt om, willens nillens, in allerlei sociale verhoudingen een deel van zijn beslotenheid en autonomie prijs te geven’ (emphasis added). Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989, p 20. Other authors have also stressed that privacy should be seen as a form of freedom and informational self-determination. See e.g., S. Gutwirth, Privacy and the information age, Oxford, Rowman & Littlefield, 2002, 146 p.

  632. 632.

    For those early developments of the concept of ‘privacy’, see P. Lemmens, ‘De veiligheidsgordel en het recht op eerbiediging van het privé-leven’, R.W. 1979–80, (838), pp. 837–839 (‘Lemmens, De veiligheidsgordel, 1979’); see also Arthur R. Miller who published in 1971 in the United States the book ‘The Assault on Privacy’, in which he examined the effect of the technological revolution (of that time) on individual privacy. He made various proposals to reconcile technology with society values, which aroused discussion and controversy. See A. Miller, The Assault on Privacy: Computers, Data Bases and Dossiers, Ann Arbor, University of Michigan press, 1971.

  633. 633.

    ECtHR, Pretty v. the United Kingdom, no. 2346/02, 29.04.2002, § 66 (‘Pretty 2002’). The case concerned a person paralyzed and suffering from the devastating effects of a degenerative and incurable illness requesting immunity from prosecution (of her husband) if he assisted her in committing suicide. The Court mentioned in particular (in this context of distress and prevention of the management of death by not being entitled to receiving assistance in suicide) and similarly to a case before the Supreme Court in Canada, deprivation of autonomy and required justification under the principles of justice.

  634. 634.

    Goodwin 2002, § 90. The case concerns the rights of transsexuals.

  635. 635.

    Pretty 2002, §65.

  636. 636.

    See Von Hannover 2004, § 69.

  637. 637.

    In Belgium, see, e.g., H. Vandenberghe, ‘Bescherming van het privé-leven en recht op informatie via de massamedia’, R.W. 1969–70, p. 1462; J. Velu, Le droit au respect de la vie privée, Namur, Presses Universitaires de Namur, 1974, 160 p. (‘Velu, Le droit au respect de la vie privée, 1974’); J. Velu, ‘Preadvies’ in X., Privacy en de rechten van de mens. Handelingen van het Derde Internationaal Colloquium over het Europees Verdrag tot Bescherming van de Rechten van de Mens, Leuven, Acco, 1974, pp. 19–107. (‘Velu, Preadvies, 1974’); Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989, pp. 10–16. See also about the right to forget, K. Lemmens, ‘Sic transit gloria mundi: over vergeten en vergaan in het recht’, in Mensenrechten. Jaarboek 2000, Antwerpen-Apeldoorn, Maklu, pp. 51–53; for the Netherlands, see, e.g., J. Berkvens and J. Prins (eds.), Privacyregulering in theorie en praktijk, Deventer, Kluwer, 2007, 288 p.

  638. 638.

    See, for example, for Belgium, F. Rigaux, ‘La protection de la vie privée à l’égard des données à caractère personnel’, Annales de droit de Louvain 1993, (49), p. 53; S. Gutwirth, ‘De toepassing van het finaliteitsbeginsel van de privacywet van 8 december 1992 tot bescherming van de persoonlijke levenssfeer ten opzicht van de verwerking van persoonsgegevens’, T.P.R. 1993, (1409), pp. 1416–1417 (‘Gutwirth, De toepassing van het finaliteitsbeginsel, 1993’).

  639. 639.

    See D. Solove, Understanding Privacy, Cambridge (Massachusetts, U.S.A.), Harvard University Press, 2008, p. 101 et seq. Solove hereby admits the ‘conceptual jungle’ of defining privacy.

  640. 640.

    Ibid., p. 101. Solove focuses on four categories of data processing activities, in particular the collection, the processing and the dissemination of information and invasion and defines what the problems are in each context.

  641. 641.

    While Article 8 ECHR contains the right to respect for private and family life, home and correspondence, the right to respect for family life is not further analyzed, as the focus will remain on the right to respect for (individual) private life. References to Article 8 ECHR in this work are for this reason hence basically references to the right to respect for private life.

  642. 642.

    The European Court of Human Rights was set up in 1959 by the Council of Europe together with a European Commission of Human Rights (‘Commission’) to decide upon claims for alleged violations of the European Convention on Human Rights of 1950. The Commission had a ‘filtering role’ in relation with the petitions filed: as individuals did not have direct access to the Court, they had to apply to the Commission, which, if it found the case well-founded, would launch the case in the Court on the individual’s behalf (see Section II of the Convention before Protocol N° 11). Protocol N° 11 to the Convention (signed on 11 May 1994) entering into force on 1 November 1998 abolished the Commission and established the permanent European Court of Human Rights as single and permanent court. The Court has its seat in Strasbourg. The decisions of the Court are published in the Reports of Judgments and Decisions, the Court’s official series and are also electronically available via the HUDOC Portal of the Court available at http://www.echr.coe.int/Pages/home.aspx?p=caselaw/HUDOC&c=, which provides free online access to the case-law.

  643. 643.

    For legal authors who have analyzed the concept for Europe and under the Convention, see, e.g., R. Beddard, Human rights and Europe, Cambridge, Cambridge University Press, 1993, pp. 93–128 (‘Beddard, Human Rights, 1993’); A. Clapham, Human rights in the private sphere, Oxford, Clarendon Press, 1993, 385 p.; D. Harris, M. O’ Boyle and C. Warbrick, Law of the European Convention on Human Rights, Oxford, Oxford University Press, 2009, 902 p.

  644. 644.

    See also below § 450 et seq.

  645. 645.

    We further explain the notion of private sector in Part III, Chap. 7, §§ 119–121.

  646. 646.

    One of the reasons of this issue is the distinction generally made between public and private law. See more on this distinction and debate, A. Clapham, Human Rights. A Very Short Introduction. Oxford, Oxford University Press, 2007, pp. 112–114; F. Hendrickx, Privacy en Arbeidsrecht 1999, pp. 26–32; see also B. de Witte, ‘Direct effect, primacy, and the nature of the legal order’, in P. Craig and G. de Bùrca (eds.), The evolution of EU Law, Oxford, Oxford University Press, 2011, pp. 323–362; L. Verhey, ‘Horizontale werking van grondrechten: de stille Straatsburgse revolutie’ in Barkhuysen, T., van Emmerik, M. and Loof, J. (eds.), Geschakeld recht. Liber Amicorum voor Evert Alkema, Deventer, Kluwer, 2009, pp. 517–535 (‘Verhey, Horizontale werking van grondrechten, 2009’); L. Verhey, Horizontale werking van grondrechten, in het bijzonder van het recht op privacy, Zwolle, Tjeenk Willink, 1992, 487 p.; B. Oversteyns, ‘Het recht op eerbiediging van het privé-leven’, R.W. 1988–1989, (488), pp. 490–492 (‘Oversteyns, Recht op eerbiediging van het privéleven. 1988’).

  647. 647.

    On the concept of ‘Drittwirkung’ in Germany, see also Rigaux, Protection vie privée, 1990, pp. 674–683; A. Clapham, ‘The ‘Drittwirkung’ of the Convention’, in R. Macdonald, F. Matscher and H. Petzold (eds.), The European System for the Protection of Human Rights, Dordrecht, Martinus Nijhoff, 1993, pp. 163–206 (‘Clapham, The ‘Drittwirkung’ of the Convention, 1993’).

  648. 648.

    P. van Dijk, F. van Hoof, A. van Rijn and L. Zwaak (eds.), Theory and Practice of the European Convention on Human Rights, Antwerp, Intersentia, 2006, p. 29 (‘van Dijk, van Hoof, van Rijn and Zwaak (eds.), Theory and Practice of the European Convention 2006’).

  649. 649.

    See van Dijk, van Hoof, van Rijn and Zwaak (eds.), Theory and Practice of the European Convention, 2006, p. 29 and further references to the Verein gegen Tierfabriken case.

  650. 650.

    See A. Drzemczewski, ‘The domestic status of the European Convention on Human Rights; new dimensions’, Legal issues of European Integration, no 1, 1977, pp. 1–85; A. Drzemczewski, ‘The European Human Rights Convention and relations between private parties’, N.I.R.L. 1979, p. 163. Other terms used are ‘vertical’ application (‘doorwerking’/‘application vertical’), horizontal effect (‘horizontale werking’/‘application horizontale’) and private effect (‘privaatrechtelijke werking’/‘effet droit privé’) of the fundamental rights; see also Oversteyns, Recht op eerbiediging van het privéleven. 1988, pp. 491–492. Indirect horizontal effect refers to the theory that the fundamental rights can be enforced against private parties by using other concepts of private law such as good faith, equity, etc. However, some scholars have, in our view, correctly pointed out that this discussion as to whether fundamental rights have also effect in relations between private parties, is in fact an academic discussion (Oversteyns, Recht op eerbiediging van het privéleven. 1988, p. 492).

  651. 651.

    See, for example, without limitation, A. Clapham, Human rights obligations of non-state actors, Oxford, Oxford university press, 2006, 613 p.; van Dijk, van Hoof, van Rijn and Zwaak (eds.), Theory and Practice of the European Convention 2006, p. 32; B. de Witte, ‘Direct effect, Primacy, and the nature of the legal order’, in P. Craig and G. de Bùrca (eds.), The evolution of EU Law, Oxford, Oxford University Press, 2011, pp. 323–362; O. De Schutter, Fonction de juger et droits fondamentaux. Transformation du contrôle juridictionnel dans les ordres juridiques américain et européen, Brussels, Bruylant, 1999, p. 302.

  652. 652.

    K. Rimanque and P. Peeters, ‘De toepasselijkheid van de grondrechten in de betrekkingen tussen private personen. Algemene probleemstelling’, in Rimanque, K., (ed.), De toepasselijkheid van de grondrechten in private verhoudingen, Antwerpen, Kluwer, 1982, pp. 1–34; Hendrickx, Privacy en Arbeidsrecht 1999, pp. 23–24; E. Dirix, ‘Grondrechten en overeenkomsten’, in Rimanque, K., (ed.), De toepasselijkheid van de grondrechten in private verhoudingen, Antwerpen, Kluwer, 1982, (35), pp. 43–47 (‘Dirix, Grondrechten en overeenkomsten, in Rimanque, De toepasselijkheid van grondrechten in private verhoudingen, 1982’). For the Netherlands, see, e.g., C. Kortmann, Constitutioneel recht, Kluwer, 2008, p. 375. Since fundamental rights have according to some no direct horizontal effect, so called ‘personality rights’ (‘persoonlijkheidsrechten’) have been developed in private law in some countries, in particular in Belgium. See Hendrickx, Privacy en Arbeidsrecht 1999, pp. 24–25; see also Guldix, De persoonlijkheidsrechten, 1986, pp. 526–583.

  653. 653.

    Velu, Le droit au respect de la vie privée, 1974, pp. 49–50 and the references to several reports and numerous scholars of various countries in the footnotes, in particular the footnotes 85 and 93; see on this issue in particular also J. De Meyer, ‘Preadvies. Het recht op eerbiediging van het privé-leven, van de woning en van mededelingen in de betrekkingen tussen particulieren en de verplichtingen die daaruit voortvloeien voor de staten die partij zijn bij het verdrag. H/Coll.(70)4’, in X., Privacy en rechten van de mens. Handelingen van het Derde internationaal Colloquium over het Europees Verdrag tot Beschermping van de Rechten van de Mens, Leuven, Acco, 1974, pp. 251–284; see also Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, p. 1422: ‘De vrijheid van de privacy belichaamt de erkenning van de weerstand die in alle machtsverhoudingen wordt verdrongen. Haar bescherming situeert zich bijgevolg op het niveau van de individualiteit, en niet alleen tegen de totaliserende ingrepen van de verzorgingsstaat, maar ook tegen de steeds verdergaande beheersings- en managementaspiraties van private actoren’ (emphasis added); about the growing accepting, see also A. Clapham, Human rights in the private sphere, Oxford, Clarendon Press, 1993, pp. 90–91. Clapham specifically points to the decision of the ECtHR in X and Y. v. the Netherlands (no. 8978/80, 26 March 1985, Series A no. 91) and the positive obligations (see also below) of the Member States. The Court therein stated that the Convention creates obligations for States which involve ‘the adoption of measures designed to secure respect for private life even in the sphere of the relations of individuals between themselves’ (§ 23). See also with reference to Rees v. United Kingdom 1986 (see Part II, Chap. 5, footnotes 269 and 270), Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989, pp. 19–20. See also E. Brems, ‘Introduction’ in E. Brems (ed.), Conflicts Between Fundamental Rights, Antwerp, Intersentia, 2008, p. 2 and Alen, A. and K. Muylle, Compendium van het Belgisch staatsrecht, Syllabusuitgave, Mechelen, Kluwer, 2012, § 81 (‘Alen en Muylle, Belgisch Staatsrecht, 2012’) (see also footnote 710 below); see also, for views by common law specialists, M. Hurt, ‘The “horizontal effect” of the Human rights Act: moving beyond the public-private distinction’, in J. Jowell and J. Cooper (eds.), Understanding Human Rights Principles, Oxford and Portland, Oregon, Hart, 2001, pp. 161–177 and G. Phillipson, ‘Transforming Breach of Confidence? Towards a Common Law Right of Privacy under the Human Rights Act’, 66 MLR, 2003, (726), pp. 726–728.

  654. 654.

    See also Verhey, Horizontale werking van grondrechten, 2009, (517), p. 534: ‘(…) het gaat niet meer om de vraag of maar hoe grondrechten in horizontale verhoudingen doorwerken’. Depending on whether States adhere a monistic view or dualistic view, some States accept direct effect, while other States do not so easily (see also below). See also van Dijk, van Hoof, van Rijn and Zwaak (eds.), Theory and Practice of the European Convention 2006, pp. 26–27. For the Netherlands, see e.g., L. Prakke and C. Kortmann, Het staatsrecht van 15 landen van de Europese Unie, Deventer, Kluwer, 2009, p. 601 (‘Prakke and Kortmann, Het staatsrecht van 15 landen, 2009’): ‘Bij de grondwetsherziening van 1983 heeft de regering overwogen dat, ofschoon grondrechten primair waarborgen bieden tegen inbreuken door de overheid in haar publiekrechtelijke of privaatrechte-lijke hoedanigheid, bepaalde rechten tevens horizontale werking kunnen hebben, d.w.z. werking in rechtsverhoudingen tussen private instellingen en personen, kunnen hebben. De ontwikkeling op dit terrein is in hoofdzaak aan de rechter overgelaten’ (emphasis added).

  655. 655.

    Hendrickx, Privacy en Arbeidsrecht 1999, p. 26.

  656. 656.

    Fundamental rights are also considered as being derived from the law of nature, which implies that they shall have effect in the public ànd private sphere, and therefore have an absolute effect. See also J. De Meyer, ‘The right to respect for private and family life, home and communications in relations between individuals, and the resulting obligations for state parties to the Convention’, in A. Robertson (ed.), Privacy and human rights, 1973, p. 264.

  657. 657.

    See also Oversteyns, Recht op eerbiediging van het privéleven. 1988, p. 495. See also Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989, p. 20, pointing to evolutions in the case law of the ECtHR in this regard. See also Part II, Chap. 5, § 300 and § 321.

  658. 658.

    E.g., in Belgium. See W. Ganshof van der Meersch, ‘L’ordre public et les droits de l’homme’, J.T. 1968, p. 663 (‘Ganshof van der Meersch, L’ordre public, 1968’); J. Velu, Les effects directs des instruments internationaux en matière de droits de lhomme, Brussels, Swinnen-Prolegomena, 1982, p. 30. The Supreme Court in Belgium has acknowledged explicitly in 1971 precedence of international treaty rules which have direct effect. See Cass., 27.05.1971, Pas. 1971, I, pp. 886–920, Arr. Cass. 1971, p. 959 (‘If there is a conflict between a national rule and an international rule which has direct effect in the national legal system, the rule of the Treaty has precedence. This precedence follows from the nature of the by the Treaty stipulated law’ (free translation)); see also J. Velu and R. Ergec., La convention européenne des droits de lhomme, Brussels, Bruylant, 1990 p. 84. For France and the (complicated) priority of international treaties over national law, see Prakke and Kortmann, Het staatsrecht van 15 landen, 2009, p. 324 including in particular Cass. Fr., 24.05.1975, D., 1975, 497 and Cons. d’Etat, 20 October 1989, D., 1990, 135. For the effect of binding international treaty provisions in the Netherlands, see Prakke and Kortmann, Het staatsrecht van 15 landen, 2009, p. 601.

  659. 659.

    See also E. Kindt, E. Lievens, E. Kosta, Th. Leys, and P. de Hert, ‘Chapter 2. Constitutional rights and new technologies in Belgium’, in R. Leenes, B.-J. Koops, P. De Hert (eds.), Constitutional Rights and New Technologies. A Comparative Study, The Hague, Asser, 2008, (11), 19–20 (‘Kindt, Lievens, Kosta et al., Constitutional rights and new technologies in Belgium, in Leenes et al., Constitutional Rights and New Technologies, 2008’).

  660. 660.

    Cons. const. (France), n° 94-343-344 DC, 27 July 1994, Respect du corps humain. See also § 455 below.

  661. 661.

    Some of the early cases involved the use of new technology such as the use of the tape-recorder. See and compare the Supreme Court decision of Germany of 20 May 1958 which found the use thereof in breach of Article 8 § 2 and the Supreme Court decision of Austria of April 1965 which did not find such breach, described by Velu, Preadvies, 1974, (19), pp. 67–68. The latter case (Scheichelbauer v. Austria, app. no. 2645/65, 16.12.1979) was subsequently submitted to the Commission. The outcome however remains unclear (Velu reports that the case was initially inadmissible in relation with Article 8, which was later overturned). Belgian national decisions remained limited. See Velu, Preadvies, 1974, p. 67, no 113. For an interesting discussion of Article 8 ECHR, including the views of experts, see Velu, Le droit au respect de la vie privée, 1974, 160 p.

  662. 662.

    In general, more than 90 % of the Court’s judgments since its establishment in 1959 have been delivered between 1998 and 2008. See also ECtHR, Ten years of thenewCourt, available at http://www.echr.coe.int/ECHR/EN/Header/The+Court/Events+at+the+Court/10+years+of+the+new+Court/

  663. 663.

    See on this issue, P. De Hert, ‘Grondrechten die bijna niet verdedigd kunnen worden; De bescherming van persoonlijke gegevens op het Internet’, De rechten van de mens op het internet, Maklu, Antwerpen – Apeldoorn, 2000, (21), p. 33. The Court, however, hereby initially did not interpret the right to data protection in the same way as the right as laid down in the data protection legislation, for example in the Convention of 1981. The Court for example made a distinction between privacy sensitive information and non privacy sensitive information.

  664. 664.

    See ECtHR, Storck v. Germany, no. 61603/00, 16 June 2005, §143 (‘Storck 2005’). The case concerned someone who was involuntary placed in a clinic and medically treated while detained with various medicines against her will.

  665. 665.

    See, e.g., I. van der Ploeg, D.3.3a, Ethical Brief on Biometrics & Embedded Technology, Hide, 2009, p. 7 (‘van der Ploeg, Ethical Brief, 2009’).

  666. 666.

    For example, by the scanning of the retina, because of fear of medical effects (thermal damage (see Chap. 2, § 139)). This biometric method, however, is decreasing.

  667. 667.

    See and compare also with Velu, Le droit au respect de la vie privée, 1974, p. 70. The author refers to and cites the expert committee of the Council of Europe: ‘En ce qui concerne l’observation, le Comité considère comme une atteinte illicite à la vie privée, le fait de procéder, en quelque lieu que ce soit, à l’observation de personnes ou de biens: (….) b) si elle est effectuée clandestinement à l’aide de moyens techniques qui renforcent sensiblement les possibilités d’une perception naturelle. (…)’.

  668. 668.

    E.g., by surgery of the fingertips. See Part II, Chap. 4, § 193.

  669. 669.

    See Von Hannover 2004, § 50. See also ECtHR, Peck v. U.K., no. 44647/98, 28 January 2003, § 57 (‘Peck 2003’) discussed below. About the notion of identity, see, e.g., also Gutwirth, S., ‘Beyond identity?’, IDIS 2008, pp. 123–133.

  670. 670.

    See and compare also with I. van der Ploeg, Identity, Biometrics and Behavior Predictability, presentation at the Rise/Hide Conference, 9-10.12.2010, Brussels, available at http://riseproject.webtrade.ie/_fileupload/RISE%20Conference/Presentations/Irma%20 van%20der%20Ploeg.pdf

  671. 671.

    Peck 2003, § 87.

  672. 672.

    Sciacca 2006, § 48. See also Schüssel v. Austria 2002 and Von Hannover 2004, §§ 50–53.

  673. 673.

    ECmHR, Pierre Herbecq and Ligue des droits de lhomme v. Belgium, nos. 32200/96 and 32201/96, 14 January 1998, A.J.T., 1998 (‘Herbecq 1998’) with note P. De Hert and O. De Schutter, pp. 501–511. Video monitoring or the use of photograph equipment which does not record visual data as such was considered to fall outside the application field of Article 8 of the Convention. See and compare also with the opinion of the Belgian DPA in relation with video surveillance, discussed above in § 288.

  674. 674.

    ECtHR, P.G. and J.H. v. U.K., no. 44787/98, 25 September 2001, § 57 (‘P.G. and J.H. 2001’).

  675. 675.

    P.G. and J.H. 2001, §§ 59–60. In this case, the permanent recording of the voices of P.G. and J.H. during their answering of questions in a police cell as police officers listened to them, for further analysis, was regarding as interfering with their right to respect for their private lives. Other cases involving covert audio surveillance and recording include ECtHR, Armstrong v. U.K., no. 48521/99, 16 October 2002 and ECtHR, Allan v. U.K., no. 48539/99, 5 November 2002.

  676. 676.

    See also ECtHR, Rotaru v. Romania, no. 28341/95, 4.05.2000 (‘Rotaru 2000’) and ECtHR, Amman v. Switzerland, no. 27798/95, 16.02.2000 (‘Amman, 2000’). Rotaru 2000 involved the holding and use by the Romanian Intelligence Service of a file containing personal information. The latter case involved the tapping of telephone conversations and the collection and registration of secret information (about an investigation by police) with the Swiss national security card index of applicant, which was judged to be an interference because ‘not in accordance with the law’. See about these decisions also Part II, Chap. 4, § 133.

  677. 677.

    ECtHR, Reklos and Davourlis v. Greece, no. 1234/05, 15 January 2009 (‘Reklos and Davourlis 2009’).

  678. 678.

    With ‘Commission’, we refer to the European Commission of Human Rights. From 1954 until 1998, individuals had no direct access to the ECtHR but had to lodge first an application with the Commission, who would, if it found the case well founded, refer the case to the Court. See footnote 642 above.

  679. 679.

    ECmHR, Friedl v. Austria, no. 28/1994/475/556, §§ 49–51 (‘Friedl 1994’). See also ECtHR, Friedl v. Austria, no. 15225/89 (28/1994/475/556), 31 January 1995, Series A no. 305-B (striking of the case), with in Annex the Commission of Human Rights decisions of 1992 (on admissibility) and 1994.

  680. 680.

    See and compare with a similar case of that time, where no violation was withheld, where a photograph and personal details were taken: ECtHR, Murray v. the United Kingdom, no. 14310/88 (13/1993/408/487), 28 October 1994.

  681. 681.

    ECmHR, F. Reyntjens v. Belgium, no. 16810/90, 9 September 1992, D.R. 73, p. 136. The Commission noted that de identity information on the card, which ‘may not carry any information other than the bearer’s name, forenames, sex, date and place of birth, and main address, and his spouse’s name and forenames, where appropriate’ is not ‘information relating to private life’, hereby making implicitly a distinction between ‘public’ and ‘private’ data. About this case, see also N. Van Leuven, ‘Privacy: een onrustig begrip in volle ontplooiing’, in Lemmens, P. (ed.), Uitdagingen door en voor het E.V.R.M., Mechelen, Kluwer, 2005, pp. 8–9.

  682. 682.

    ECmHR, Kinnunen v. Finland, no 24950/94, 15 May 1996, p. 4; Similarly, ECmHR, McVeigh, ONeill and Evans v. the United Kingdom, no. 8022/77, 8025/77, 8027/77, 18 March 1981, D.R. 25, p. 15.

  683. 683.

    ECtHR, Perry v. the United Kingdom, no. 63737/00, 17 July 2003 (‘Perry 2003’).

  684. 684.

    Perry 2003, § 43.

  685. 685.

    S. and Marper 2008, § 80.

  686. 686.

    Ibid., §§ 84–85.

  687. 687.

    Ibid., §121. For a further analysis of this decision, see, e.g., De Beer, D., De Hert, P., González Fuster G. and Gutwirth, S., ‘Nouveaux éclairages de la notion de “donnée personnelle” et application audacieuse du critère de proportionnalité’, Rev. trim. dr.h. 2010, pp. 141–161.

  688. 688.

    See also, for an interesting case in the United Kingdom, EWCA, Wood v. Commissioner of Police for the Metropolis [2009] EWCA Civ 414 (‘EWCA, Wood 2009’), also available at http://www.bailii.org/ew/cases/EWCA/Civ/2009/414.html This case involved the taking of photographs in a public street by the police of an individual having attended a general meeting of a company associated with the arms and defense industry. The Court of Appeal found in this case that, while the mere taking of the pictures by the State in a public street has in general been consistently held to be no interference with the right to privacy, aggravating circumstances could do so and found that the keeping and storage beyond the need by police (after fear for demonstration) infringed art. 8 (2) ECHR. The Court herein referred several times to the position of the ECtHR in S. and Marper 2008.

  689. 689.

    See, e.g., the invitation to the public to use face recognition technology mentioned above and the roll out of face recognition technology in Facebook in 2011.

  690. 690.

    As stated, the protection of information privacy under Article 8 ECHR, however, was initially problematic. (see also above footnote 663.)

  691. 691.

    Von Hannover 2004, § 70.

  692. 692.

    E.g., the Court found in Storck that because of the lack of effective State control over private psychiatric institutions at the relevant time, the State failed to comply with its positive obligation to protect the applicant against interferences with her private life. Storck 2005, §§ 149–150. See also ECtHR, X and Y. v. the Netherlands, no. 8978/80, 26 March 1985, Series A no. 91, § 23 as mentioned in § 429 and footnote 653 above

  693. 693.

    Van Dijk, van Hoof, van Rijn and Zwaak (eds.), Theory and Practice of the European Convention 2006, p. 739.

  694. 694.

    For example, the provisions with regard to the court competent to annul the regulation. About the legal remedies in case of contradiction with the fundamental legal rights of the Convention, see J. Velu and R. Ergec., La convention européenne des droits de lhomme, Brussels, Bruylant, 1990, pp. 531–532 and for Belgium, for example, A. Van Oevelen, ‘Schade en schadeloosstelling bij de schending van grondrechten door private personen’, in K. Rimanque (ed.), De toepasselijkheid van de grondrechten in private verhoudingen, Antwerpen, Kluwer, 1982, p. 423 et seq.; about biometric systems and civil liability, see also Part III, Chap. 9, § 542; see also Rigaux, Protection vie privée, 1990, pp. 765–770.

  695. 695.

    See and compare with Article 13 ECHR which only refers to ‘an effective remedy before a national authority’.

  696. 696.

    See also Hendrickx, Privacy en Arbeidsrecht 1999, pp. 73–74.

  697. 697.

    To take account of developments in technology, the word ‘correspondence’ has been replaced by ‘communications’.

  698. 698.

    ECJ, Case 29/69, Erich Stauder v. City of Ulm, 12.11.1969, ECR 1969, p. 419. In this case, Mr. Stauder contested the requirement that he had to identify himself in order to obtain coupons allowing him to purchase butter at a reduced fee.

  699. 699.

    See also our discussion of this issue above.

  700. 700.

    ECJ, C-92/09 and C-93/09, Volker und Markus Schecke and Hartmut Eifert, 09.11.2010. For a discussion of the conditions for interference under the Union fundamental rights, see Part II, Chap. 5.

  701. 701.

    E.g., the Act of 1982 on a Population Census in Germany which gave lead to the groundbreaking Volkszählungsurteil decision of the Constitutional Court in 1983 (see § 457). About the ‘Volkstelling’ in 1970 in the Netherlands, which was one of the triggers for the coming into existence of privacy concerns, see Holvast, Informationele privacy in vogelvlucht, 2011, pp. 239–244.

  702. 702.

    See and compare also with the coming into existence of data protection legislation in 1978 in France. About the raise of privacy concerns in Belgium, when plans were made at the end of the 1960s to set up a National Registry (which was operational in 1968 without any legal regulation), see the detailed analysis in S. Gutwirth, Waarheidsaanspraken in recht en wetenschap, Brussel and Antwerpen, VUB Press and Maklu, 1993, pp. 668–670 (‘Gutwirth, Waarheidsaanspraken, 1993’).

  703. 703.

    See, generally, Prakke and Kortmann, Het staatsrecht van 15 landen, 2009, 1063 p.; E. Brems (ed.), Conflicts Between Fundamental Rights, Antwerp, Intersentia, 2008, 690 p.; for Belgium, see e.g., A. Alen and K. Muylle, Compendium van het Belgisch staatsrecht, Syllabusuitgave, Mechelen, Kluwer, 2008, §§ 80–81 and § 461 (‘Alen and Muylle, Belgisch staatsrecht, 2008’) and Alen en Muylle, Belgisch Staatsrecht, 2012; for the Netherlands, see, e.g., Van der Pot, reworked by D. Elzinga and R. de Lange with collaboration of H. Hoogers, Handboek van het Nederlandse staatsrecht, Deventer, Kluwer, 2006, 1073 p. (‘Van der Pot et al., Handboek Nederlandse staatsrecht, 2006’); for France, see, e.g., F. Sudre, Droit Européen et international des droits de lhomme, Paris, Presses Universitaires de France, 2008, 843 p. (‘Sudre, Droit Européen, 2008’).

  704. 704.

    L. Burgorgue-Larsen, ‘L’appréhension constitutionnelle de la vie privée en Europe. Analyse croisée des systèmes constitutionnels allemand, espagnol et français’, in F. Sudre (ed.), Le droit au respect de la vie privée au sens de la Convention européenne des droits de lhomme, Brussels, Bruylant, 2005, p. 104 (‘Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005’): ‘Le résultat est une homogénéisation de l’interprétation des droits grâce à une « étanchéité » des systèmes constitutionnels au « droit venu d’ailleurs… »: le droit de la Convention mais aussi, de plus en plus, le droit de l’Union européenne’. On this emerging transnational or European constitutionalism, see also below § 462.

  705. 705.

    Article 22bis (‘Every child has the right to respect for his moral, physical, psychological and sexual integrity. (…)’ confirmed the rights of children as laid down in international conventions. See also Article 15 of the Belgian Constitution providing that the domicile is inviolable, Article 23 of the Belgian Constitution which states that: ‘Everyone has the right to lead a life in conformity with human dignity. (…)’ and Article 29 stating that the confidentiality of letters is inviolable. Before including the right to respect for privacy as a fundamental right in the constitution in Article 22, there were many discussions amongst legal scholars about the nature of the right to privacy. See on this issue also Gutwirth, Waarheidsaanspraken, 1993, pp. 649–658.

  706. 706.

    Parl. St. Senaat, 1991–92, n° 100-4/5, p. 3; about the concept, see also P. Lemmens, ‘Het recht op eerbiediging van de persoonlijke levenssfeer, in het algemeen en ten opzichte van de verwerking van persoonsgegevens in het bijzonder’ in P. Arnou, L. Delwaide, E. Dirix, L. Dupont and R. Verstraeten (eds.), Liber amicorum J. Armand Vandeplas, 1994, Gent, Mys & Breesch, pp. 313–326; for the concept of privacy in employment relations under Belgian law, see Hendrickx, Privacy en Arbeidsrecht, 1999, 358 p.

  707. 707.

    See also on this subject, Kindt, Lievens, Kosta et al., Constitutional rights and new technologies in Belgium, in Leenes et al., Constitutional Rights and New Technologies, 2008, pp. 11–55.

  708. 708.

    Parl. St. Kamer, 1993–94, 997/5, p. 2.

  709. 709.

    Alen and Muylle, Belgisch staatsrecht, 2008, § 461 (and § 713). Alen and Muylle mention that this is a rather new and important development in (Belgian) case law: ‘Een belangrijke nieuwigheid is gelegen in de rechtspraak van het Grondwettelijk Hof, volgens welke dat Hof, bij zijn toetsing aan de grondrechten in Titel II van de grondwet, rekening houdt met bindende verdragsbepalingen die analoge rechten of vrijheden waarborgen.’

  710. 710.

    Alen and Muylle also prudently state that case law seems to accept that Art. 22 can be invoked and enforced in relations between private parties. See Alen and Muylle, Belgisch staatsrecht, 2008, § 714. In 2012, same authors are more firm on this issue: Alen en Muylle, Belgisch Staatsrecht, 2012, § 81, p. 40: ‘Er wordt aangenomen dat de grondrechten ook derdenwerking, horizontale werking of Drittwirkung kunnen hebben, dit is werking kunnen hebben tussen particulieren onderling. Grondrechten zijn tegen schending door private personen beschermd, hetzij rechtstreeks, hetzij onrechtstreeks; (…)’ (see also same authors, § 723).

  711. 711.

    AH (Belgium), N° 131/2005, 19.07.2005, B.5.1. In other words, an interference with this fundamental right shall meet the requirements of legality, requiring a legal provision meeting precision and foreseeability, has to pursue a legitimate aim (‘wettig doel’(sic)/’but légitime’) and shall be proportionate (‘in een juist verband van evenredigheid’/‘proportionnée à lobjectif légitime poursuivi’) with this aim (B.5.5.). In fact, the Court, by referring to the need for a legitimate aim, however, uses the words ‘nagestreefde wettige doelstelling’/‘lobjectif légitime poursuivie’) (sic). About the confusion of the terms on this point, we refer to Part II, Chap. 5 on the proportionality principle. About the role of the Court and various principles applied, see P. Popelier, ‘The Role of the Belgian Constitutional Court in the Legislative Process’, Statute Law Review 2005, pp. 22–40.

  712. 712.

    AH (Belgium), N° 131/2005, 19.07.2005, B.5.2. Although Article 8 ECHR, which has in Belgium direct effect (see above § 430), does not require a law in the formal sense (see Part II, Chap. 5, §§ 304–310), this (national) requirement for a formal law is required by the Belgian Constitutional court and remains in the Belgian constitutional tradition important. See also Alen and Muylle, Belgisch staatsrecht, 2008, § 81; however, case law of the Constitutional court and the Supreme court is conflicting in this regard: see, e.g., Cass., 2 May 1990, J.T.T. 1990, p. 469 where the Supreme Court stated that any regulation of national law, whether written or not, provided it is accessible and is precise, can be a law in the sense of article 8 ECHR. For other case law of the Belgian Supreme Court, see also Lemmens, De veiligheidsgordel, 1979, pp. 839–840.

  713. 713.

    See Art. 19 Constitution.

  714. 714.

    See Art. 26 Constitution.

  715. 715.

    See also Alen and Muylle, Belgisch staatsrecht, 2008, § 80 and the references. Alen and Muylle criticize this extension to other fundamental rights without explicit provision in the Constitution. One should question whether one cannot invoke the other grounds and conditions for interference following Article 8 §2 ECHR, which will be discussed in Part II, Chap. 5 As stated, it was the intention of the legislator that the Article 22 of the Belgian Constitution is interpreted in the same way as Article 8 ECHR. Therefore the case law under Article 8 §2 ECHR will be relevant for Article 22 of the Constitution as well.

  716. 716.

    See for example, AH (Belgium), N° 50/2003, 30.04.2003, considerans B.8.10: ‘(…) Deciding otherwise would mean that the competences of the Communities and the Regions would become without subject. The fact that an intrusion in the private life and the family life is the result of a regulation of specific matter which belongs to competence of the regional legislator, does not result in a breach of his competence’.

  717. 717.

    GwH (Belgium), N° 59/2010, 27.05.2010. See also Part II, Chap. 4, § 27.

  718. 718.

    The right was only set forth in the Code civil (article 9) and was to be applied by the ordinary judges.

  719. 719.

    See, e.g., Cons. Const. (France), N° 2010-25 of 16 September 2010, §11. See also Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005, pp. 98–100.

  720. 720.

    See above footnote 660.

  721. 721.

    Cons. const. (France), n° 94-352 DC, 18 January 1995, Vidéosurveillance; see also Cons. const. (France) n° 2012-652, 22 March 2012 (Loi protection de lidentité) mentioned in Part III, Chap. 7, § 186 below: ‘Considérant (…) que la liberté proclamée par l’article 2 de la Déclaration des droits de l’homme et du citoyen de 1789 implique le droit au respect de la vie privée’ (§ 8).

  722. 722.

    See E. Zoller, ‘Le contrôle de constitutionnalité en France’, in Droit constitutionnel, Paris, Presses Universitaires de France, 1999, pp. 257–260 (‘Zoller, Droit constitutionnel, 1999’).

  723. 723.

    Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005, pp. 105–106; see also Drzemczewski, A., European Human Rights Convention in domestic law. A comparative study, Oxford, Clarendon Press, 1983, pp. 70–81.

  724. 724.

    The Conseil dEtat is an entity of the French government and examines and provides advisory approval of several statutory instruments such as draft legislation and ‘décrets’ (enacted by ministers and which further define the scope and application of statutes or acts of parliament) but also functions as a judicial body of last resort reviewing appeals against administrative decisions and decisions from administrative courts. See, for a recent decision relating to the central storage of biometric data for the epassport and the eID, Conseil d’Etat, N° 317827, 317952, 318013, 318051, 26 October 2011 mentioned in Part III, Chap. 7, § 186 below. The Conseil dEtat could be considered as a supreme court in administrative matters, while the Cour de Cassation is the supreme court for civil and criminal matters and courts. About the French system, see, e.g., C. Dadamo and S. Farran, The French Legal System, London, Sweet & Maxwell, 1993, pp. 111–113; J. Schwarze, European Administrative Law, London, Sweet and Maxwell, 2006, pp. 108–111 (‘Schwarze, European Administrative Law, 2006’); Bousta, R., ‘La « spécificité » du contrôle constitutionnel français de proportionnalité’, R.I.D.C. 2007, pp. 859–877 (‘Bousta, La “spécificité” du controle constitutionnel français de proportionnalité, 2007’); E. Zoller, ‘Le contrôle de constitutionnalité en France’, in Zoller, Droit constitutionnel, 1999, pp. 181–281.

  725. 725.

    See Ordinance N° 58-1067 constituting an Institutional Act on the Constitutional Council, Section 17 et seq, also available at http://www.conseil-constitutionnel.fr/conseil-constitutionnel/root/bank_mm/anglais/en_ordinance_58_1067.pdf; see also ECJ, Melki and Abdeli, 2010 (Chap. 4, footnote 58 and Chap. 5, § 373, footnote 408).

  726. 726.

    Van der Pot et al., Handboek Nederlandse staatsrecht, 2006, pp. 387.

  727. 727.

    B.-J. Koops and M. Groothuis, ‘Constitutional Rights and New Technologies in the Netherlands’, in R. Leenes, B.-J. Koops, P. De Hert (eds.), Constitutional Rights and New Technologies. A Comparative Study, The Hague, Asser, 2008, (159) p. 165 (‘Koops and Groothuis, Constitutional rights in the Netherlands, Leenes et al., Constitutional Rights and New Technologies 2008’).

  728. 728.

    About the effect of Art. 8 ECHR, see Van der Pot et al., Handboek Nederlandse staatsrecht, 2006, pp. 390–391; the necessity criterion is in some cases, in particular in case of the processing of sensitive data, such as racial or ethnic information, interpreted as ‘indispensable’. See Kamerstukken II 1996/97, 25 001, nr. 24, Minderhedenbeleid 1997, Brief van de Ministers van Justitie en van Binnenlandse Zaken, 29.04.1997, p. 1.

  729. 729.

    Koops and Groothuis, Constitutional rights in the Netherlands, in Leenes et al., Constitutional Rights and New Technologies 2008, p. 167.

  730. 730.

    But, see, e.g., also the United States and the widely cited dissenting opinion of Brandeis in Olmstead v. United States of 1928 (277 U.S. 438), where the Supreme Court held that the use of wiretapped private telephone conversations did not constitute a violation of the Fourth and Fifth Amendments (reversed by Katz v. United States in 1967). In his opinion, Brandeis changed his focus making personal privacy more a relevant matter of constitutional law. He stated that ‘discovery and invention have made it possible for the Government, by means far more effective than stretching upon the rack, to obtain disclosure in court of what is whispered in the closet’. About the constitutional status of the right to privacy in the United States, see also footnote 611.

  731. 731.

    Article 1(1) of the German Federal Constitution reads as follows: ‘Die Würde des Menschen ist unantastbar. Sie zu achten und zu schützen ist Verpflichtung aller staatlichen Gewalt’. Article 2 (1) reads as follows: ‘Jeder hat das Recht auf die freie Entfaltung seiner Persönlichkeit, soweit er nicht die Rechte anderer verletzt und nicht gegen die verfassungsmäßige Ordnung oder das Sittengesetz verstößt’. The right to respect for privacy nor the right to data protection are explicitly mentioned in the German Constitution.

  732. 732.

    BVerfG, 15.01.1970, BVerfGE 27, 344, 1 BvR 13/68 (‘Ehescheidungsakten’).

  733. 733.

    BVerfG, 15.12.1983, BVerfGE 65, 1 (‘Volkszählung’). Free translation of the following phrase: ‘[…] Das Grundrecht gewährleistet insoweit die Befugnis des Einzelnen, grundsätzlich selbst über die Preisgabe und Verwendung seiner persönlichen Daten zu bestimmen’ (p. 46). About this decision, see also, e.g., S. Fischer-Hübner, IT-security and Privacy. Design and use of privacy-enhancing Security Mechanisms, Springer, 2001, pp. 8–10 (‘Fischer-Hübner, IT-security and Privacy, 2001’); see also G. Hornung and CH. Schnabel, ‘Data protection in Germany I: The population census decision and the right to informational self-determination’, Computer Law & Security Review 2009, pp. 84–88; see also D. Flaherty, ‘On the Utility of Constitutional Rights to Privacy and Data Protection’, 41 Case W. Res. L. Rev., 1990–1991, (831), p. 852.

  734. 734.

    BVerfG, 1 BvR 2378/98 and 1 BvR 1084/99, 3.03.2004 (‘Grosser Lauschangriff’); about this Kernbereich, see § 54 of the decision. In this decision, the Court decided that new provisions in the Code of Criminal Procedure for implementing acoustic domicile surveillance in their form of that time violated the general right of personality.

  735. 735.

    Examples which were given include a very personal conversation with a close family member, a conversation with a religious counselor (Grosser Lauschangriff’, § 132) and expressions of intimate feelings or sexuality (Grosser Lauschangriff’, § 123).

  736. 736.

    Online Durchsuchung, paragraph 169; see also the Volkszählungsurteil decision of 1983 stating about the general right of personality that it ‘needs under the present and future conditions of the automated data processing to a particular extent protection’ (emphasis added) (‘Diese Befugnis bedarf under den heutigen und Künftigen Bedingungen der automatischen Datanverarbeitung in besonderem Maβe des Schutzes’) (p. 44); See also Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005, p. 101.

  737. 737.

    Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005, p. 105, in particular footnote 111.

  738. 738.

    See the conclusions of Šušnjar, after a review of the case law by the Federal Constitutional Court in D. Šušnjar, Proportionality, fundamental rights and balance of powers, Leiden, Martinus Nijhoff, 2010, pp. 145–146; in a similar sense, see also W. Van Gerven, ‘The Effect of Proportionality on the Actions of Member States of the European Community: National Viewpoints from Continental Europe’, in E. Ellis (ed.), The Principle of Proportionality in the Laws of Europe, Oxford, Hart Publishing, 1999, p. 44 (‘Van Gerven, Proportionality. National Viewpoints, 1999’).

  739. 739.

    In 2004, for example, the House of Lords ruled in Campbell v. MGN Ltd that publication of the pictures of Ms. Campbell leaving the drugs treatment center violated Article 8 ECHR. EPIC and Privacy International, An International Survey, 2007, p. 991; see also the Human Rights Act 1998, available at http://www.legislation.gov.uk/ukpga/1998/42/contents

  740. 740.

    See also above at footnote 611.

  741. 741.

    In the Netherlands, a similar opinion exists with some politicians, e.g., the Minister of Justice Donner in 2000. See B.-J. Koops, Tendensen in opsporing en technologie. Over twee honden en een kalf, Nijmegen, Wolf Legal Publishers, 2006, p. 31.

  742. 742.

    An ordinary superficial search of bags of for example, airport passengers, can in the view of the courts in the United Kingdom hardly obtain a certain level of seriousness (see R (Gillan) v. Commissioner of Police for the Metropolis [2006] 2 AC 307, § 28).

  743. 743.

    See EWCA, Wood 2009, §§ 22–26.

  744. 744.

    See and compare also with the testimony of Wayman in 1998 before the Subcommittee on Domestic and International Monetary Policy of the Committee on Banking and Financial Services, U.S. House of Representatives on ‘Biometrics and the future of money’, 20.05.1998, pp. 12–17, available at http://commdocs.house.gov/committees/bank/hba48784.000/hba48784_0f.htm

  745. 745.

    This could be debated however for biometric testing and research databases. At the same time, protection of the data subjects participating in building this type of biometric databases is required as well and application of the Directive 95/46/EC justified.

  746. 746.

    Our research also revealed that the position taken by some data protection authorities, for example in relation to the qualification of biometric data (e.g., templates) as personal data or as sensitive data, relies on sometimes only a few opinions of scholars or merely on the position taken in neighboring countries, without a thorough (public) debate on the subject.

  747. 747.

    See also the analysis of inter alia Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe, Sudre, Le droit au respect de la vie privée, 2005, pp. 69–115.

  748. 748.

    See above § 272 et seq. Our working definition proposes to define biometric data as ‘all personal data which (a) relate directly or indirectly to unique or distinctive biological or behavioral characteristics of human beings and (b) are used or are fit to be used by automated means (c) for purposes of identification, identity verification or verification of a claim of natural persons’.

  749. 749.

    But: see for example Section 4 n) of the Slovakian data protection legislation, as discussed.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Kindt, E.J. (2013). Biometric Data, Data Protection and the Right to Privacy. In: Privacy and Data Protection Issues of Biometric Applications. Law, Governance and Technology Series, vol 12. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7522-0_3

Download citation

Publish with us

Policies and ethics