Skip to main content

Legal safeguards for privacy and data protection in ambient intelligence


To get the maximum benefit from ambient intelligence (AmI), we need to anticipate and react to possible drawbacks and threats emerging from the new technologies in order to devise appropriate safeguards. The SWAMI project took a precautionary approach in its exploration of the privacy risks in AmI and sought ways to reduce them. It constructed four “dark scenarios” showing possible negative implications of AmI, notably for privacy protection. Legal analysis of the depicted futures showed the shortcomings of the current legal framework in being able to provide adequate privacy protection in the AmI environment. In this paper, the authors, building upon their involvement in SWAMI research as well as the further advancement of EU privacy analysis, identify various outstanding issues regarding the legal framework that still need to be resolved in order to deal with AmI in an equitable and efficacious way. This article points out some of the lacunae in the legal framework and postulates several privacy-specific safeguards aimed at overcoming them.

This is a preview of subscription content, access via your institution.


  1. The Safeguards in a World of Ambient Intelligence (SWAMI) project brought together researchers from several disciplines, such as technologists, sociologists, economists and lawyers, with the aim of undertaking an interdisciplinary and holistic approach of AmI. The project finished in July 2006. Its results can be found in Wright, Gutwirth et al. [32].

  2. For more on the SWAMI dark scenarios and methodology, as well as on identified threats and vulnerabilities, see Wright, Gutwirth et al. [32].

  3. The term dataveillance (data + surveillance) appears to have been coined by Roger Clarke in a paper he wrote entitled “Information Technology and Dataveillance”, published in the Communications of the ACM, Vol. 31, Issue 5, May 1988.

  4. Future of Identity in the Information Society (FIDIS) is a multidisciplinary Network of Excellence supported by the European Commission’s Sixth Framework Programme (

  5. This refers not only to privacy, but also to the question of the transparency of the systems and the access to the generated knowledge that would allow the individuals to be aware of mistakes, understand the decisions taken and react if they feel the decisions are wrong, discriminating or too intrusive. Crucial here are the means individuals have at their disposal to remain in control of their data, and their empowerment in the new environment.

  6. The stable connection between the item and the individual is then necessary. It is possible to establish such a link in the case of personal products carried by their owners. The Article 29, Data Protection Working Party, illustrated such a situation and the attendant concerns raised thereby in its document on RFID [2]. However, such a stable link between the item and the owner has been contested. See Hildebrandt and Meints [23].

  7. Information about an object and its environment (e.g. humidity) can actually contain information on a person. We refer here, inter alia, to an example given by a speaker at the UbiComp Workshop 2007 [18], and subsequent discussion of participants.

  8. In particular, the Universal Declaration of Human Rights 1948 [33], Article 12, and the International Covenant on Civil and Political Rights 1966 [35], Article 17.

  9. In Niemitz versus Germany [40], the European Court of Human Rights stated that there is no reason why the notion of “private life” should be taken to exclude activities of a professional or business nature. In Halford versus United Kingdom [41], Miss Halford, a senior officer whose telephone calls were intercepted without warning, was granted privacy protection in her office space, although not absolute protection.

  10. In cases such as Khan [42] and P·H. & J.H. versus The United Kingdom [43], the European Court of Human Rights decided that a violation of ECHR Article 8 had taken place, but it nevertheless accepted the use of the evidence in a criminal process.

  11. Article 6 of the Data Protection Directive.

  12. Article 6 of the Data Protection Directive.

  13. Article 7 of Data Protection Directive [37].

  14. Such correlated information can offer a comprehensive picture of the individual. See Hildebrandt M [20].

  15. The Data Protection Directive [37] applies to the processing of “personal data”, defined as any information relating to an identified or identifiable natural person (“data subject”). Article 2 of the Directive defines an identifiable person as one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his psychic, psychological, mental, economic, cultural or social identity.

  16. The Data Protection Directive’s definition of personal data can be assumed to cover the data stored by a tag for the purpose of identification (e.g. tags in passports or identity cards), or when a reference database can be used for making a connection between information on a tag and an individual. However, taking into account the increasing availability of data, as well as computing and data mining capabilities, it is possible to establish such links between information on a tag and the identity of an individual even in the absence of direct reference data. On this point, see, for example, Hildebrandt and Meints [23]. In the context of RFID and similar technologies, the usefulness of the concept of personal data can be contested. The Article 29 Data Protection Working Party states in its Working document on data protection issues related to RFID technology [2] that, if the processing of data collected via RFID systems is to be covered by the Data Protection Directive, we must determine whether such data relates to an individual, and whether such data concerns an individual who is identified or identifiable. In assessing whether information concerns an identifiable person, one must apply Recital 26 of the Data Protection Directive, which says that “account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify the said person”. And further, “Finally, the use of RFID technology to track individual movements which, given the massive data aggregation and computer memory and processing capacity, are if not identified, identifiable, also triggers the application of the data protection Directive.” This case-by-case approach was upheld in a recent document from the Article 29 Data Protection Working Party on the definition of personal data [1].

  17. For a broader overview of legal safeguards in the field of privacy and data protection and in other legal fields, see Wright and Gutwirth et al. [32]. The SWAMI consortium also proposed some general safeguards addressing issues concerning the regulation of AmI. The SWAMI and FIDIS research [23, 32] made it clear that a comprehensive approach is needed to protect privacy and that such a comprehensive approach should also address related issues such as liability and antidiscrimination rules.

  18. For example, such an approach was adopted by the Privacy in an Ambient World (PAW) project, which developed the language enabling the distribution of data in a decentralised architecture, with the use of sticky policies attached to data providing information on what kind of use has been licensed to the particular actor (licensee). Enforcement relies on auditing. See The FIDIS consortium considered automated management of data and control of privacy policies. See Schreurs et al. [31]. The PRIME project also discussed the matter. See Hansen et al. [19]. See also the UbiComp 2007 presentation by Le Métayer [7]. Management and auditing possibilities offered by technology should be coupled with effective liability for breach of privacy rules.

  19. See, Beslay L and Hakala H [5]. An in-depth analysis of the concept and the various categories of digital territories can be found in a recent IPTS report by Daskala et al. [10].

  20. Idem.

  21. An overview of the existing identity management systems has been given by Bauer et al. [4]; Hildebrandt and Backhouse [21] and Müller et al. [29]. Development of identity (information) management systems has been discussed in Hansen et al. [19], Leenes et al. [27] and within the FIDIS project (See Schreurs et al. [30]).

  22. For more on RFID safeguards, see Wright and Gutwirth et al. [32].

  23. As already mentioned, such information on a tag can be a unique identifier enabling profiling activities. See, Kardasiadou et al. [24].

  24. Some standards have already been adopted in the RFID domain. The International Organization for Standardization has developed sector-specific standards, as well as more generic standards. Some standards have also been developed by EPCglobal (, an industry-driven organisation, creating standards to connect servers containing information relating to items identified by EPC (Electronic Product Code) numbers.

  25. Researchers and legislators should also seek further solutions addressing the issue of profiling enabled by such technologies. See supra: “Dangers of AmI Enabling Technology: RFIDs”; see also Hildebrandt and Meints [23].

  26. An example of such (emerging) initiatives are the EPCglobal Ltd. guidelines regarding privacy in RFID technology [14] and the CDT (Centre for Democracy and Technology) Working Group on RFID Privacy Best Practices [8].


  1. Article 29 Data Protection Working Party (2007) Opinion 4/2007 on the concept of personal data. (01248/07/EN WP 136)

  2. Article 29 Data Protection Working Party (2005) Working document on data protection issues related to RFID technology. (10107/05/EN WP 105)., last accessed 02.07.2007

  3. Article 29 Data Protection Working Party (2004) Opinion on more harmonised information provisions. (11987/04/EN-WP 100).

  4. Bauer M, Meints M, Hansen M (eds) (2005) Structured overview on prototypes and concepts of identity management systems, Future of Identity in the Information Society (FIDIS) Deliverable D3.1.

  5. Beslay L, Hakala H (2003) Digital territory: bubbles. (draft version available at, last accessed on 02.07.2007

  6. Beslay L, Punie Y (2003) The virtual residence: identity, privacy and security. In: Security and privacy for the citizen in the post-september 11 digital age: a prospective overview, IPTS report to the European Parliament Committee on Citizens’ Freedoms and Rights, Justice and Home Affairs (LIBE)., last accessed 02.07.2007

  7. Bryce C, Dekker M, Etalle S, Le Metayer D, Le Mouel MF, Minier M, Moret-Bailly J, Ubeda S (2007) Ubiquitous privacy protection: position paper. In: Bajart A, Muller H, Strang T (eds) UbiComp 2007 Workshops Proceedings, Innsbruck, Austria September 2007, pp 397–402

  8. Centre for democracy and technology (CDT) Working Group on RFID (2006) Privacy best practices for deployment of RFID technology, interim draft., last accessed 02.07.2007

  9. Casassa Mont M, Pearson S, Bramhall P (2003) Towards accountable management of identity and privacy: sticky policies and enforcable tracing services, HP labs technical reports, HPL–2003-49, Bristol 2003.

  10. Daskala B, Maghiros I (2007) Digital territories: towards the protection of public and private spaces in a digital and ambient intelligence environment. EUR 22765 EN

  11. De Hert P (2006) What are the risks and what guarantees need to be put in place in view of interoperability of police databases? Standard Briefing Note ‘JHA & Data Protection’, No. 1, produced on behalf of the European Parliament

  12. De Hert P, Gutwirth S (2003) Making sense of privacy and data protection: a prospective overview in the light of the future of identity, location-based services and virtual residence. In: Security and privacy for the citizen in the post-September 11 digital age: a prospective overview, IPTS Report to the European Parliament Committee on Citizens’ Freedoms and Rights, Justice and Home Affairs (LIBE)., last acccessed 02.07.2007

  13. De Hert P, Gutwirth S (2005) Privacy, data protection and law enforcement. Opacity of the individual and transparency of power. In: Claes E, Duff A, Gutwirth S (eds) Privacy and the criminal law. Oxford, Antwerp

    Google Scholar 

  14. EPCglobal Ltd. (2007) Guidelines regarding privacy in RFID technology., last accessed 02.07.2007

  15. Faull J (2007) Heard by the House of Lords, minutes of evidence taken before the Select Committee of the European Union (Sub-Committee F), The EU-US PRN Agreement, 22 March 2007, p 5., last consulted 02.07.2007

  16. Gutwirth S (2002) Privacy and the information age, Lanham/Boulder/New York/Oxford, Rowman & Littlefield Publ. 158 p

  17. Gutwirth S, De Hert P (2008) Regulating profiling in a democratic constitutional state In: Hildebrandt M, Gutwirth S (eds) Profiling the European citizen. Cross disciplinary perspectives, Springer Press, Dordrecht, pp 271–291

  18. Han J, Shah A, Luk M, Perrig A (2007) Don’t sweat your privacy, using humidity to detect human presence. In: Bajart A, Muller H, Strang S (eds) UbiComp 2007 Workshops Proceedings, Innsbruck, Austria, pp 422–427

  19. Hansen M, Krasemann H (eds) (2005) Privacy and identity management for europe, PRIME White Paper. Deliverable 15.1

  20. Hildebrandt M (2006) “Profiles and correlatable humans”. In: Stehr N (ed) Who owns knowledge? Transaction Books, New Brunswick

  21. Hildebrandt M, Backhouse J (eds) (2005) Descriptive analysis and inventory of profiling practices, FIDIS (Future of Identity in the Information Society) Deliverable D7.2,

  22. Hildebrandt M, Koops BJ (eds) A vision of ambient law, FIDIS (Future of Identity in the Information Society) D7.9, version as of 15.11.2007 (

  23. Hildebrandt M, Meints M (eds) (2006) RFID, profiling, and AmI, FIDIS (Future of Identity in the Information Society) Deliverable D7.7.

  24. Kardasiadou Z, Talidou Z (2006) Report on legal issues of RFID Technology, LEGAL IST (Legal Issues for the Advancement of Information Society Technologies). Deliverable 15

  25. Lahlou S (2005) Living in a goldfish bowl: lessons learned about privacy issues in a privacy-challenged environment, Workshop on UbiComp Privacy, privacy in context

  26. Lahlou S, Langheinrich M, Rocker C (2005) Privacy and trust issues with invisible computers. Commun ACM 40(3):59–60

    Article  Google Scholar 

  27. Leenes R, Schallabock J, Hansen M (2007) Prime white paper v2., 2007., last accessed 02.07.2007

  28. Meints M (2006) AmI: the European perspective on data protection legislation and privacy policies, presentation at the SWAMI International Conference on Safeguards in a World of Ambient Intelligence, 21

  29. Müller G, Wohlgemuth S (eds) (2005) Study on mobile identity management, FIDIS (Future of Identity in the Information Society) Deliverable D3.3.

  30. Rouvroy A (2007) Privacy, data protection, and the unprecedented challenges of ambient intelligence, 11 September, available at SSRN:

  31. Schreurs W, Hildebrandt M, Gasson M, Warwick K (eds) (2005) Report on actual and possible profiling techniques in the field of ambient intelligence, FIDIS (Future of Identity in the Information Society) Deliverable D7.3.

  32. Wright D, Gutwirth S, Friedewald M, Punie Y, Vildjiounaite E (eds) (2008) Safeguards in a world of ambient intelligence, Springer Press, Dordrecht, p 291

Legal Acts

  1. Universal Declaration of Human Rights, United Nations, 1948

  2. European Convention on Human Rights of 4 November 1950

  3. International Covenant on Civil and Political Rights, United Nations, 1966

  4. Charter of Fundamental Rights of the European Union, OJ C 341, 18.12.2002, pp 1–22

  5. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data on the free movement of such data, OJ L 281, 23/11/95, pp 31–50

  6. Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) OJ L 201, 31/07/2002, pp 37–47

  7. Directive 2006/24/EC of the European Parliament and of the Council on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC, OJ L 105, 13/4/2006, pp 54–63

Case Law

  1. ECHR, Niemitz v. Germany (23.11.1992)

  2. ECHR, Halford v. the United Kingdom (27.03. 1997)

  3. ECHR, Khan v. the United Kingdom (12.03.2000)

  4. ECHR, P·H. & J.H. v. the United Kingdom (25.12.2001)

  5. ECHR, Copland v. the United Kingdom (3.04. 2007)

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Serge Gutwirth.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

De Hert, P., Gutwirth, S., Moscibroda, A. et al. Legal safeguards for privacy and data protection in ambient intelligence. Pers Ubiquit Comput 13, 435–444 (2009).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Personal Data
  • Data Protection
  • Privacy Protection
  • Ambient Intelligence
  • Electronic Product Code