Advertisement

Flexible Regulation with Privacy Points

  • Hanno Langweg
  • Lisa Rajbhandari
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7449)

Abstract

We propose a utilitarian approach to a uniform regulatory framework to assess privacy impact and to establish compensatory actions. “Privacy points” gauge the effect of measures on people’s privacy. Privacy points are exchangeable and, hence, give companies room for innovation in how they improve people’s privacy. Regulators lose control on details while getting the opportunity to extend their power to a larger portion of the market.

Keywords

Minimum Requirement Emission Trading Building Code Location Privacy Compensatory Action 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Google’s Schmidt warns regulators against killing innovation (2012), http://www.physorg.com/news/2012-02-google-schmidt.html
  2. 2.
    Bezzi, M.: Expressing privacy metrics as one-symbol information. In: Proceedings of the 2010 EDBT/ICDT Workshops, EDBT 2010, pp. 29:1–29:5 (2010)Google Scholar
  3. 3.
    Bruns, E.: The evaluation and accounting methods used in the mitigation and compensation regulation. In: An Analysis and Systematization of the Proceedings and Approaches Used at the Federal and State Levels. Dissertation, TU Berlin (2007)Google Scholar
  4. 4.
    Buchanan, T., Paine, C., Joinson, A.N., Reips, U.-D.: Development of measures of online privacy concern and protection for use on the internet. Journal of the American Society for Information Science and Technology 58(2), 157–165 (2007)CrossRefGoogle Scholar
  5. 5.
    Dayarathna, R.: Taxonomy for information privacy metrics. Journal of International Commercial Law and Technology 6(4) (2011)Google Scholar
  6. 6.
    Diaz, C.: Anonymity metrics revisited. In: Dolev, S., Ostrovsky, R., Pfitzmann, A. (eds.) Anonymous Communication and its Applications, Dagstuhl, Germany. Dagstuhl Seminar Proceedings, vol. 05411 (2006)Google Scholar
  7. 7.
    Gellert, R., Kloza, D.: Can Privacy Impact Assessment Mitigate Civil Liability? A Precautionary Approach. In: IRIS 2012 Proceedings of the 15th International Legal Informatics Symposium, pp. 497–505 (2012)Google Scholar
  8. 8.
    Herrmann, D.S.: Complete Guide to Security and Privacy Metrics: Measuring Regulatory Compliance, Operational Resilience, and ROI (2007)Google Scholar
  9. 9.
    ICO. Privacy impact assessment (PIA) handbook, Version 2.0 (2009), http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=12451
  10. 10.
    Landesamt für Natur, Umwelt und Verbraucherschutz Nordrhein-Westfalen. Numerische Bewertung von Biotoptypen für die Eingriffsregelung in NRW (2008)Google Scholar
  11. 11.
    Pfitzmann, A., Hansen, M.: A terminology for talking about privacy by data minimization: Anonymity, unlinkability, undetectability, unobservability, pseudonymity, and identity management, v0.34 (August 2010), http://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.34.pdf
  12. 12.
    Reiter, M.K., Rubin, A.D.: Crowds: anonymity for web transactions. ACM Trans. Inf. Syst. Secur. 1(1), 66–92 (1998)CrossRefGoogle Scholar
  13. 13.
    Shokri, R., Freudiger, J., Jadliwala, M., Hubaux, J.-P.: A distortion-based metric for location privacy, pp. 21–30 (2009)Google Scholar
  14. 14.
    Sweeney, L.: k-anonymity: a model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 10(5), 557–570 (2002)MathSciNetzbMATHCrossRefGoogle Scholar
  15. 15.
    Wright, D.: Should privacy impact assessments be mandatory? Commun. ACM 54(8), 121–131 (2011)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Hanno Langweg
    • 1
  • Lisa Rajbhandari
    • 1
  1. 1.NISlab Norwegian Information Security LaboratoryGjøvikNorway

Personalised recommendations