Skip to main content

Regulating Algorithms’ Regulation? First Ethico-Legal Principles, Problems, and Opportunities of Algorithms

  • Chapter
  • First Online:
Transparent Data Mining for Big and Small Data

Part of the book series: Studies in Big Data ((SBD,volume 32))

Abstract

Algorithms are regularly used for mining data, offering unexplored patterns and deep non-causal analyses in what we term the “classifying society”. In the classifying society individuals are no longer targetable as individuals but are instead selectively addressed for the way in which some clusters of data that they (one or more of their devices) share with a given model fit in to the analytical model itself. This way the classifying society might bypass data protection as we know it. Thus, we argue for a change of paradigm: to consider and regulate anonymities—not only identities—in data protection. This requires a combined regulatory approach that blends together (1) the reinterpretation of existing legal rules in light of the central role of privacy in the classifying society; (2) the promotion of disruptive technologies for disruptive new business models enabling more market control by data subjects over their own data; and, eventually, (3) new rules aiming, among other things, to provide to data generated by individuals some form of property protection similar to that enjoyed by the generation of data and models by businesses (e.g. trade secrets). The blend would be completed by (4) the timely insertion of ethical principles in the very generation of the algorithms sustaining the classifying society.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    “Contrary to some claims, privacy and data protection are a platform for a sustainable and dynamic digital environment, not an obstacle” [1, p. 9].

  2. 2.

    Cambridge Advanced Learner’s Dictionary & Thesaurus of the Cambridge University Press specifies as the first meaning: “to divide things into groups according to their type: The books in the library are classified by/according to subject. Biologists classify animals and plants into different groups”.

  3. 3.

    On the risks and benefits of big data see e.g., Tene and Polonetsky [5]; contra Ohm [6].

  4. 4.

    “Personalization is using more (demographic, but also behavioral) information about a particular individual to tailor predictions to that individual. Examples are Google’s search results based on individual’s cookies or GMail contents” [7, p. 261]. See also https://www.google.com/experimental/gmailfieldtrial.

  5. 5.

    “Algorithms nowadays define how we are seen, by providing a digital lens, tailored by statistics and other biases.” [7, p. 256].

  6. 6.

    Amazon for instance is aiming at shipping goods to us even before we place an order [8]. This approach is very similar to Google attempting to understand what we want before we know we want it. “Google is a system of almost universal surveillance, yet it operates so quietly that at times it’s hard to discern” [9, p. 84].

  7. 7.

    See for more examples Citron and Pasquale [10].

  8. 8.

    By using previous direct interaction, Target knew a teenage girl was pregnant well before her family did [11].

  9. 9.

    See, for instance, the following list of horrors in Gray and Citron [12, p. 81, footnotes omitted]: “Employers have refused to interview or hire individuals based on incorrect or misleading personal information obtained through surveillance technologies. Governmental data-mining systems have flagged innocent individuals as persons of interest, leading to their erroneous classifications as terrorists or security threats. … In one case, Maryland state police exploited their access to fusion centers in order to conduct surveillance of human rights groups, peace activists, and death penalty opponents over a 19 month period. Fifty-three political activists eventually were classified as ‘terrorists,’ including two Catholic nuns and a Democratic candidate for local office. The fusion center subsequently shared these erroneous terrorist classifications with federal drug enforcement, law enforcement databases, and the National Security Administration, all without affording the innocent targets any opportunity to know, much less correct, the record.”

  10. 10.

    On the chilling effect of dataveillance for autonomy and freedom of speech see, for instance, in literature [13,14,15,16,17].

  11. 11.

    The limits of antidiscrimination law to cope with data-driven discrimination have been already highlighted by Barocas and Selbst [19].

  12. 12.

    Some forms of notification at least have already been advocated in the context of the debate surrounding the USA’s Fourth Amendment [20]. In the EU, specific rules on automation are in place [21]. However, some authors claim that automation as such does not require higher scrutiny [22].

  13. 13.

    See also Moss [24], stressing the ability of algorithms to discriminate “in practically and legally analogous ways to a real world real estate agent”.

  14. 14.

    “It is not just the amount of data but also novel ways to analyze this data that change the playing field of any single individual in the information battle against big companies and governments. Data is becoming a key element for profit and control, and computers gain in authority” [7, p. 256; 25].

  15. 15.

    See infra footnotes 73–85 and accompanying text.

  16. 16.

    According to EU Competition Commissioner Margrethe Vestager, the EU Commission is considering the proposal of a specific directive on big data.

  17. 17.

    See in general [26].

  18. 18.

    See also Rajagopal [28].

  19. 19.

    This is the way in which data collection and sharing is supposedly justified in the eyes of customers.

  20. 20.

    For a general description, see Perzanowski [30].

  21. 21.

    Apple [31] for instance imposes the acceptance of the following: “Notwithstanding any other provision of this Agreement, Apple and its licensors reserve the right to change, suspend, remove, or disable access to any products, content, or other materials comprising a part of the Service at any time without notice. In no event will Apple be liable for making these changes. Apple may also impose limits on the use of or access to certain features or portions of the Service, in any case and without notice or liability.”

  22. 22.

    Actually, companies already extensively use algorithms to select employees. For documented cases, see Behm [32].

  23. 23.

    The NSA spying story is nothing new [for the timeline, 33, 34].

  24. 24.

    FaceBook tracks micro-actions such as mouse movements as well [35].

  25. 25.

    See Privacy SOS [36].

  26. 26.

    “TrapWire is a unique, predictive software system designed to detect patterns indicative of terrorist attacks or criminal operations. Utilizing a proprietary, rules-based engine, TrapWire detects, analyzes and alerts on suspicious events as they are collected over periods of time and across multiple locations” [37].

  27. 27.

    See, on the risk of re-identification of anonymized data, Ohm [39].

  28. 28.

    “We are constantly tracked by companies and governments; think of smart energy meters, biometric information on passports, number plate tracking, medical record sharing, etc.” [7, p. 256].

  29. 29.

    Often acting synergistically: see Hoofnagle [40]; Singer [41].

  30. 30.

    See Article 29 Data Protection Working Party [43].

  31. 31.

    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation, or “GDPR”).

  32. 32.

    See art. 6 of the GDPR on subsequent processing and pseudo-anonymous data.

  33. 33.

    The argument that “free” services actually command a price (in data) for their services and the suggestion that “free users should be treated as consumers for the purposes of consumer protection law” has been already advanced [45, pp. 661–662]; on the economic value of data see Aziz and Telang [46].

  34. 34.

    These algorithms use the model built on other people’s similar behavioral patterns to make suggestions for us if they think we fit the model (i.e. the classification) they have produced [47].

  35. 35.

    As beautifully described by Pasquale and Citron [48, p. 1421]: “Unexplained and unchallengeable, Big Data becomes a star chamber… secrecy is a discriminator’s best friend: unknown unfairness can never be challenged, let alone corrected”. On the importance of transparency and accountability in algorithms of powerful internet intermediaries see also Pasquale [49, 50]. But see, on the role of transparency and the various levels of anonymity, Zarsky [51, 52]; Cohen [53].

  36. 36.

    The point is clearly illustrated by Zarsky [52].

  37. 37.

    In their description [54, pp. 264–265, footnotes omitted]: “Fusion centers access specially designed data-broker data-bases containing dossiers on hundreds of millions of individuals, including their Social Security numbers, property records, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, social network activity, and drug- and food-store records. Some gather biometric data and utilize facial-recognition software.”

  38. 38.

    See the official description [55].

  39. 39.

    See also Cohen [56]. For the psychological impact of surveillance see Karabenick and Knapp [57].

  40. 40.

    “More unsettling still is the potential combination of surveillance technologies with neuroanalytics to reveal, predict, and manipulate instinctual behavioral patterns of which we are not even aware” [54, p. 265]. Up to the fear that “Based on the technology available, the emergence of a ‘Walden 3.0′ with control using positive reinforcements and behavioral engineering seems a natural development.” [7, p. 265]. Walden 3.0 would be the manifestation of “Walden Two,” the utopian novel written by behavioral psychologist B. F. Skinner (first published in 1948) embracing the proposition that even human behaviour is determined by environmental variables; thus, systematically altering environmental variables can generate a sociocultural system driven by behavioral engineering.

  41. 41.

    See also Citron [59]; Coleman [60]; Marwick [61, p. 22].

  42. 42.

    This phenomenon is particularly problematic for jurists since “[o]ne of the great accomplishments of the legal order was holding the sovereign accountable for decisionmaking and giving subjects basic rights, in breakthroughs stretching from Runnymede to the Glorious Revolution of 1688 to the American Revolution. New algorithmic decisionmakers are sovereign over important aspects of individual lives. If law and due process are absent from this field, we are essentially paving the way to a new feudal order of unaccountable reputational intermediaries” [63, p.19].

  43. 43.

    Government actions have triggered and driven a critical debate. See for instance Ramasastry [68]; Slobogin [69]; Solove [70].

  44. 44.

    On the issue see also Solove [71]; Cate [72]; Strandburg [73].

  45. 45.

    See also Schwartz [13].

  46. 46.

    A serious concern shared both in Europe [75] and in the USA [76], stressing the systematic suppression of conservative news.

  47. 47.

    Of course, it is not only Google that is the problem [77].

  48. 48.

    It was the case in the Google Anti-defamation league [78]; see also Woan [79]; Wu [80].

  49. 49.

    2014 WL 1282730, at 6 (SDNY 2014) (“[A]llowing Plaintiffs to sue Baidu for what are in essence editorial judgments about which political ideas to promote would run afoul of the First Amendment.”).

  50. 50.

    Contra e.g., Case C-131/12.

  51. 51.

    A need already signalled in the literature [82, 83].

  52. 52.

    The high risks of enabling such a free speech approach have been highlighted [84; 7, p. 269].

  53. 53.

    See also Pasquale and Citron [48]; Zarsky [87].

  54. 54.

    According to R. Calo [89] harm must be “unanticipated or, if known to the victim, coerced”.

  55. 55.

    This is the case for both the EU and the USA. See for instance California Online Privacy Protection Act, CAL. BUS & PROF. CODE §§ 22575–22579 (West 2004) (privacy policy requirement for websites on pages where they collect personally identifiable information); CAL. CIV. CODE §§ 1785.11.2, 1798.29, 1798.82 (West 2009); CONN. GEN. STAT. ANN. § 36a-701b (West 2009 & Supp. 2010); GA. CODE ANN. § 10-1-910, 911 (2009).

  56. 56.

    See footnotes 35 and 40 and accompanying text.

  57. 57.

    Mining itself generates new data that change the model and the reading of the clusters.

  58. 58.

    Clarke [92] defines dataveillance as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons”.

  59. 59.

    However, there are several technical definitions of data mining available but they all refer to the discovery of previously unknown, valid patterns and relationships.

  60. 60.

    This is the case for a recent study on pancreatic cancer [94].

  61. 61.

    For an explanation of the actual mechanisms see Solove [70].

  62. 62.

    Meaning agents have an ethical and sometimes legal obligation to answer for their actions, wrongdoing, or mistakes.

  63. 63.

    Transparency is intended as the enabling tool for actual accountability.

  64. 64.

    The different cost-impact of the level of transparency required is analysed by Zarsky [52].

  65. 65.

    Efforts to generate Transparency Enhancing Tools (TETs) is producing an expanding body of research at the crossroad between law and technology [96]. But on the side effects and risks of an excess of transparent information see Shkabatur [97].

  66. 66.

    On this issue see in general Zarsky [22, 98].

  67. 67.

    Literature concentrates on the potential harms of predictive algorithms [67].

  68. 68.

    The “undiscovered observer represents the quintessential privacy harm because of the unfairness of his actions and the asymmetry between his and his victim’s perspective” [89, p. 1160].

  69. 69.

    The literature on the obstacles to obtaining acceptable levels of anonymity on the web is immense [39, 99,100,101]. See also Datalogix [102] privacy policy.

  70. 70.

    Art. 4 GDPR.

  71. 71.

    See the seminal work of Solove [103]; see also Zarsky [104].

  72. 72.

    According to the EU GDPR (art. 4) “pseudonymisation’ means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”.

  73. 73.

    An average of 56 parties track activities on a website [106]. On the evolution of personal data trade see World Economic Forum [107].

  74. 74.

    “No one can challenge the process of scoring and the results because the algorithms are zealously guarded trade secrets” [10, p. 5]. As illustrated by Richards and King [66, p. 42], “[w]hile Big Data pervasively collects all manner of private information, the operations of Big Data itself are almost entirely shrouded in legal and commercial secrecy”.

  75. 75.

    But see for concerns over propertization Noam [113]; Cohen [114]; Bergelson [115].

  76. 76.

    The role of data aggregation and data brokers is vividly illustrated by Kroft [118].

  77. 77.

    See e.g. Kosner [120].

  78. 78.

    Other authors have already pointed out that one key reading of privacy in the digital age is the lack of choice about the processes that involve us and the impossibility of understanding them [121, p. 133].

  79. 79.

    See now the GDPR; for a technical analysis see Borcea-Pfitzmann et al. [122].

  80. 80.

    See Gritzalis [128]. Indeed several authors have already highlighted the risks to privacy and autonomy posed by the expanding use of social networks: see, for instance the consequent call for a “Social Network Constitution” by Andrews [129] or the proposal principles of network governance by Mackinnon [82] or the worries expressed by Irani et al. [130, 131]; see also Sweeney [123]; Spiekermann et al. [132].

  81. 81.

    See Fujitsu Res. Inst. [134].

  82. 82.

    We are not discussing a science fiction conspiracy to control human beings but the actual side effects of the embrace of specific technological advancements with specific business models and their surrounding legal constraints.

  83. 83.

    This holds true also when the code is verified or programmed by humans with the risk of embedding in it, even unintentionally, the biases of the programmer: “Because human beings program predictive algorithms, their biases and values are embedded into the software’s instructions, known as the source code and predictive algorithms” [10, p. 4].

  84. 84.

    See also Danezis [136].

  85. 85.

    See also Bengtsson et al. [137]; Wesolowski et al. [138, 139].

  86. 86.

    See also Wood and Neal [141]; Buttle and Burton [142]; Pew Research Centre [143]; Reinfelder [144].

  87. 87.

    See also Elkin-Koren and Weinstock [145]; FTC Staff Report [146,147,148]; Canadian Offices of the Privacy Commissioners [149]; Harris [150].

  88. 88.

    See for instance Zhang v. Baidu.com, Inc (2014). But see, e.g., Case C-131/12; Pasquale [153].

  89. 89.

    On the privacy concerns and their social impact see Latar and Norsfors [155]; Ombelet and Morozov [156].

  90. 90.

    The authors also propose: “mandatory active choice between payment with money and payment with data, ex post evaluation of privacy notices, democratized data collection, and wealth or income-responsive fines”. Their proposals enrich an already expanding host of regulatory suggestions. See Hajian and Domingo-Ferrer [158]; Mayer-Schonberger and Cukier [159]; Barocas and Selbst [19]. For a more technical account on fostering discrimination-free classifications, see Calders and Verwer [160]; Kamiran et al. [161]. Recently, the establishment of an ad hoc authority has also been advocated [162]. On market manipulation through the use of predictive and descriptive algorithms see the seminal work of Calo [38].

  91. 91.

    See the EU Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2000, concerning unfair business-to-consumer commercial practices in the internal market.

  92. 92.

    We do not report here those topics that are exclusively related to the USA on which the FTC has authority, such as the relevance of the Fair Credit Reporting Act.

  93. 93.

    For a recent account of algorithms’ disparate impact see Barocas and Selbst [19, p. 671].

  94. 94.

    See an analysis of some techniques potentially available to promote transparency and accountability [165]. See also Moss [24, p. 24], quoting relevant American statutes. Yet if action is not taken at a global level, online auditing can be run in countries where it is not forbidden and results transferred as information in other places. Analogously, a technical attempt to create auditing by using volunteers profiling in a sort of crowdsourcing empowering exercise might even make permissible online auditing in those mentioned jurisdictions forbidding the violation of PPT of web sites by using bots. There is an ongoing debate on this issue. See Benkler [166]; Citron [167]. But see contra Barnett [168]. For a critical analysis urging differentiation of the approach targeting the specific or general public see Zarsky [52].

  95. 95.

    On the potential for discriminatory and other misuses of health data regularly “protected” by professional secrecy see Orentlicher [170].

  96. 96.

    Indeed, it has been estimated that on average we would need 244 h per year to read every privacy policy we encounter [173].

  97. 97.

    Here, app is used as a synonym for software.

  98. 98.

    See also Lipford et al. [174]; Passera and Haapio [175].

  99. 99.

    See Case C–51/94, para 34, holding that consumers who care about ingredients (contained in a sauce) read labels (sic); see also Phillips [176]; Gardner [177]; Ayres and Schwartz [178].

  100. 100.

    E.g. artt. 5, 6 and 10 of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’).

  101. 101.

    See also Bar-Gill and Ben-Shahar [179]; Luzak [180, 181]; Purnhagen and Van Herpen [182].

  102. 102.

    See for an information mandate approach: Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts; Directive on electronic commerce; Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market; Directive 2008/48/EC of the European Parliament and of the Council of 23 April 2008 on credit agreements for consumers and repealing Council Directive 87/102/EEC; Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights and Regulation (Eu) No 531/2012 of the European Parliament and of the Council of 13 June 2012 on roaming on public mobile communications networks within the Union.

  103. 103.

    See McDonald et al. [183].

  104. 104.

    See in the economics literature: Stigler [184]; Akerlof [185]; Macho-Stadler and Perez-Castrillo [186].

  105. 105.

    On the failure of the disclosure model see in general Ben-Shahar and Schneider [189], Radin [190], and with reference to privacy Ben-Shahar and Chilton [191].

  106. 106.

    The issue of actual market alternatives is not addressed here.

  107. 107.

    See above footnotes 80–81 and accompanying text.

Abbreviations

AI:

Artificial intelligence

CAL. BUS & PROF. CODE:

California business and professions code

CAL. CIV. CODE:

California civil code

CONN. GEN. STAT. ANN.:

Connecticut general statutes annotated

DAS:

Domain awareness system

DHS:

U.S. Department of Homeland Security

DNA:

Deoxyribonucleic acid

EDPS:

European data protection supervisor

EFF:

Electronic Frontier Foundation

EU:

European Union

EU GDPR:

European Union general data protection regulation

EUCJ:

European Union Court of Justice

FTC:

Federal Trade Commission

GA. CODE ANN.:

Code of Georgia annotated

GPS:

Global positioning system

GSM:

Global system for mobile communications

GSMA:

GSM Association

ICT:

Information and communications technology

NSA:

National Security Agency

PETs:

Privacy-enhancing technologies

PPTCs:

Privacy policy terms and conditions

SDNY:

United States District Court for the southern district of New York

ToS:

Terms of service

WEF:

World Economic Forum

WPF:

World Privacy Forum

References

  1. European Data Protection Supervisor, Opinion No 4/2015: Towards a new digital ethics: Data, dignity and technology. https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Consultation/Opinions/2015/15-09-11_Data_Ethics_EN.pdf. Accessed 24 Oct 2016

  2. Angwin, J.: The web’s new gold mine: your secrets. Wall Street J. http://online.wsj.com/article/SB10001424052748703940904575395073512989404.html (2010). Accessed 24 Oct 2016

  3. Bain & Company: Using Data as a Hidden Asset. http://www.bain.com/publications/articles/using-data-as-a-hidden-asset.aspx (2010). Accessed 24 Oct 2016

  4. Pariser, E.: The Filter Bubble. Penguin Press, New York (2011)

    Google Scholar 

  5. Tene, O., Polonetsky, J.: Big data for all: privacy and user control in the age of analytics. Northwest. J. Technol. Intellect. Prop. 11(5), 239–273 (2013)

    Google Scholar 

  6. Ohm, P.: Response, the underwhelming benefits of big data. Univ. Pa. Law Rev. Online. 161, 339–346 (2013)

    Google Scholar 

  7. Van Otterlo, M.: Automated experimentation in Walden 3.0: the next step in profiling, predicting, control and surveillance. Surveill. Soc. 12(2), 255–272 (2014)

    Google Scholar 

  8. Lomas, N.: Amazon patents “anticipatory” shipping—to start sending stuff before you’ve bought it. https://techcrunch.com/2014/01/18/amazon-pre-ships/ (2014). Accessed 24 Oct 2016

  9. Vaidhyanathan, S.: The Googlization of Everything. University of California Press, Berkeley (2011)

    Google Scholar 

  10. Citron, D.K., Pasquale, F.: The scored society: due process for automated predictions. Wash. Law Rev. 89(1), 1–33 (2014)

    Google Scholar 

  11. Duhigg, C.: How Companies Learn Your Secrets. The New York Times, New York (2012). http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html. Accessed 24 Oct 2016.

  12. Gray, D., Citron, D.K.: The right to quantitative privacy. Minn. Law Rev. 98, 62–144 (2013)

    Google Scholar 

  13. Schwartz, P.M.: Privacy and democracy in cyberspace. Vanderbilt Law Rev. 52, 1609–1701 (1999)

    Google Scholar 

  14. Schwartz, P.M.: Internet privacy and the state. Conn. Law Rev. 32, 815–859 (2000)

    Google Scholar 

  15. Cohen, J.E.: Examined lives: informational privacy and the subject as object. Stanford Law Rev. 52, 1373–1438 (2000)

    Article  Google Scholar 

  16. Cohen, J.E.: Cyberspace as/and space. Columbia Law Rev. 107(1), 210–256 (2007)

    Google Scholar 

  17. Solove, D.J.: The Digital Person. New York University Press, New York (2004)

    Google Scholar 

  18. Mattioli, D.: On Orbitz, Mac users steered to pricier hotels. Wall Street J.. 23, 2012 (2012). http://www.wsj.com/articles/SB10001424052702304458604577488822667325882. Accessed 24 Oct 2016.

  19. Barocas, S., Selbst, A.D.: Big data’s disparate impact. Calif. Law Rev. 104, 671–732 (2016)

    Google Scholar 

  20. Colb, S.F.: Innocence, privacy, and targeting in fourth amendment jurisprudence. Columbia Law Rev. 56, 1456–1525 (1996)

    Article  Google Scholar 

  21. Korff, D.: Data protection laws in the EU: the difficulties in meeting the challenges posed by global social and technical developments. In: European Commission Directorate-General Justice, Freedom and Security, Working Paper No. 2. http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_working_paper_2_en.pdf (2010). Accessed 24 Oct 2016

  22. Zarsky, T.Z.: Governmental data mining and its alternatives. Penn State Law Rev. 116(2), 285–330 (2011)

    Google Scholar 

  23. Bamberger, K.A.: Technologies of compliance: risk and regulation in a digital age. Tex. Law Rev. 88(4), 669–739 (2010)

    Google Scholar 

  24. Moss, R.D.: Civil rights enforcement in the era of big data: algorithmic discrimination and the computer fraud and abuse act. Columbia Hum. Rights Law Rev. 48(1) (2016).

    Google Scholar 

  25. Exec. Office of The President: Big data: seizing opportunities, preserving values. http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf (2014). Accessed 24 Oct 2016

  26. Turow, J.: Niche Envy. MIT Press, Cambridge, MA (2006)

    Google Scholar 

  27. Al-Khouri, A.M.: Data ownership: who owns “my data”? Int. J. Manag. Inf. Technol. 2(1), 1–8 (2012)

    Google Scholar 

  28. Rajagopal, S.: Customer data clustering using data mining technique. Int. J. Database Manag. Syst. 3(4), 1–11 (2011)

    Google Scholar 

  29. Frischmann, B.M., Selinger, E.: Engineering humans with contracts. Cardozo Legal Studies Research Paper No. 493. https://ssrn.com/abstract=2834011 (2016). Accessed 24 Oct 2016

  30. Perzanowski, A., Hoofnagle, C.J.: What we buy when we ‘buy now’. Univ. Pa. Law Rev. 165, 317 (2017). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2778072 (forthcoming 2017). Accessed 24 Oct 2014

  31. Apple: Terms and Conditions—Game Center. http://www.apple.com/legal/internet-services/itunes/gamecenter/us/terms.html (2013). Accessed 24 Oct 2016

  32. Behm, R.: What are the issues? Employment testing: failing to make the grade. http://employmentassessment.blogspot.com/2013/07/what-are-issues.html (2013). Accessed 24 Oct 2016

  33. EFF: Timeline of NSA domestic spying. https://www.eff.org/nsa-spying/timeline (2015). Accessed 24 Oct 2016

  34. Schneier, B.: Want to evade NSA spying? Don’t connect to the internet. Wired Magazine. http://www.wired.com/opinion/2013/10/149481 (2013). Accessed 24 Oct 2016

  35. Rosenbush, S.: Facebook tests software to track your cursor on screen. CIO J. http://blogs.wsj.com/cio/2013/10/30/facebook-considers-vast-increase-in-data-collection (2013). Accessed 24 Oct 2016

  36. PrivacySOS: NYPD’s domain awareness system raises privacy, ethics issues. https://privacysos.org/blog/nypds-domain-awareness-system-raises-privacy-ethics-issues/ (2012). Accessed 24 Oct 2016

  37. TrapWire: The intelligent security method. http://www.trapwire.com/trapwire.html (2016). Accessed 24 Oct 2016

  38. Calo, M.R.: Digital market manipulation. George Wash. Law Rev. 82(4), 95–1051 (2014)

    Google Scholar 

  39. Ohm, P.: Broken promises of privacy: responding to the surprising failure of anonymization. UCLA Law Rev. 57, 1701–1777 (2010)

    Google Scholar 

  40. Hoofnagle, C.Y.: Big brother’s little helpers: how choicepoint and other commercial data brokers collect and package your data for law enforcement. N. C. J. Int. Law Commer. Regul. 29, 595–637 (2004)

    Google Scholar 

  41. Singer, N.: Mapping, and sharing, the consumer genome. The New York Times. http://www.nytimes.com/2012/06/17/technology/acxiom-the-quiet-giant-of-consumer-database-marketing.html (2012). Accessed 24 Oct 2016

  42. Tucker, P.: Has big data made anonymity impossible? MIT Technology Review. http://www.technologyreview.com/news/514351/has-big-data-madeanonymity-impossible/ (2013). Accessed 24 Oct 2016

  43. Article 29 Data Protection Working Party: Opinion 5/2014 on anonymization techniques. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommen dation/files/2014/wp216_en.pdf (2014). Accessed 24 Oct 2016

    Google Scholar 

  44. Ruggieri, S., Pedreschi, D., Turini, F.: Data mining for discrimination discovery. ACM Trans. Knowl. Discov. Data. 4(2), 1–40 (2010)

    Article  Google Scholar 

  45. Hoofnagle, C.Y., Whittington, J.: “Free”: accounting for the costs of the Internet’s most popular price. UCLA Law Rev. 61, 606–670 (2014)

    Google Scholar 

  46. Aziz, A., Telang, R.: What is a digital cookie worth? https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757325 (2016). Accessed 24 Oct 2016

  47. Bozdag, E.: Bias in algorithmic filtering and personalization. Ethics Inf. Technol. 15(3), 209–227 (2013)

    Article  Google Scholar 

  48. Pasquale, F., Citron, D.K.: Promoting innovation while preventing discrimination: policy goals for the scored society. Wash. Law Rev. 89(4), 1413–1424 (2014)

    Google Scholar 

  49. Pasquale, F.: Beyond innovation and competition: the need for qualified transparency in Internet intermediaries. Northwest. Univ. Law Rev. 104(1), 105–174 (2010)

    Google Scholar 

  50. Pasquale, F.: Restoring transparency to automated authority. J. Telecommun. High Technol. Law. 9, 235–256 (2011)

    Google Scholar 

  51. Zarsky, T.Z.: Thinking outside the box: considering transparency, anonymity, and pseudonymity as overall solutions to the problems in information privacy in the Internet society. Univ. Miami Law Rev. 58, 1301–1354 (2004)

    Google Scholar 

  52. Zarsky, T.Z.: Transparent predictions. Univ. Ill. Law Rev. 2013(4), 1503–1570 (2013)

    Google Scholar 

  53. Cohen, J.E.: Configuring the Networked Self: Law, Code, and the Play of Everyday Practice. Yale University Press, New Haven, CT (2012)

    Google Scholar 

  54. Citron, D.K., Gray, D.: Addressing the harm of total surveillance: a reply to professor Neil Richards. Harv. L. Rev. F. 126, 262 (2013)

    Google Scholar 

  55. DHS: National network of fusion centers fact sheet. https://www.dhs.gov/national-network-fusion-centers-fact-sheet (2008). Accessed 24 Oct 2016

  56. Cohen, J.E.: Privacy, visibility, transparency, and exposure. Univ. Chicago Law Rev. 75(1), 181–201 (2008)

    Google Scholar 

  57. Karabenick, S.A., Knapp, J.R.: Effects of computer privacy on help-seeking. J. Appl. Soc. Psychol. 18(6), 461–472 (1988)

    Article  Google Scholar 

  58. Peck, D.: They’re watching you at work. The Atlantic. http://www.osaunion.org/articles/Theyre%20Watching%20You%20At%20Work.pdf (2013). Accessed 24 Oct 2016

  59. Citron, D.K.: Data mining for juvenile offenders. Concurring Opinions. http://www.concurringopinions.com/archives/2010/04/data-mining-for-juvenile-offenders.html (2010). Accessed 24 Oct 2016

  60. Coleman, E.G.: Coding Freedom. Princeton University Press, Princeton (2013)

    Google Scholar 

  61. Marwick, A.E.: How your data are being deeply mined. The New York Review of Books. http://www.nybooks.com/articles/2014/01/09/how-your-data-are-being-deeply-mined/~(2014). Accessed 24 Oct 2016

  62. Abdou, H.A., Pointon, J.: Credit scoring, statistical techniques and evaluation criteria: a review of the literature. Intell. Syst. Account. Finance Manag. 18(2–3), 59–88 (2011)

    Article  Google Scholar 

  63. Balkin, J.M.: The constitution in the national surveillance state. Minn. Law Rev. 93(1), 1–25 (2008)

    Google Scholar 

  64. Kerr, O.S.: Searches and seizures in a digital world. Harv. Law Rev. 119(2), 531–585 (2005)

    Google Scholar 

  65. Citron, D.K.: Technological due process. Wash. Univ. Law Rev. 85(6), 1249–1313 (2008)

    Google Scholar 

  66. Richards, N.M., King, J.H.: Three paradoxes of big data. Stanford Law Rev. 66, 41–46 (2013)

    Google Scholar 

  67. Crawford, K., Schultz, J.: Big data and due process: toward a framework to redress predictive privacy harms. Boston Coll. Law Rev. 55(1), 93–128 (2014)

    Google Scholar 

  68. Ramasastry, A.: Lost in translation? Data mining, national security and the “adverse inference” problem. Santa Clara Comput. High Technol. Law J. 22(4), 757–796 (2004)

    Google Scholar 

  69. Slobogin, C.: Government data mining and the fourth amendment. Univ. Chicago Law Rev. 75(1), 317–341 (2008)

    Google Scholar 

  70. Solove, D.J.: Data mining and the security-liberty debate. Univ. Chicago Law Rev. 75, 343–362 (2008)

    Google Scholar 

  71. Solove, D.J.: Privacy and power: computer databases and metaphors for information privacy. Stanford Law Rev. 53(6), 1393–1462 (2001)

    Article  Google Scholar 

  72. Cate, F.H.: Data mining: the need for a legal framework. Harv. Civil Rights Civil Liberties Law Rev. 43, 435 (2008)

    Google Scholar 

  73. Strandburg, K.J.: Freedom of association in a networked world: first amendment regulation of relational surveillance. Boston Coll. Law Rev. 49(3), 741–821 (2008)

    Google Scholar 

  74. Bloustein, E.J.: Individual and Group Privacy. Transaction Books, New Brunswick, NJ (1978)

    Google Scholar 

  75. Conseil National Numerique, Platform Neutrality: Building an open and sustainable digital environment. http://www.cnnumerique.fr/wp-content/uploads/2014/06/PlatformNeutrality_VA.pdf (2014). Accessed 24 Oct 2016

  76. Nunez, M.: Senate GOP launches inquiry into Facebook’s news curation. http://gizmodo.com/senate-gop-launches-inquiry-into-facebook-s-news-curati-1775767018 (2016). Accessed 24 Oct 2016

  77. Chan, C.: When one app rules them all: the case of WeChat and mobile in China. Andreessen Horowitz. http://a16z.com/2015/08/06/wechat-china-mobile-first/ (2015). Accessed 24 Oct 2016

  78. ADL: Google search ranking of hate sites not intentional. http://archive.adl.org/rumors/google_search_rumors.html (2004). Accessed 24 Oct 2016

  79. Woan, T.: Searching for an answer: can Google legally manipulate search engine results? Univ. Pa. J. Bus. Law. 16(1), 294–331 (2013)

    Google Scholar 

  80. Wu, T.: Machine speech. Univ. Pa. Law Rev. 161, 1495–1533 (2013)

    Google Scholar 

  81. Volokh, E., Falk, D.: First amendment protection for~search~engine~search results. http://volokh.com/wp-content/uploads/2012/05/SearchEngineFirstAmendment.pdf~(2012).~Accessed 24 Oct 2016

  82. MacKinnon, R.: Consent of the Networked. Basic Books, New York (2012)

    Google Scholar 

  83. Chander, A.: Facebookistan. N. C. Law Rev. 90, 1807 (2012)

    Google Scholar 

  84. Pasquale, F.: Search, speech, and secrecy: corporate strategies for inverting net neutrality debates. Yale Law and Policy Review. Inter Alia. http://ylpr.yale.edu/inter_alia/search-speech-and-secrecy-corporate-strategies-inverting-net-neutrality-debates (2010). Accessed 24 Oct 2016

  85. Richtel, M.: I was discovered by an algorithm. The New York Times. http://archive.indianexpress.com/news/i-was-discovered-by-an-algorithm/1111552/~(2013).~Accessed 24 Oct 2016

  86. Slobogin, C.: Privacy at Risk. University of Chicago Press, Chicago (2007)

    Book  Google Scholar 

  87. Zarsky, T.Z.: Understanding discrimination in the scored society. Wash. Law Rev. 89, 1375–1412 (2014)

    Google Scholar 

  88. Nissenbaum, H.F.: Privacy in Context. Stanford Law Books, Stanford, CA (2010)

    Google Scholar 

  89. Calo, M.R.: The boundaries of privacy harm. Indiana Law J. 86(3), 1131–1162 (2011)

    Google Scholar 

  90. Goldman, E.: Data mining and attention consumption. In: Strandburg, K., Raicu, D. (eds.) Privacy and Technologies of Identity. Springer Science + Business Media, New York (2005)

    Google Scholar 

  91. Pasquale, F.: The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, Cambridge, MA (2015)

    Book  Google Scholar 

  92. Clarke, R.: Profiling: a hidden challenge to the regulation of data surveillance. J. Law Inf. Sci. 4(2), 403 (1993)

    Google Scholar 

  93. Fayyad, U.M., Piatetsky-Shapiro, G., Smyth, P.: From data mining to knowledge discovery: an overview. In: Fayyad, U. (ed.) Advances in Knowledge Discovery and Data Mining. AAAI Press, Menlo Park, CA (1996)

    Google Scholar 

  94. Paparrizos, J., White, R.W., Horvitz, E.: Screening for pancreatic adenocarcinoma using signals from web search logs: feasibility study and results. J. Oncol. Pract. 12(8), 737–744 (2016)

    Article  Google Scholar 

  95. Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. 14(3), 330–347 (1996). In: Friedman, B. (ed.). Human Values and the Design of Computer Technology. CSLI Publications, Stanford, CA (1997)

    Google Scholar 

  96. Hildebrant, M.: Profiling and the rule of law. Identity Inf. Soc. 1(1), 55–70 (2008)

    Article  Google Scholar 

  97. Shkabatur, J.: Cities @ crossroads: digital technology and local democracy in America. Brooklin Law Rev. 76(4), 1413–1485 (2011)

    Google Scholar 

  98. Zarsky, T.Z.: “Mine your own business!”: making the case for the implications of the data mining of personal information in the forum of public opinion. Yale J. Law Technol. 5(1), 1–56 (2003)

    Google Scholar 

  99. Mayer, J: Tracking the trackers: where everybody knows your username. http://cyberlaw.stanford.edu/node/6740 (2011). Accessed 24 Oct 2016

  100. Narayanan, A: There is no such thing as anonymous online tracking. http://cyberlaw.stanford.edu/node/6701 (2011). Accessed 24 Oct 2016

  101. Perito, D., Castelluccia, C., Kaafar, M.A., Manilsr, P.: How unique and traceable are usernames? In: Fischer-Hübner, S., Hopper, N. (eds.) Privacy Enhancing Technologies. Springer, Berlin (2011)

    Google Scholar 

  102. Datalogix: Privacy policy. https://www.datalogix.com/privacy/ (2016). Accessed 24 Oct 2016

  103. Solove, D.J.: Nothing to Hide. Yale University Press, New Haven, CT (2011)

    Google Scholar 

  104. Zarsky, T.Z.: Law and online social networks: mapping the challenges and promises of user-generated information flows. Fordham Intell. Prop. Media Entertainment Law J. 18(3), 741–783 (2008)

    Google Scholar 

  105. Himma, K.E., Tavani, H.T.: The Handbook of Information and Computer Ethics. Wiley, Hoboken, NJ (2008)

    Book  Google Scholar 

  106. Angwin, J.: Online tracking ramps up—popularity of user-tailored advertising fuels data gathering on browsing habits. Wall Street J. http://www.wsj.com/articles/SB10001424052702303836404577472491637833420 (2012). Accessed 24 Oct 2016

  107. World Economic Forum: Rethinking personal data: strengthening trust. http://www3.weforum.org/docs/WEF_IT_RethinkingPersonalData_Report_2012.pdf~(2012). Accessed 24 Oct 2016

  108. Posner, R.A.: The economics of privacy. Am. Econ. Rev. 71(2), 405–409 (1981)

    Google Scholar 

  109. Calzolari, G., Pavan, A.: On the optimality of privacy in sequential contracting. J. Econ. Theory. 130(1), 168–204 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  110. Acquisti, A., Varian, H.R.: Conditioning prices on purchase history. Mark. Sci. 24(3), 367–381 (2005)

    Article  Google Scholar 

  111. Schwartz, P.M.: Property, privacy, and personal data. Harv. Law Rev. 117, 2056–2128 (2003)

    Article  Google Scholar 

  112. Purtova, N.: Property rights in personal data: an European perspective. Dissertation, Uitgeverij BOXPress, Oistervijk (2011)

    Google Scholar 

  113. Noam, E.M.: Privacy and self-regulation: markets for electronic privacy. In: Wellbery, B.S. (ed.) Privacy and Self-Regulation in the Information Age. U.S. Dept. of Commerce, National Telecommunications and Information Administration, Washington, D.C. (1997)

    Google Scholar 

  114. Cohen, J.E.: Examined lives: informational privacy and the subject as object. Stanford Law Rev. 52, 1373–1437 (1999)

    Article  Google Scholar 

  115. Bergelson, V.: It’s personal but is it mine? Toward property rights in personal information. U.C. Davis Law Review. 37, 379–451 (2003)

    Google Scholar 

  116. Laudon, K.C.: Markets and privacy. Commun. ACM. 39(9), 92–104 (1996)

    Article  Google Scholar 

  117. Aperjis, C., Huberman, B.: A market for unbiased private data: paying individuals according to their privacy attitudes. First Monday 17(5) (2012)

    Google Scholar 

  118. Kroft, S.: The data brokers: selling your personal information. 60 Minutes. http://www.cbsnews.com/news/data-brokers-selling-personal-information-60-minutes/ (2014). Accessed 24 Oct 2016

  119. Jentzsch, N., Preibusch, S., Harasser, A.: Study on monetizing privacy: an economic model for pricing personal information. ENISA Publications. https://www.enisa.europa.eu/publications/monetising-privacy (2012). Accessed 24 Oct 2016

  120. Kosner, A.W.: New Facebook policies sell your face and whatever it infers. Forbes. http://www.forbes.com/sites/anthonykosner/2013/08/31/new-facebook-policies-sell-your-faceand-whatever-it-infers/ (2013). Accessed 24 Oct 2016

  121. Solove, D.J.: Understanding Privacy. Harvard University Press, Cambridge, MA (2008)

    Google Scholar 

  122. Borcea-Pfitzmann, K., Pfitzmann, A., Berg, M.: Privacy 3.0: = data minimization + user control + contextual integrity. Inf. Technol. 53(1), 34–40 (2011)

    Google Scholar 

  123. Sweeney, L.: K-anonymity: a model for protecting privacy. Int. J. Uncertain Fuzziness Knowl Based Syst. 10(5), 557–570 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  124. Machanavajjhala, A., Kifer, D., Gehrke, J., Venkitasubramaniam, M.: L-diversity: privacy beyond k-anonymity. ACM Trans. Knowl. Discov. Data 1(1), 1–52, Art. 3 (2007)

    Google Scholar 

  125. Li, N., Li, T., Venkatasubramanian, S.: t-closeness: privacy beyond k-anonymity and l-diversity. In: IEEE 23rd International Conference on Data Engineering, pp. 106–115. IEEE, Istanbul (2007)

    Google Scholar 

  126. Karjoth, G., Schunter, M., Waidner, M.: Privacy-enabled services for enterprises. http://www.semper.org/sirene/publ/KaSW_02.IBMreport-rz3391.pdf (2002). Accessed 24 Oct 2016

  127. Cranor, L.F., Guduru, P., Arjula, M.: User interfaces for privacy agents. ACM Trans. Comput. Hum. Interact. 13(2), 135–178 (2006)

    Article  Google Scholar 

  128. Gritzalis, S.: Enhancing web privacy and anonymity in the digital era. Inf. Manag. Comput. Secur. 12(3), 255–288 (2004)

    Article  Google Scholar 

  129. Andrews, L.: I Know Who You Are and I Saw What You Did: Social Networks and The Death of Privacy. Free Press, New York (2012)

    Google Scholar 

  130. Irani, D., Webb, S., Li, K., Pu, C.: Large online social footprints—an emerging threat. http://cobweb.cs.uga.edu/~kangli/src/SecureCom09.pdf (2009). Accessed 24 Oct 2016

  131. Irani, D., Webb, S., Pu, C., Li, K.: Modeling unintended personal-information leakage from multiple online social networks. IEEE Internet Comput. 15(3), 13–19 (2011)

    Article  Google Scholar 

  132. Spiekermann, S., Dickinson, I., Günther, O., Reynolds, D.: User agents in e-commerce environments: industry vs. consumer perspectives on data exchange. In: Eder, J., Missikoff, M. (eds.) Advanced Information Systems Engineering. Springer, Berlin (2003)

    Google Scholar 

  133. Bott, E.: The do not track standard has crossed into crazy territory. http://www.zdnet.com/the-do-not-track-standard-has-crossed-into-crazy-territory-7000005502/ (2012). Accessed 24 Oct 2016

  134. Fujitsu Res. Inst.: Personal data in the cloud: a global survey of consumer attitudes. http://www.fujitsu.com/downloads/SOL/fai/reports/fujitsu_personal-data-in-the-cloud.pdf (2010). Accessed 24 Oct 2016

  135. Brunton, F., Nissenbaum, H.: Vernacular resistance to data collection and analysis: a political theory of obfuscation. First Monday. 16(5), 1–16 (2011)

    Article  Google Scholar 

  136. Danezis, G.: Privacy technology options for smart metering. http://research.microsoft.com/enus/projects/privacy_in_metering/privacytechnologyoptionsforsmartmetering.pdf (2011). Accessed 24 Oct 2016

  137. Bengtsson, L., Lu, X., Thorson, A., Garfield, R., von Schreeb, J.: Improved response to disasters and outbreaks by tracking population movements with mobile phone network data: a post-earthquake geospatial study in Haiti. PLoS Med. 8(8), e1001083 (2011)

    Article  Google Scholar 

  138. Wesolowski, A., Eagle, N., Tatem, A.J., Smith, D.L., Noor, A.M., Snow, R.W., Buckee, C.O.: Quantifying the impact of human mobility on malaria. Science. 338(6104), 267–270 (2012)

    Article  Google Scholar 

  139. Wesolowski, A., Buckee, C., Bengtsson, L., Wetter, E., Lu, X., Tatem, A.J.: Commentary: containing the ebola outbreak—the potential and challenge of mobile network data. http://currents.plos.org/outbreaks/article/containing-the-ebola-outbreak-the-potential-and-challenge-of-mobile-network-data/ (2014). Accessed 24 Oct 2016

  140. Phelps, J., Nowak, G., Ferrell, E.: Privacy concerns and consumer willingness to provide personal information. J. Public Policy Mark. 19(1), 27–41 (2000)

    Article  Google Scholar 

  141. Wood, W., Neal, D.T.: The habitual consumer. J. Consum. Psychol. 19(4), 579–592 (2009)

    Article  Google Scholar 

  142. Buttle, F., Burton, J.: Does service failure influence customer loyalty? J. Consum. Behav. 1(3), 217–227 (2012)

    Article  Google Scholar 

  143. Pew Research Centre: Mobile health 2012. http://www.pewinternet.org/2012/11/08/mobile-health-2012 (2012). Accessed 24 Oct 2016

  144. Reinfelder, L., Benenson, Z., Gassmann, F.: Android and iOS users’ differences concerning security and privacy. In: Mackay, W. (ed.) CHI ’13 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY (2013)

    Google Scholar 

  145. Elkin-Koren, N., Weinstock Netanel, N. (eds.): The Commodification of Information. Kluwer Law International, The Hague (2002)

    Google Scholar 

  146. FTC Staff Report: Mobile apps for kids: current privacy disclosures are disappointing. http://www.ftc.gov/os/2012/02/120216mobile_apps_kids.pdf (2012). Accessed 24 Oct 2016

  147. FTC Staff Report: Mobile apps for kids: disclosures still not making the grade. http://www.ftc.gov/os/2012/12/121210mobilekidsappreport.pdf (2012). Accessed 24 Oct 2016

  148. FTC Staff Report: Mobile privacy disclosures: building trust through transparency. http://www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf (2013). Accessed 24 Oct 2016

  149. Canadian Offices of the Privacy Commissioners: Seizing opportunity: good privacy practices for developing mobile apps. http://www.priv.gc.ca/information/pub/gd_app_201210_e.pdf (2012). Accessed 24 Oct 2016

  150. Harris, K.D.: Privacy on the go: recommendations for the mobile ecosystem. http://oag.ca.gov/sites/all/files/pdfs/privacy/privacy_on_the_go.pdf (2013). Accessed 24 Oct 2016

  151. GSMA: User perspectives on mobile privacy. http://www.gsma.com/publicpolicy/wpcontent/uploads/2012/03/futuresightuserperspectivesonuserprivacy.pdf (2011). Accessed 24 Oct 2016

  152. Sundsøy, P., Bjelland, J., Iqbal, A.M., Pentland, A.S., De Montjoye, Y.A.: Big data-driven marketing: how machine learning outperforms marketers’ gut-feeling. In: Greenberg, A.M., Kennedy, W.G., Bos, N. (eds.) Social Computing, Behavioral-Cultural Modeling and Prediction. Springer, Berlin (2013)

    Google Scholar 

  153. Pasquale, F.: Reforming the law of reputation. Loyola Univ. Chicago Law J. 47, 515–539 (2015)

    Google Scholar 

  154. Ombelet, P.J., Kuczerawy, A., Valcke, P.: Supervising automated journalists in the newsroom: liability for algorithmically produced news stories. Revue du Droit des Technologies de l’Information. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2768646 (forthcoming 2016). Accessed 24 Oct 2016

  155. Latar, N.L., Norsfors, D.: Digital identities and journalism content—how artificial intelligence and journalism may co-develop and why society should care. Innov. Journalism. 6(7), 1–47 (2006)

    Google Scholar 

  156. Ombelet, P.J., Morozov, E.: A robot stole my Pulitzer! How~automated journalism and loss of reading privacy may hurt civil~discourse.~http://www.slate.com/articles/technology/future_tense/2012/03/narrative_science_robot_journalists_customized_news_and_the_danger_to_civil_discourse_.single.html (2012). Accessed 24 Oct 2016

  157. Hacker, P., Petkova, B.: Reining in the big promise of big data: transparency, inequality, and new regulatory frontiers. Northwest. J. Technol. Intellect. Prop. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2773527 (forthcoming 2016). Accessed 24 Oct 2016

  158. Hajian, S., Domingo-Ferrer, J.: Direct and indirect discrimination prevention methods. In: Custers, B., Calders, T., Schermer, B., Zarsky, T. (eds.) Discrimination and Privacy in the Information Society. Springer, New York (2013)

    Google Scholar 

  159. Mayer-Schonberger, V., Cukier, K.: Big Data. A Revolution That Will Transform How We Live, Work, And Think. Eamon Dolan/Houghton Mifflin Harcourt, Boston, MA (2014)

    Google Scholar 

  160. Calders, T., Verwer, S.: Three naïve Bayes approaches for discrimination-free classification. Data Min. Knowl. Disc. 21(2), 277–292 (2010)

    Article  Google Scholar 

  161. Kamiran, F., Calders, T., Pechenizkiy, M.: Techniques for discrimination-free predictive models. In: Custers, B., Calders, T., Schermer, B., Zarsky, T. (eds.) Discrimination and Privacy in the Information Society. Springer, New York (2013)

    Google Scholar 

  162. Tutt, A.: An FDA for algorithms. Adm. Law Rev. 67, 1–26 (2016)

    Google Scholar 

  163. FTC: Spring privacy series: alternative scoring products. http://www.ftc.gov/news-events/events-calendar/2014/03/spring-privacy-series-alternative-scoring-products (2014). Accessed 24 Oct 2016

  164. Ramirez, E.: The privacy challenges of big data: a view from the lifeguard’s chair. https://www.ftc.gov/public-statements/2013/08/privacy-challenges-big-data-view-lifeguard%E2%80%99s-chair (2013). Accessed 24 Oct 2016

  165. Sandvig, C., Hamilton, K., Karahalios, K., Langbort, C.: Auditing algorithms: research methods for detecting discrimination on internet platforms. Data and discrimination: converting critical concerns into productive inquiry. http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf (2014). Accessed 24 Oct 2016

  166. Benkler, Y.: The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press, New Haven, CT (2006)

    Google Scholar 

  167. Citron, D.K.: Open code governance. Univ. Chicago Legal Forum. 2008(1), 355–387 (2008)

    Google Scholar 

  168. Barnett, J.M.: The host’s dilemma: strategic forfeiture in platform markets for informational goods. Harv. Law Rev. 124(8), 1861–1938 (2011)

    Google Scholar 

  169. Moses, L.: Marketers should take note of when women feel least attractive: what messages to convey and when to send them. ADWEEK. http://www.adweek.com/news/advertising-branding/marketers-should-take-note-when-women-feel-least-attractive-152753 (2013). Accessed 24 Oct 2016

  170. Orentlicher, D.: Prescription data mining and the protection of patients’ interests. J. Law Med. Ethics. 38(1), 74–84 (2010)

    Article  Google Scholar 

  171. WPF: Data broker testimony results in new congressional letters to data brokers about vulnerability-based marketing. http://www.worldprivacyforum.org/2014/02/wpfs-data-broker-testimony-results-in-new-congressional-letters-to-data-brokers-regarding-vulnerability-based-marketing/ (2014). Accessed 24 Oct 2016

  172. Bakos, Y., Marotta-Wurgler, F., Trossen, D.R.: Does anyone read the fine print? Consumer attention to standard-form contracts. J. Leg. Stud. 43(1), 1–35 (2014)

    Article  Google Scholar 

  173. MacDonald, A.M., Cranor, L.F.: The cost of reading privacy policies. J. Law Policy Inf. Soc. 4(3), 540–565 (2008)

    Google Scholar 

  174. Lipford, H.R, Watson, J., Whitney, M., Froiland, K., Reeder, R.W.: Visual vs compact: a comparison of privacy policy interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1111–1114 (2010)

    Google Scholar 

  175. Passera, S., Haapio, H.: Transforming contracts from legal rules to user-centered communication tools: a human-information interaction challenge. Commun. Des. Q. Rev. 1(3), 38–45 (2013)

    Article  Google Scholar 

  176. Phillips, E.D.: The Software License Unveiled. Oxford University Press, Oxford (2009)

    Google Scholar 

  177. Gardner, T.: To read, or not to read… the terms and conditions. The Daily Mail. http://www.dailymail.co.uk/news/article-2118688/PayPalagreement-longer-Hamlet-iTunes-beats-Macbeth.html (2012). Accessed 24 Oct 2016

  178. Ayres, I., Schwartz, A.: The no-reading problem in consumer contract law. Stanford Law Rev. 66, 545 (2014)

    Google Scholar 

  179. Bar-Gill, O., Ben-Shahar, O.: Regulatory techniques in consumer protection: a critique of European consumer contract law. Common Mark. Law Rev. 50, 109–126 (2013)

    Google Scholar 

  180. Luzak, J.: Passive consumers vs. the new online disclosure rules of the consumer rights. Directive. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2553877 (2014). Accessed 24 Oct 2016

  181. Luzak, J.: To withdraw or not to withdraw? Evaluation of the mandatory right of withdrawal in consumer distance selling contracts taking into account its behavioral effects on consumers. J. Consum. Policy. 37(1), 91–111 (2014)

    Article  Google Scholar 

  182. Purnhagen, K., Van Herpen, E.: Can bonus packs mislead consumers? An empirical assessment of the ECJ’s mars judgment and its potential impact on EU marketing regulation. In: Wageningen Working Papers Series in Law and Governance 2014/07, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2503342 (2014)

  183. MacDonald, A.M., Reeder, R.W., Kelley, P.G., Cranor, L.F.: A comparative study of online privacy policies and formats. In: Goldberg, I., Atallah, M.J. (eds.) Privacy Enhancing Technologies. Springer, Berlin (2009)

    Google Scholar 

  184. Stigler, G.: The Economics of information. J. Polit. Econ. 69(3), 213–225 (1961)

    Article  Google Scholar 

  185. Akerlof, G.A.: The Market for “lemons”: quality uncertainty and the market mechanisms. Q. J. Econ. 84(3), 488 (1970)

    Article  Google Scholar 

  186. Macho-Stadler, I., Pérez-Castrillo, J.D.: An Introduction to the Economics of Information. Oxford University Press, Oxford (2001)

    Google Scholar 

  187. Evans, M.B., McBride, A.A., Queen, M., Thayer, A., Spyridakis, J.H.: The effect of style and typography on perceptions of document tone. http://faculty.washington.edu/jansp/Publications/Document_Tone_IEEE_Proceedings_2004.pdf (2004). Accessed 24 Oct 2016

  188. Masson, M.E.J., Waldron, M.A.: Comprehension of legal contracts by non-experts: effectiveness of plain language redrafting. Appl. Cogn. Psychol. 8, 67–85 (1994)

    Article  Google Scholar 

  189. Ben-Shahar, O., Schneider, C.E.: More Than You Wanted to Know: The Failure of Mandated Disclosure. Princeton University Press, Princeton (2014)

    Book  Google Scholar 

  190. Radin, M.: Boilerplate. Princeton University Press, Princeton, NJ (2013)

    Book  Google Scholar 

  191. Ben-Shahar, O., Chilton, A.S.: “Best practices” in the design of privacy disclosures: an experimental test. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2670115 (2015). Accessed 24 Oct 2016

  192. Miller, A.A.: What do we worry about when we worry about price discrimination? The law and ethics of using personal information for pricing. J. Technol. Law Policy. 19, 41–104 (2014)

    Google Scholar 

  193. Mittlestadt, B.D., Allo, P., Taddeo, M., Wachter, S., Floridi, L.: The ethics of algorithms: mapping the debate. Big Data Soc. 1–21 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Comandè .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Comandè, G. (2017). Regulating Algorithms’ Regulation? First Ethico-Legal Principles, Problems, and Opportunities of Algorithms. In: Cerquitelli, T., Quercia, D., Pasquale, F. (eds) Transparent Data Mining for Big and Small Data. Studies in Big Data, vol 32. Springer, Cham. https://doi.org/10.1007/978-3-319-54024-5_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-54024-5_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-54023-8

  • Online ISBN: 978-3-319-54024-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics