Abstract
Conventional wisdom holds that the market for digital privacy fails owing to widespread informational asymmetry between digital firms and their customers, behavioral biases exhibited by those customers, and negative externalities from data resale. This paper supplies both theoretical and empirical reasons to question the standard market failure conclusion. On the theoretical side, I argue that digital markets are not qualitatively different from markets for other consumer goods. To wit, just as in traditional markets, it is costly to measure product attributes (such as “privacy”) and, just as in more traditional settings, some firms offer credible commitments to reduce the threat of potential opportunism. On the empirical side, I conduct a survey of Google’s users. The most important results of this survey suggest that, at least with respect to Google, (a) the extent of informational asymmetry is minimal and (b) the demands for “unconstrained” and “constrained” privacy diverge substantially. Significantly, 86% of respondents express no willingness to pay for additional privacy when interacting with Google. Among the remaining 14%, the average expressed willingness to pay is low.
Similar content being viewed by others
Notes
See here for more information on Google’s decision to drop the phrase: https://gizmodo.com/google-removes-nearly-all-mentions-of-dont-be-evil-from-1826153393.
Less consensus exists regarding what specific policy interventions should be implemented. Some scholars favor outright bans on information collection, others call for a legally mandated opt-in, and still others argue that greater transparency be required of firms. The EU, Japan, Canada, Singapore and South Africa all have passed comprehensive digital privacy legislation. For an analysis of intervention, see Fuller (2018).
Haven Insights used Lucid’s platform for academic research. See more details here: https://luc.id/lucid-for-academics/.
Of course, to get a library card, one usually provides name, physical address, email address, etc.…The information collected by digital firms tends to be a browser’s location, browsing history, and (often) purchase history. If one wishes to avoid surrendering information to a library, it is possible to use the library without checking out any items.
The company earns revenue by displaying ads based merely on what search terms a browser enters but does not track the user.
Although DuckDuckGo has grown steadily, it averaged only a little more than 20 million queries daily as of early 2018, far less than 1% of Google’s daily traffic. See https://duckduckgo.com/traffic.html for statistics on DuckDuckGo’s traffic over time. See http://www.internetlivestats.com/google-search-statistics/ for a daily count of Google searches.
Respondents who indicated unawareness of Google’s information-collection practices were not asked questions four, five, or six.
Google may collect any of the data listed in question four of the survey’s text (Appendix A) except: “Your driver’s license”, “Your social security number”, “Your medical information” and “Your credit card information.”
The results show that individuals are least aware of the fact that Google gathers information about their devices. Still, 51% of “aware respondents” know that device information is collected. Arguably, for most users, device information is the least “sensitive” or “important” piece of information that Google collects. It also is possible that some consumers are unfamiliar with the term “device information.”
Google may use collected information to “target ads based on your search history and location”, to “aggregate large quantities of anonymized data”, and to “store your data indefinitely”, but its privacy policy does not permit any of the other uses listed in question five of the survey (see the survey’s text in Appendix A).
As noted by Acquisti et al. (2016), information asymmetry provides another explanation, but I am ruling that out for a moment so as to isolate the purported effect of behaviorial biases.
“[S]tudies in which consumers are… asked to consider paying… to protect their privacy are… scarcer” (Acquisti et al. 2013, p. 254).
Goldfarb and Tucker 2011 examine the economic impact of the EU’s switch to an opt-in rather than an opt-out default option. They find that the switch reduced the effectiveness of the average digital ad dramatically because of the inability to target advertisements.
Unconstrained surveys also are common in other contexts. For example, see Clark and Powell’s (2013) analysis of “non-economic” or “unconstrained” survey approaches in the literature on sweatshops.
Acquisti et al. (2016, pp. 44–445) affirm that both costs and benefits are associated with disclosure of personal information.
Non-money differentials may include preferences for beauty, love, discrimination and so on (Boettke and Candela 2017), but those differentials come in the form of personal information in the case of digital privacy.
Note that 149 respondents indicated a willingness to pay for privacy on Google, but when they subsequently were prompted to state the amount they would be willing to pay, they entered $0. Those 149 respondents were re-categorized as being unwilling to pay for privacy and thus included amongst the 86% of all respondents not willing to pay for privacy.
The survey began with a sample of 6864 respondents, but 781 were eliminated because they did not use Google. It is unclear how those non-users would respond to the remainder of the survey. At one extreme, it is possible that 100% of them refrain from using Google because of privacy concerns and all of them would also be willing to pay for privacy on Google. If that were the case, 23% of the Internet-using population would be willing to pay for privacy on Google. At the other extreme, 100% of them could also be unwilling to pay for privacy on Google because they never use Google (for reasons other than privacy concerns). If that were the case, only 12% of the Internet-using population would be willing to pay for privacy. The truth probably lies somewhere between the extremes.
It is impossible to determine whether respondents are perfectly consistent between their annual and “per-search” valuations. For example, someone selecting “$1 to $5” may have had $1 in mind, whereas another had $5 in mind. Nonetheless, the answers are “generally consistent” in that both the annual and “per-search” prompts elicit relatively low WTPs.
Respondents who believe that Google does not collect information were excluded from the question about whether Google can change its privacy policy unilaterally. Thus, the relevant sample comprises users who are aware of Google’s information-collection practices and who express a desire for Google not to collect their data.
The respondents are comprised of those users who were aware that Google engages in information collection (question three) and expressed a willingness to pay for privacy (question nine).
Of course, other ways of categorizing respondents as “relatively informed” or “relatively uninformed” with respect to question five are possible. My strategy for categorization was selected in the interest of generating a sufficiently large sample size for both “informed” and “uninformed” groups, given that most respondents are unwilling to pay. Respondents who selected only two correct answers, but no incorrect answers are categorized as “uninformed” because they seemingly exhibit less awareness of overall data collection practices than those who selected all three correct answers and exhibited some degree of misinformation by also selecting an incorrect response.
The respondents are comprised of those who prefer not to have their information collected (including those both willing and unwilling to pay for privacy).
Acquisti et al. (2016) also list “quantity discrimination in insurance and credit markets”, but I did not present respondents with that option because it was the most technical of the possibilities suggested by those authors.
In addition to the four possibilities listed by Acquisti et al. (2016), my survey added: “Advertisers being able to target you directly”, “A government agency forcing an internet entity that has collected your information to hand over the information”, and “Other (please specify)”.
Section 3.2.2’s results provide evidence for doubting this argument.
Question 13 contains a wording error. The question should not have included the phrase: “Enter a whole number in US dollars” because respondents were not offered an open-ended response option.
As described in the paper’s text, some respondents indicated a positive WTP, but then subsequently entered a value of “zero” for question 12. Those respondents (totaling 149) were re-categorized in both the text and in Appendix B’s table as being unwilling to pay.
The paper’s text describes how respondents were assigned to either the “informed” or “uninformed” categories. Respondents who prefer their information to be collected are excluded from this analysis.
References
Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate gratification. In Proceedings of the 5th ACM conference on electronic commerce (pp. 21–29). ACM.
Acquisti, A. (2012). Privacy and market failures: Three reasons for concern, and three reasons for hope. Journal on Telecommunications and High Technology Law, 10, 227–233.
Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the Facebook. In International workshop on privacy enhancing technologies (pp. 36–58). Springer.
Acquisti, A., John, L. K., & Loewenstein, G. (2013). What is privacy worth? The Journal of Legal Studies, 42(2), 249–274.
Acquisti, A., Taylor, C., & Wagman, L. (2016). The economics of privacy. Journal of Economic Literature, 54(2), 442–492.
Akerlof, G. A. (1970). The market for ‘lemons’: Quality uncertainty and the market mechanism. The Quarterly Journal of Economics, 84(3), 488–500.
Alchian, A. A. (1967). Pricing and society. London: Institute of Economic Affairs.
Barzel, Y. (1982). Measurement cost and the organization of markets. The Journal of Law and Economics, 25(1), 27–48.
Benson, B. (1998). How to secede in business without really leaving: Evidence of the substitution of arbitration for litigation. In D. Gordon (Ed.), Secession, state, and liberty. New Brunswick, NJ: Transaction
Berendt, B., Günther, O., & Spiekermann, S. (2005). Privacy in e-commerce: Stated preferences vs. actual behavior. Communications of the ACM, 48(4), 101–106.
Berman, J., & Mulligan, D. (1998). Privacy in the digital age: Work in progress. Nova Law Review, 23, 551–582.
Boettke, P. J., & Candela, R. A. (2017). Price theory as prophylactic against popular fallacies. Journal of Institutional Economics, 13(3), 725–752.
Brown, I. (2016). The economics of privacy, data protection and surveillance. In Handbook on the economics of the internet. Available at SSRN 2358392 (2013).
Calo, R. (2013). Digital market manipulation. George Washington Law Review, 82, 995–1051.
Clark, J. R., & Powell, B. (2013). Sweatshop working conditions and employee welfare: Say it ain’t sew. Comparative Economic Studies, 55(2), 343–357.
Classroom.com. How much do Americans spend on soft drinks? September 29, 2017. https://classroom.synonym.com/how-much-do-americans-spend-on-soft-drinks-12081634.html.
Cooper, J. C. (2012). Privacy and antitrust: Underpants gnomes, the first amendment, and subjectivity. George Mason Law Review, 20, 1129–1146.
De Corniere, A., & De Nijs, R. (2016). Online advertising and privacy. The Rand Journal of Economics, 47(1), 48–72.
Demsetz, H. (1969). Information and efficiency: Another viewpoint. The Journal of Law and Economics, 12(1), 1–22.
Farrell, J. (2012). Can privacy be just another good? Journal on Telecommunications and High Technology Law, 10, 251–264.
Federal Trade Commission. (2012). Protecting consumer privacy in an era of rapid change. FTC report, 1-112. https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protectingconsumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf. Accessed 23 Jan 2019.
Fuller, C. S. (2018). Privacy law as price control. European Journal of Law and Economics, 45(2), 225–250.
Gertz, J. D. (2002). The purloined personality: Consumer profiling in financial services. San Diego Law Review, 39, 943.
Hermalin, B. E., & Katz, M. L. (2004). Sender or receiver: Who should pay to receive an electronic message? RAND Journal of Economics, 35(3), 423–448.
Hermalin, B. E., & Katz, M. L. (2006). Privacy, property rights and efficiency: The economics of privacy as secrecy. Quantitative Marketing and Economics, 4(3), 209–239.
Hirsch, D. D. (2010). The law and policy of online privacy: Regulation, self-regulation, or co-regulation. Seattle University Law Review, 34, 439–480.
Hoofnagle, C. J. (2003). Reflections on the NC JOLT symposium: The privacy self-regulation race to the bottom. NCJL & Tech., 5, 213–217.
Hoofnagle, C.J. (2006). Privacy self regulation: A decade of disappointment. In: J. K. Winn (Ed.), Consumer Protection in the Age of the 'Information Economy.
Hoofnagle, C. J. (2009). Beyond Google and evil: How policy makers, journalists and consumers should talk differently about Google and privacy. First Monday, 14, 4–6.
Hoofnagle, C. J., & Whittington, J. (2013). Free: Accounting for the costs of the internet’s most popular price. UCLA Law Review, 61, 606–670.
Hoofnagle, C.J., Soltani, A., Good, N., & Wambach, D.J. (2012). Behavioral advertising: The offer you can't refuse. Harvard Law & Policy Review, 6, 273–296
Hui, K. L., & Png, I. (2005). Economics of privacy. In Handbook of information systems and economics. Available at SSRN 786846.
Goldfarb, A., & Tucker, C.E. (2011). Privacy regulation and online advertising. Management Science, 57(1), 57–71
John, L. K., Acquisti, A., & Loewenstein, G. (2011). Strangers on a plane: context-dependent willingness to divulge sensitive information. Journal of Consumer Research, 37(5), 858–873.
Leeson, P. T. (2007). Better off stateless: Somalia before and after government collapse. Journal of Comparative Economics, 35(4), 689–710.
Leeson, P. T. (2008). Social distance and self-enforcing exchange. The Journal of Legal Studies, 37(1), 161–188.
Leeson, P. T. (2014). Pirates, prisoners, and preliterates: anarchic context and the private enforcement of law. European Journal of Law and Economics, 37(3), 365–379.
Leeson, P. T., & Coyne, C. J. (2012). Conflict-inhibiting norms. Oxford handbook of the economics of peace and conflict. Oxford: Oxford University Press.
Madden, M., & Rainie, L. (2015). Americans’ attitudes about privacy, security, and surveillance. http://www.pewinternet.org/2015/05/20/americans-attitudes-about-privacy-security-and-surveillance/. Accessed 20 Nov 2018.
Marthews, A., & Tucker, C. (2017). Government surveillance and internet search behavior. Available at SSRN 2412564.
Milberg, S., Smith, J., & Burke, S. J. (2000). Information privacy: Corporate management and national regulation. Organization Science, 11(1), 35–57.
Newman, N. (2013). The costs of lost privacy: Consumer harm and rising economic inequality in the age of Google. William Mitchell Law Review, 40, 849.
Norberg, P. A., Horne, D. R., & Horne, D. A. (2007). The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs, 41(1), 100–126.
Odlyzko, A. (2003). Privacy, economics, and price discrimination on the Internet. In Proceedings of the 5th international conference on electronic commerce (pp. 355–366). ACM.
Ohm, P. (2010). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review, 57, 1701–1777.
Penney, J. (2016). Chilling effects: Online surveillance and Wikipedia use. Berkeley Technology Law Journal, 31, 117–182.
Rose, E.A. (2005). Data users versus data subjects: are consumers willing to pay for property rights to personal information? In Proceedings of the 38th Annual Hawaii International Conference on System Sciences. Washington, D.C.: IEEE Computer Society Press.
Sachs, B. R. (2009). Consumerism and information privacy: How Upton Sinclair can again save us from ourselves. Virginia Law Review, 95(1), 205–252.
Savage, S. J., & Waldman, D. M. (2015). Privacy tradeoffs in smartphone applications. Economics Letters, 137, 171–175.
Solove, D. J. (2004). The digital person: Technology and privacy in the information age. New York: NYU Press.
Statista. (2017). Google’s ad revenue from 2001 to 2017 “in billion US dollars). https://www.statista.com/statistics/266249/advertising-revenue-of-google/. Accessed 20 Nov 2018.
Statista. (2018). Global digital population as of July 2018 (in millions). https://www.statista.com/statistics/617136/digital-population-worldwide/. Accessed 20 Nov 2018.
Stigler, G. J. (1961). The economics of information. Journal of Political Economy, 69(3), 213–225.
Strandburg, K. J. (2013). Free fall: The online market’s consumer preference disconnect. University of Chicago Legal Forum, 5, 95–172.
Tsai, J. Y., Egelman, S., Cranor, L., & Acquisti, A. (2011). The effect of online privacy information on purchasing behavior: An experimental study. Information Systems Research, 22(2), 254–268.
Tucker, C. E. (2012). The economics of advertising and privacy. International Journal of Industrial Organization, 30(3), 326–329.
Turow, J., King, J., Hoofnagle, C. J., Bleakley, A., & Hennessy, M. (2009). Americans reject tailored advertising and three activities that enable it. Available at SSRN 1478214.
Varian, H. R. (2002). Economic aspects of personal privacy. In W.H. Lehr & L. Pupillo (Eds.), Cyber Policy and Economics in an Internet Age (pp. 127–137). Boston, MA: Springer.
Vila, T., Greenstadt, R., & Molnar, D. (2004). Why we can’t be bothered to read privacy policies. In L. J. Camp & S. Lewis (Eds.), Economics of information security (pp. 143–153). Berlin: Springer.
Williamson, O. E. (1983). Credible commitments: using hostages to support exchange. The American Economic Review, 73(4), 519–540.
Acknowledgements
I wish to thank Chris Coyne, Peter Boettke, Peter Leeson, William H.J. Hubbard, Alessandro Acquisti, David Lucas, Noah Gould, Nicholas Freiling, and an anonymous reviewer for helpful suggestions. Any errors are my own.
Funding
Caleb Fuller received funding from the Mercatus Center at George Mason University to pay Haven Insights, LLC to conduct the initial survey. In response to a revise and resubmit request, a modified survey was conducted by Haven Insights, LLC and its findings replaced the initial survey’s results. The funding for the second survey was provided by the Charles Koch Foundation.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Appendices
Questions four, five and ten randomized the response options to respondents. The other questions presented the response options in the order displayed in Appendix A
Appendix A
Questions four, five and ten randomized the response options to respondents. The other questions presented the response options in the order displayed in Appendix A
-
1.
Do you make web searches on Google.com?
-
a.
If the respondent indicated they did not, they were disqualified from further questions.
-
b.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
-
2.
How often do you make searches on Google.com?
-
a.
Possible responses:
-
i.
Once a day
-
ii.
A few times per day
-
iii.
Dozens of times per day (or more)
-
i.
-
a.
-
3.
Do you believe that Google collects information about you as you use Google.com?
-
a.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
-
4.
What information do you believe Google collects and saves about you? Select all that apply.
-
a.
This question was asked of those who answered “Yes” to question three.
-
b.
Possible responses:
-
i.
Your driver’s license number
-
ii.
Your social security number
-
iii.
Videos you watch
-
iv.
Device information
-
v.
Ads you click on or tap
-
vi.
Your credit card information
-
vii.
Websites you visit
-
viii.
Your location
-
ix.
Things you search for
-
x.
Your medical information
-
xi.
IP address and cookie data
-
xii.
None of the above
-
i.
-
a.
-
5.
Which of the following do you believe Google may use your information for? Select all that apply.
-
a.
This question was asked of those who answered “Yes” to question three.
-
b.
Possible responses:
-
i.
To target ads based on your search history and location
-
ii.
To link your search history with your personal identity
-
iii.
To link your search history with your race, gender, religious preferences, or sexual orientation
-
iv.
To aggregate large quantities of anonymized data
-
v.
To store your data indefinitely
-
vi.
To sell your browsing history to potential employers or insurers who are hoping to learn more about you
-
i.
-
a.
-
6.
Do you believe that Google could change its privacy policy to allow new uses for user data?
-
a.
This question was asked of those who answered “Yes” to question three.
-
b.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
-
7.
Do you use a tool to protect your privacy while browsing, such as Adblock Plus?
-
a.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
-
8.
Would you prefer that Google collected no information about you when you use Google.com?
-
a.
Those responding that they would prefer Google to collect personal information were disqualified from further queries.
-
b.
Possible responses:
-
i.
I would prefer Google collect information about me
-
ii.
I would prefer Google NOT collect information about me
-
i.
-
a.
-
9.
Would you prefer to pay to use Google.com in exchange for a guarantee that Google will NOT collect any information about you?
-
a.
Those answering “No” to this question were disqualified from further queries.
-
b.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
-
10.
Why do you prefer that Google not collect information about you? Select all that apply.
-
a.
Possible responses:
-
i.
A government agency forcing an internet entity that has collected your information to hand over the information
-
ii.
Sellers offering different prices to buyers for the same good
-
iii.
Uneasiness just not knowing who knows what about you
-
iv.
The risk of identity theft
-
v.
The threat of spam
-
vi.
Advertisers being able to target you directly
-
vii.
Other (please specify)
-
i.
-
a.
-
11.
What do you think about the ads targeted to you based on the information Google collects about you?
-
a.
Possible responses:
-
i.
I like seeing the ads customized to my preferences
-
ii.
I don’t like the ads and would rather not see them
-
i.
-
a.
-
12.
How much would you be willing to pay per year to use Google.com without Google collecting any personal information about you? Enter a whole number in US dollars.
-
13.
How much would you be willing to pay per search to use Google.com without Google collecting any personal information about you? Enter a whole number in US dollars. Footnote 30
-
a.
Possible responses:
-
i.
Less than 1 cent
-
ii.
1 cent to ninety-nine cents
-
iii.
$1 to $5
-
iv.
More than $5
-
i.
-
a.
-
14.
Would you be willing to pay $70 per year for a guarantee that Google will NOT collect any information about you while using Google.com?
-
a.
Possible responses:
-
i.
Yes
-
ii.
No
-
i.
-
a.
Appendix B
Appendix B contains the results from all survey questions except for question one (a screener question to determine whether respondents are Google users) and question 12 which asks about how much consumers would be willing to pay for privacy. The paper’s text reports the results of question 12.
Rights and permissions
About this article
Cite this article
Fuller, C.S. Is the market for digital privacy a failure?. Public Choice 180, 353–381 (2019). https://doi.org/10.1007/s11127-019-00642-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11127-019-00642-2