The New Palgrave Dictionary of Economics

Living Edition
| Editors: Palgrave Macmillan

The Economics of Privacy

  • Grazia CecereEmail author
  • Fabrice Le Guel
  • Matthieu Manant
  • Nicolas Soulié
Living reference work entry

Later version available View entry history



The increasing digitalization of the economy and advances in data processing have drawn economists’ attention to the role of personal data in markets. The economics of privacy aims to analyze how individuals, firms, and policymakers interact in markets where personal data play a key role. The complexity of these markets is challenging for academics who rely on different disciplines and methods to investigate privacy issues, including field experiments. In this article we review this literature, and point out the need to take account of the complex interactions between the different economic agents, and highlight specific strategies regarding privacy issues. First, individuals might face a puzzling tradeoff between sharing data in order to access customized services and, on the other hand, protecting their personal data against potential data misuses. Second, industrial organizations and the literature of marketing study how firms can fit personal data into their strategies; and how this can spur new business models. Third, privacy regulation is being challenged as it aims both at protecting individuals’ privacy while preserving firms’ capacity to innovate. In this context, the main difficulty for policymakers is to shape a clear framework for both individuals and firms. Lastly, we bring attention to the need for further research to investigate the role of privacy as a business differentiator: in other words to establish a clear link between consumers’ demand and firms’ offer.


Privacy Regulation Discrimination Information asymmetry Personalized advertising 


D18 D82 D83 K2 L5 M31 M37 


The role of personal data in economics has been spurred on by the increasing digitalization of the economy, as it is now possible to collect, store and process huge amounts of data. In the economics literature, the concealing of individuals’ personal information was first supposed to lead to information asymmetries between employee and employer, insured and insurer, etc., and thus, to possible market inefficiencies (see Stigler 1980; Posner 1981). However, especially in the context of internet economics, little is known about what might be the right balance for individuals between disclosure and protection of their personal information. For example, individuals might know (or not know) that firms may exploit their data for marketing purposes, or for discrimination in the job market. Recently large parts of the economics and marketing literature has begun focusing on individual behaviors in different contexts (Acquisti et al. 2012), and in particular, it is investigating the effectiveness of firms’ personalized advertising (Lambrecht and Tucker 2013) and the impact of privacy regulation (Campbell et al. 2015).

The literature provides numerous definitions and conceptions of privacy. For instance, Hirshleifer (1980) considers that privacy is the individual capability to act autonomously and independently of other individuals’ control. In the same way, Westin (1967) defines privacy as the ability of individuals to control access to, and use of, their personal data, which would appear more relevant in the context of the internet. From an economics perspective, personal data have the characteristics of an information good that is non-rivalrous and non-excludable. Consequently, the secondary use of individual personal data by companies can occur without the individual’s awareness. This can lead to possible negative externalities, particularly in the internet era. While the exploitation of personal data enables companies to propose better offers to users by reducing their search costs, it can also lead to discrimination or unwanted and inappropriate solicitations. The economics of privacy examines the policy implications related to the interactions between individuals’ decisions to disclose personal data and firms’ strategies to implement innovation using these data. The pervasiveness of internet economics (including the internet of things) in various sectors and industries has increased the importance of data to any kind of business activity, and in particular, in the health sector and when dealing with vulnerable populations such as children or people with disabilities.

The economics of privacy encompasses a range of disciplines such as industrial organization, marketing, behavioral economics, information systems, health economics, computer science and law. This strand of literature largely relies on the use of field experiments to study behaviors of both individuals and firms as it allows the assessment of causal inferences. Considering the contributions made by these disciplines, the present article analyzes the emergence of new forms of complex interactions between the agents in the online market, namely consumers, firms and policy makers. The article is organized as follows. Section “Individuals’ Privacy Behaviors” investigates individuals’ behaviors and individual awareness about privacy matters, and identifies the role of information asymmetry between individuals and firms. It focuses on research in marketing and behavioral economics. Section “How Firms Use Personal Data” focuses on the market for privacy and firms’ strategies (discrimination, hidden market, etc.). The data collected by companies can be used in different contexts, for example to inform individuals during advertising campaigns (e.g. of the best price at your favorite store) or conversely to discriminate against them during a job search. Section “Regulation and Policy Intervention” deals with the regulation of privacy, and it delves into analysis of the factors influencing the adoption of new modes of regulation, such as the spread of privacy-enhancing services. It also demonstrates how policymakers’ analysis of privacy regulation is crucial since it provides the instruments to influence firms’ strategies and to disseminate information (addressed) to individuals (Goldfarb and Tucker 2012b). It considers recent literature which suggests that major events such as Edward Snowden’s revelations about possible misuses of personal data by firms drew the attention of individuals to privacy issues.

Individuals’ Privacy Behaviors

The advent of the internet and the increasing adoption of connected devices mean that individuals currently face many more privacy issues than before. Understanding individuals’ privacy concerns is crucial since they can hamper the diffusion of online services and, more generally, the growth of ICTs. Above all, there is an obvious tradeoff for users between sharing and protecting their personal data. On the one hand, sharing can be beneficial as it allows users access to personalized services. Individuals can also benefit from the externalities from other individuals’ data disclosures. For instance, the (good or bad) recommendations about a product on a website, which reveal people’s preferences, can benefit other users who have access to this information. On the other hand, users need to consider possible misuses of their data (Acquisti et al. 2016). Disclosing personal information is not the only way for individuals to have their data collected, since firms can use activity, behavioral or locational data to find out individuals’ characteristics. As an example, Facebook Likes can help to predict a range of highly sensitive personal data, like political views or sexual orientation (Kosinski et al. 2013). Consequently, the rationale adopted by users to privacy issues is interesting from an economic point of view. From the perspective of a market for privacy, users may be aware that their personal traits and attributes (age, address, gender, revealed preferences via social media or purchases, comments, etc.) have an economic value for firms.

By default, many existing theoretical models consider users as having complete information about firms’ strategies; in other words, they suppose that individuals know how firms might use their personal data, and how to react to these strategies. In fact, these models are related closely to the acceptance of the internet model by users. They deal with how markets would be affected where individuals are reluctant to share their data, and with the effectiveness of privacy protection strategies. While the earliest theoretical approaches consider hiding personal information to be a suspicious strategy, which may be inefficient from a social perspective (Hirshleifer 1980; Stigler 1980; Posner 1981), later models are more realistic and consider that this might be a rational strategy for consumers. Thus, Varian (1997) admits that individuals might find it beneficial not to reveal private information, and introduces the idea that individuals may consider secondary usage in their privacy strategies. Within a Coasian perspective, Kahn et al. (2000) take into account explicitly the awareness of consumers able to “deal” in their private information, and able to evaluate the costs and benefits of their privacy choices. Subsequent models provide evidence of consumer strategies which firms need to consider in markets. Indeed, Fudenberg and Tirole (1998) show that a monopoly selling durable goods may employ different strategies according to the type of good, that is the type of consumer – anonymous, semi-anonymous or identified. In a duopoly setting, Villas-Boas (1999) provides evidence that consumers have an interest in revealing their preferences to competitors and to be patient, that is they care about the future, in order to get lower prices from competitors who try to attract them. Thus, theoretical models show that consumers’ preferences drive the strategies of firms whose business models are based on the disclosure of personal data by users. In some way, these approaches question the consequences of the existence of asymmetric information between users and firms about firms’ privacy strategies. Indeed, the literature points out that individuals may not know what firms or third parties intend to do with their data, and for that reason, may be reluctant to disclose them.

In order to capture the issue of information asymmetry, which is highlighted in theoretical models, and more broadly, the way individuals reason about privacy, researchers propose empirical approaches and field experiments. The literature tries to understand the diversity of privacy behaviors, and to provide evidence of biases which might influence privacy decision making.

Empirical research is interested in estimating users’ privacy concerns by emphasizing the motivations and conditions which lead individuals to disclose (or not) personal information. Through an interdisciplinary review, Smith et al. (2011) identify five main factors that might explain an individual’s level of concern about online privacy: privacy experience, privacy awareness, personality, demographics and culture. In the case of online services, consumers tend to disclose personal information to access personalized services (Chellappa and Sin 2005). Nevertheless, the literature provides evidence of a privacy paradox, or an inconsistency between what is said and what is done. These differences in privacy choices can be context dependent (Nissenbaum 2004) or can be due to intrinsic characteristics, that is individual preferences or cognitive biases. Field experiments show that personal data disclosure can be explained by access to immediate gratification, by incomplete information, and by individuals’ bounded rationality (Acquisti 2004; Acquisti and Grossklags 2005). Through a series of diverse field experiments, Acquisti et al. (2012) show that survey respondents disclose more personal data if other respondents do so – the herd effect – and that the level of disclosure increases if the questions progress from more to less intrusive. In a study of people’s interactions, Forman et al. (2008) find that consumer-generated product reviews containing identity-descriptive information are rated more positively by community members, and are associated with an increase in product sales. Moreover, information disclosure tends to become the norm, and to lead other reviewers of a product to do the same.

The literature has underlined that individuals’ privacy concerns increase once consumers have, or have heard, about bad privacy experiences, such as identity theft or unwanted secondary uses (Smith et al. 1996). The extent of their concern over privacy depends also on an individual’s awareness of firms’ privacy practices (Malhotra et al. 2004). In two field experiments, Marreiros et al. (2016) test whether privacy actions and attitudes can be influenced by exposure to information about the advantages and disadvantages associated with disclosing personal information online. The results suggest that privacy concerns emerge once users are asked to think about privacy issues.

From a market perspective, individuals’ awareness of the potential risks associated with privacy abuses can have an impact on the demand for privacy. In a field experiment designed on a shopping search engine interface, Tsai et al. (2011) show that individuals’ purchasing intentions increase if online retailers display salient information about privacy protection clearly. Acquisti et al. (2013) disentangle the issue of privacy valuations by suggesting that individuals value privacy more when they have it than when they do not. In other words, they highlight the existence of an endowment effect. Using a large dataset, Goldfarb and Tucker (2012b) use a refusal to disclose personal information (here individual income) to measure demand for privacy. They show that, overall, there is an increasing percentage of individuals who do not disclose personal data, and that there is a generational pattern: younger individuals disclose more compared to older people.

Open Research Questions

Regarding individuals’ privacy behaviors, there are several open research questions. For example, knowing the differing privacy concerns among users is essential in order to conceive adequate privacy regulation. A regulation aimed at users who are not concerned about privacy is likely to be ineffective. For this reason, it is of interest to investigate the younger generation’s privacy concerns, and especially the privacy concerns of teenagers. Demand for privacy appears also to be an important feature, and there is a need to design field experiments to estimate the demand for privacy, that is if users find a good or a service with privacy characteristics more attractive.

How Firms Use Personal Data

The marketing and industrial organization literatures mostly investigate how companies exploit personal data, and how personal data can spur new business models, and thus, generate innovation. Big data, data analysis and progress in business analytics permit data to be retrieved and analyzed at an unprecedented level (Acquisti 2014). In a seminal contribution, Varian (1997) distinguishes between first and ‘second usage’ of personal information by internet companies: first usage facilitates firms’ interactions with customers, and second usage occurs when the firms pass on information to one or more other firms – “third parties” – better able to exploit personal data. This distinction defines a primary market involving customers and internet companies, and a secondary market involving internet companies and third parties.

In the primary market involving internet companies and customers, personal data is first used to design more effective advertising campaigns aimed at individuals, and to set prices that are close to individuals’ willingness to pay. Second, both the exploitation of clickstream data that provide detailed information on how individuals interact with websites and advertising, and the increased importance of algorithms in internet economics help speed up the pace of innovation.

According to price discrimination theory, the link between firms’ strategies and personal information, and thus privacy, is central (Taylor 2004). Exploiting personal information could facilitate first degree price discrimination, or more realistically, third degree discrimination as these data permit a company to identify an individual’s reservation price. Consistent with a positive effect of discrimination for users, a recent theoretical work by Belleflamme and Vergote (2016) suggests that the use of technologies to conceal personal information might reduce consumer surplus.

For most online firms, personal ad revenue is a major source of income (Martin and Murphy 2016). However, empirical works based on large field experiments provide some puzzling results regarding the effectiveness of personalized ads. In particular, in a field experiment on a popular social media website, Tucker (2014) shows that the effectiveness of an ad increases if individuals have more control over their personal data. In another field experiment, Lambrecht and Tucker (2013) show that dynamic ad retargeting is, on average, less efficient than generic ads. Dynamic retargeting effectiveness increases once individuals have more information related to the products they want to buy. In addition to price discrimination, other forms of discrimination can arise, in particular, in the labor market where recruiters can discriminate among candidates on the basis of information available on social media (Acquisti and Fong 2015; Manant et al. 2016). The article by Lambrecht and Tucker (2016) adds to this literature the role of social media algorithms which reproduce offline discrimination of individuals and particularly, that of women. Overall, these articles highlight how firms can exploit personal data which users leave online.

Data exploitation strategies suppose that firms can access individuals’ personal data. This is why the theoretical literature highlights the need for firms to take account of individuals’ privacy concerns. Taylor (2004) shows that firms can employ different strategies depending on the privacy regulation and consumers’ expectations: firms prefer a disclosure regime if consumers are naïve, and a confidential regime if consumers expect that their personal data will be used by firms. Acquisti and Varian (2005) provide similar results – that is that firms’ strategies rely on consumers’ preferences for personalized services. This need to take account of consumers’ strategies to protect their privacy is confirmed by dynamic approaches (Villas-Boas 2004; Armstrong and Zhou 2010). Internet companies largely rely on the distribution of services or goods in exchange for users’ registration. The effectiveness of these practices is not straightforward. In a theoretical model, Morath and Münster (2017) show that a monopoly firm can benefit from ex-ante registration requirements, in particular, if future purchases are considered. To influence consumers’ decisions to register, consumer discounts seem appropriate, and allow consumer surplus to be increased.

In the market involving internet companies and third parties, firms can also exploit personal data by selling it. While there are many marketing companies such as BlueKai and Avarto which specialize in data management, there is a need to understand the functioning of personal data markets, and the role of the companies in these markets. Secondary use of personal information arises if data are sent to third parties or data brokers, that is to data aggregators, advertisers, or, more broadly, to competent departments within a firm (Akçura and Srinivasan 2005). Third party use and secondary use of personal data within the same firm seems to be less legitimate if personal data are sent without the awareness of the user. Taylor (2004) considers that this behavior is welfare-diminishing for consumers. Using a theoretical model, Akçura and Srinivasan (2005) also show that this secondary market can result in a dramatic decrease in consumer welfare. The existence of the market for personal data implicitly questions the value of data to firms (Spiekermann et al. 2015). From this perspective, industrial organization scholars first may have to assess whether personal data are a good per se. Farrell (2012) contributes to this discussion by showing that personal data can be considered as a good, and that privacy protection can be seen as a strategic parameter for profit maximizing firms. In this context, the contribution by Kummer and Schulte (2016) is relevant since it delves into the business models of smartphone applications by highlighting a tradeoff between price and personal data for both the market’s supply and the demand side, seeing personal data as an alternative business model for free services. An OECD (2013) report provides details of different methods to evaluate personal data, and offers some insights into the firms operating in the market involving internet companies and third parties.

Open Research Questions

There remain various research questions related to the industry structure of the personal data market, and the extent to which personal data are part of the business models of internet companies. This is particularly relevant in the market for smartphone applications, where an increasing number of free applications are available. An important issue is to see if privacy can be a business differentiator.

Regulation and Policy Intervention

The emergence of new businesses based on personal data drew the regulators’ attention to the need to find the right balance between protecting privacy and promoting data sharing to encourage innovation and improve services. Formulating public intervention in privacy is a complex issue. First, the innovation in sectors where personal data play a key role is challenging and competition among players is high. Second, the design of privacy regulation can affect the behaviors of both individuals and firms. Indeed, market interactions, namely those between firms and consumers, but also between firms and third party companies, can lead to unexpected consequences, which can trigger the effectiveness and the evaluation of the policy. Moreover, while privacy regulation is directed towards consumers and firms, it can also have indirect consequences on market structure. All in all, the direction and the size of these effects are unclear. While regulation helps to create the premise of a clear framework for companies, there is a need to understand the overall role of personal data on the markets. For this reason, we focus here on the principles that govern privacy regulation and on theoretical and empirical evidence of the impact of this regulation on the markets, but also on how privacy-enhancing technologies and data breaches can shape these markets.

Regulation and Self-Regulation

Focusing on the instruments used by the regulatory authorities helps understand how privacy regulation can intervene in the markets. In the U.S. where the Federal Trade Commission (FTC) provides guidelines at the sectoral level, self-regulation prevails. The main goal of self-regulation is to stimulate “competition on privacy”, and then alleviate market failures. This approach considers both sides of the market by assuming that consumers can make decisions to enact their privacy preferences, and that companies are supposed to respect a principle of transparency and control over privacy issues, for example, by giving detailed information on data collection as well as on the use of data. In this perspective, privacy policies rely on a ‘notice-and-consent’ principle where individuals are supposed to read privacy policies, and consent or not to the terms of service (Cranor 2012). Privacy policies are expected to provide information to individuals on firms’ practices about how their personal data are gathered, used, shared and secured (Marotta-Wurgler 2016). However, empirical evidence shows that those policies are too long to read and too complex to understand for a non-practitioner (McDonald and Cranor 2008). In line with the self-regulation approach, FTC policy has also encouraged the creation of third party certification services and online seals such as TRUSTe’s and BBB’s to help decrease individuals’ cognitive costs of assessing the eventuality of potential privacy threats. Nevertheless, adverse selection can emerge with such private seals. In particular, empirical research has shown that websites certified by TRUSTe are more than twice as likely to be untrustworthy as uncertified sites (Edelman 2011). In terms of policy implications, it suggests that regulatory intervention is necessary to ensure the quality of private seals.

In Europe, the regulatory approach is focused more on providing a general framework to protect consumers across sectors, which is substantially different from the self-regulation approach. Regulation aims at alleviating the adverse selection problem by ensuring more guarantees for individuals and a stringent environment for companies. The current debate was triggered in 2016 by the publication of the Data Protection Directive updating the previous Data Protection Directive 95/46/EC Directive and ePrivacy 02/58/EC Directive. This new framework allows individuals access to more information about how their data are processed by companies, and gives individuals the right to have their data forgotten – that is, they can ask for deletion or modification of their data by the data holders. The Directive promotes the use of ‘privacy by design’ which aims to embed privacy in the very early phases of the development of a product or a service, as well as throughout its development. Overall, while the European approach is supposed to bring more transparency and stronger protection for individuals than in the U.S., it must also be more costly for firms that comply with, since it imposes stronger obligations on them.

The evolving practices of regulation dealing with privacy issues have also had an impact on the commercial relations between the U.S. and Europe. As an illustration, Snowden’s revelations of mass surveillance programs have encouraged the replacement of the previous US-EU Safe Harbor Agreement by the EU-US Privacy Shield, which took effect in July 2016. This agreement pushes for more cooperation between U.S. and European Data Protection Authorities. This obliges U.S. companies to ensure transparency about trans-frontier transfers of personal data and stronger protection of personal data.

Impacts of Privacy Regulation

Since privacy regulation affects both individuals and firms, the literature has identified different levels of impact of this regulation, that is, the impact on agents’ behaviors, but also on market structure.

First, the literature shows that the different principles of privacy regulation have direct effects on individuals’ choices. Looking at the variation in U.S. State genetics privacy laws over time, Miller and Tucker (2015) show that giving individuals control over the redisclosure of their genetic tests by hospitals encouraged the diffusion of this practice, while requiring informed consent deterred individuals from undertaking genetic testing services. In a large-scale field experiment, Goldfarb and Tucker (2011) study how the enactment of the European privacy regulation has impacted the effectiveness of banner ads on individual behaviors. They show that the intention to buy has decreased since EU regulation restricted the use of data related to customers’ past browsing behaviors.

The increased use of open data and mass data collection can also influence consumer behaviors, and have potential commercial outcomes. Marthews and Tucker (2015) show that Edward Snowden’s revelations about the U.S. government surveillance changed the type of requests conducted on Google Search. The result suggests that this event has affected individual behaviors by increasing the demand for privacy.

Second, while privacy regulation is directed towards consumers and firms, it can also have indirect consequences on the market structure. Campbell et al. (2015) propose a theoretical model to show that a consent-based approach, even if it deters consumers and imposes costs on all firms, may disproportionately benefit generalist firms that offer a large scope of services, rather than specialist firms. This regulation regime thus affects the competitive structure of the market. In the context of the impact of sectoral privacy regulation on organizations’ activities, there is a large literature studying the impact of health regulation in the U.S. on the diffusion of hospital information technologies. Exploiting the variation in State privacy legislation in the U.S., Miller and Tucker (2009) show that privacy protection can hamper the adoption of these technologies by hospitals if they are unable to take advantage of patient information from other hospitals. U.S. privacy law restricts cases where hospitals can exchange patient data. The regulation can also have an impact on the location of internet companies. Using a sample of the most visited websites worldwide, Rochelandet and Tai (2016) demonstrate empirically that internet firms prefer to be located in countries where data collection and exploitation are less regulated. Inversely, tax instruments can help reduce data collection by internet platforms, while an opting-out option, where users can access the platform with no data collection, induces the platform to raise the level of data exploitation (Bloch and Demange 2016).

The increased importance of data has also stimulated mergers and acquisitions aiming to increase their competitive advantages. These practices might radically change the structure of the market. On the one hand, they can counterbalance the internet giants currently in place, and on the other hand, they can create the conditions for anti-competitive data-driven strategies once these operations aim to prevent rivals from accessing data or hamper the access of consumers to competitors’ platforms (Goldfarb and Tucker 2012a). As an example, the Google/DoubleClick merger has illustrated how the standard tools of competition policy cannot really assess consumer privacy issues since privacy issues are related to non-price attributes. In particular, the degradation of privacy cannot be easily observed by consumers (Grunes and Stucke 2015). In this respect, the role of data brokers can be central. In a theoretical model, Clavorà Braulin and Valletti (2016) show that it is possible to achieve first best allocation only when data are sold non-exclusively. When a data broker sells the data exclusively, this creates inefficient allocations.

Privacy-Enhancing Services: Privacy as a ‘Business Differentiator’

The increasing level of individuals’ privacy awareness can have an impact on demand for privacy, which then generates conditions conducive to the adoption of privacy-enhancing services. After the Snowden revelations, the number of DuckDuckGo service users, a search engine that does not register users’ IP addresses, sharply increased by about 600% (Wired 2017). In January 2017, it had more than 14 million searches. In this respect, the protection of personal data can represent a differentiation strategy for internet companies, ‘pushing to the top’ the most privacy respectful firms. In such a case, the decrease in personal data collection would offset the increase in the number of consumers becoming more confident. An important contribution in this respect is the theoretical model of Casadesus-Masanell and Hervas-Drane (2015) showing that higher competition intensity in the marketplace need not improve privacy when consumers exhibit low willingness to pay. However, so far there is no clear evidence that people do understand the current model of free internet services in which they give personal data in exchange for free services, and that they might be better off paying for it and protecting their privacy. Privacy-enhancing services can remain a niche market if demand for these services does not increase, which Farrell (2012) defines as a “dysfunctional equilibrium”. In this respect, moving from free services to paid services is a major challenge in the future for privacy-enhancing services. Nowadays, the large majority of these services are freely available (examples are blogs, website contents, or smartphone applications). The main sources of income of these internet services are advertising, e-shopping and personal data (Lambrecht et al. 2014). The advance of services that help block advertising, as well as the increase of individuals’ privacy concerns, might challenge the model of free web services (e.g. Adblock Plus, Google Contributor). In particular, the demand for privacy can influence the willingness to pay for services which are respectful of privacy.

Open Research Question

With the ‘internet of things’ generating huge amounts of data (‘big data’), and then the proliferation of algorithms that implement artificial intelligence and machine learning, many choices can be suggested to individuals. This could have a positive or negative impact on social welfare. Potential discrimination due to these technologies will require special attention from the regulator but also innovations on their part.

Further Developments

We identify the existence of a privacy market linking users’ demand for privacy. Looking at the drawbacks of privacy policy, privacy-enhancing technologies (PET) can be seen as an alternative. An example was the P3P project (Platform for Privacy Preferences Project) which was dedicated to the creation of machine-readable privacy policies, aimed to normalize privacy policies and help users to better understand them and increase individuals’ trust. However, these solutions have also failed, due partly to their complexity. More advanced solutions, such as encryption, differential privacy or decentralized personal data systems, based on the blockchain technology, have been suggested to help build a market for personal data, but these technologies do not yet guarantee a possible re-identification of the personal data owner (Zyskind et al. 2015).

Starting from these weaknesses, and by simplifying the signal to its maximum, behavioral economists and psychologists suggest that information related to privacy issues should be conveyed in a simple and standardized manner in order to ensure that consumers understand it (Bhargava and Loewenstein 2015). Nudging privacy seems to be a promising practice that is complementary to effective regulation and individuals’ empowerment (Lazer et al. 2009).

As a summary, it appears to be particularly complicated to achieve an appropriate balance between information sharing and information hiding. Therefore, there is no single way to reduce information asymmetry. Regulation, market-based or technologies-based solutions can be seen as complementary. In this perspective, privacy protection as a business differentiator ‘pushing to the top’ the most privacy respectful firms deserves consideration, although it depends in fine on consumers’ preferences for privacy.

See Also


  1. Acquisti, A. 2004. Privacy in electronic commerce and the economics of immediate gratification. In Proceedings of the 5th ACM conference on electronic commerce, 21–29.Google Scholar
  2. Acquisti, A. 2014. From the economics of privacy to the economics of big data. In Privacy, big data, and the public good: Frameworks for engagement, ed. S. Bender, J. Lane, H. Nissenbaum, and V. Stodden, 76–95. New York: Cambridge University Press.CrossRefGoogle Scholar
  3. Acquisti, A., and C. Fong. 2015. An experiment in hiring discrimination via online social networks, WP.Google Scholar
  4. Acquisti, A., and J. Grossklags. 2005. Privacy and rationality in decision making. IEEE Security & Privacy 3 (1): 26–33.CrossRefGoogle Scholar
  5. Acquisti, A., and H.R. Varian. 2005. Conditioning prices on purchase history. Marketing Science 24 (3): 367–381.CrossRefGoogle Scholar
  6. Acquisti, A., L.K. John, and G. Loewenstein. 2012. The impact of relative standards on the propensity to disclose. Journal of Marketing Research 49: 160–174.CrossRefGoogle Scholar
  7. Acquisti, A., L.K. John, and G. Loewenstein. 2013. What is privacy worth? The Journal of Legal Studies 42 (2): 249–274.CrossRefGoogle Scholar
  8. Acquisti, A., C. Taylor, and L. Wagman. 2016. The economics of privacy. Journal of economic Literature 54 (2): 442–492.CrossRefGoogle Scholar
  9. Akçura, M.T., and K. Srinivasan. 2005. Research note: Customer intimacy and cross-selling strategy. Management Science 51 (6): 1007–1012.CrossRefGoogle Scholar
  10. Armstrong, M., and J. Zhou. 2010. Conditioning prices on search behaviour, Munich Personal RePEc Archive Paper 19985.Google Scholar
  11. Belleflamme, P., and W. Vergote. 2016. Monopoly price discrimination and privacy: The hidden cost of hiding. Economic Letters 149: 141–144.CrossRefGoogle Scholar
  12. Bhargava, S., and G. Loewenstein. 2015. Behavioral economics and public policy 102: Beyond nudging. American Economic Review 105 (5): 396–401.CrossRefGoogle Scholar
  13. Bloch, F., and G. Demange. 2016. Taxation and privacy protection on internet platforms, halshs-01381044.Google Scholar
  14. Campbell, J., A. Goldfarb, and C. Tucker. 2015. Privacy regulation and market structure. Journal of Economics & Management Strategy 24 (1): 47–73.CrossRefGoogle Scholar
  15. Casadesus-Masanell, R., and A. Hervas-Drane. 2015. Competing with privacy. Management Science 61 (1): 229–246.CrossRefGoogle Scholar
  16. Chellappa, R.K., and R.G. Sin. 2005. Personalization versus privacy: An empirical examination of the online consumer’s Dilemma. Information Technology and Management 6 (2): 181–202.CrossRefGoogle Scholar
  17. Clavorà Braulin, F., and T. Valletti. 2016. Selling customer information to competing firms. Economics Letters 149: 10–14.CrossRefGoogle Scholar
  18. Cranor, L.F. 2012. Necessary but not sufficient: Standardized mechanisms for privacy notice and choice. Journal on Telecommunications and High Technology Law 10: 273–307.Google Scholar
  19. Edelman, B. 2011. Adverse selection in online ‘trust’ certifications and search results. Electronic Commerce Research and Applications 10 (1): 17–25.CrossRefGoogle Scholar
  20. Farrell, J. 2012. Can privacy be just another good? Journal on Telecommunications & High Technology Law 10 (2): 251–261.Google Scholar
  21. Forman, C., A. Ghose, and B. Wiesenfeld. 2008. Examining the relationship between reviews and sales: The role of reviewer identity disclosure in electronic markets. Information Systems Research 19 (3): 291–313.CrossRefGoogle Scholar
  22. Fudenberg, D., and J. Tirole. 1998. Upgrades, tradeins, and buybacks. RAND Journal of Economics 29 (2): 235–258.CrossRefGoogle Scholar
  23. Goldfarb, A., and C.E. Tucker. 2011. Privacy regulation and online advertising. Management science 57 (1): 57–71.CrossRefGoogle Scholar
  24. Goldfarb, A., and C.E. Tucker. 2012a. Privacy and innovation. Innovation policy and the economy 12 (1): 65–90.CrossRefGoogle Scholar
  25. Goldfarb, A., and C.E. Tucker. 2012b. Shifts in privacy concerns. American Economic review 102 (3): 349–353.CrossRefGoogle Scholar
  26. Grunes, A.P., and M. Stucke. 2015. No mistake about it: The important role of antitrust in the era of big data. The Antitrust Source, April, 1–14.Google Scholar
  27. Hirshleifer, J. 1980. Privacy: Its origin, function, and future. The Journal of Legal Studies 9 (4): 649–664.CrossRefGoogle Scholar
  28. Kahn, C.M., J. McAndrews, and W. Roberds 2000. A theory of transactions privacy. Federal Reserve Bank of Atlanta Working Paper 2000-22.Google Scholar
  29. Kosinski, M., D. Stillwell, and T. Graepel. 2013. Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Sciences 110 (15): 5802–5805.CrossRefGoogle Scholar
  30. Kummer, M.E., and P. Schulte. 2016. When private information settles the bill: Money and privacy in Google’s market for smartphone applications, ZEW-Centre for European Economic Research Discussion Paper, 16-031.Google Scholar
  31. Lambrecht, A., and C.E. Tucker. 2013. When does retargeting work? Information specificity in online advertising. Journal of Marketing research 50 (5): 561–576.CrossRefGoogle Scholar
  32. Lambrecht, A., and C.E. Tucker. 2016. Algorithmic bias? An empirical study into apparent gender-based discrimination in the display of STEM career ads. Social Science Research Network, WP.Google Scholar
  33. Lambrecht, A., A. Goldfarb, A. Bonatti, A. Ghose, D. Goldstein, R. Lewis, A. Rao, N. Sahni, and S. Yao. 2014. How do firms make money selling digital goods online? Marketing Letters 25: 331–341.CrossRefGoogle Scholar
  34. Lazer, D., A. Pentland, L. Adamic, S. Aral, A.-L. Barabasi, D. Brewer, N. Christakis, N. Contractor, J. Fowler, M. Gutmann, T. Jebara, G. King, M. Macy, D. Roy, and M. Van Alstyne. 2009. Social science: Computational social science. Science 323 (5915): 721–723.CrossRefGoogle Scholar
  35. Malhotra, N.K., S.S. Kim, and J. Agarwal. 2004. Internet users’ information privacy concerns (IUIPC): The construct, the scale, and a causal model. Information Systems Research 15 (4): 336–355.CrossRefGoogle Scholar
  36. Manant, M., S. Pajak, and N. Soulié. 2016. Can social media lead to labour market discrimination: A field experiment, WP.Google Scholar
  37. Marotta-Wurgler, F. 2016. Self-regulation and competition in privacy policies. Journal of Legal Studies 45 (S2): 13–39.CrossRefGoogle Scholar
  38. Marreiros, H., M. Tonin, and M. Vlassopoulos 2016. ‘Now that you mention it’: A survey experiment on information, salience and online privacy, CESifo Working Paper No. 5756.Google Scholar
  39. Marthews, A., and C.E. Tucker. 2015. Government surveillance and internet search behavior. Available at SSRN 2412564.Google Scholar
  40. Martin, K.D., and P.E. Murphy. 2016. The role of data privacy in marketing. Journal of the Academy of Marketing Science, forthcoming.Google Scholar
  41. McDonald, A.M., and L.F. Cranor. 2008. The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society 4 (3): 540–565.Google Scholar
  42. Miller, A.R., and C. Tucker. 2009. Privacy protection and technology diffusion: The case of electronic medical records. Management Science 55 (7): 1077–1093.CrossRefGoogle Scholar
  43. Miller, A.R., and C. Tucker. 2015. Privacy protection, personalized medicine and genetic testing, WP.Google Scholar
  44. Morath, F. and J. Münster 2017. Online shopping and platform design with ex ante registration requirements. Management Science, forthcoming.Google Scholar
  45. Nissenbaum, H. 2004. Privacy as contextual integrity. Washington Law Review 79 (1): 119–158.Google Scholar
  46. Posner, R.A. 1981. The economics of privacy. The American economic review 71 (2): 405–409.Google Scholar
  47. Rochelandet, R., and S. Tai. 2016. Do privacy laws affect the location decisions of internet firms? Evidence for privacy havens. European Journal of Law and Economics 42 (2): 339–368.CrossRefGoogle Scholar
  48. Smith, H.J., J.S. Milberg, and J.S. Burke. 1996. Information privacy: Measuring individuals’ concerns about organizational practices. MIS Quarterly 20 (2): 167–196.CrossRefGoogle Scholar
  49. Smith, H.J., T. Dinev, and H. Xu. 2011. Information privacy research: An interdisciplinary review. MIS Quarterly 35 (4): 989–1015.Google Scholar
  50. Spiekermann, S., A. Acquisti, R. Böhme, and K.L. Hui. 2015. The challenges of personal data markets and privacy. Electronic Markets 25 (2): 161–167.CrossRefGoogle Scholar
  51. Stigler, G.J. 1980. An introduction to privacy in economics and politics. The Journal of Legal Studies 9 (4): 623–644.CrossRefGoogle Scholar
  52. Taylor, C.A. 2004. Consumer privacy and the market for customer information. RAND Journal of Economics 35 (4): 631–665.CrossRefGoogle Scholar
  53. Tsai, J.Y., S. Egelman, L. Cranor, and A. Acquisti. 2011. The effect of online privacy information on purchasing behavior: An experimental study. Information Systems Research 22 (2): 254–268.CrossRefGoogle Scholar
  54. Tucker, C.E. 2014. Social networks, personalized advertising, and privacy controls. Journal of Marketing Research 51 (5): 546–562.CrossRefGoogle Scholar
  55. Varian, H.R. 1997. Economic aspects of personal privacy. In Privacy and self-regulation in the information age. Washington, DC: US Department of Commerce.Google Scholar
  56. Villas-Boas, J.M. 1999. Dynamic competition with customer recognition. RAND Journal of Economics 30 (4): 604–631.CrossRefGoogle Scholar
  57. Villas-Boas, J.M. 2004. Price cycles in markets with customer recognition. RAND Journal of Economics 35 (3): 486–501.CrossRefGoogle Scholar
  58. Westin, A. 1967. Privacy and freedom. New York: Atheneum Publishers.Google Scholar
  59. Wired. 2017. DuckDuckGo hits 10 billion anonymous searches after big 2016. Last retrieved 30 Jan 2017.
  60. Zyskind, G., O. Nathan, and A. Pentland. 2015. Enigma: Decentralized computation platform with guaranteed privacy. arXiv preprint arXiv:1506.03471.Google Scholar

Copyright information

© The Author(s) 2017

Authors and Affiliations

  • Grazia Cecere
    • 1
    Email author
  • Fabrice Le Guel
    • 1
  • Matthieu Manant
    • 1
  • Nicolas Soulié
    • 1
  1. 1.

Section editors and affiliations

  • Catherine Tucker

There are no affiliations available