International Workshop on Security and Trust Management

Security and Trust Management pp 89-104 | Cite as

Towards Balancing Privacy and Efficiency: A Principal-Agent Model of Data-Centric Business

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9331)


Personal data has emerged as a crucial asset of the digital economy. However, unregulated markets for personal data severely threaten consumers’ privacy. Based upon a commodity-centric notion of privacy, this paper takes a principal-agent perspective on data-centric business. Specifically, this paper presents an economic model of the privacy problem in data-centric business, in that drawing from contract theory. Building upon a critical analysis of the model, this paper analyzes how regulatory and technological instruments could balance efficiency of markets for personal data and data-subjects’ right to informational self-determination.


Privacy economics Privacy Property rights Accountability Principal-agent model 

1 Personal Data Markets and Privacy

In the information society, superior capacities to analyze data and superior data sets can constitute crucial competitive advantages for companies [30]. Among the different classes of data, especially personal data is, as the World Economic Forum states, a “critical source of innovation and value” [53]. The, now common-place, metaphor of personal data as “the new oil” by European Commissioner Kuneva (as cited in [53]) further illustrates the crucial role that is attributed to personal data in the economy. Consequently, markets for personal data have emerged, which, however, are barely regulated [45]. In those markets data subjects participate as suppliers of personal data, in that often not knowing who collects, transfers and monetizes which data relating to them [41]. While markets with highly transparent data-subjects might be efficient (cf. [37]), not only economic factors have to be considered in the debate on markets for personal data. The human rights aspect of privacy has to be taken into account in order to find a balance between the economic efficiency of markets for personal data and data-subjects’ right to informational self-determination [9].

This paper analyzes from an economic perspective how such a balance can be achieved by technological and regulatory instruments, in that focusing on the “First Tier Relationship Space” [35] of markets for personal data, i.e., on the direct relation of primary data-controller and data-subject. In particular, this paper focuses on data-centric business as defined by Müller et al. [32] and the privacy problems inherent in this business model [32, 41]. A simplified scheme of data-centric business is depicted in Fig. 1.
Fig. 1.

Schematic representation of transactions in data-centric business (taken from [32])

Data-centric service providers such as Google or Facebook provide (often free of charge) services to consumers and generate revenue by providing third companies with the ability to present targeted advertisements to theses consumers. Hence, data-centric businesses act as multi-sided platforms [17] that cater to users of their services on the one hand and to advertisers on the other hand. The collection of personal data and the generation of user profiles are at the core of data-centric business models, as these profiles build the foundation for delivering targeted advertisements. Hence, it is in data-centric businesses’ interest to collect as much data relating to users as possible in order to be able to generate precise targeting profiles [34]. However, not only providers of data-centric services, but also their users benefit from profiling [19], e.g., through decreased transaction cost due to automatically personalized recommender systems [49]. However, extensive collection, analysis and usage of data relating to users affect and threaten their privacy [41, 51]. In the context of data-centric business, this paper addresses the following research questions:
  1. 1.

    How can the relation of users and providers be modeled in economic terms?

  2. 2.

    Which leverage points for balancing economic efficiency and privacy can be identified and how can technological and regulatory instruments help in establishing such a balance?


This paper contributes as follows: In order to describe the privacy problems in data-centric business and subsequently be able to identify leverage points for balancing efficiency and privacy, this paper provides an economic model of the privacy trade-offs in data-centric business. The presented model builds upon a novel principal-agent perspective on data-centric business that is rooted in a commodity-centric notion of privacy. Building upon a critical analysis of the model, this paper analyzes how regulatory and technological instruments could balance efficiency of markets for personal data and data-subjects’ right to informational self-determination.

The remainder of this paper is structured as follows: the next section provides an overview on related work. Section 3 presents our economic model of the privacy trade-offs in data-centric business. Section 4 presents an analysis of the model, illustrates regulatory and technological leverage points for balancing market efficiency and privacy and discusses instruments for this balancing. We conclude the paper and provide an outlook on future work in Sect. 5.

2 Related Work

The emergence of barely regulated markets for personal data and their threats to privacy have not gone unnoticed by academia. Scholars in the computer sciences, jurisprudence and IS are investigating legal and technological instruments to regulate such markets and to provide instruments for data subjects to exercise greater control over their personal data. Notable approaches towards organizing markets for personal data have been proposed by Laudon [27], Schwartz [42] or Novotny and Spiekermann [35].

Technological and legal instruments for addressing the privacy problems in markets for personal data have recently been discussed by Spiekermann and Novotny [45]. We build upon their commodity-centric notion of privacy, in that considering only usage rights for personal data tradable. Acquisti provides an economic model of privacy trade-offs in electronic commerce, in that focussing on data-subjects’ decision process and arguing for models based on psychological distortions [2]. He does, however, not investigate the perspective of the data-controller and the structure of the market. Chellappa & Shivendu provide a model for game-theoretic analysis of property rights approaches towards privacy, in that, however, considering only monopolistic markets [12].

In contrast to existing work, the model provided in this paper takes a principal-agent perspective and focuses on the market structure in data-centric business.

3 Principal-Agent Model of the Privacy Problems in Data-Centric Business

Identification of leverage points for balancing efficiency of the market for personal data and data-subjects’ privacy in data-centric business requires a model fit to describe the market, its agents’ behavior and the market’s power structure. Building upon a commodity-centric notion of privacy, we provide a principal-agent model of data-centric business in Sect. 3.2. However, we first elaborate on data-centric business and the assumptions underlying our model in the following.

3.1 Assumptions and Background

We base our model upon the following three assumptions, on which we elaborate further in the following.

  1. 1.

    Usage rights to personal data are transferable and tradable.

  2. 2.

    Providers and users act rational and have homogeneous utility functions within their constraints.

  3. 3.

    Users are privacy pragmatists that are willing to substitute privacy for functionality up to a certain degree.

Assumption 1: Among many others, Campell and Hanson or Davies argued that the growing economic importance of personal data is paralleled by a shift in public perception of personal data and a reconceptualization of privacy that moves it from the domain of civil rights to the domain of commodification [10, 16]. That means that personal data increasingly is considered a tradable commodity that is separable from the individual [10, 16]. In the wake of this shift in perception, property rights for personal data have increasingly been debated not only in jurisprudence, but also in IS and the computer sciences (cf. [6, 15, 28, 39, 40, 42]). Recently, Purtova has shown that current European data protection regime “endorses the’property thinking’ with regard to personal data” [40,  p. 211] and that data-subjects’ ownership claims to personal data are compatible with the principle of informational self-determination [40]. However, the human rights aspect of privacy excludes full propertization of personal data. In particular, ownership claims to it can not be alienated [6, 42]. Our model does not build upon full propertization of personal data. Instead, we follow Spiekermann and Novotny and consider only usage rights to personal data transferable and tradable [45].

Assumption 2: For mathematical simplicity we assume that all agents within our model, providers as well as users, act rational under their constraints. Thus, an agent will perform any action that increases her expected utility and will avoid any action that has no positive expected utility for her. Moreover, also for easy modeling, we assume that users and providers have homogeneous utility functions. This allows for utilization of just one expected utility function for all users and just one for all providers within our model. Those assumptions are also common and necessary for the standard principal-agent model [50].

Assumption 3: Building upon Westin and Ackerman et al., we further put “pragmatic users” into the center of our investigation [1, 52]. Pragmatic users are concerned about their privacy, i.e., the usage of data regarding them, but weigh their concerns against the benefits of disclosing data about themselves. This user model is supported by current research that shows that users are willing to engage in online transactions and disclose personal data in case the perceived benefits of doing so outweigh the cost, including the perceived cost of reduced privacy [3, 20, 21, 34, 44].

In data-centric business as defined above, users receive benefits from data-centric service providers’ data aggregation and analysis, e.g. personalized search results. However, to be able to reap these benefits from data processing, users need to entrust data relating to them to a provider of a data-centric service, i.e., transfer usage rights to that data to the provider. Hence, given the above presented commodity-centric perspective on privacy, we consider the relation of users and providers of data-centric services a principal-agent relation [43]. In this principal-agent relation, users suffer severe information asymmetries [41], i.e., they are unable to fully comprehend which data a provider collects, how that data is aggregated, which data is inferred from collected data and how personal data is used by the provider. Thus, users face a problem of moral hazard [50], i.e., they face the risk that providers exercise transferred usage rights in ways users do not wish them to be used.

The classic approach towards describing and investigating solutions to moral hazard problems in principal-agent relations is provided by contract theory [50], upon which we build our investigation. However, the classic contract theory model can not be applied straight forward in the context of data-centric business. The classic model builds upon the idea that principal and agent negotiate a contract and the principal pays the agent a price that will incent the agent to follow strategies that benefit the principal rather than following strategies that maximize solely the agent’s own benefit. In current data-centric business, the user (the principal) undoubtedly enters a contract with the provider (the agent) by agreeing to its terms of usage. Hence, currently, the user is unable to negotiate this contract and has to accept the conditions set by the provider.

Further, in data-centric business, the user transfers usage rights to personal data to the provider so that she can reap benefits from the provider’s usage of the data. However, based on the contract, the user transfers more data and wider-reaching usage rights than necessary (and, possibly, desired by the user) for receiving the desired benefits and, hence, pays a price to the provider. This price is set by the provider and, thus, it does not incent the provider to act in the user’s interest. In fact, regarding privacy, the price is set such as to maximize benefits for the provider. Although the user seldom is able to fully comprehend common privacy policies or terms of usage [31, 33, 34], we assume that the user expects the provider to be able and eager to collect more data than technologically necessary and to use the data for defined purposes (e.g. for advertising) that are beyond solely providing the desired service (see Assumption 2). Current research supports this assumptions and has shown that users engage in “privacy-seeking” behavior when using data-centric services [47].

3.2 Principal-Agent Model

Game theory addresses problems where the probability of a certain outcome is utterly unknown. In our approach, users face a decision under risk but not under complete uncertainty. While the probabilities can not necessarily be determined exactly, general outcome probabilities are rough deducible. For example, users can infer the probabilities for some extreme outcomes based on media reports about data leakage scandals or similar reported events. Hence, following the classic model [50], we represent a user’s expected utility as a concave Neumann-Morgenstern function. This also allows us to consider different user types, for example, as in this paper, a risk-averse privacy pragmatic user. We formulate the user’s expected utility as follows:
$$\begin{aligned} EU(a,s,x,r,z):=\pi (s,a)-(g(x,r,z) + z) \end{aligned}$$
The user desires the data-centric service provider to perform action \(\hat{a}\), i.e., provide the desired services and exercise the transferred usage rights solely to provide these services. Thereby \(\hat{a}\) is part of a finite set of possible provider actions \(A\). Depending on the random variable \(s\) and the action \(a\) chosen by the provider, the user receives the outcome function \(\pi (s,a)\), with \(s \in S, S=\{s_1,\dots ,s_n\}, p(s)\in [0,1]\) being a random variable individually drawn for each user, accounting for the provider’s ability to take action \(a\) (e.g., exercise usage rights for purposes undesired by the user). Hence, because the outcome \(\pi (s,a)\) partly depends on chance the user is not able to fully compare it to the desired outcome \(\pi (s,\hat{a})\).

Moreover, the user can not definitely determine which data and which usage rights are at minimum necessary to provide the desired services. For example, a user can not estimate which data and which usage of the data are necessary for receiving personalized search results. We represent the cost of disclosing this technologically at minimum necessary amount of data and transferring the resp. usage rights by \(x_{min} > 0\). The user’s at maximum desired cost of disclosing data and transferring usage rights is represented by \(\hat{x}\). As described above, the user suspects the provider to collect more data than technologically necessary for providing the desired service and to use the transferred usage rights for purposes (specified in the terms of usage) beyond solely providing the desired services. For simplicity we do not consider illegal data usage by the provider. We represent the user’s expected cost of data disclosure and transfer of usage rights by \(x\) with \(x \ge x_{min} > 0\).

We take users’ privacy-seeking behavior into account by representing the subjectively expected privacy-related overall cost of using data-centric services by \((g(x,r,z)+z)\) with \(g(x,r,z) \ge x_{min}, z \in [0,1]\). For simplicity and clarity, we neglect the cost of using the service per se, e.g., expenditure of time. In the construct \((g(x,r,z)+z)\), the variable \(z \in [0,1]\) represents the cost that a user incurs when engaging in privacy-seeking behavior, trying to reduce \(x\) by, e.g., the usage of Privacy-Enhancing Technologies (PET) [48] or by adjustment of privacy settings. In case the user does not engage in privacy-seeking behavior, then \(z=0\). In case the user engages in privacy-seeking behavior to the maximum extent technologically currently available, then \(z=1\). We assume that the user is aware of the fact that privacy-seeking behavior is not necessarily successful, i.e., does not necessarily decrease \(x\) (e.g., data might be inferred anyway and be usable for advertising). We represent this uncertainty by the random variable \(r\) individually drawn for each user, with \(r \in R, R=\{r_1,\dots , r_m\}\), \(r \in [0,1]\) and the probability \(q(r)\in [0,1]\). In this construct \(r\) represents the chance of success of a user’s privacy-seeking behavior, with \(r = 0\) meaning no success at all, i.e., \(g(x,r,z) = x\). In case \(r = 1\), success is depending on the invested \(z\), i.e., \(g(x,r,z) = \hat{x}\) (or \(g(x,r,z) = x_{min}\) if \(\hat{x} < x_{min}\)) provided \(z =1\). In case \(r = 1, z \in ~]0,1[\) then \(g(x,r,z) \in ~]x_{min},x[\). We formulate a provider’s expected utility from providing a user with data-centric services as follows:
$$\begin{aligned} EF(a,x,r,z):= f(h(x,r,z))-c(a) \end{aligned}$$
The provider aims at receiving \(x\) (i.e., data relating to the users and the respective usage rights) and incurs the cost \(c(a)\) of its action (providing the service and exercising usage rights to user’s data), with \(c(a) \ge 0\) and \(c(a) = 0\) only for \(a = 0\). The utility function \(f(h(x,r,z))\) the provider expects depends on the expected effectiveness of users’ privacy-seeking behavior, i.e., on \(r\), \(z\) and \(x\) which we take into account by the outcome function \(h(x,r,z)\). In any case, the provider receives at least \(h(x,r,z) \ge x_{min}\). Given the high information asymmetries in data-centric business [41], currently, the provider is in the position to set the price in terms of transfer of data and usage rights, i.e., to set \(x\) as high as possible for profit maximization. Hence, the provider aims at establishing contract \(x(\pi )\) that maximizes:
$$\begin{aligned} \sum ^n_{i=1}p(s_i)*\sum ^m_{j=1}q(r_j)*EF(a,g(x(\pi (s_i,a)),r_j,z), r_j, z) \end{aligned}$$
While the assumed pragmatic user is willing to trade-off privacy and benefits from using data-centric services, even a pragmatic user [1, 52] is not willing to completely substitute privacy with functionality [34]. This means, that the assumed pragmatic user will refrain from using a specific data-centric service in case she expects the provider to collect data and to exercise data usage rights to an extent too far beyond the desired extent. In that case, \(EU < U^0\) with \(U^0\) being the user’s expected utility from not using a service at all. Similar as in classic contract theory, we represent this constraint as follows:
$$\begin{aligned} \sum ^n_{i=1}p(s_i)*\sum ^m_{j=1}q(r_j)*EU(a,s_i,x,r_j,z) \ge U^0 \end{aligned}$$
As we assume non-monopolistic markets, users can choose from several providers of data-centric services, e.g., different online search engines, to get similar benefits. Thus, a privacy pragmatic user will only use a specific service if, besides the constraint formulated above, the following constraint holds:
$$\begin{aligned} \sum ^n_{i=1}p(s_i)*\sum ^m_{j=1}q(r_j)*EU(a,s_i,x,r_j,z) \ge \sum ^n_{i=1}p(s_i)*\sum ^m_{j=1}q(r_j)*EU(a',s_i,x',r_j,z) \end{aligned}$$
Here \(a'\) and \(x'\) represent actions of a data-centric service provider’s competitors and the data and data usage rights to be transferred for using their services. Hence, provided competitors exist, the provider can not completely neglect the assumed pragmatic user’s concerns for privacy.

In the following, we discuss our model and illustrate possibilities to achieve a balance between privacy and efficiency by means of technological and regulatory instruments.

4 Towards Balancing Privacy and Efficiency

The presented model exhibits some limitations. First, in line with the majority of economic models, it builds upon the assumption of rational agents. Hence, while our model can be adapted to model privacy-affine or privacy-uninterested users by adapting \(U^0\) or \(g(\bullet )\), the model does not take into account psychological distortions [2]. Second, while our model covers the moral hazard problem on the user’s side, it does not consider the inverse information asymmetry suffered by the provider with respect to the user’s characteristics. Thus, while the model does account for the asymmetry regarding users’ privacy-seeking behavior, it does not take into account possible misuse of services by the user, e.g., data crawling and reselling. Third, the model does also not consider illegal behavior by the provider (e.g., privacy policy violations or non-compliance with data protection regulation). Last, the positive network effects between users within the same service are only rudimentary covered by the outcome \(\pi (s,a)\) but not explicitly included. Taking into account these limitations, we discuss possibilities to achieve a balance between privacy and efficiency by means of technological and regulatory instruments in the following.

In perfect competition with providers as price-takers [50], providers would be forced by competition to set \(x\) as low as possible so that they can just cover their marginal cost of the provided action \(a\). Hence, if the minimal necessary data and usage rights \(x_{min}\) to perform action \(a\) suffice to cover the marginal costs of providing it to yet another user the \(x\) demanded by all providers would consequentially be \(x_{min}\) and an equilibrium would exist for \(x = x_{min}\) with \(a \ge a'\). However, current data-centric business is far from a state of perfect competition. The actual market situation indicates that each branch is dominated by one powerful provider (e.g. Google with 90 % market share for online search [46]) that is flanked by a few small competitors competing for the left-over market share. Hence, the current market situation resembles a monopolistic situation where the provider maximizes its revenue and therefore \(x\) with the only constraint to deliver a service with expected user utility equal to or greater the user’s utility without any service: \(EU \ge U_{0}\) (see Eq. 4). Thus, the dominant providers are currently in the position to establish contracts with their users, possibly but not necessarily in all cases, such as that \(x > \hat{x}\ > x_{min}\) [47]. Hence, as long as \(EU \ge U^0\) wit \(x > \hat{x}\) there are few incentives for providers to establish contracts that are more “privacy-friendly”, i.e., set \(x = \hat{x}\) and \(a = \hat{a}\). Three scenarios in the first relationship tier [35] to be investigated can be distinguished:
  1. (S1)

    Privacy is not considered a competitive factor by users.

  2. (S2)

    Privacy is perceived as a competitive factor by users but they are unable to determine providers’ level of “privacy-friendliness”.

  3. (S3)

    Markets for data-centric services are currently monopolistic.

    1. (a)

      Users perceive privacy as worthy of protection.

    2. (b)

      User do not perceive privacy as worthy of protection.

It is obvious that these scenarios require different approaches towards balancing market efficiency and privacy. Further, it is to be investigated whether there is a need for privacy protection at all. Some scholars have argued that privacy protection generally decreases efficiency and general welfare [37]. While full transparency and full information might increase efficiency, the human rights aspect of privacy [9] excludes purely efficiency-focused approaches towards markets for personal data. Hence, regardless of which scenario currently exists, a balance between efficiency and privacy in data-centric business has to be established.
Table 1.

Applicability of the high-level approaches in different scenarios

Which instruments are suited for achieving such a balance depends on the market structure in data-centric business. In order to identify and discuss instruments for balancing privacy and efficiency we analyze the above provided scenarios in the following. We distinguish between three high-level approaches towards balancing privacy and efficiency: market-centric approaches, regulation-centric approaches and user-centric approaches. The differentiation criterion for these approaches is their primary instrument for balancing privacy and efficiency in data-centric business. Table 1 provides an overview over the approaches and their applicability in the above-described scenarios under the premise that privacy requires protection.

4.1 S1: Privacy is Not Considered a Competitive Factor by Users

In S1, purely market-centric approaches, i.e., regulatory laissez-faire or incentive-centered interventions, are not suited to foster increased privacy-friendliness in data-centric business as both providers and users have no self-motivated incentives to provide or demand respective services. The same holds true for user-centric approaches. If privacy is to be achieved in S1, only regulatory action and a (soft-)paternalistic regulatory regime can be applied. Our model does not aim at providing insight into the challenges of such an approach and we do not further consider S1 in this paper.

4.2 S3: Markets for Data-Centric Services Are Currently Monopolistic

In S3, market-centric approaches obviously are not suited to balance efficiency and privacy. In S3b, user-centric approaches are not well suited as users have no incentive to take action to protect their privacy. In S3a, at least users such as the pragmatic user of our model have incentives to expend \(z > 0\) to protect their privacy. Provided privacy-seeking behavior is effective and users can determine its effectiveness, user-centric approaches can lead to increased privacy in S3a but only on an individual level when PET [48] are used on the users-side. However, purely user-centric approaches would not change the market structure and monopoly would continue. Hence, regulatory action to weaken or even break the monopoly and to enable and increase market competition would be necessary. Thus, if a balance between efficiency and privacy is to be achieved, only regulation-centric approaches are applicable in S3. Such approaches would need to convert S3 into S1 or S2 and subsequently perform regulatory action as described in Sects. 4.1 and 4.3, respectively, to achieve a balance between privacy and efficiency. Chellappa and Shivendu propose the introduction of property rights to personal information as a regulatory approach towards balancing efficiency and privacy in monopoly [12].

4.3 S2: Privacy is Perceived as a Competitive Factor by Users But They are Unable to Determine Providers’ Level of “Privacy-Friendliness”

Scenario 2 exhibits characteristics that are partly similar to those of “lemon markets” which are deemed doomed to fail in the long run [4]. While users in S2 value privacy, they are unable to determine the privacy-friendliness of a provider ex ante and ex post entering a contract and, hence, providers have no incentive to compete on privacy and rather compete on functionality. Thus, in the long run, privacy-friendly providers would leave the market due to their lower profits caused by a lower \(x\) and the market would “fail” in the sense that no balance between privacy and efficiency could be achieved. In classic lemon markets as in data-centric business the problem is rooted in information asymmetry and power asymmetries [41]. Hence, in S2 classic instruments for reducing asymmetries of information and power seem best suited to achieve a balance between privacy and market efficiency. Because of the principal-agent relation in data-centric business and the human-rights aspect of privacy, however, further instruments as well as the suitability of classic instruments for reducing information and power asymmetries have to be investigated for the context at hand. In the following, we analyze and discuss technological and regulatory instruments for balancing privacy and efficiency in S2. Figure 2 provides an overview on the categories of analyzed instruments.
Fig. 2.

Instruments for addressing the privacy problem in data-centric business

Signaling and screening are instruments for reducing information asymmetries ex ante establishment of a contract [29]. The informed party can utilize signaling instruments to signal to the uninformed party its characteristics in order to reduce the information asymmetry and convince the uninformed party to establish a contract with the signaling party instead of with another party. Signaling, however, can only be a successful mechanisms in case the uninformed party has good reason to trust in the signal, i.e., the cost of falsely signaling a characteristic while not exhibiting it has to be high, ideally exceeding the benefits of doing so. Screening can be seen as inverse signaling, i.e., screening instruments can be utilized by the uninformed party in order to reduce information asymmetries by actively trying to find out the informed party’s characteristics. In the context of data-centric business, a provider of data-centric services has superior information regarding \(x_{min}\), \(x\) and \(a\), i.e., she is the informed party [41]. Signaling and screening are instruments for market-centric approaches towards balancing privacy and efficiency (see Table 1).

Drawing from the literature, we identify Transparency-Enhancing Technologies (TET) [22] that are applied before establishment of a contract [25] (“ex ante TET”) as potential instruments for signaling in data-centric business. Ex ante TET comprise all TET that are applied before using a service and include tools for policy visualization (e.g., “PrivacyBird”1), privacy seals (e.g., the “European Privacy Seal”2) and other instruments for providing information regarding intended data collection and usage, i.e., information on \(x\), \(a\) and, possibly, \(x_{min}\). A variety of ex ante TET exist, however, their suitability regarding balancing privacy and efficiency in data-centric business is limited. While tools for policy visualization are able to signal intended data collection and usage, i.e., \(x\), they do not provide users with information regarding the actions a provider actually performs, i.e., \(a\). Privacy seals can constitute a valid instrument for signaling, provided they are issued by a trustworthy party and the criteria for awarding the seals are known to users.

Screening originally refers to actions of the uninformed party that aim at inducing the informed party to actively reveal its characteristics during negotiation of the contract [29]. Technological instruments for policy negotiation exist, e.g., P3P/APPEL [14], XACML [13] or the approaches provided by Pretschner et al. [38], Hanson et al. [23] or Bohrer et al. [7]. However, to our best knowledge, these mechanisms are not supported by any provider of data-centric services, which is not surprising given the power relations in data-centric business as described in Sect. 3. Existing, and actively used, mechanisms that resemble classic screening for data-centric services in the sense that they allow the uninformed party to reduce information asymmetries ex ante establishment of a contract are reputation services [26]. This includes crowd-sourced services such as “Web of Trust”3 or “TOS;DR”4 or services aimed at allowing users to rate other services. While reputation systems can allow users to gain some insight into a provider’s behavior, in particular crowd-source services are hardly suited to provide meaningful information regarding \(x_{min}\), \(x\) or \(a\) as other users (even if they already have established a contract with a specific provider) are unable to fully determine the provider’s actions. In case the provider grants wide-ranging insight into \(x_{min}\), \(x\) or \(a\) after establishment of a contract, however, crowd-sourced reputation services can constitute effective instruments for estimating \(x_{min}\), \(x\) or \(a\). However, to our best knowledge, no data-centric provider already does so (see below). Besides instruments to be applied ex ante establishment of a contract, instruments that can be applied to reduce asymmetries of information and power ex post have to be investigated. Further, some instruments, especially regulatory ones, exist that can not be categorized as ex ante or ex post instruments.

Users themselves can apply user-side PET [48] at cost \(z\) in order to reduce the information they disclose and a provider’s power to exercise usage rights to data. Such PET include, among many other, tools for anonymity (e.g., “Tor”5) or obfuscation (e.g., “TrackMeNot” [24]). While user-side PET could also be seen as instruments for reducing information asymmetry we consider them instruments for elusion of power asymmetries. We do so, as data-minimization on the user-side does not help users to learn the hidden characteristics of the provider. While they do not allow users to estimate \(x_{min}\), \(x\) or \(a\), they can reduce the privacy-related cost of using data-centric services ex post establishing of a contract by reducing \(h(x,r,z)\). Usage control tools, as presented in, e.g., [5, 11], in combination with policy negotiation tools can also be applied to reduce power asymmetries (and information asymmetries in case policy negotiation can be used for screening) by giving users a means for setting rules for the exercise of usage rights by the provider, i.e., for influencing \(a\).

Ex post TET aim at providing users with insight into actual data collection and usage [25]. The most prominent class of ex post TET are so-called privacy dashboards. Depending on their functionality, they can be considered both instruments for reducing information asymmetry and instruments for reducing power asymmetries. While read-only ex post TET are instruments for reducing information asymmetry ex post establishment of the contract, interactive ex post TET are instruments for reduction of asymmetries of power and information [54]. While ex post TET, and privacy dashboards in particular, seem promising approaches towards balancing privacy and efficiency in data-centric business, current approaches as proposed in, e.g., [8, 18] or the privacy dashboards provided by Google6 or Acxiom7 do not provide trustworthy information and, hence, are not well suited for balancing privacy and efficiency (cf. [54]).

Accountability-centric approaches are currently widely discussed as means towards balancing privacy and efficiency [36, 55]. Privacy by accountability inherently requires a combination of technological and regulatory instruments [36, 55]. Respective approaches towards privacy build upon audit in order to determine providers’ adherence to data protection regulation and/or agreed-upon polices. A central concept within accountability-centric approaches towards privacy is liability, i.e., sanctioning of providers in case of noncompliance with regulation and agreed-upon policies. While accountability-centric approaches towards balancing privacy and efficiency are promising and increasingly investigated, no respective solution currently exists.

Regulatory action towards reducing information asymmetry is currently being taken, e.g., in the new GDPR. Regulatory instruments can set the legal frame such as to reduce information asymmetry ex ante and ex post establishment of contracts and can support both regulation-centric and market-centric approaches towards balancing privacy and efficiency. Another purely regulatory approach towards balancing privacy and efficiency is the assignment of property rights to personal data, which is currently widely debated in the fields of jurisprudence, IS and computer science (see Sect. 3).

5 Conclusion

In this paper, we provided a principal-agent model of the privacy problems and trade-offs in data-centric business, in that drawing from contract theory. Building upon an analysis of the model, we identified asymmetries of information and power as the primary leverage points for balancing efficiency and privacy in data-centric business. We analyzed and discussed the suitability of existing regulatory and technological instruments for reducing the identified asymmetries. We showed that, in non-monopolistic markets and provided that privacy is perceived as a competitive factor by users and providers, providers have the incentive to provide users with increased transparency and control regarding their personal data. We also showed that regulatory pressure might be necessary to foster competition in data-centric business. Based upon our analysis, we conclude that a transparency-fostering regulatory regime in combination with trustworthy ex ante and ex post TET, respectively accountability mechanisms, seems best suited for achieving a more privacy-friendly balance of efficiency and privacy in data-centric business. Adopting a commodity-centric notion of privacy as described in Sect. 3 into law might further increase users’ ability to exercise their right to informational self-determination without loss of the benefits of data-centric service. Currently, we are investigating requirements of accountability-oriented instruments for balancing privacy and efficiency in data-centric business. Among others, this includes the investigation of the suitability of privacy dashboards as instruments for accountability and the economic implications of such an approach.



  1. 1.
    Ackerman, M.S., Cranor, L.F., Reagle, J.: Privacy in e-Commerce: examining user scenarios and privacy preferences. In: Proceedings of EC 1999, pp. 1–8. ACM (1999)Google Scholar
  2. 2.
    Acquisti, A.: Privacy in electronic commerce and the economics of immediate gratification. In: Proceedings of EC 2004, pp. 21–29. ACM, New York (2004)Google Scholar
  3. 3.
    Acquisti, A., Gross, R.: Imagined communities: awareness, information sharing, and privacy on the facebook. In: Danezis, G., Golle, P. (eds.) PET 2006. LNCS, vol. 4258, pp. 36–58. Springer, Heidelberg (2006) CrossRefGoogle Scholar
  4. 4.
    Akerlof, G.A.: The market for “Lemons”: quality uncertainty and the market mechanism. Q. J. Econ. 84(3), 488–500 (1970)CrossRefGoogle Scholar
  5. 5.
    Ashley, P., Powers, C., Schunter, M.: From privacy promises to privacy management. In: Proceedings of NSPW 2002, pp. 43–50. ACM (2002)Google Scholar
  6. 6.
    Bergelson, V.: It’s personal but is it mine? toward property rights in personal information. U.C. Davis Law Rev. 37(2), 379–452 (2003)MathSciNetGoogle Scholar
  7. 7.
    Bohrer, K., Liu, X., Kesdogan, D., Schonberg, E., Singh, M., Spraragen, S.: Personal information management and distribution. In: Proceedings of ICECR-4 (2001)Google Scholar
  8. 8.
    Buchmann, J., Nebel, M., Rossnagel, A., Shirazi, F., Fhom, H.S., Waidner, M.: Personal information dashboard: putting the individual back in control. In: Digital Enlightenment Yearbook 2013, pp. 139–164. IOS Press (2013)Google Scholar
  9. 9.
    Bundesverfassungsgericht: BVerfG, Urteil v. 15. Dezember 1983, Az. 1 BvR 209, 269, 362, 420, 440, 484/83 (1983)Google Scholar
  10. 10.
    Campbell, J.E., Carlson, M.: online surveillance and the commodification of privacy. J. Broadcast. Electron. Media 46(4), 586–606 (2002)CrossRefGoogle Scholar
  11. 11.
    Mont, M.C., Pearson, S., Bramhall, P.: Towards accountable management of identity and privacy: sticky policies and enforceable tracing services. In: Proceedings of DEXA 2003, pp. 377–382. IEEE (2003)Google Scholar
  12. 12.
    Chellappa, R.K., Shivendu, S.: An economic model of privacy: a property rights approach to regulatory choices for online personalization. J. Manage. Inf. Syst. 24(3), 193–225 (2007)CrossRefGoogle Scholar
  13. 13.
    Cheng, V., Hung, P., Chiu, D.: Enabling web services policy negotiation with privacy preserved using XACML. In: Proceedings of HICSS 2007, pp. 33–33. IEEE (2007)Google Scholar
  14. 14.
    Cranor, L., Langheinrich, M., Marchiori, M.: A P3P Preference Exchange Language 1.0 (APPEL1.0) (2002).
  15. 15.
    Cuijpers, C.: A private law approach to privacy; mandatory law obliged? SCRIPT-ed 4(4), 304–318 (2007)CrossRefGoogle Scholar
  16. 16.
    Davies, S.G.: Re-engineering the right to privacy: how privacy has been transformed from a right to a commodity. In: Technology and privacy, pp. 143–165. MIT Press (1997)Google Scholar
  17. 17.
    Evans, D.S.: The economics of the online advertising industry. Rev. Netw. Econ. 7(3), 1–33 (2008)CrossRefGoogle Scholar
  18. 18.
    Fischer-Hübner, S., Hedbom, H., Wästlund, E.: Trust and assurance HCI. In: Camenisch, J., Fischer-Hübne, S., Rannenberg, K. (eds.) Privacy and Identity Management for Life, pp. 245–260. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  19. 19.
    Franke, N., Keinz, P., Steger, C.J.: Testing the value of customization: when do customers really prefer products tailored to their preferences? J. Mark. 73, 103–121 (2009)CrossRefGoogle Scholar
  20. 20.
    Fujitsu Res. Inst.: Personal data in the cloud: a global survey of consumer attitudes (2010).
  21. 21.
    Gross, R., Acquisti, A.: Information revelation and privacy in online social networks. In: Proceedings of WPES 2005, pp. 71–80. ACM (2005)Google Scholar
  22. 22.
    Hansen, M.: Marrying transparency tools with user-controlled identity management. In: Fischer-Hübner, S., Duquenoy, P., Zuccato, A., Martucci, L. (eds.) FIDIS 2007. IFIP Advances in Information and Communication Technology, vol. 262, pp. 199–220. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  23. 23.
    Hanson, C., Kagal, L., Berners-Lee, T., Sussman, G., Weitzner, D.: Data-purpose algebra: modeling data usage policies. In: Proceedings of POLICY 2007, pp. 173–177. IEEE (2007)Google Scholar
  24. 24.
    Howe, D.C., Nissenbaum, H.: TrackMeNot: resisting surveillance in web search. In: Lessons from the Identity Trail: Anonymity, Privacy, and Identity in a Networked Society, pp. 417–436. Oxford University Press (2009)Google Scholar
  25. 25.
    Janic, M., Wijbenga, J., Veugen, T.: Transparency enhancing tools (TETs): an overview. In: STAST 2013, pp. 18–25 (2013)Google Scholar
  26. 26.
    Josang, A., Ismail, R., Boyd, C.: A survey of trust and reputation systems for online service provision. Decis. Support Syst. 43(2), 618–644 (2007)CrossRefGoogle Scholar
  27. 27.
    Laudon, K.C.: Markets and privacy. Commun. ACM 39(9), 92–104 (1996)CrossRefGoogle Scholar
  28. 28.
    Lessig, L.: Privacy as property. Soc. Res. 69(1), 247–269 (2002)Google Scholar
  29. 29.
    Mankiw, N.: Principles of Macroeconomics. Cengage Learning, Boston (2014)Google Scholar
  30. 30.
    McAfee, A., Brynjolfsson, E.: Big data: the management revolution. Harv. Bus. Rev. 90(10), 60–68 (2012)Google Scholar
  31. 31.
    McDonald, A.M., Cranor, L.F.: Cost of reading privacy policies. J. Law Policy Inf. Soc. 4, 543–568 (2008)Google Scholar
  32. 32.
    Müller, G., Flender, C., Peters, M.: Vertrauensinfrastruktur und Privatheit als ökonomische Fragestellung. In: Buchmann, J. (ed.) Internet Privacy. acatech STUDY, pp. 143–188. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  33. 33.
    Nissenbaum, H.: A contextual approach to privacy online. Daedalus 140(4), 32–48 (2011)CrossRefGoogle Scholar
  34. 34.
    Nolte, C.G.: Personal data as payment method in SNS and users’ concerning price sensitivity - a survey. In: Business Information Systems Workshops. LNBIP, vol. 228. Springer (2015), to appearGoogle Scholar
  35. 35.
    Novotny, A., Spiekermann, S.: Personal information markets and privacy: a new model to solve the controversy. In: Proceedings of WI 2013, pp. 1635–1649 (2013)Google Scholar
  36. 36.
    Pearson, S., Charlesworth, A.: Accountability as a way forward for privacy protection in the cloud. In: Jaatun, M.G., Zhao, G., Rong, C. (eds.) Cloud Computing. LNCS, vol. 5931, pp. 131–144. Springer, Heidelberg (2009) CrossRefGoogle Scholar
  37. 37.
    Posner, R.A.: The economics of privacy. Am. Econ. Rev. 71(2), 405–409 (1981)Google Scholar
  38. 38.
    Pretschner, A., Hilty, M., Basin, D.: Distributed usage control. CACM 49(9), 39–44 (2006)CrossRefGoogle Scholar
  39. 39.
    Purtova, N.: Property rights in personal data: learning from the American discourse. Comput. Law Secur. Rev. 25(6), 507–521 (2009)CrossRefGoogle Scholar
  40. 40.
    Purtova, N.: Property rights in personal data: A European perspective. Ph.D. thesis, Universiteit van Tilburg, Tilburg (2011)Google Scholar
  41. 41.
    Schermer, B.W.: The limits of privacy in automated profiling and data mining. Comput. Law Secur. Rev. 27(1), 45–52 (2011)CrossRefGoogle Scholar
  42. 42.
    Schwartz, P.M.: Property, privacy, and personal data. Harv. Law Rev. 117(7), 2056–2128 (2004)CrossRefGoogle Scholar
  43. 43.
    Shapiro, S.P.: Agency theory. Ann. Rev. Soc. 31, 263–284 (2005)CrossRefGoogle Scholar
  44. 44.
    Spiekermann, S., Dickinson, I., Günther, O., Reynolds, D.: User agents in e-commerce environments: industry vs. consumer perspectives on data exchange. In: Eder, Johann, Missikoff, Michele (eds.) CAiSE 2003. LNCS, vol. 2681, pp. 696–710. Springer, Heidelberg (2003) CrossRefGoogle Scholar
  45. 45.
    Spiekermann, S., Novotny, A.: A vision for global privacy bridges: technical and legal measures for international data markets. Comput. Law Secur. Rev. 31(2), 181–200 (2015)CrossRefGoogle Scholar
  46. 46.
    StatCounter: Worldwide market share of leading search engines from january 2010 to April 2015.
  47. 47.
    Stutzman, F., Gross, R., Acquisti, A.: Silent listeners: the evolution of privacy and disclosure on facebook. J. Priv. Confid. 4(2), 7–41 (2013)Google Scholar
  48. 48.
    Van Blarkom, G., Borking, J., Olk, J. (eds.): Handbook of Privacy and Privacy-Enhancing Technologies. College bescherming persoonsgegevens, The Hague (2003) Google Scholar
  49. 49.
    Varian, H.R.: Economic aspects of personal privacy. In: Lehr, W.H., Pupillo, L.M. (eds.) Internet Policy and Economics, pp. 101–109. Springer, Heidelberg (2009) CrossRefGoogle Scholar
  50. 50.
    Varian, H.R.: Intermediate Microeconomics: A Modern Approach, 8th edn. WW Norton & Company, New York (2010) Google Scholar
  51. 51.
    Weitzner, D.J.: Google, profiling, and privacy. IEEE Internet Comput. 11(6), 95–97 (2007)CrossRefGoogle Scholar
  52. 52.
    Westin, A., Harris Louis & Associates: Harris-Equifax Consumer Privacy Survey. Technical report, 1991. Conducted for Equifax Inc. 1,255 adults of the U.S. public. Technical report (1991)Google Scholar
  53. 53.
    World Economic Forum: Personal Data: The Emergence of a New Asset Class (2011).
  54. 54.
    Zimmermann, C., Accorsi, R., Müller, G.: Privacy dashboards: reconciling data-driven business models and privacy. In: Proceedings of ARES 2014, pp. 152–157. IEEE (2014)Google Scholar
  55. 55.
    Zimmermann, C., Cabinakova, J.: A conceptualization of accountability as a privacy principle. In: Business Information Systems Workshops. LNBIP, vol. 228. Springer (2015), to appearGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.University of FreiburgFreiburg im BreisgauGermany

Personalised recommendations