The disclosure of private data: measuring the privacy paradox in digital services

  • Henner Gimpel
  • Dominikus Kleindienst
  • Daniela Waldmann
Research Paper


Privacy is a current topic in the context of digital services because such services demand mass volumes of consumer data. Although most consumers are aware of their personal privacy, they frequently do not behave rationally in terms of the risk-benefit trade-off. This phenomenon is known as the privacy paradox. It is a common limitation in research papers examining consumers’ privacy intentions. Using a design science approach, we develop a metric that determines the extent of consumers’ privacy paradox in digital services based on the theoretical construct of the privacy calculus. We demonstrate a practical application of the metric for mobile apps. With that, we contribute to validating respective research findings. Moreover, among others, consumers and companies can be prevented from unwanted consequences regarding data privacy issues and service market places can provide privacy-customized suggestions.


Privacy paradox Privacy calculus Metric Digital services 

JEL classification


In the information age, the significant quantity of available data enables organizations to create detailed descriptions of individuals (Hashem et al. 2015). Enabled by information and communication technologies (ICT), the resulting profiles can, for example, be used for personalized marketing campaigns or advertising (Egelman et al. 2013; Hauff et al. 2015). However, this usage is not always with the knowledge of consumers. Platform and service operators may be regarded as unreliable actors in part using data for unauthorized or unintended purposes (Alt et al. 2015). Invasion of privacy can result in serious negative effects, for example, legal consequences may arise if a person acts on behalf of others and, thus, abuses their identities. Financial implications may include financial losses caused by third parties hacking into a personal account (Hauff et al. 2015). Massive data access facilitates collecting, sharing, buying, or selling of private data, and entails storing, manipulating, mining, and analyzing these data (Malhotra et al. 2004). Currently, most consumers already have a pronounced perception toward privacy and pursue the goal of protecting themselves from their private data being misused (Egelman et al. 2013; Kumaraguru and Cranor 2005). Especially, the third-party use of data is seen with particular care by customers (Spiekermann et al. 2015). Put together, the easy dissemination of data raises the awareness of privacy and makes privacy a current topic for information hubs such as electronic markets (Alt et al. 2015).

However, despite these risks, consumers are usually unable to estimate the amount and economic value of the personal data they provide (Buck et al. 2014). Consequently, consumers have a propensity to not protect themselves enough against privacy risks and disclose private data despite the associated imminent dangers (Acquisti 2004). This phenomenon reveals the unrealistic assumption of individual rationality in the context of personal privacy. Despite making the statement that they want to protect their privacy, consumers act contrarily (Acquisti 2004). This phenomenon is called the privacy paradox (Norberg et al. 2007).

The growth of digital services in ICT amplifies the challenges concerning privacy issues. In general, a service is defined as “any activity or benefit that one can offer to another that is essentially intangible” (Kotler and Armstrong 2010, p. 248). A digital service is a service provided over electronic networks (Graupner et al. 2015). Consumers’ privacy is even more threatened because digital services such as mobile apps or social media have enormous demand for consumer data (Stutzman et al. 2013; Wei et al. 2012). For example, mobile apps can easily collect sensitive data, such as photos and files, contact lists, or location information, thus supporting the increase in data collection (Egelman et al. 2013; Wei et al. 2012; Zhou 2013). Consequently, consumers in particular who use smartphones and, thus, mobile apps, are faced with a special challenge concerning their privacy (Horbach and Horbach 2013). Yet they continue to download, install, and use a significant number of apps. Current download rates and forecasts show a booming app economy (Buck et al. 2014). In addition, consumers hardly pay attention to or comprehend app permissions (Felt et al. 2012), even though technical capabilities enable the spreading and targeted usage of this mass of personal consumer information. Additional factors exist that reinforce the enormous usage of apps and lead to the collection of consumer information. For example, by integrating smartphones into their daily life (Abdelzaher et al. 2007; Buck et al. 2014) and trusting them, some consumers use mobile apps without a clue as to how they invade privacy (Horbach and Horbach 2013). Furthermore, consumers expect several benefits, such as social adjustment, time savings, and pleasure, or face social pressure to use apps (Hui et al. 2006; Smith et al. 2011). These factors also explain the booming app economy with apps as a special type of digital service (Buck et al. 2014; Hui et al. 2006; Krasnova and Veltri 2010; Schreiner and Hess 2015).

The vulnerability of consumers’ privacy in the context of digital services results in the loss of control over personal information and unwanted data disclosure (Bélanger and Crossler 2011; Dinev and Hart 2006). Additionally, this vulnerability supports business models for digital service providers whose most important type of revenue is based on information (Buck et al. 2014). Thus, consumers gather the benefits of free digital services only in return for providing their personal data (Buck et al. 2014). Although the clueless handling of privacy is primarily a consumer problem, it also has implications for digital service providers because they compete for consumers (Culnan and Armstrong 1999). Moreover, their economic success is determined by a strong reputation, which in turn depends on the responsible handling of consumers’ privacy (Culnan and Armstrong 1999; Degirmenci et al. 2013).

Consumer privacy is a well-known research subject. Topics such as the privacy calculus (Culnan and Armstrong 1999; Min and Kim 2015; Smith et al. 2011) and privacy concerns (Buchanan et al. 2007; Krasnova et al. 2009; van Slyke et al. 2006; Zukowski and Brown 2007) are frequently mentioned in the literature. However, many researchers focus on examining factors that betray consumers and induce them or keep them from disclosing personal information (Son and Kim 2008). They also focus on factors that affect an individual’s privacy concerns, such as privacy experience, personality traits, or privacy awareness (Smith et al. 2011). In these cases, it is theoretically assumed that consumers behave rationally according to a risk-benefit calculation — the privacy calculus. The perceived risks of disclosing personal information are opposed to the perceived benefits expected from doing so (Chellappa and Sin 2005; Dinev et al. 2006; Dinev and Hart 2006; Xu et al. 2009). However, actual consumer behavior, which is affected by bounded rationality or missing information, is neglected. The so-called privacy paradox, which “represents a form of irrational, or bounded-rational decision making” (Keith et al. 2012, p. 3), is already discussed in extant literature. Yet, this literature only discusses and demonstrates that consumers do not keep to their stated privacy concerns. However, to the best of our knowledge, no research exists that has developed a metric to calculate the extent of the privacy paradox for either digital services or any other application domain. To overcome this research gap, we use a design science research approach following Peffers et al. (2007) to address the following objective in this paper.
  • Design objective: Development of a privacy paradox metric (PPM) as a design artifact that aggregates consumers’ privacy intentions and behavior to a single measure and quantitatively assesses the extent of consumers’ paradoxical privacy behavior in the context of digital services.

We focus on digital services because consumers’ privacy is even more vulnerable in this context given that these services require a large volume of personal data. A metric is defined as a standard of measurement (Merriam Webster 2017). Such a standard is a human created artifact and, as such, should be accurately designed. Prior research considers metrics as artifacts that are objects for design science research (Offermann et al. 2010). Being the first quantitative measure of the extent of the privacy paradox, the PPM will have several advantages for researches, consumers, the companies offering digital services, ICT platform providers, and consumer protection organizations. Research can use the metric to validate empirical results towards consumers’ data privacy intentions, as the privacy paradox is a limitation in many data privacy research papers. To consumers, the PPM could provide transparency about an individual consumer’s privacy paradox to save them from careless disclosure of data and thus unwanted consequences. Accordingly, also companies offering digital services can use the PPM to identify careless consumer decisions towards data disclosure and manage the risk related to such decisions. ICT service providers, such as app stores, can use the PPM to enhance their attractiveness by providing customized warnings, suggestions, sorting, or filtering. Consumer protection organizations can build further empirical studies on the PPM that increase public awareness on the risks related to the privacy paradox.

To achieve the design objective, we follow the design science research methodology (DSRM) (Hevner et al. 2004; Peffers et al. 2007) and contribute a design theory (Gregor 2006; Gregor and Hevner 2013; Gregor and Jones 2007) to measure the extent of the privacy paradox. The structure of this paper is similar to the publication scheme suggested by Gregor and Hevner (2013). First, we present the theoretical background of the topic in “Theoretical Background” section. Next, we deduce the requirements that the metric must fulfil to guarantee high quality and to declare the basic notion and calculation of the privacy paradox metric in “Development of the Privacy Paradox Metric” section. Subsequently, we describe an exemplary application of the PPM and its results in “Practical Application of the Privacy Paradox Metric” section. Afterwards, we evaluate the metric against the requirements in “Evaluation of the Privacy Paradox Metric” section. Finally, we show the theoretical contribution, limitations, and managerial implications in the discussion and conclude in “Discussion and Conclusion” section.

Theoretical background

Our research contributes to a stream of literature on data privacy in information systems (IS) and related fields. Thus, we review this research and build our metric on it.

A person’s privacy has evolved into one of the most important ethical topics of the information age (Mason 1986). The main reason for this development is that we live in an age of information overload (Zhan and Rajamani 2008). The broad dissemination of information enables companies to collect significant quantities of information on their consumers in order to meet consumer demands and to remain competitive (Culnan and Armstrong 1999; Nissenbaum 1997; Zhan and Rajamani 2008). Companies expect even greater advantages from promising consumer data, such as improving consumer retention, increasing revenue, having a better understanding of existing and prospective consumer needs, better recommendations or increasing productivity (Heimbach et al. 2015; Spiekermann et al. 2001; Tene and Polonetsky 2012; Zhan and Rajamani 2008). However, as companies collect an increasing amount of data, they tend to forget the fundamental right to privacy (Spiekermann et al. 2001). The same information, which brings significant advantages for companies, also results in increasing privacy concerns on the consumer side (Zhan and Rajamani 2008), such as social, psychological, resource-related, independence-related, legal, and physical consequences (Hauff et al. 2015).

In a contemporary interpretation, privacy refers to an individual’s control over sensitive information about oneself (Bélanger and Crossler 2011; Bélanger et al. 2002; Stone et al. 1983). At the individual level, countless differences exist in the desire for privacy (Hawkey and Inkpen 2006). Zukowski and Brown (2007) find that certain demographic factors, such as age, education, and income level, affect the privacy concerns of individuals, whereas factors such as gender or Internet experience have no influence. In contrast, Cho et al. (2009) show that gender and Internet experience apparently influence individuals’ privacy concerns. Likewise, knowledge and own preferences affect the privacy attitude (Acquisti et al. 2015). Frequently, people cannot imagine that their data disclosure can have serious consequences for them. In addition to the missing knowledge, own preferences, emotions, and thoughts change in different situations and stages of life (Acquisti et al. 2015), which also affects information disclosure and privacy conditions. However, privacy is different not only on an individual level but also with respect to cultural- and context-related deviations (Acquisti et al. 2015).

Regarding privacy concerns, several individual differences exist (Acquisti and Grossklags 2005). Every consumer makes decisions about his or her own privacy every day, such as when deciding to use or not use a digital service. A prevalent model for such decisions is the privacy calculus, which represents “the most useful framework for analyzing contemporary consumer privacy concerns” (Culnan and Bies 2003, p. 326). In this way, it is possible to consider individual circumstances by weighing personal preferences for benefits and risks (Dinev and Hart 2006; Laufer and Wolfe 1977). Although risks reduce the individual’s readiness to disclose private data, benefits have the reverse effect (Laufer and Wolfe 1977). To make the calculus more illustrative, Hui et al. (2006) list two categories of potential benefits, namely extrinsic (monetary saving, time saving, self-enhancement, social adjustment) and intrinsic (pleasure, novelty, altruism) benefits. Roeber et al. (2015) find that most customers disclose their personal information if the benefits fulfil their needs. The greater the benefits of a digital service, the more the consumer is willing to disclose data to be able to use the service. Risks are understood as the possible intrusion of privacy within the risk of losing personal data to a company and the potential danger of the data being misused (Malhotra et al. 2004; Smith et al. 2011; Smith et al. 1996). Hence, risks are viewed as the result of two factors: the perceived likelihood of a potential privacy invasion and the perceived damage it causes (Cunningham 1967). The higher risk of using a digital service results in a lower likelihood that the consumer will use the service.

Most researchers assume that people behave rationally when they decide about their privacy and that they weigh the benefits and risks (Acquisti and Grossklags 2005). Also the privacy calculus is based on this assumption (Keith et al. 2012). Nevertheless, an opposite behavior is observed and behavioral intentions to disclose information are not a precise predictor for actual behavior (Norberg et al. 2007). People who claim to have strong privacy concerns and no purpose for revealing their data give the information away despite that attitude. The term privacy paradox denotes such behavior (Acquisti and Grossklags 2004; Norberg et al. 2007). Researchers show that people behave contrarily to their reported privacy attitudes and concerns (Bélanger and Crossler 2011; Norberg et al. 2007; Smith et al. 2011). Thus, in many cases, the stated privacy concerns do not correspond to their real behavior and consumers act boundedly rational or irrationally (Acquisti and Grossklags 2004).

Prior literature repeatedly verifies this behavior. Spiekermann et al. (2001) show that consumers disclose private data to online shops despite having privacy concerns. They conduct an experiment to measure the self-reported privacy attitudes and compare them with their actual disclosing behavior in an online shopping environment. They confirm that most consumers do not keep to their stated privacy preferences. Additionally, Norberg et al. (2007) investigate a study to determine whether or not people live up to their reported intentions toward privacy. Thereby, they reinforce the existence of the privacy paradox because they find that consumers provide substantially more private data than they profess. Additional examples exist for the examination of the privacy paradox in e-commerce scenarios, such as from Jensen et al. (2005) and Berendt et al. (2005). The privacy paradox cannot be demonstrated only in the online shopping and marketing context but also through social media. Acquisti and Gross (2006) point out that privacy concerned persons are members of Facebook and disclose a large amount of private data, thus behaving paradoxically.

Development of the privacy paradox metric

A metric is a mathematical model that is able to measure aspects of systems, system designs, or behavior in the interaction with systems (Offermann et al. 2010). In general, measuring means assigning a number to an object to express some aspect of it in a quantitative manner. Any form of measurement is an abstraction that reduces the complexity of the object’s attributes to a single number (Böhme and Freiling 2008). In this way, a metric provides measures that managers understand and that academics can replicate and analyze (Palmer 2002). In addition, practitioners and researchers use metrics to make better decisions (Hauser and Katz 1998).

To ensure a high quality for the PPM, we first present the requirements that the metric must fulfill. Subsequently, we introduce the metric’s basic concept and its calculation. Using a metric as a suitable artifact for design science research (Offermann et al. 2010), we meet the design science guideline that suggests that “design science research must produce a viable artifact” (Hevner et al. 2004, p. 83).

Requirements for the privacy paradox metric

Metrics are specifically used to evaluate certain decision alternatives (Kaiser et al. 2007; Linkov et al. 2011). Because their requirements are context-dependent, they cannot be used to evaluate metrics in general. Moreover, no metric exists to measure the privacy paradox and, thus, no set of appropriate requirements exists to reference. Consequently, we deduce requirements from research on the development of a metric, on measurement instruments, and on requirements for data quality metrics, security metrics, and software quality metrics (Becker et al. 2015; Liggesmeyer 2009; Wallmüller 2001). As a result, we compile the following list of seven requirements, present definitions, and show the related requirements used in research (see Table 1).
Table 1

Requirements for the PPM



Related requirements


“The unit of measure is clearly set, absolute, and appropriate so that the metric can be based on quantitative measurements.” (Erl et al. 2013, p. 405)

Quantification (Böhme and Freiling 2008; Kaiser et al. 2007)


“The degree of mutual agreement among individual measurements made under prescribed conditions […]. Precision captures the notion of the repeatability of accurate measurements under similar conditions.” (Herrmann 2007, p. 29)

Repeatability (Erl et al. 2013; Liggesmeyer 2009), Reliability (Wallmüller 2001)


“The units of measure used by a metric need to be standardized and comparable.” (Erl et al. 2013, p. 405)

Analyzability (Liggesmeyer 2009)

Normalization (Kaiser et al. 2007)


“The metric needs to be based on a non-proprietary, common form of measurement that can be easily obtained and understood by […] consumers.” (Erl et al. 2013, p. 405)

Feasibility (Kaiser et al. 2007) Reliability (Böhme and Freiling 2008)


“The actual measure should be easy to interpret by business users.” (Even and Shankaranarayanan 2007, p. 83)

Simplicity (Liggesmeyer 2009)


“A metric is considered useful if the metric corresponds to the intuition of the measurer [and] is actively used in a decision making process.” (Bouwers et al. 2013, p. 2)



“From an economic view, only those measures must be taken that are efficient with regard to costs and benefit.” (Kaiser et al. 2007, p. 2)


In addition to the requirements listed in Table 1, accuracy (Herrmann 2007) is an important requirement for evaluating the metric because accuracy is defined as “the degree of agreement of individual or average measurements with an accepted reference value or level” (Herrmann 2007, p. 29). However, as there is no reference value for the measurement of the privacy paradox so far, it is not possible to apply this evaluation criterion for the PPM.

Basic concept and calculation

Using the requirements previously identified, we develop the PPM. To establish the PPM, we use the concept of the privacy calculus, which is useful to explain consumers’ intention to disclose any information (Keith et al. 2014). According to the privacy calculus, consumers weigh the perceived risks of a decision that may involve a privacy threat against the perceived benefits that result from the information disclosure (Dinev and Hart 2006; Laufer and Wolfe 1977; Sheng et al. 2008). Consequently, consumers accept a loss of their privacy as long as the benefits outweigh the imminent risks (Sheng et al. 2008). Moreover, the theory of reasoned action (TRA) implies that actual behavior matches the intention to disclose information (Fishbein and Ajzen 1975). However, consumers do not always act rationally according to their privacy calculus (Acquisti and Grossklags 2004; Norberg et al. 2007). Consequently, a contradiction exists between consumers’ privacy intentions and behaviors (Keith et al. 2012, 2013; Norberg et al. 2007). This phenomenon is called the privacy paradox (Bélanger and Crossler 2011; Smith et al. 2011) and can be identified if consumers use a digital service, although their intentions imply not using it (and vice versa). However, the case for using a service paradoxically constitutes the privacy relevant part because it involves a violation of consumers’ privacy, whereas a paradoxical non-usage only implies a loss of utility for consumers. Figure 1 depicts this difference between intention and actual behavior in the form of rational and paradoxical service usage and non-usage. Consumers are classified in one of the four segments according to their modeled privacy calculus. This classification shows consumers’ privacy intention with respect to the information disclosure. Because the perceived benefit and the perceived risk are not measured on directly comparable scales, the classification in the four segments is necessary. Depending on the consumer’s classification, whether the consumer behaves rationally or paradoxically can be determined.
Fig. 1

Illustration of rational and paradoxical consumer privacy behavior

Figure 1 shows the two different types of paradoxical and rational behavior. The top left of the figure indicates low perceived risk and high perceived benefit. Consequently, a service usage seems rational (R U ), whereas a non-service usage is paradoxical (P N ). In contrast, the segment in the bottom right is characterized with a high perceived risk and a low perceived benefit. Thus, P U represents consumers using a service, even if they recognize only a relatively small benefit and perceive the risk as relatively high. All consumers, which are classified in the segment R N , use the digital service rationally. However, in some segments, the PPM is not applicable because no clear conclusions can be made about the rational respectively paradoxical behavior, marked by NAU1, NAU2, NAN1, and NAN2. In such cases, the weighing up between the consumer’s service benefit and attitude toward the service’s risk do not result in incontestable conclusions.

The PPM determines the percentage of consumers who behave paradoxically according to the modeled privacy calculus. Therefore, we first model the service benefit i and the service risk i for consumers i (i = 1, …, n). service benefit i ∈[service benefit min , service benefit max ] ∀ i, service benefit min  < service benefit max is composed of several dimensions (see Formula 1).
$$ {service\ benefit}_i=\sum \limits_{j=1}^m{benefit\ weight}_{ij}\ast service\ {benefit\ dimension}_{ij} $$

These benefit dimensions can, for example, be taken from technology acceptance models, such as hedonic motivation or perceived usefulness, because these models define constructs that predict the behavioral intention to use a technology (Venkatesh et al. 2003, 2012). The constructs j (j = 1, …, m) affecting the consumer’s service benefit i are service and context dependent (Venkatesh et al. 2012). Each consumer can rate the single constructs j with a personal assessment service benefit dimension ij  ∈ [service benefit min , service benefit max ] ∀ i, j. To create differences in the importance of the chosen constructs, they are weighted with the construct and consumer-specific factor benefit weight ij  ∈ [0, 1] with\( {\sum}_{j=1}^m{benefit\ weight}_{ij}=1\forall i \).

A digital service may be required to use many different types of information, such as identity, credit card information, and location (Lioudakis et al. 2007). Consequently, consumers’ service risk i  ∈ [service risk min , service risk max ] ∀ i, service risk min  < service risk max is also service and context dependent. The service can demand k (k = 1, …, p) different permissions that are assessed for each consumer i with an permission dependent valuation permission ∈ [service risk min , service risk max ] ∀ i, k, and that can be weighted using the permission specific factor \( {risk\ weight}_{ik}\in \left[0,1\right]\ with\ {\sum}_{j=1}^p{risk\ weight}_{ik}=1 \).
$$ {service\ risk}_i=\sum \limits_{k=1}^p{risk\ weight}_{ik}\ast {permission}_{ik} $$

Given the paradox, which states that intention to disclose private data is not necessarily a predictor of actual use, we need information on whether a consumer actually uses the service. Therefore, we introduce usage \( {usage}_i=\left\{\begin{array}{c}0,\kern0.5em for\ non- usage\\ {}1,\kern0.5em for\ usage\end{array}\right. \).

To distinguish among the four segments illustrated in Fig. 1 that stand for the different intentions to disclose private data, x- and y-axes \( \widehat{benefit\ threshold} \) and \( \widehat{risk\ threshold} \) are required. Therefore, we denote Benefit as the distribution of consumers’ service benefit i . f is a function that maps distribution Benefit to a scalar \( \widehat{benefit\ threshold} \) on the interval [service benefit min , service benefit max ]. f could, for example, be the mean or any quantile such as the median. The same analogously applies for \( \widehat{risk\ threshold} \).
$$ \widehat{benefit\ threshold}=f(Benefit) $$
$$ \widehat{risk\ threhold}=f(Risk) $$
Finally, we determine the rational (R U and R N ) and paradoxical (P U and P N ) behavior, as well as the NA U and NA N , using the classification i (see Formula 5). This classification relates the actual behavior of the consumer i to his or her perceived benefit and risk of the service. Formula 5 represents the same classification that is graphically displayed as matrices in Fig. 1. As an example for this classification, a consumer, who uses a digital service and has a higher service benefit i than the \( \widehat{benefit\ threshold} \) and a lower attitude towards the service risk i than the \( \widehat{risk\ threshold} \), is segmented into R U , which represents a rational service usage. The reason for this is that the consumer appreciates the benefit of a digital service more than the risk.
$$ {a}_i=\left\{\begin{array}{c}{R}_U,\kern0.5em {usage}_i=1\wedge {service\ benefit}_i>\widehat{benefit\ threshold}\wedge {service\ risk}_i\le \widehat{risk\ threshold}\\ {}{P}_U,\kern0.5em {usage}_i=1\wedge {service\ benefit}_i\le \widehat{benefit\ threshold}\wedge {service\ risk}_i>\widehat{risk\ threshold}\\ {}{NA}_{U1},\kern0.5em {usage}_i=1\wedge {service\ benefit}_i>\widehat{benefit\ threshold}\wedge {service\ risk}_i>\widehat{risk\ threshold}\\ {}{NA}_{U2},\kern0.5em {usage}_i=1\wedge {service\ benefit}_i\le \widehat{benefit\ threshold}\wedge {service\ risk}_i\le \widehat{risk\ threshold}\\ {}{P}_N,\kern0.5em {usage}_i=0\wedge {service\ benefit}_i>\widehat{benefit\ threshold}\wedge {service\ risk}_i\le \widehat{risk\ threshold}\\ {}{R}_N,\kern0.5em {usage}_i=0\wedge {service\ benefit}_i\le \widehat{benefit\ threshold}\wedge {service\ risk}_i>\widehat{risk\ threshold}\\ {}{NA}_{N1},\kern0.5em {usage}_i=0\wedge {service\ benefit}_i>\widehat{benefit\ threshold}\wedge {service\ risk}_i>\widehat{risk\ threshold}\\ {}{NA}_{N2},\kern0.5em {usage}_i=0\wedge {service\ benefit}_i\le \widehat{benefit\ threshold}\wedge \mathrm{s}{ervice\ risk}_i\le \widehat{risk\ threshold}\end{array}\right. $$
We depict the detailed subdivision in the segments for a better understanding of the basic idea of the PPM, although the segments R U , NAU1, NAU2, R N , NAN1, and NAN2 are not fundamentally necessary to calculate the PPM. Because different types of R, P, and NA exist for service usage and non-usage, two calculation bases also exist for the PPM. PPM U  ∈ [0, 1] represents the percentage of service users behaving paradoxically (see Formula 6), or the percentage of consumers using a service, although such actual behavior is not rational. This phenomenon results in unwarranted data disclosure on the consumer’s side and, thus, represents the more important metric regarding privacy. \( \overset{\sim }{P_U} \), \( \overset{\sim }{R_U} \), and \( \overset{\sim }{NA_U} \) represent the number of elements of the classification i with the attribute P U , R U , or NA U , respectively.
$$ {PPM}_U=\frac{\overset{\sim }{P_U}}{\overset{\sim }{P_U}+\overset{\sim }{R_U}+\overset{\sim }{NA_{U1}}+\overset{\sim }{NA_{U2}}} $$
PPM N  ∈ [0, 1] represents the percentage of non-service users behaving paradoxically (see Formula 7). This metric shows the other side of paradoxical behavior, meaning the percentage of consumers who do not use a service even though they should according to the privacy calculus. \( \overset{\sim }{P_N} \), \( \overset{\sim }{R_N} \), and \( \overset{\sim }{NA_N} \) represent the number of elements of the classification i with the attribute P N , R N , or NA N , respectively.
$$ {PPM}_N=\frac{\overset{\sim }{P_N}}{\overset{\sim }{P_N}+\overset{\sim }{R_N}+\overset{\sim }{NA_{N1}}+\overset{\sim }{NA_{N2}}} $$
PPM U and PPM N can both be integrated into a single number as well. PPM ∈ [0, 1] describes the share of all consumers who behave paradoxically (see Formula 8). α ∈ [0, 1] is a weighting factor for PPM U and PPM N , which allows giving more weight to either paradoxical usage or non-usage in the calculation of the PPM. Thus, at the extreme, α = 1 represents PPM U , whereas α = 0 signifies that only PPM N is regarded.
$$ PPM=\frac{\alpha \overset{\sim }{P_U}+\left(1-\alpha \right)\overset{\sim }{P_N}}{\alpha \left(\overset{\sim }{P_U}+\overset{\sim }{R_U}+\overset{\sim }{NA_{U1}}+\overset{\sim }{NA_{U2}}\right)+\left(1-\alpha \right)\overset{\sim }{\Big({P}_N}+\overset{\sim }{R_N}+\overset{\sim }{NA_{N1}}+\overset{\sim }{NA_{N2}}\Big)} $$

In summary, the PPM represents the privacy paradox metric. It models individual consumers’ privacy calculus, classifies their service usage or non-service usage as rational or paradoxical, and aggregates multiple consumers’ intentions and behavior to a single measure. Thereby, higher values of the PPM indicate more paradoxical behavior.

Practical application of the privacy paradox metric

Mobile apps are a special type of digital service that we use to illustrate the application of the PPM. Consumers’ intention to use a service can be determined by querying them on the service’s perceived benefits and risk. The usage or non-usage of the service and, with that, the disclosure of private data enable conclusions about real behavior. The application context of mobile apps is particularly suitable for determining the PPM because the installation of a specific app is an indicator of consumers’ willingness or lack thereof to release their data. App permissions, which provide access to private data on a smartphone (Egelman et al. 2013), need to be approved by consumers in the app store when installing the app (Keith et al. 2013). Thus, consumers can typically avoid the privacy invasion only by not installing the app (Egelman et al. 2013). With that knowledge, it is possible to draw conclusions about the gap between consumers’ intention to disclose private data and their real behavior. Because perceived benefits and perceived risk are not readily available, consumers need to be asked for these measures in a survey. In August 2015, we conducted a survey with 715 participants from the 150 largest universities in Germany. In the following section, we first present a concrete ascertainment of the variables necessary for the PPM in the context of mobile apps. Subsequently, the results of the survey and the PPM are presented.

Determination of the privacy paradox metric in mobile apps

The survey questionnaire consists of four parts: app benefit and installation, app risk, and demographic data. The full German questionnaire is available from the authors upon request. The determination of the PPM in mobile apps abides by the formulas in “Basic Concept and Calculation” section. Before we discuss the operationalization of the PPM constructs in the context of mobile apps, we define requirements on the app selection process.

For this practical application of the PPM, the aim is to survey app users on their perceived benefits and risks associated with apps and compare this perception with actual installation behavior. To gain comparability across participants and limit the length of the questionnaire, we focus on five app categories. Further, to limit length of the questionnaire, we decided to not query benefits on the app level but the app category level. This requires app categories in which the benefits of different apps can reasonably be assumed rather homogenous and replaceable. Thus, we avoid app categories subject to network effects, such as present for online social networks or communication apps, for example, as this would violate the homogeneity assumption. Additionally, the consumers’ usage decision depends on the monetary costs of an app. To keep the survey simple and to preserve the homogeneity of the apps, we consider only free apps and, thus only app categories where these are common. To ensure some degree of representativeness of the app category selection, we recruited 20 test subjects and analyzed their installed apps. As a result, the following five app categories are selected for the survey: navigation, note, radio, picture editor, and running. The questionnaire contains about ten popular examples for each app category and the option to enter additional apps.

To determine the consumer’s service benefit i in a non-organizational context, we use m = 2 constructs of the extended unified theory of acceptance and use of technology (UTAUT2), namely, hedonic motivation and performance expectancy. “Hedonic motivation is defined as the fun or pleasure derived from using a technology” (Venkatesh et al. 2012, p. 161). It plays an important role and is a clear predictor of the intention to use it (Venkatesh et al. 2012). “Performance expectancy is defined as the degree to which using a technology will provide benefits to consumers in performing in certain activities” (Venkatesh et al. 2012, p. 159). Prior research found that performance expectancy is the main driver for the intention to use a technology (Venkatesh et al. 2012). In summary, both constructs are important drivers in explaining the intentions of consumers (Venkatesh et al. 2012). We operationalize hedonic motivation and performance expectancy with survey items adapted from the UTAUT2 (Venkatesh et al. 2012). Table 2 lists how items of the UTAUT2 are adjusted to measure the service benefit i for each consumer i. Similar to the UTAUT2, all items are measured using a seven-point Likert scale, with anchors being 1 (“strongly disagree”) and 7 (“strongly agree”).
Table 2

Example for determining the benefit of a service using survey items (Venkatesh et al. 2012)

Hedonic motivation

Performance expectancy

Using mobile apps is fun.

I find mobile apps useful in my daily life.

Using mobile apps is enjoyable.

Using mobile apps helps me accomplish things more quickly.

Using mobile apps is very entertaining.

Using mobile apps increases my productivity.

We capture the hedonic motivation and performance expectancy items of the UTAUT2 to detect consumers’ service benefit i for each of the five app categories. Each participant was asked each question for each of the five app categories. We query a consumer’s service benefit i at the level of app categories and not for single app, due to the practical reason that the complexity and length of the survey would arise otherwise. To assure equivalence between the questionnaire in German and the original English version, we conduct a standard translation and back-translation procedure (Brislin 1970). To operationalize the two constructs and consumers’ service benefit i , we use the average of the 7-point Likert scale of the three questions for each construct. Thus, the minimum app benefit value service benefit min  = 1 and the maximum value service benefit max  = 7. For reasons of simplicity, we use equal weights for both benefits for all participants. Although the basic idea of the model enables differing weights as well. The weights can vary, as they can be different depending on whether they are utilitarian or hedonic apps and on what users expect from them. For example, some respondents give some of the apps more hedonic values or greater usefulness than others do. Formula (1) can be adapted to our survey context as follows:
$$ {service\ benefit}_i=\frac{1}{2}\ast service\ benefit\ d\mathrm{i}{mension}_{i1}+\frac{1}{2}\ast service\ {benefit\ dimension}_{i2} $$
To calculate the privacy paradox, we also need consumers’ attitude toward the service risk i . Therefore, it is necessary to identify the types of private data that consumers must disclose when they want to use the service. The questionnaire queried which apps a respondent has installed (yes/no question for about ten popular examples for each app category, option to list further installed apps). For each of these apps, the Google Play Store publicly provides the information on the permissions the respective app requests. Thus, for each of the five app categories under investigation, we know which apps an individual respondent has installed and which permissions each of these apps requests. Every claimed permission k must be asked about in the survey to determine the respective consumers’ privacy concerns. Table 3 presents the corresponding permissions for the mobile app context (p = 12). The permission groups listed in the Google Play Store are taken as a basis for ascertaining the claimed permissions of the mobile apps. We have used the permissions of the apps to determine the risk, as the question whether a particular app has been installed can accurately and unambiguously determine whether the consumer is taking the risk or not. With other scales for privacy concerns (e.g. based on self-reported perceived risk), the risk would not be that clearly observable, and the data for the determination of the PPM would be diluted with the statements of consumers who are subject to paradoxical behavior. Analyzing the permissions requested by apps that the participant really installed anchors the calculation in observed behavior, not reported perception.
Table 3

Example for determining the risk of a service based on survey items

I think it is critical when mobile apps access my …

Device and app history

(Read sensitive log data, retrieve system internal state, retrieve running apps)


(Directly call phone numbers, write call log, read call log, reroute outgoing calls)


(Find accounts on the device, add or remove accounts)


(Read the contents of your USB storage, modify or delete the contents of your USB storage)


(read and modify your contacts)

Wi-Fi connection information

(view Wi-Fi connections)


(Read calendar events plus confidential information, add or modify calendar events, and send email to guests without owners’ knowledge)


(Approximate location (network-based), precise location (GPS and network-based), access extra location provider commands)


(record audio)


(Take pictures and videos)


(Receive text messages, send text messages)

Device ID and call information

(read phone status and identity)

Given the seven-point Likert scale, service risk min is 1 and service risk max is 7. We also use equal weights for all permissions for all participants for reasons of simplicity in this exemplary application, although permitting the collection of certain information can carry more weight in influencing one’s use decision than others. To determine the risk of an app, only the permissions requested by the app and not required for the function of the app are considered. With that, we ensure that apps, which request more app permissions than others, are considered as more critical. This is implemented in the following calculation by considering app permissions that are not required with the value 0.
$$ {service\ risk}_i=\sum \limits_{k=1}^{12}\frac{1}{12}\ast {permission}_{ik} $$

Further, to make the result comparable service risk i is scaled to the domain between 1 and 7.

To determine consumers’ real behavior, the consumer’s service usage usage i needs to be collected through questions on the mobile apps that each consumer i installed. Thus, the formula of the usage can be detailed: \( {usage}_i=\left\{\begin{array}{c}0,\kern0.5em for\ no\ app\ installation\\ {}1,\kern0.5em for\ app\ installation\end{array}\right. \).

Subsequently, we calculate the \( \widehat{benefit\ threshold} \) and \( \widehat{risk\ threshold} \). We use the median to transform distributions Benefit and Risk to scalars \( \widehat{benefit\ threshold} \) and \( \widehat{risk\ threshold} \). In doing so, we separate the higher half from the lower half of the data and divide the survey results into two parts of approximatively the same size. For the app category navigation, the outcome of our survey results are, for example, \( \widehat{benefit\ threshold}=4.17 \) and \( \widehat{risk\ threshold}=3.75 \).

By knowing all input parameters and the classification i , it is possible to categorize consumers into different types of paradoxical (P U and P N ) and rational (R U and R N ) behavior on the one side and the NAs on the other side. Finally, we can calculate PPM U PPM N , and PPM. In our practical application, we use α = 0.5 as a weighting factor as we consider both PPM U  and PPM N equally.

Results of the survey on mobile apps

Characterization of the sample

By distributing the questionnaire to students and university employees at the 150 largest universities in Germany, we were able to recruit 715 participants. Because our participants are on average 24 years old (ranging from 18 to 65 of age), we cannot claim that the results are representative of the entire population. However, the point is not to obtain representative measures of the PPM but to demonstrate the application of the metric. Our population is highly educated (57% higher education entrance qualification, 26% bachelor’s degree, 8% master’s degree, 9% other) and is predominantly female (60%). Most of the participants (60%) are familiar with a smartphone and has used one for two years or longer. Out of these, 23% used smartphones for longer than five years. The frequency of new app installations is distributed in descending order: 45% of participants install less than one new app a month, 31% install one app a month, and 24% install more apps a month. Only 8% of participants have no navigation app installed and 63% have no radio app. The maximum average number of installed apps within the app categories is 1.12 for navigation, and the minimum is 0.48 for radio apps. With that the app categories navigation and radio represent the two extremes of most and least apps within an app category.

Assessment of app benefit

Table 4 shows the mean benefit values and the standard deviation for the five app categories distinguished between the two factors and in total. However, we calculate factor scores as the arithmetic mean of responses to assure comparability across the app categories. Based on the mean values of performance expectancy and hedonic motivation, the five app categories can be divided in three groups: predominant performance apps (navigation and note), predominant hedonic apps (radio and picture editing), and balanced apps (running).
Table 4

Results of the EFA for the app benefits of the app categories

App category

Performance expectancy

Hedonic motivation





























Picture editing














Assessment of app risk

In our survey, we asked all 715 respondents about their attitude toward app permissions (see Table 3). Thus, we can draw conclusions about their risk ranking subject to their risk on a scale from 1 (lowest) to 7 (highest). Data shows that respondents perceive the app permissions Phone (mean = 6.13, SD = 1.34), Contacts (mean = 6.07, SD = 1.39), and Identity (mean = 6.02, SD = 1.37) as being particularly critical, while Location (mean = 5.36, SD = 1.76), Device and app history (mean = 5.32, SD = 1.70), and Wi-Fi connection information (mean = 4.93, SD = 1.88) are the least critical.

Table 5 shows the mean, standard deviation, as well as the minimum and maximum value of service risk i by app category, which consider the permissions required by an app (see Formula 10). This result illustrates that navigation has the highest mean value regarding both risk and benefit. Further uni-dimensional results of the app benefit as well as the app risk can be found in Appendix 2.
Table 5

Distribution characteristics of app risk by app category

App category




















Picture editing










Results of the privacy paradox metric in mobile apps

Although consumers not using a beneficial service only miss the added value, the usage of critical services is privacy relevant \( \left(\overset{\sim }{P_U}\right) \) because private data are disclosed. Consequently, we focus on the more privacy important case of service usage (usage i  = 1) in the following section. Using the surveyed data, we can instantiate the metric presented in “Basic Concept and Calculation” section Figure 2 presents an exemplary distribution of consumers for navigation apps in the case of service usage (usage i  = 1).
Fig. 2

Distribution of participants (n = 658) in segments for navigation apps in the case of service usage (usage i  = 1)

These values represent the basis for the calculation of the PPM in the case of service usage (PPM U ). For instance, the PPM U for navigation apps can be calculated as presented in formula (11):
$$ {PPM}_U=\frac{\overset{\sim }{P_U}}{\overset{\sim }{P_U}+\overset{\sim }{R_U}+\overset{\sim }{NA_{U1}}+\overset{\sim }{NA_{U2}}}=\frac{289}{289+58+261+50}=43.92\% $$
The analogous results of PPM U for all app categories are shown in Fig. 3. Thereby, n represents the subsample of participants who have installed an app of the respective app category.
Fig. 3

Distribution of participants in segments and results of PPM U for all app categories

Table 6 additionally shows the results of the PPM in the case of non-service usage, PPM N , and the integrated metric PPM for α = 0.5, i.e., an equal weighting of service usage and non-usage.
Table 6

Results of PPM U , PPM N , and PPM for all app categories

App category
















Picture editing








As is seen in Fig. 3, in this example the sum of \( \overset{\sim }{NA_{U1}} \) and \( \overset{\sim }{NA_{U2}} \) has a similar magnitude across all app categories. In contrast, the gap between rational and paradoxical behavior shows significant differences. Consumers using the radio and picture editing apps (hedonic apps) have a relatively weak PPM U compared with the navigation and note apps (performance apps). That is, many consumers use an app even though they perceive a minor benefit and significant risk. This insight can be relevant for different interest groups, such as app providers, app stores, consumer protection organizations, and app users. App providers can use the results of the PPM U to prevent serious consequences such as image damage or consumer migration. These consequences can arise when privacy concerned consumers become aware of their paradoxical privacy behavior because of incidents, such as leaked data misuse. App stores might use privacy as a competitive factor by applying the PPM U to create consumer awareness for their paradoxical behavior. Consumer protection organizations can apply the PPM U to draw attention to the privacy paradox to protect consumers. Finally, the PPM U reminds consumers themselves of their possible misconduct regarding privacy. With this knowledge, consumers can scrutinize their behavior and protect themselves from unwanted data disclosure.

Evaluation of the privacy paradox metric

The evaluation of design artifacts and design theories is a central part of design science research (Hevner et al. 2004; Peffers et al. 2007). In this paper, the evaluation demonstrates the utility, quality, and efficacy of the PPM using an elaborated evaluation method (Hevner et al. 2004). We evaluate the PPM against the requirements compiled in “Requirements for the Privacy Paradox Metric” section to ensure the rigor of the research and to prove the utility of the metrics in real situations.


To calculate the PPM, consumer data are required. In “Determination of the Privacy Paradox Metric in Mobile Apps” section, we present the possibility of quantifying the input parameters and demonstrate how the variables can be calculated in detail in the context of mobile apps. Additionally, we define the calculation rules to determine the PPM resulting in a percentage. Thus, the PPM meets the requirement by quantifying the input parameters and the result.


By specifying the components of the PPM and defining its calculation rules (see “Basic Concept and Calculation” section), we ensure its precision during determination and that the measurements are taken under prescribed conditions. This situation also ensures the repeatability of the PPM calculation (see “Results of the Privacy Paradox Metric in Mobile Apps” section).


The result of the PPM is a standardized percentage value, which is easy to compare. A “percentage simply converts a proportion to terms of per-hundred units” (Herrmann 2007, p. 33).


The obtainability of the data depends on the digital service and the context. In the case of mobile apps, all data can be simply collected by conducting a consumer survey. Real consumer behavior concerning data disclosure can also be identified because consumers are only able to install the mobile apps if they accept the permissions and, consequently, release their data.


Because the PPM is a percentage, the metric can be interpreted. PPM U represents the percentage of service users behaving paradoxically, PPM N represents the percentage of non-service users behaving paradoxically, and the integrated PPM describes the share of all consumers who behave paradoxically.


The information provided by the PPM brings along several advantages for consumers, companies offering digital services, ICT platform providers, and organizations for consumer protection, such as raising awareness for data privacy, more sensitive data disclosure, and improvement in consumer services. In the context of mobile apps, the interest groups are app users, app providers, app stores, and consumer protection organizations. The usefulness of the PPM is discussed in detail in “Introduction” section. The implications of the PPM results are presented in “Discussion and Conclusion” section.


Economic value is strongly application-dependent because varying costs and benefits can result from the data collection. The PPM is the only metric measuring consumers’ privacy paradox, which means that it is currently the best metric that considers the economic aspects of cost and benefit. Therefore, future research should consider these economic aspects when extending the PPM or defining new metrics that measure the privacy paradox.

To summarize, we emphasize that the PPM meets all requirements in the context of mobile apps. However, we cannot generalize that the PPM fulfills all requirements in the context of other digital services. Some requirements are context-sensitive and must be examined before adapting the PPM to other fields of application. Given that the metric presented in this study is the first quantification of the privacy paradox, no other privacy paradox metric exists that would outperform the PPM on the requirements.

Discussion and conclusion

The privacy paradox is well known in the literature; however, to date, no other approach has measured its extent. To better support the investigation of the privacy paradox, we design the PPM. This metric uses the theoretical basis of the privacy calculus and the observation of real consumer behavior to determine whether a consumer behaves paradoxically.

A metric is a human created artifact. We chose a design science research approach and present the metric in terms of a design science artefact to clearly highlight this artificial nature, explicitly specify the general requirements we see for such a metric and present it for both usage and as a benchmark reference for metrics to potentially be designed in the future. We posit that the PPM is a generalizable metric applied to mobile apps as an example of digital services. As such, it contributes to a nascent design theory on quantifying the privacy paradox. Appendix 1 further details this perspective by discussing the PPM in terms of the components of a design theory as suggested by Gregor and Jones (2007).

The PPM provides several important insights and implications for research and different interest groups, including service consumers, companies offering digital services, ICT platform providers such as app stores, and consumer protection organizations. For research, the privacy paradox often represents a major limitation in empirical research towards consumers’ privacy intentions. Accordingly, the PPM is a tool that may be validating respective research findings and is thereby the first approach to identify and quantify deviations between consumers’ privacy intention and behavior. In practice, service consumers could benefit from the PPM when being implemented, for instance, as a smartphone application that monitors installations and use of other apps. In this way, the PPM could provide transparency about an individual consumer’s privacy paradox, which might save consumers from careless disclosure of data and thus unwanted consequences regarding data privacy. Customers typically do not realize privacy invasions at the point of data disclosure but rather as soon as its consequences become apparent. However, at the latter point, not only consumers sustain damage, but also the company offering the respective service, as consumers might be dissatisfied, leave the company, or generate negative word-of-mouth for instance. That is, companies offering digital services can use the PPM to identify careless consumer decisions towards data disclosure and manage the risk related to such decisions. More concrete, based on the PPM, companies might decide to provide warnings at the point of possible data disclosure and make suggestions of alternative digital services that better fit the consumer’s privacy intentions. The data to identify the privacy intention can, for example, be collected during a field study to identify the PPM in general, to make a statement for specific customer groups, or while using the digital service. With the help of the privacy intention, the PPM can be detected if the behavior does not fit to it. Accordingly, ICT service providers, such as app stores, might use the PPM to enhance their attractiveness by providing privacy-customized warnings, suggestions, sorting, or filtering based on the PPM. E.g., they can ask the privacy intention of every customer at the first time of using the ICT platform to be able to detect the gap between intention and behavior before it comes to the data disclosure. Consumer protection organizations might, for instance, take the PPM as a basis for further empirical studies that increase public awareness on the risks related to the privacy paradox. Beside these stakeholders, society itself can benefit from being aware of the privacy paradox when aiming at “understanding, anticipating, and proposing solutions for potential future negative consequences of ICT” (Lynne and Mentzer 2014).

Our research is beset with limitations that require further investigation. First, the PPM is based on the binary segmentation of consumers in non-service usage (usage i  = 0) and service usage (usage i  = 1) on the one hand and on their classification into the four quadrants formed by the divisions \( \widehat{b} \) and \( \widehat{c} \) on the other hand. Therefore, consumers classified at the edges of these segments could already belong to adjacent ones if they provided slightly different survey responses. Thus, the result from calculating the PPM depends on the specifications of the exact boundary values. In this paper, we provide examples, such as using the median for \( \widehat{benefit\ threshold} \) and \( \widehat{risk\ threshold} \), but there are no definite guidelines. Future research might explore more fine-grained classifications and identify and evaluate alternative divisions. Second, we showed that the evaluation of the PPM regarding the seven requirements is particularly based on a single expository instantiation for mobile apps. Future research might apply the PPM for other digital services. Thereby, its evaluation might be strengthened and its boundaries tested. Third, the expository instantiation of the PPM uses a few simplifications. These simplifications include the aggregation of the apps in categories and assume homogeneity within a category. Further, for simplicity we used equal weights for both benefits and the twelve app permissions for all participants, although the basic idea of the model enables differing weights as well. Additionally, survey participants were not representative of the entire population. Finally, we applied the PPM in a research project and, to date, there was no application in an industry situation. For further evaluation, particularly regarding usefulness, a practical application in an industry context would be beneficial.

Overall, we presented the PPM, a privacy paradox metric for digital services, as a design artifact that enables the assessment of consumers’ privacy paradox for digital services. We followed the design science research methodology of Peffers et al. (2007) to develop the metric. Based on the context of the problem and the theoretical background, we identified metric requirements and presented the basic idea, form, and functions of the PPM. Furthermore, we demonstrated the practical applicability in the context of mobile apps as an example for digital services and evaluate the metric in terms of quantifiability, precision, comparability, obtainability, interpretability, usefulness, and economy. We hope that this quantitative perspective on the privacy paradox contributes to improvements in the disclosure and the use of private data.

Supplementary material

12525_2018_303_MOESM1_ESM.pdf (109 kb)
ESM 1 (PDF 109 kb)


  1. Abdelzaher, T., Anokwa, Y., Boda, P., Burke, J., Estrin, D., Guibas, L., ... Reich, J. (2007). Mobiscopes for human spaces. IEEE Pervasive Computing, 6(2), 20–29.
  2. Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate gratification. In Proceedings of the 5th ACM Conference on Electronic Commerce (pp. 21–29).
  3. Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the Facebook. In Proceedings of the 6th Workshop on Privacy Enhancing Technologies (pp. 36–58).Google Scholar
  4. Acquisti, A., & Grossklags, J. (2004). Privacy attitudes and privacy behavior. In L. J. Camp & S. Lewis (Eds.), Economics of information security (pp. 165–178). Boston: Kluwer Academic Publishers.CrossRefGoogle Scholar
  5. Acquisti, A., & Grossklags, J. (2005). Privacy and rationality in individual decision making. IEEE Security and Privacy, 3(1), 26–33. Scholar
  6. Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. Scholar
  7. Alt, R., Militzer-Horstmann, C., & Zimmermann, H.-D. (2015). Editorial 25/2: electronic markets and privacy. Electronic Markets, 25(2), 87–90. Scholar
  8. Becker, M., Lehrig, S., & Becker, S. (2015). Systematically deriving quality metrics for cloud computing systems. In Proceedings of the 6th ACM/SPEC International Conference on Performance Engineering (pp. 169–174). New York, USA.
  9. Bélanger, F., & Crossler, R. E. (2011). Privacy in the digital age: a review of information privacy research in information systems. MIS Quarterly, 35(4), 1017–1042.CrossRefGoogle Scholar
  10. Bélanger, F., Hiller, J. S., & Smith, W. J. (2002). Trustworthiness in electronic commerce: the role of privacy, security, and site attributes. The Journal of Strategic Information Systems, 11(3–4), 245–270. Scholar
  11. Berendt, B., Günther, O., & Spiekermann, S. (2005). Privacy in e-commerce: stated preferences vs. actual behavior. Communications of the ACM, 48(4), 101–106. Scholar
  12. Böhme, R., & Freiling, F. C. (2008). On metrics and measurements. In I. Eusgeld (Ed.), Lecture notes in computer science: Vol. 4909. Dependability metrics. Advanced lectures (pp. 7–13). Berlin: Springer.Google Scholar
  13. Bouwers, E., van Deursen, A., & Visser, J. (2013). Evaluating usefulness of software metrics: An industrial experience report. Proceedings of the 35th International Conference on Software Engineering (pp. 921–930).Google Scholar
  14. Brislin, R. W. (1970). Back-translation for cross-cultural research. Journal of Cross-Cultural Psychology, 1(3), 185–216.CrossRefGoogle Scholar
  15. Buchanan, T., Paine, C., Joinson, A. N., & Reips, U. (2007). Development of measures of online privacy concern and protection for use on the internet. Journal of the American Society for Information Science and Technology, 58(2), 157–165.CrossRefGoogle Scholar
  16. Buck, C., Horbel, C., Germelmann, C. C., & Eymann, T. (2014). The unconscious app consumer. Proceedings of the 22nd European Conference on Information Systems (ECIS2014), Tel Aviv, June 9–11, 2014.Google Scholar
  17. Chellappa, R. K., & Sin, R. G. (2005). Personalization versus privacy: an empirical examination of the online consumer’s dilemma. Information Technology and Management, 6(2–3), 181–202.CrossRefGoogle Scholar
  18. Cho, H., Rivera-Sánchez, M., & Lim, S. S. (2009). A multinational study on online privacy: global concerns and local responses. New Media & Society, 11(3), 395–416.CrossRefGoogle Scholar
  19. Culnan, M. J., & Armstrong, P. K. (1999). Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigation. Organization Science, 10(1), 104–115.CrossRefGoogle Scholar
  20. Culnan, M. J., & Bies, R. J. (2003). Consumer privacy: balancing economic and justice considerations. Journal of Social Issues, 59(2), 323–342.CrossRefGoogle Scholar
  21. Cunningham, S. M. (1967). The major dimensions of perceived risk. In D. F. Cox (Ed.), Risk taking and information handling in consumer behavior (pp. 82–111). Cambridge: Harvard University Press.Google Scholar
  22. Degirmenci, K., Guhr, N., & Breitner, M. (2013). Mobile applications and access to personal information: a discussion of users’ privacy concerns. Proceedings of the 34th International Conference on Information Systems (ICIS 2013), Milan, December 15–18, 2013.Google Scholar
  23. Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1), 61–80.CrossRefGoogle Scholar
  24. Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., & Colautti, C. (2006). Privacy calculus model in e-commerce: a study of Italy and the United States. European Journal of Information Systems, 15(4), 389–402.CrossRefGoogle Scholar
  25. Egelman, S., Felt, A. P., & Wagner, D. (2013). Choice architecture and smartphone privacy: There’s a price for that. In R. Böhme (Ed.), The economics of information security and privacy (pp. 211–236). Heidelberg: Springer.CrossRefGoogle Scholar
  26. Erl, T., Puttini, R., & Mahmood, Z. (2013). Cloud computing: concepts, technology and architecture. Upper Saddle River, NJ: Prentice Hall.Google Scholar
  27. Even, A., & Shankaranarayanan, G. (2007). Utility-driven assessment of data quality. ACM SIGMIS Database, 38(2), 75–93. Scholar
  28. Felt, A. P., Ha, E., Egelman, S., Haney, A., Chin, E., & Wagner, D. (2012). Android permissions: user attention, comprehension, and behavior. Proceedings of the 8th Symposium on Usable Privacy and Security (SOUPS 2012), Washington, DC, July 11–13, 2012.Google Scholar
  29. Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention and behavior: An introduction to theory and research: An introduction to theory and research. Reading, MA: Addison-Wesley.Google Scholar
  30. Graupner, E., Melcher, F., Demers, D., & Maedche, A. (2015). Customers’ intention to use digital services in retail banking: an information processing perspective. Proceedings of the 23rd European Conference on Information Systems (ECIS 2015), Münster, May 26–29, 2015.Google Scholar
  31. Gregor, S. (2006). The nature of theory in information systems. MIS Quarterly, 30(3), 611–642.CrossRefGoogle Scholar
  32. Gregor, S., & Hevner, A. R. (2013). Positioning and presenting design science research for maximum impact. MIS Quarterly, 37(2), 337–356.CrossRefGoogle Scholar
  33. Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems, 8(5), 312–335.CrossRefGoogle Scholar
  34. Hashem, I. A. T., Yaqoob, I., Anuar, N. B., Mokhtar, S., Gani, A., & Khan, S. U. (2015). The rise of “big data” on cloud computing: review and open research issues. Information Systems, 47, 98–115. Scholar
  35. Hauff, S., Veit, D., & Tuunainen, V. (2015). Towards a taxonomy of perceived consequences of privacy-invasive practices. Proceedings of the 23rd European Conference on Information Systems (ECIS 2015), Münster, May 26–29, 2015.Google Scholar
  36. Hauser, J., & Katz, G. (1998). Metrics: you are what you measure! European Management Journal, 16(5), 517–528.CrossRefGoogle Scholar
  37. Hawkey, K., & Inkpen, K. M. (2006). Keeping up appearances: Understanding the dimensions of incidental information privacy. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: ACM Press.
  38. Heimbach, I., Gottschlich, J., & Hinz, O. (2015). The value of user’s facebook profile data for product recommendation generation. Electronic Markets, 25(2), 125–138. Scholar
  39. Herrmann, D. S. (2007). Complete guide to security and privacy metrics: Measuring regulatory compliance, operational resilience, and ROI. Boca Raton, FL: Auerbach Publications.CrossRefGoogle Scholar
  40. Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28(1), 75–105.CrossRefGoogle Scholar
  41. Horbach, M., & Horbach, M. (Eds.). (2013). Informatik 2013: Informatik angepasst an Mensch, Organisation und Umwelt. Koblenz: Bonner Köllen Verlag.Google Scholar
  42. Hui, K.-L., Tan, B. C. Y., & Goh, C.-Y. (2006). Online information disclosure: motivators and measurements. ACM Transactions on Internet Technology (TOIT), 6(4), 415–441. Scholar
  43. Jensen, C., Potts, C., & Jensen, C. (2005). Privacy practices of internet users: self-reports versus observed behavior. International Journal of Human-Computer Studies, 63(1), 203–227.CrossRefGoogle Scholar
  44. Kaiser, M., Klier, M., & Heinrich, B. (2007). How to measure data quality? A metric-based approach. Proceedings of the 28th International Conference on Information Systems (ICIS 2007), Montreal, December 9–12, 2007.Google Scholar
  45. Keith, M. J., Thompson, S. C., Hale, J., & Greer, C. (2012). Examining the rationality of information disclosure through mobile devices. Proceedings of the 33rd International Conference on Information Systems (ICIS 2012), Orlando, December 16–19, 2012.Google Scholar
  46. Keith, M. J., Thompson, S. C., Hale, J., Lowry, P. B., & Greer, C. (2013). Information disclosure on mobile devices: re-examining privacy calculus with actual user behavior. International Journal of Human-Computer Studies, 71(12), 1163–1173.CrossRefGoogle Scholar
  47. Keith, M. J., Babb, J. S., & Lowry, P. B. (2014). A longitudinal study of information privacy on mobile devices. In Proceedings of the 47th Hawaii International Conference on System Sciences (pp. 3149–3158).Google Scholar
  48. Kotler, P., & Armstrong, G. M. (2010). Principles of marketing. Upper Saddle River, NY: Pearson Prentice Hall.Google Scholar
  49. Krasnova, H., & Veltri, N. F. (2010). Privacy calculus on social networking sites: explorative evidence from Germany and USA. Proceedings of the 43rd Hawaii International Conference on System Sciences (HICSS 2010), January 5–8, 2010.Google Scholar
  50. Krasnova, H., Günther, O., Spiekermann, S., & Koroleva, K. (2009). Privacy concerns and identity in online social networks. Identity in the Information Society, 2(1), 39–63.CrossRefGoogle Scholar
  51. Kumaraguru, P., & Cranor, L. F. (2005). Privacy indexes: A survey of Westin’s studies. Technical Report, CMUISRI-05-138, Carnegie Mellon University, Institute of Software Research.Google Scholar
  52. Laufer, R. S., & Wolfe, M. (1977). Privacy as a concept and a social issue: a multidimensional developmental theory. Journal of Social Issues, 33(3), 22–42.CrossRefGoogle Scholar
  53. Liggesmeyer, P. (2009). Software-Qualität: Testen, Analysieren und Verifizieren von Software. Heidelberg: Spektrum Akademischer Verlag.CrossRefGoogle Scholar
  54. Linkov, I., Welle, P., Loney, D., Tkachuk, A., Canis, L., Kim, J. B., & Bridges, T. (2011). Use of multicriteria decision analysis to support weight of evidence evaluation. Risk Analysis, 31(8), 1211–1225. Scholar
  55. Lioudakis, G. V., Koutsoloukas, E. A., Dellas, N. L., Tselikas, N., Kapellaki, S., Prezerakos, G. N.,. ... Venieris, I. S. (2007). A middleware architecture for privacy protection. Computer Networks, 51(16), 4679–4696.
  56. Lynne, M. M., & Mentzer, K. (2014). Foresight for a responsible future with ICT. Information Systems Frontiers, 16.Google Scholar
  57. Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004). Internet users’ information privacy concerns (IUIPC): the construct, the scale, and a causal model. Information Systems Research, 15(4), 336–355.CrossRefGoogle Scholar
  58. Mason, R. O. (1986). Four ethical issues of the information age. MIS Quarterly, 10(1), 5–12.CrossRefGoogle Scholar
  59. Merriam Webster. (2017). Definition of metric. Retrieved from
  60. Min, J., & Kim, B. (2015). How are people enticed to disclose personal information despite privacy concerns in social network sites? The calculus between benefit and cost. Journal of the Association for Information Science and Technology, 66(4), 839–857.CrossRefGoogle Scholar
  61. Nissenbaum, H. (1997). Toward an approach to privacy in public: Challenges of information technology. Ethics & Behavior, 7(3), 207–219.CrossRefGoogle Scholar
  62. Norberg, P. A., Horne, D. R., & Horne, D. A. (2007). The privacy paradox: personal information disclosure intentions versus behaviors. Journal of Consumer Affairs, 41(1), 100–126.CrossRefGoogle Scholar
  63. Offermann, P., Blom, S., Schönherr, M., & Bub, U. (2010). Artifact types in information systems design science – a literature review. In D. Hutchison, T. Kanade, J. Kittler, J. M. Kleinberg, F. Mattern, J. C. Mitchell,. ... S. Aier (Eds.), Global perspectives on design science research (pp. 77–92). Heidelberg: Springer.
  64. Palmer, J. W. (2002). Web site usability, design, and performance metrics. Information Systems Research, 13(2), 151–167.CrossRefGoogle Scholar
  65. Peffers, K., Tuunanen, T., Rothenberger, M. A., & Chatterjee, S. (2007). A design science research methodology for information systems research. Journal of Management Information Systems, 24(3), 45–77. Scholar
  66. Roeber, B., Rehse, O., Knorrek, R., & Thomsen, B. (2015). Personal data: how context shapes consumers’ data sharing with organizations from various sectors. Electronic Markets, 25(2), 95–108. Scholar
  67. Schreiner, M., & Hess, T. (2015). Why are consumers willing to pay for privacy? An application of the privacy-freemium model to media companies. Proceedings of the 23rd European Conference on Information Systems (ECIS 2015), Münster, May 26–29, 2015.Google Scholar
  68. Sheng, H., Nah, F. F.-H., & Siau, K. (2008). An experimental study on ubiquitous commerce adoption: impact of personalization and privacy concerns. Journal of the Association for Information Systems, 9(6), 15.CrossRefGoogle Scholar
  69. Smith, H. J., Milberg, S. J., & Burke, S. J. (1996). Information privacy: measuring individuals’ concerns about organizational practices. MIS Quarterly, 20(2), 167–196.CrossRefGoogle Scholar
  70. Smith, H. J., Dinev, T., & Xu, H. (2011). Information privacy research: an interdisciplinary review. MIS Quarterly, 35(4), 989–1016.CrossRefGoogle Scholar
  71. Son, J.-Y., & Kim, S. S. (2008). Internet users’ information privacy-protective responses: a taxonomy and a nomological model. MIS Quarterly, 32(3), 503–529.CrossRefGoogle Scholar
  72. Spiekermann, S., Grossklags, J., & Berendt, B. (2001). E-privacy in 2nd generation e-commerce: privacy preferences versus actual behavior. Proceedings of the 3rd ACM Conference on Electronic Commerce.Google Scholar
  73. Spiekermann, S., Acquisti, A., Böhme, R., & Hui, K.-L. (2015). The challenges of personal data markets and privacy. Electronic Markets, 25(2), 161–167. Scholar
  74. Stone, E. F., Gueutal, H. G., Gardner, D. G., & McClure, S. (1983). A field experiment comparing information: privacy values, beliefs, and attitudes across several types of organizations. Journal of Applied Psychology, 68(3), 459.CrossRefGoogle Scholar
  75. Stutzman, F., Gross, R., & Acquisti, A. (2013). Silent listeners: the evolution of privacy and disclosure on Facebook. The Journal of Privacy and Confidentiality, 4(2), 7–41.CrossRefGoogle Scholar
  76. Tene, O., & Polonetsky, J. (2012). Privacy in the age of big data: a time for big decisions. Stanford Law Review Online, 64, 63–69.Google Scholar
  77. van Slyke, C., Shim, J. T., Johnson, R., & Jiang, J. J. (2006). Concern for information privacy and online consumer purchasing. Journal of the Association for Information Systems, 7(6), 415–444.CrossRefGoogle Scholar
  78. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27(3), 425–478.CrossRefGoogle Scholar
  79. Venkatesh, V., Thong, J. Y. L., & Xu, X. (2012). Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Quarterly, 36(1), 157–178.Google Scholar
  80. Wallmüller, E. (2001). Software-Qualitätsmanagement in der Praxis: Software-Qualität durch Führung und Verbesserung von Software-Prozessen. München: Hanser.Google Scholar
  81. Wei, X., Gomez, L., Neamtiu, I., & Faloutsos, M. (2012). Malicious android applications in the enterprise: What do they do and how do we fix it? In Proceedings of the 28th International Conference on Data Engineering Workshops (pp. 251–254).Google Scholar
  82. Xu, H., Teo, H.-H., Tan, B. C. Y., & Agarwal, R. (2009). The role of push-pull technology in privacy calculus: the case of location-based services. Journal of Management Information Systems, 26(3), 135–174.CrossRefGoogle Scholar
  83. Zhan, J., & Rajamani, V. (2008). The economics of privacy-privacy: People, policy and technology. Proceedings of the 2nd International Conference on Information Security and Assurance.Google Scholar
  84. Zhou, T. (2013). Examining continuous usage of location-based services from the perspective of perceived justice. Information Systems Frontiers, 15, 141–150.CrossRefGoogle Scholar
  85. Zukowski, T., & Brown, I. (2007). Examining the influence of demographic factors on internet users’ information privacy concerns. Proceedings of the 2007 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on IT Research in Developing Countries.Google Scholar

Copyright information

© Institute of Applied Informatics at University of Leipzig 2018

Authors and Affiliations

  1. 1.FIM Research CenterAugsburgGermany
  2. 2.Fraunhofer FIT - Project Group Business and Information Systems EngineeringAugsburgGermany

Personalised recommendations