Consider three illustrative privacy issues online:

  1. (1)

    Through ‘Sponsored Stories,’ Facebook users who clicked on ‘like’ buttons had pictures of themselves with an endorsement sent to their friends in a what looked like sponsored advertising (Kravets 2012).

  2. (2)

    The travel site Orbitz tracks how users arrived at their site in order to prioritize search results: if a user arrived at Orbitz from a competitor’s site, Orbitz may prioritize results based on price (Mattioli 2012). Similarly, Facebook mines users’ browser history in order to target advertising.

  3. (3)

    Verizon offers a service—Precision Market Insights—to business customers to mine Verizon’s customer call and web browsing information in order to map where people are located and the types of services they purchase and use (Hill 2012). In an aptly titled article: “Verizon Very Excited That It Can Track Everything Phone Users Do And Sell That To Whomever Is Interested,” Kashmir Hill outlines the service Verizon offers to business’ to track their potential customers: “we [Verizon] understand what our customers’ daily activity stream is…,” and Verizon sells that activity stream to their commercial customers.

In each case, individuals willingly divulged information—clicked like, visited a travel site, watched a basketball game in a stadium—yet held different privacy expectations within the different contexts. For example, location information is expected to be used and tracked from a travel website (Martin and Shilton 2015); yet it is a surprise when information is used to track movement to and from a basketball game. Individuals share preferences with some friends—but not all. Users’ different norms and expectations across contexts has been a source of frustration to firms and academics alike.

To explain variances in privacy expectations, previous work relies on a static, universal definition of privacy expectations and measures differences in individuals’ concerns, attitudes, or valuations of privacy as illustrated in Table 1. In privacy scholarship, the access-view of privacy suggests that individuals have a reasonable expectation of privacy so long as they and their information are inaccessible or hidden (Warren and Brandeis 1890; Elgesem 1999; Persson and Hansson 2003; Schoeman 1984; Posner 1981). Online, the access-view would categorize the act of sharing information as necessarily giving up any expectation of privacy. When individuals use a phone, watch a basketball game, or click ‘like,’ individuals are seen as not having privacy expectations because all of the information was accessible. The question then becomes, ‘why did the users divulge the information at all?

Table 1 Ethical implications of privacy approaches

Alternatively, the control-view of privacy (Westin 1967; Alder et al. 2007; Margulis 1977; Altman 1975; Moor 1997) suggests that relinquishing control of information to another party renders the individual without any reasonable expectation of privacy. Online, the control-view of privacy is regulated through adequate notice and choice in Fair Information Practices (FIPs; Bennett 1992; Ashworth and Free 2006; Peslak 2005; Culnan and Armstrong 1999; Bowie and Jamal 2006). FIPs allow for the contemporaneous disclosure of information and respect of privacy norms while online.Footnote 1 Although popular, notice and choice statements may be immaterial—or nonfactors—to assessments about the appropriateness and inappropriateness of the information transmitted within a particular context. In fact, each of the three examples was argued to comply with the written privacy notices, and users agreed to the notice upon engaging with the service; yet, all three examples caused privacy advocates to bring lawsuits or provided the impetus for articles exposing the firms’ behavior. In other words, individuals, employees, users, and consumers make judgments about privacy expectations and violations regardless of the notice and choice policy in many situations.

Recent work on privacy suggests that privacy norms can be viewed as mutually beneficial and sustainable agreements within a community (Martin 2012) or as context-dependent norms (Nissenbaum 2004, 2009). These social contracts are the unstated agreements that individuals and groups make in contexts, communities, and relationships. Studies also substantiate the theory: 71 % of respondents would disclose within an established relationship (Louis Harris and Associates and Westin 1997; Culnan and Bies 2003), and individuals within a particular community, such as teams or young adults, develop substantive privacy norms not easily recognized or understood by outsiders (Martin 2012; Turow et al. 2009). In other words, individuals give access to information within a particular context with an understanding of the privacy rules that govern that context.

Understanding the factors that drive mutually beneficial and sustainable privacy norms within communities is important to firms in order to best meet the privacy expectations of stakeholders such as consumers, users, and employees.Footnote 2 Not only does meeting consumer privacy expectations increase purchase intentions and consumers’ likelihood to transact with a firm (Cases et al. 2010; Eastlick et al. 2006), but meeting consumer privacy expectations also increases trust in a firm (McCole et al. 2010), while violating privacy expectations leads to adverse consumer reactions (Miyazaki 2009). Importantly for business ethicists, privacy violations are experienced as individual harms (Calo 2011) and as unfair acts (Ashworth and Free 2006).

While privacy as a social contract—a mutually beneficial agreement within a community about how information is used and shared—has been introduced theoretically (Culnan and Bies 2003) and empirically (Martin 2012), the full impact on firms of an alternative framework to respecting the privacy expectations of users, consumers, and employees has not been examined. Importantly for researchers and firms, questions remain about how to identify microsocial contract norms about privacy and what is taken into consideration in forming those privacy norms.

This paper further develops a social contract approach to generating, acknowledging, and protecting privacy norms within specific contexts (Martin 2012). The goal of this paper is to examine how information norms develop through a social contract narrative, to reframe possible privacy violations of business given the social contract approach to privacy, and to critically examine the role of business as a contractor in developing privacy norms. The social contract approach “need not—and seldom does—eliminate all questions from a moral quandary. But it can provide logical vantage points from which to view an ethical quandary and, in turn, point towards a solution” (Donaldson and Dunfee 2003, p. 115).

Understanding the underpinnings of social contract privacy norms will allow researchers and practitioners to identify the factors driving privacy expectations. Based on this narrative, individuals within a given community discriminately share information with a particular set of obligations in mind as to who has access to the information and how it will be used. In other words, rather than giving away privacy, individuals discriminately share information within a particular community and with norms governing the use of their information. Most importantly for business and business ethics, privacy as a social contract shifts the focus from gaining the consent from the user, individual, employee, or consumer to the responsibilities of the firm as a contractor to maintain a mutually beneficial and sustainable solution. The beginning of this move can be seen in online sites and applications, such as diaspora,* TOR, DuckDuckGo, and YikYak, which place understanding and meeting the privacy expectations of users as part of their value proposition.

This paper proceeds as follows.

  • First, the social contract approach to privacy is explored by connecting privacy scholarship with existing social contract theory within business ethics—namely, Integrative Social Contracts Theory (ISCT).

  • Second, I examine the social contract narrative specifically around privacy; this social contract construct grounds microsocial contract privacy norms as the natural outgrowth of individuals living in a community. The narrative offered here suggests that individuals have an interest in discriminately sharing information within a particular community and helps explain the factors that contractors take into consideration in forming privacy expectations.

  • Third, online privacy violations are redescribed given the social contract approach to privacy to better understand how seemingly disparate privacy violations (Solove 2006) are related through a social contract approach to privacy.

  • Fourth, I discuss the implications of the social contract approach to privacy and the social contract narrative for the alternative theories of privacy, which are neither descriptively valid nor prescriptively useful.

  • Finally, I critically examine the role of business as a contractor in developing privacy norms and outline implications of a social contract approach to privacy on management research and practice in the implications and conclusion.

Privacy as a Social Contract

While a social contract approach to privacy has been suggested generally, here I examine what privacy as a social contract would entail within business ethics and ISCT before developing the social contract narrative.

Previous Links Between Privacy and Social Contract Theory

A growing body of theoretical scholarship has focused on privacy as contextually defined, where privacy norms are defined and examined within a specific set of relationships, situations, or contexts (Nissenbaum 2004, 2009; Solove 2006; Martin 2012; Stutzman and Hartzog 2012; Moor 1997; Jiang et al. 2002). Within these contextually defined privacy approaches, what is and is not private is dependent on relationships, actors, information, and context (Nissenbaum 2004, 2009; Solove 2006; Grimmelmann 2010; Tufekci 2008a, b; Sloan and Warner 2013). The rules used to develop privacy norms vary across contexts; therefore violations of privacy occur when these negotiated, context-dependent rules are broken.Footnote 3

Contextually dependent or relationship-dependent approaches to privacy, where privacy rules are negotiated and evolve within particular contexts or relationships, mirror a social contract approach to norms (Martin 2012). For example, privacy as contextual integrity suggests that privacy is respected when an information exchange meets the privacy norms of a context or a community of actors. These norms include not only the type of information expected, but also who will be able to see and use the information as well as the transmission principles associated with the information (Nissenbaum 2009). Similarly, social contract theory suggests that behavior, such as economic transactions or exchanges of information (Martin 2012), that reside within a community and whose effects reside within the community’s should be governed by the communities’ locally negotiated norms.

Social Contract Theory in Business Ethics

Within business ethics, the conversation around social contract theory centers on Donaldson and Dunfee’s ISCT (1994, 1995, 1999) and Heugens et al.’s Contractualist Business Ethics (2006).Footnote 4 While research has focused less on the application of ISCT and more on the philosophical underpinnings of ISCT (Heugens et al. 2006, p. 729), ISCT has been utilized to explore particular ethical issues previously (Dunfee 2006, p. 313) including financial reporting and governance (Campbell et al 2003), marketing (Dunfee et al. 1999), lying (Ross and Robertson 2000), deviance in organizations (Warren 2003), marketing credit to college students (Lucas 2001), and Internet adoption in the Arab world (Loch et al. 2003).

Of particular relevance to privacy, ISCT delineates two types of agreements, as cogently described by Donaldson and Dunfee (1999). First, a macrosocial contract sets up the space for individuals to develop rules of engagement—including privacy norms—within a particular community. Local communities are more than simply two-party relationships. A community is a “self-defined, self-circumscribed group of people who interact in the context of shared tasks, values, or goals and who are capable of establishing norms of ethical behavior for themselves” (Donaldson and Dunfee 1999, p. 262). Marriages, friends, teams, work groups, organizations, and organization–stakeholder relationships develop privacy norms particular to their community.

Second, contractors create and negotiate microsocial contracts within the community in order to resolve issues and place constraints on behavior. For example, Verizon’s collection of phone record data—the metadata of every phone call including the caller, recipient, phone number, duration, and (possibly) the GPS location information, would be within a microsocial contract between contractors (users) and Verizon would be expected to respect those privacy expectations with their customers regardless of the substance of the notice in the user agreement. Similarly, Facebook users have expectations as to who sees their information and how it is used (Martin 2011) regardless of the privacy notices—including how user information is manipulated for experiments (Albergotti 2014b). Empirically, respondents within a particular community have a better understanding of the privacy norms than outsiders (Martin 2012).

ISCT allows for locally negotiated microsocial contracts as well as the universal principles that transcend communities. The communities are afforded the moral free space to generate community-specific moral rules consistent with their members’ preference and experiences (Donaldson and Dunfee 1999, p. 83; Dunfee 2006, p. 315). However, these communities must abide by procedural hypernorms of consent—usually manifest through the right of contractors to have voice and to exit. Microsocial contracts are only legitimate if the agreements conform to the procedural hypernorms of consent, voice, and exit; and, microsocial contracts around privacy norms only bind contractors if the agreements are legitimate (Donaldson and Dunfee 1999).Footnote 5

Allowing privacy rules to vary based on the community or relationship mirrors expectations of privacy in the world. Similar to contractual business ethics’ impact on global commerce in explaining how and why norms may vary across global contexts (Donaldson and Dunfee 1994; Van Oosterhout and Heugens 2009), the social contract approach to privacy explains how and why norms may vary across communities of actors with important implications to research and practice.

Social Contract Narrative for Privacy

An important next step in exploring a social contract approach to privacy is the social contract narrative. The narrative can justify the moral rightness of a principle, explain the social and institutional fabric of a society (e.g., Nozick 1974), or explain the emergence, persistence, or stability of an extant social contract (Heugens et al. 2006). Here, a midlevel social contract narrative is used to explain and analyze the dynamic process of privacy norm generation within particular communities.Footnote 6 Table 2 illustrates the social contract narrative applied here.

Table 2 Social contract narrative for expectations of privacy

The first step in walking through a social contract narrative is to specify an initial position. This position is a priori any agreement between parties and provides the setting for reasonable contracting where individuals are assumed to have (1) an initial state and (2) behavioral tendencies. This first step provides the setting to create an agreement and asks not only what privacy norms would contractors agree to but also what do contractors take into consideration? For firms and business ethicists, the output of this narrative will provide key facets of the microsocial contracts about privacy and the factors that contractors—such as users, consumers, and employees—take into considerations in developing privacy norms.

Initial Position

For an initial state, one would need to imagine a world where individuals have no communication or interaction with others and are in a state where information can easily remain inaccessible. Individuals in this initial state would live and work by themselves and maintain their living environment independently. Privacy, in such a world, only requires that individuals keep a solitary existence and not give access to their information to anyone. In this position, the individual is able to maintain privacy by remaining alone and hidden. This initial state would constitute a scattering of recluses.

In fact, this initial state remains a theme throughout privacy scholarship in that individuals continue to have an interest in being inaccessible to others by remaining isolated both physically and psychologically. The right to be left alone (Warren and Brandeis 1890) preserves liberty and autonomy as individuals are free to “develop personalities, goals, ideas, and the right to determine to whom their thoughts, emotions, sentiments, and tangible products are communicated” (Bloustein 1964, p. 18). Such a state of solitary inaccessibility corresponds to defining privacy as the ability to restrict access to personal information (e.g., Allen 1988) or as protection from information gathering (Tavani and Moor 2001). Privacy as restricted access prevents people from knowing certain things and implies entering the public sphere to require giving up a measure of privacy (Alfino and Mayes 2006). According to the restricted-access view of privacy in this original state, individuals either share information and make it public or do not share information and keep it private.

Behavioral Tendencies

Such a state of inaccessibility is not sustainable as we may minimally assume individuals have a behavioral tendency to form relationships and coordinate activities (Dennett 1995; De Waal and De Waal 1997). In other words, we do not have the behavioral tendencies to live as a scattering of recluses. These tendencies are so strong and integral to being human that a state of perfect inaccessibility—or a completely solitary existence where a person and their information is kept inaccessible from others—is considered an extreme form of punishment today: solitary confinement (Tufekci 2008a, b). Defining privacy as a state of inaccessibility is neither practical nor desirable and, ironically, renders privacy as a form of punishment.

As individuals naturally come together to form relationships, they share information. Human beings enjoy the freedom to converse and trade information about one another and have an interest in collecting information as well as sharing information. Throughout privacy scholarship, a need to share information for intimacy (Elgesem 1996, p. 51), in order to have relationships (Fried 1968), and to converse and trade information (Singleton 1998) pervades justifications for privacy norms. Because the original state of inaccessibility is inefficient for economic and social actors (Posner 1981), information sharing becomes necessary for relationships.Footnote 7

Furthermore, discriminately sharing information affords people the important power to determine both how close they are to others and the nature of their relationships. Information sharing is not only necessary to form relationships and trade, but discriminately sharing allows individuals to differentiate between relationships. Maintaining more than one relationship becomes more complicated as individuals interact with different types of people from different contexts or communities. Individuals share different types and amounts of information in order to negotiate the boundary conditions of relationships (Samarajiva 1997). “The sort of relationship that people have to one another involves…a conception of the kind and degree of knowledge concerning one another which it is appropriate for them to have” (Rachels 1975, p. 294). Different relationships require different information-sharing rules, and controlling who has access to personal information is necessary for friendship, intimacy, and trust (Fried 1968) and preserves important human relationships (Nissenbaum 2004). As noted by technology scholar James Moor, “different people may be given different levels of access for different kinds of information at different times” (1997, p. 414).

Outcome: Framework for Privacy as a Social Contract

The social contract narrative illustrates the natural evolution of social contract norms around privacy. Based on this narrative, individuals within a given community discriminately share information with a particular set of obligations in mind as to who has access to the information and how it will be used (Nissenbaum 2004, 2009; Martin 2012; Sloan and Warner 2014). Based on the social contract narrative, a framework for microsocial contract privacy norms centers on (1) the type of information, (2) who has access to information, and (3) how the information is used within a given community as explored below.

First, an ideal sphere lies around every individual where trespasses can be seen as an insult to one’s honor (Simmel 1906, p. 321). Protecting that space can help create a background for a self-creative enterprise (Bennett 1992). Privacy law scholar Julie Cohen refers to this inviolate space as the privacy of the home that affords “freedom of movement that is both literal and metaphorical” (Cohen 2008, p. 195). This tension between the need to maintain the ideal sphere around ourselves and the need to disclose information for relationships and communities is sustained through negotiated norms around the type of information. Users on a site, such as Facebook or diaspora,* have an expectation about the type of information collected—such as GPS or browsing history or demographics.

Importantly, people retain the desire to limit who has access to information. In other words, information known to one person does not necessarily mean the information is meant for all people. Sharing is not all or nothing but ‘optimal’ depending on maturity and scope of relationship and the role of the individual (Brin 1999). Determining who receives which piece of information keeps people from being “misrepresented and judged out of context” (Rosen 2001, p. 21). Trying out different jokes, behaviors, or personas with friends helps people to develop as individuals; but those same jokes, behaviors, or personas could be damaging with a different population. Individuals are constantly deciding how to present themselves at varying personal and social levels through agreements about confidentiality (Stutzman and Hartzog 2012), while retaining a desire for seclusion and a fear of intrusion (Bambauer 2012). Online individuals need to discriminately share information within a relationship without fear of these behaviors or information being broadcast broadly—or sold to data aggregators or retained for years. In casual language, individuals talk about expectations of confidentiality to signify the rules about which actors can know particular information.

Finally, when individuals do reveal information to an actor, rules and obligations govern not only who else should receive the information but also how the information is used (Hartzog 2011). These social contracts around what, to whom, and for what purpose information flows are the governing rules about privacy for a given community. The purpose(s) of the community within which the information is shared dictates the valid uses of the information gathered or disclosed. Tracking GPS location data by an application is valid when the application is for directions or tracking your cycling route, but not valid when the application is to simulate a flashlight. When people attempt to assign property rights to control information, they attempt to control how information is later used.

These facets of privacy norms—the what, who, and how—can be seen as working in concert within a given relationship. Within a community or context, for every given set of data, there exists a rule about who should be privy to that information and the purpose for that information. Similarly, for every given set of individuals, there exists a set of information that is expected to be shared and why. Key to these agreements is how the main components work together (see Nissenbaum 2009). Within privacy as a social contract, “who, what, and how” would identify a particular micro privacy norm in a community.

From an original and unsustainable state of inaccessibility, individuals have a need to discriminately share information in order to socialize, create relationships, form groups, and trade. Individuals have a desire—and a reasonable expectation—to be able to live within communities while maintaining a sense of self. Just as communities acknowledge freedom of movement simultaneous to a protection from assault, individuals and society have an interest of interacting in a community through sharing information while preserving space to develop themselves, their relationships, and their communities.

(Re)conceptualizing Privacy Online

The social contract approach used here is a multilevel, contextually rich framework allowing for specific contractors within a contracting community the moral free space to develop authentic and legitimate privacy norms and expectations. And the social contract narrative is an important step to understand the factors individuals take into consideration when negotiating privacy microsocial contract norms. Alternative approaches to privacy have been attractive because respecting and violating privacy are clearly defined and easy to measure—privacy is violated when information is either not controlled or no longer inaccessible. Privacy as a social contract offers a more nuanced, context-dependent understanding of privacy while not venturing into the territory of relativism. To explain, common privacy violations are redescribed below given the social contract approach to privacy and outlined in Tables 3 and 4.

Table 3 Reframing privacy violations
Table 4 Analyzing privacy violations online using examples

Reframing Privacy Violations

Violation #1: Procedural Hypernorms

First, hypernorms can be violated by not adequately addressing the procedural and structural requirements for a legitimate social contract. Microsocial contracts rely upon procedural norms of adequate voice, exit, and informed consent (Dunfee 2006), and the current focus online on adequate notice and choice seeks to uphold minimal precepts of social contract’s procedural norms of exit, consent, and voice. Online privacy notices, authentic consent, and an ability to switch websites would address the procedural hypernorms required in the macrosocial contract.

For example, researchers continually find violations to procedural hypernorms online. Notices are unrealistically time consuming (McDonald and Cranor 2008) and not always targeted toward consumers (Cranor et al. 2014). Empirical studies have shown that notices are difficult if not impossible to find by users (Leon et al. 2012) and include misleading information (Leon et al. 2010). Respondents do not understand notices to the point where users are misled by icons and notices (Ur et al. 2012). And respondents have been found to assume their privacy expectations are included in the notice (Martin 2014) or that the advertising icon does more to protect their privacy than in actuality (Leon et al. 2012).

Privacy as a social contract would suggest that focusing on informed consent and the contractors’ right of exit and voice are important, but not the only tactics to respect privacy expectations. The procedural norms of consent, exit, and voice are required for the micro-privacy social contracts to be legitimate and to bind the members of the community.Footnote 8 However, much of the proverbial ‘heavy lifting’ around privacy expectations is done within the community in identifying and negotiating context-specific privacy norms around who, what, and why information is shared.

Violation #2: Microsocial Contracts

In addition, a violation of privacy would also include when information is tracked, disseminated, or used against the agreement of the actors within the community through a breach of microsocial contracts. Given the framework of micro privacy norms above, privacy violations occur when the recipient of information—an organization, a user, or the primary website—changes who is included in receiving information, what information is shared, and how the information is used.

Change What Information is Shared

Individuals retain a desire to keep certain information inaccessible even within defined relationships, yet new pieces of information become available with advances in technology. In regards to online surveillance, GPS data is now available from mobile devices and tracked in addition to a user’s IP address or a unique user identifier. A study of 101 popular applications found that 47 transmitted phone location and 56 transmitted a unique phone identifier to a third-party data aggregator (Thurm and Kane 2010). In addition, websites can identify and capture how individuals travel to a website, where they click on a page, and where they travel after the visit in addition to purchases and searches while on the site. For example, Facebook began collecting and using user browsing history—users’ online activities outside the context of Facebook—in order to target advertising (Albergotti 2014a). A recent study found that 31 % of applications gather information outside their purpose and without a valid use (“Backgrounder” 2014). Collecting new information within an existing relationship may constitute a privacy violation.

Change Who Receives the Information

Individuals regularly give access to information to some people or some organizations while keeping the same information from others. For example, Facebook’s Beacon program took information about an individual’s browsing and buying habits with an online retailer, such as Amazon.com, and sent alerts automatically to a new group of individuals—the Facebook user’s friends. The information disclosed to Amazon.com (and others) was leaked to Facebook friends, thereby changing the actors who received the information. When a fitness application, Moves (https://www.moves-app.com), was acquired by Facebook, a new actor (Facebook) suddenly had access to the app’s user information—much to their surprise (Wagner 2014). Similarly, tagging photographs online allows new individuals to know about offline activities: by posting a picture and linking it to a subject’s name, offline activities are suddenly available to individuals not present at the event. Users do not relinquish information to an undefined group of actors. Rather, individuals knowingly disclose information to a particular set of actors within a community.

Change How the Information is Used

Individuals have an interest in how their information is used within a community, and a line of scholarship has evolved to equate privacy with the degree of control over personal information. While problems abound with conceptualizing privacy as solely an intellectual property issue (Bambauer 2012), the underlying premise that individuals have an interest in how their information is used is sound and remains a strong focus in privacy framed as a property right and the FIPs prevalent in business.

For example, information given to a medical professional is to be used for medical diagnosis or for furthering the medical field through research. If the medical professional were to sell that information to a pharmaceutical company for marketing purposes or use that information to sell the patient a car, the professional would breach the terms of use within the social contract. Online, a user’s travel history may be known to a website such as Orbitz and can be used to analyze how individuals came to find Orbitz for future Orbitz marketing or advertisements. However, using an individual’s online history to change query results uses the known information in a novel way. Research has shown users have privacy expectations around both the type of information access as well as how the information is used using mobile apps (Shilton and Martin 2013) and online (Martin 2014). Further, when respondents are shown the information that was collected and aggregated about them online, respondents care about the scope of use of even innocuous information online (Cranor et al. 2014).

An infamous example of the misuse of legitimately acquired information is the use of Facebook users’ data and the users themselves in an experiment (Albergotti 2014b; Meyer 2014). Facebook manipulated the newsfeeds of 700,000 users to be more positive or negative and then measured the effect on users’ subsequent postings. The postings were in the hands of a valid actor (Facebook and the recipients of the post), but Facebook used the information in a novel way thereby violating the microsocial contract in the Facebook community around the expected use of information.

Violation #3: Community or Contextual Integrity (Nissenbaum 2009)

Social contract theory suggests a third level of privacy violations in protecting the integrity of the boundaries of the contracting community and their moral free space. In other words, viewing privacy norms as a social contract highlights the moral importance in protecting the boundaries of the context in Nissenbaum’s Privacy as Contextual Integrity (2004, 2009) or moral free space of the communities. Within social contract theory, society has an obligation to not develop and impose substantive norms on the moral free space of the contractors. If outsiders to a contracting community make substantive demands on the content and flow of information, such outsiders would be breaching the integrity of that moral free space. In fact, such a privacy intrusion or violation is also referred to as a violation of decisional privacy (Allen 1999) or passive privacy (Floridi 2006) where the interference in autonomy is considered a privacy violation. Broad regulations aimed at too high a level may impose a standardized set of privacy norms across communities. For example, the browser-level Do Not Track designation may not apply to particular contexts and would interfere with the ability to develop microsocial contracts within particular communities.

Discussion

Privacy as a social contract constitutes a shift from viewing sharing information online as dispositive of relinquishing reasonable expectations of privacy to viewing sharing information online as a necessary part of strong community and individual autonomy. As such, the use of a social contract approach to privacy sheds light on weaknesses in the traditional restricted access and control definitions of privacy and also extends the important work within privacy on privacy as contextual integrity (Nissenbaum 2009).

Correcting Previous Views of Privacy

The access-view of privacy, where privacy is maintained only by remaining inaccessible to others, requires individuals to relinquish privacy when interacting within their community. Yet, research has shown users have privacy expectations around both the type of information revealed as well as how the information is used when online (Martin 2014) or when using mobile apps (Martin and Shilton, 2015). Respondents care about the scope of use of even innocuous information online (Leon et al. 2013), view tracking and online behavioral advertising as creepy (Ur et al. 2012), and wish to not be tracked when online (McDonald and Cranor 2010). Privacy as a social contract allows for the fact that individuals disclose information without relinquishing privacy.

Privacy as a social contract provides guidance post-disclosure and allows for the interest of individuals both to share information while having privacy expectations around how information is used and who has access within a community. For example, Facebook tracking users’ web browser history, experimenting with users’ newsfeed, and gaining access to user data of an acquired application concerned previously disclosed information that was, for the access-view of privacy, considered ‘public.” Facebook’s violations are not captured with the access-view of privacy but are explained with privacy as a social contract as a breach of microsocial contract norms and as explored in Table 3 above.

The control-view of privacy, most often operationalized through adequate notification and consumer choice, assumes that individuals maintain control over their information by reading a privacy notice and choosing the website whose privacy practices most closely match their preferences. Yet considerable agreement exists that notice and choice has failed to govern privacy effectively online (Martin 2013; Nissenbaum 2011; Calo 2012; Solove 2013). Consumers fall victim to becoming a ‘captive audience’ without functional opt-out mechanisms thereby making notice and choice less meaningful (Popescu and Barah 2013). Perhaps most damning, 91 % of respondents feel as though they have lost control of their data (Madden et al. 2014). In fact, the infamous Facebook experiment conformed to the broad statements in Facebook’s privacy policy (Elder 2014).

Not only are the access-view and control-view of privacy lacking in descriptive validity, the views of privacy may guide firms in the wrong direction to meet privacy expectations of users. Currently, the only affirmative responsibility of firms online is adequate notification (Calo 2012; Beales and Eisenach 2013). Firms online are not responsible for their specific privacy practices—only in communicating their tactics to consumers. In focusing on disclosure as the main responsibility of the firm, firms become free to implement questionable privacy practices so long as the practices are accurately reported. However, the social contract narrative suggests that individuals have an interest in discriminately sharing information with limits as to who knows and how it is used, thus changing how managers and management researchers would frame privacy violations and judge privacy expectations.

Extending Privacy as Contextual Integrity

The social contract approach to privacy also extends context-dependent theories of privacy, such as privacy as contextual integrity (Nissenbaum 2009). Privacy as a social contract offers a mechanism to judge privacy norms and, in doing so, addresses charges of relativism endemic to contextually dependent theories of privacy. First, locally negotiated social contracts are always beholden to procedural universal principles to remain legitimate (Van Oosterhout et al. 2006). Therefore, microsocial contract privacy norms must also abide by the universal and thin second order norms such as the rights of consent, voice, and exit (Donaldson and Dunfee 1995; Dunfee 2006; Heugens et al. 2006). As such, contracting has an internal morality without the need for external substantive guidance—for some (Van Oosterhout et al. 2006).Footnote 9

In addition, locally negotiated privacy norms, i.e., microsocial contracts, can be analyzed through both actual and hypothetical social contracts to address “norms of decency, etiquette, sociability, convention, and morality” (Nissenbaum 2004; see also, Tavani 2008; Dunfee 2006). While privacy as contextual integrity (Nissenbaum 2004, 2009) focuses on the actual negotiated privacy norms, a social contract approach adds a possible additional layer of analysis in the form of the hypothetical social contract which would have moral weight. We could ask, what privacy norms would reasonable individuals agree to, given minimal social contract standards of consent, voice, and exit?

These hypothetical microsocial contracts should leverage existing empirical work on privacy expectations, interests, and preferences. For example, we can ask, “would users of Facebook expect their browsing histories to be used for targeted advertising?” regardless of what practices were communicated in the privacy policy (Albergotti 2014a). Considering the fact that 80 % of respondents are concerned third parties accessing data they share (Madden et al. 2014), we would be able to presume Facebook users would be concerned with third parties accessing their data.

Finally, locally negotiated privacy norms must meet the interests of the contractors to discriminately share information as illustrated within the narrative above. Similarly, Helen Nissenbaum highlights the important purpose of the community in guiding appropriate privacy norms. Nissenbaum further suggests judging privacy norms based on, first, the promotion of goods and values within the context and, second, meeting “fundamental social, political, and moral values” (2009, p. 128). Within social contract theory, the criterion of mutually beneficial and sustainable local norms (Van Oosterhout et al. 2006) also suggests a required fit within the community’s goals or purpose is an important factor to consider in judging privacy norms. Similarly, one of the two key assumptions in the construction of the macrosocial contract and the moral free space within a community by Donaldson and Dunfee (1999) is the need for a moral fabric supportive of (1) efficiency and (2) preexisting core values of the community.

Implications and Conclusion

In relying on notice and choice to assuage privacy concerns, a firm’s only role in respecting privacy expectations online was to ensure a user was adequately notified and the consent of the user was acquired. This gives firms the perverse incentive to construct elaborately vague privacy notices, left unread and misunderstood by users, only to gain users’ consent. With individuals and consumers rendering privacy judgments regardless of the explicit privacy notices, the prominent tool available for businesses to manage privacy expectations is rendered ineffective.

Within a social contract approach to privacy, the focus shifts from firms gaining consent to the role and responsibilities of businesses as contractors in communities. From the narrative above, rules around discriminately sharing information take into consideration the possible benefits to the individual (such as better relationships, trading for goods and services, employment, etc.) as well as the benefits to the contracting community (such as a banking system, a functioning workplace, a credit system, a marketplace, etc.) while also balancing the expected harms. Understanding this privacy analysis will help firms better meet the privacy expectations of their stakeholders. Importantly for researchers and firms, questions remain about how to identify microsocial contract norms about privacy and what is taken into consideration in forming those privacy norms.

The implications to privacy research and practice based on social contract concepts are examined below and outlined in Table 5.

Table 5 Implications to research and practice

Implications for Research

Both the restricted-access and control approaches may be considered universal principles or ‘strong’ definitions of privacy where the definition of what it means to respect privacy—remaining inaccessible or adhering to notice statements—is universally known and applicable. This is problematic in that performing research on privacy becomes an exercise in testing an individual’s belief in a predefined and arbitrary conception of privacy. For example, it has become almost cliché to declare young adults to have diminished or no privacy expectations, yet, when examined closely, young adults are found to have privacy norms that differ from older adults while retaining strong expectations of privacy (Hoofnagle et al. 2010). Similarly, individuals who do not agree with the analyst’s definition of privacy are presumed to not find privacy important (e.g., Acquisti and Grossklags 2005) or to be unethical (Winter et al. 2004). Instead, researchers and organizations should ask what are the privacy expectations of the users, customers, or employees in this situation? rather than do users, customers, or employees have any reasonable expectation of privacy here?

Scholarship that operationalizes relinquishing privacy as when users provide information misses the expectations consumers have even once they provide information—even innocuous information (Leon et al. 2013). In other words, researchers will observe a respondent who is willing to purchase something online and equate that behavior with a demonstration that he/she is less concerned about privacy. When asked, as shown above in the studies, respondents go online and have expectations of privacy.

A social contract approach would be particularly well suited to the stakeholders and issues of organizations and managers. However, little empirical work has been done to test a social contract approach to privacy, since social contract approaches, in general, remain empirically challenged (Dunfee 2006; Glac and Kim 2009; Van Oosterhout et al. 2006; Soule 2002). This is due to the fact that allowing for locally defined norms renders contextual approaches to privacy difficult to test empirically. The identification of the relevant community and local authentic norms is “partially if not entirely” an empirical task (Husted 1999). Additional inductive research to identify the particular privacy norms within a community or context would help organizations meet privacy expectations of users, employees, and customers.

Implications for Practice

Responsibility of Firms

Current approaches to online privacy place the onus on the consumer to understand and acknowledge the privacy notices or to choose wisely where and when they give access to their information. In other words, the responsibility for the handoff of information is placed primarily on the consumer. Once privacy is viewed as the social contract between parties about the type and flow of information within a given community, privacy becomes attached primarily to a relationship rather than to a piece of data or location.

In the case of privacy online, the relationship between the website and the user becomes critical to upholding privacy expectations. All contractors—users and organizations—have a right and an obligation as both the recipient of information and as the disseminator of information to abide by the particular privacy norms within that community or to voice objection. Primary websites have the knowledge, access, and incentives to become more responsible regarding their users’ overall privacy experience online.

As noted by Dunfee Dunfee et al. (1999, p. 32) and Van Oosterhout and Heugens (2009, p. 731), merely enjoying the benefits of the community, engaging in transaction within the community, and reaping the benefits of the structure offered by the microsocial contracts within the community entails a reciprocal obligation to uphold and develop the authentic norms of the community. Firms reaping the benefits of users, consumers, and employees from their disclosures of information have an obligation to respect the privacy norms within their community. For example, Facebook partners with many retail, gaming, search, and news sites to allow a Facebook login on these third-party sites. However, Facebook negotiated that these partners are not permitted to transfer any information to AdNetworks or data brokers based on their Facebook users’ login. In addition, Facebook also uses technology to detect attempts to scrape, or copy, their members’ profiles thereby taking responsibility to manage their users’ online experience. However, Facebook’s purchase of the fitness app Move, and attempt to access Move’s user data (Wagner 2014), calls into question whether Facebook prioritizes the role and responsibility of the website’s relationship with users or, instead, prioritizes Facebook’s needs.

“Anything Goes” Fallacy (Nissenbaum 2004)

According to the narrative offered, the decision to share information is not dispositive of relinquishing a reasonable expectation of privacy. Instead, individuals have an interest in discriminately sharing information. For privacy research, more work would need to examine the privacy expectations of users with disclosed information. Both the traditional control and restricted-access approach to privacy approaches treat the act of sharing information as dispositive of relinquishing an expectation of privacy: individuals either share information and lose a right to privacy or do not share information and retain a reasonable expectation of privacy. The narrative offered here suggests shifting the conversation to view individuals as always having an interest in discriminately sharing information. The question for firms becomes how to support individuals discriminately sharing information within a particular context or community. For example, selling behavioral information may be appropriate for retail websites but not for financial services, as MasterCard and Visa learned when they approached companies with selling personalized information (Steel 2011).

For privacy as a social contract, no area exists where “anything goes” (Nissenbaum 2004). Any community has prevailing privacy norms and associated reasonable expectations of privacy that are the product of either explicit or implicit negotiations. Rather than create the false possibility of a region where anything goes online, a social contract approach to privacy suggests that information is always governed by the norms of a particular community.

Privacy as a social contract—or a mutually beneficial agreement within a community about how information is used and shared—suggests that tactics to address online privacy expectations should be dependent on the context of the exchange. This diverges from tactics that seek to address privacy issues online as if privacy concerns and expectations are uniform. For example, a banking website will have different privacy norms from a retail website. Similarly, a gaming website might have more in common with a social networking site than a retail site. The purpose of the website will influence the privacy expectations for the users and empirical studies may be required to identify the microsocial contract norms around privacy—as has been called for in scholarship (Dunfee 2006).

Privacy as a Competitive Advantage

The development of mutually beneficial privacy norms by contractors is a competitive advantage within communities. In order to keep people actively participating in relationships and trade within a particular community, privacy rules develop around who is privy to which piece of information and the obligations associated with knowing that piece of information. Sociologist Schwartz notes that privacy rules are necessary within any stable social system as he suggests that privacy agreements should be viewed as an index of solidarity (1968). In other words, strong privacy norms make strong communities.

The larger community also benefits from individuals retaining ‘a backstage’ or a private self (Goffman 1959; Nissenbaum 2004) while also sharing information.Footnote 10 Communities—including those of a firm—benefit when websites and users, husbands and wives, work groups, or teams develop their particular privacy expectations and norms. In fact, “part of what makes a society a good place to live is the extent to which it allows people freedom from intrusiveness of others” (Regan 2011). As Priscilla Regan notes, “on a societal level, people require a measure of understanding of how they relate to others that permits the development of a sense of self and connectedness to others within the society of which they are a part.” Without rules governing how information should move within a given community or relationship, individuals withdraw (Schwartz 1968). This approach is Deweyan in acknowledging that both individuals and society benefit from particular protection of privacy rather than positioning the interests of parties as opposing forces (see also Nissenbaum 2004; Regan 2011; Solove 2006).

On a smaller scale, this competitive advantage can be seen in the introduction of privacy-aware products and services. For example, DuckDuckGo is “the search engine that doesn’t track you” (www.duckduckgo.com). Diaspora* (www.diasporafoundation.org) is a decentralized social network that differentiates based on freedom and privacy: users access diaspora* through user-supported servers (or pods), using pseudonyms, and with full rights over the use of their data.Footnote 11 In addition, Whisper, an app that allows users to share thoughts anonymously, was caught tracking users (Dwoskin 2014); yet a competitor, Secrets, noted their business model does not include developing relationships with media outlets and therefore will not have the incentive to monetize tracking of users. Similarly, Snapchat attempted to distance themselves from other social media services by not using native advertising: instead ads are “compartmentalized” and not based on collected user data (Shields 2014). Table 3 includes the examples of the products and services in the market seeking to responsibly contract in their community by engineering privacy into their product—as has been called for in research (Mayer and Narayanan 2013) and public policy (Ohlhausen 2014).

Limitations and Concerns

Because privacy norms may be locally defined within a particular community, charges of relativism are endemic to a social contract approach. The lack of substantive principles to guide the development of local norms leave some to find a lack of moral authority (Wempe 2005; Soule 2002; Dunfee 2006) and allow “morally rogue agreements” (Soule 2002). Locally developed privacy norms can be perceived as losing moral authority because the norms are tied to practice or convention (Nissenbaum 2004). Van Oosterhout et al. (2006) refer to this assumption as the ‘contractualist fallacy,’ or the “erroneous assumption that the contractualist argumentative structure uniquely determines a single set of action-guiding norms” (p. 522).

However, other approaches to social contracts do not view contracting as “a morally neutral idea” (Van Oosterhout et al. 2006, p. 528). In fact, the social contract narrative illustrates what Van Oosterhout et al. (2006) refer to as ‘the internal morality of contracting’ by walking through a ‘precontractual state of nature without cooperation and demonstrate how cooperation works.” Therefore, substantive privacy principles are not needed, according to Nissenbaum (2004), Van Oosterhout et al. (2006), and Van Oosterhout and Heugens (2009), in order to have moral gravity. The internal morality defines a moral threshold for microsocial contracting that enables us to filter out contracts and practices incompatible with the moral import of contracting (Van Oosterhout et al. 2006). Instead, contractualists “focus on the reasonable and normative foundations of contractual schemes” (Van Oosterhout et al. 2006, p. 521). The goal of the contractualist endeavor is not to identify the single right answer, but to identify legitimate and authentic agreements.

In addition, the demarcation where one community starts and another stops is not clear at times. In fact, a social contract approach to privacy introduces the possibility of conflicting norms of privacy and overlapping communities similar to other social contract theories. Overlapping spaces and conflicting norms/duties are endemic limitations for social contract approaches (Phillips and Johnson-Cramer 2006). Future research on privacy as a social contact would need to take such overlapping communities into consideration.

The evolution of thick privacy norms may be seen as a problem for some, since a social contract approach to privacy leads to an increase in stability and a tendency toward the status quo. Social contract approaches can be viewed as lacking a mechanism for revising micro norms (Phillips and Johnson-Cramer 2006) or, as Nissenbaum notes in reference to privacy as contextual integrity, a tendency toward conservatism (Nissenbaum 2004). Changes are initially resisted as “entrenched normative framework represents a settled rationale” (Nissenbaum 2004, p. 127).

Yet for Michael Walzer, agreements “change over time as a result of internal tension and external example; hence they are always subject to dispute” (Walzer 1994, p. 27). In fact, others see social contract approaches to include dynamism as an asset rather than a hindrance and position the norm of forgiveness as critical to sustainable solutions (Van Oosterhout et al. 2006). Most clearly, Daniel Dennett suggests an evolutionary story with a mutation arising “instead of persisting in the myopically selfish policies of mutual defection and distrust that had reigned heretofore, these particular lucky competitors hit upon a new idea: cooperation for mutual benefit” (Dennett 1995, p. 454). All social contract theorists “agree in seeing morality to be, in one way or another, an emergent product of a major innovation in perspective.” Where Rawls sees a stable agreement that cannot be upset in the form of reflective equilibrium, such stability creates problems for Phillips and Johnson-Cramer in their analysis of ISCT within business ethics (2006); and Dennett never commits to such stability and talks of evolutionary nature. Importantly here, both assumptions of stability and dynamism are possible within the arguments herein, however Dennett’s assumption about human behavioral tendencies are more in line with the social contract narrative above.

Conclusion

This paper examined how privacy norms develop through a social contract narrative in order to reframe possible privacy violations given the social contract approach to privacy and critically examine the role of business as a contractor in developing privacy norms. These social contracts are important to understand if firms are going to adequately manage the privacy expectations of stakeholders. Most importantly, focusing on the microsocial contracts around privacy expectations shifts the responsibility of firms from adequate notification and gaining consent of the individuals to the responsibilities of the firm as a contractor to maintain a mutually beneficial and sustainable solution. The social contract approach to privacy has important practical implications for firms struggling to identify the privacy expectations of stakeholders.