Apple’s refusal to decrypt the phone of a known terrorist, requested by the FBI and ordered by a court, provides a particularly fruitful case to examine the tension between two core values: national security and individual rights. It also allows for an analysis of how such differences may be resolved. Although this article focuses on one major corporation (Apple) and one major device (the iPhone), similar issues arise when other means of communication are armed with strong encryption by other leading high-tech corporations (such as Google, WhatsApp, Facebook)—and major places where large amounts of information are stored (especially in ‘clouds’).

The article first provides a brief overview of the relevant historical developments (Part I). It then introduces an approach, based on liberal communitarian ethics, that is applied in this article to the issues at hand. The article next shows that the core values involved are reflected in the Constitution and in laws enacted by Congress. The legal analyses proceed by reviewing the challenges that Apple has raised in response to the government’s request and the court’s order—and responses that have been made to Apple’s positions (Part II).

The last part of the article shows that in this case, as in many others, laws and the ethical values that support them may well not suffice to resolve the relevant differences between the high-tech corporations and their many supporters and the government. Additionally, ethical considerations will have to find a way out of the current stalemate, which has major implications for national security and public safety (both referred to from here on as ‘security’) as well as privacy.

Part I: A Brief History

Until the Snowden revelations, American high-tech corporations showed limited interest in developing and marketing high-power encryption software. (Some form of encryption has existed for decades; some argue since the Ancient Egyptians. And of course there has been an interest in privacy protection at least since the twelfth century. However, these encryptions were not strong and developing high-power ones—and making them a major marketing feature—was pursued by Apple and other high-tech corporations). After these revelations, many customers—especially overseas—became very concerned about their privacy. Some nations, such as Germany, India, and Brazil, considered forging their own networks (Taylor et al. 2016). American high-tech companies viewed these developments as highly threatening to their business. Apple led the response by developing new and powerful encryption, one that only the sender and the receiver can decrypt. Customers were warned that if they lost their password, they would lose access to whatever they encrypted because Apple could not come to their rescue.

The FBI noted that the development of this kind of very powerful end-to-end encryption (I refer to it from here on as ultimate encryption or UE) would leave the government in the dark. FBI Director James Comey, for example, has warned that “encryption threatens to lead all of us to a very dark place,” as the “recent default encryption settings and encrypted devices and networks,” that is, UE “will have very serious consequences for law enforcement and national security agencies at all levels (2014).” And “It’s the equivalent of a closet that can’t be opened. A safe that can’t be cracked” (Comey 2014). British PM David Cameron has asked, “do we want to allow a means of communication between people which even in extremis, with a signed warrant from the home secretary personally, that we cannot read? … My answer to that question is: ‘No we must not’” (Hope 2015).

Initially, the White House did not take a stand on this issue, hoping that the FBI and the tech companies would work out their differences through amicable negotiations. Congress, not prone to action in the first place, did not take any action either.

The political atmosphere changed to some extent after the ISIS attacks in Paris that killed 130 people (after which “reading” a phone abandoned by a terrorist helped to catch him and avoid more attacks), and after the government found a phone used by Syed Rizwan Farook, a terrorist who, along with his wife killed fourteen people in San Bernardino on December 2, 2015. This phone had the UE provided by Apple, and the FBI, at the time unable to “read” it, asked Apple to help it to decrypt the phone (Lichtblau and Benner 2016). When Apple demurred, the FBI turned to the courts, which ordered Apple to comply with the FBI request (Lichtblau and Benner 2016). Apple refused and the FBI asked the court to force Apple to comply.

An intensive public debate followed between the supporters of Apple (major parts of the media, law professors, and public intellectuals) and a smaller number of supporters of the FBI. After holding back, the President stated in a speech on March 11, 2016 that never allowing government access to someone’s smartphone would be equivalent to “fetishizing our phones above every other value” and that it would not “strike the balance that we have lived with for 200, 300 years” (Sydell 2016). Reliable sources report that ISIS switched to use highly encrypted phones, and one must assume that human traffickers, drug lords, and spies around the world are following. In the US, the strength of Apple’s encryption is leading criminals to switch from burner phones to iPhones (Levine 2016).

On March 28, 2016, the FBI announced that it was able to “read” the dead terrorist’s phone and the case became moot. However, the ‘key’ the FBI used in this case cannot be used for other kinds of phones or to break the encryption of other communication means or that of information storage systems. Indeed, Apple announced that it will double down, by working to increase its iCloud encryption. Other high-tech corporations are following and providing UE to their products.

In an attempt to resolve the matter, Senators Richard Burr (R-NC) and Dianne Feinstein (D-CA) issued a draft bill that would require “‘intelligible information or data’ or the ‘technical means to get it’” to be provided if given a court order (Welna 2016).

[T]he ‘Compliance with Court Orders Act of 2016’…[is] a nine-page piece of legislation that would require people to comply with any authorized court order for data—and if that data is ‘unintelligible,’ the legislation would demand that it be rendered ‘intelligible.’ In other words, the bill would make illegal the sort of user-controlled encryption that’s in every modern iPhone, in all billion devices that run Whatsapp’s messaging service, and in dozens of other tech products. (Greenberg 2016).

Meanwhile, Apple hired a high-profile lobbyist for its Washington office (Romm 2016), and trade groups that represent Apple as well as other tech companies have already started lobbying Congress, having concerns regarding privacy and the effect on business. (Kang 2016). Thus, the issue regarding the ‘warrant-free zone’ created by encryption is very much alive. Nobody on either side disagrees with the observation that the way this issue will be resolved will have major implications for national security (especially terrorism), public safety (especially crime), privacy, personal security (e.g., protection from identity theft), and the business interests of the corporations involved.

Part II: Liberal Communitarian Ethics and Relevant Laws

A Liberal Communitarian Approach and the US Constitution

Communitarianism is a social philosophy centered around the values of the common good, just as libertarianism is centered around the value of liberty. Although the number of scholars who are communitarians is very small, and the school is little known, in effect its values are extolled in the Old and New Testaments, by the ancient Greeks, Confucius, the Catholic Church, democratic socialists, among others. Strong communitarians are concerned almost exclusively with the common good and are willing to ignore individual rights if they believe this would serve the common good.

In contrast, liberal communitarian ethics—an approach I have been credited with helping to develop—takes as its starting point that each community, each nation, faces two fully legitimate ethical claims—national security and individual rights—and that neither can be maximized nor fully reconciled, as there is an inevitable tension between these two claims (Etzioni 1996). It thus follows that some balance must be worked out between these conflicting claims. That is, the liberal communitarian model assumes from the outset that the nation is committed to both individual rights and the advancement of the common good, and that neither should be assumed to a priori trump the other.

The Constitution, especially the Fourth Amendment, provides a stellar expression of a liberal communitarian ethics. It does not state that the government may not “search” phones, homes, papers, or persons; it merely bans “unreasonable” searches. In seeking to determine where to draw the communitarian balance in this area, the Court very often draws on the Fourth Amendment. This Amendment captures the basic thesis of the liberal communitarian way of thinking well. By banning only unreasonable searches and seizures (U.S. Const. amend. IV), the Fourth Amendment recognizes, on the face of it, a category of reasonable searches, which turn out typically to be those that promote public safety and do not require a warrant or probable cause. That is, the very text speaks of two sides, and hence, of a balance in sharp contrast to the First Amendment, which states “Congress shall make no law… abridging the freedom of speech….” (U.S. Const. amend. I) (emphasis added). (True, even this Amendment does not provide an absolute right; however, is it much less hedged than the Fourth Amendment).

Moreover, the Constitution provides a mechanism for determining what searches are reasonable: the courts. What the courts considerable reasonable changes as conditions change. For instance, after a rush of skyjacking in 1972 the courts deemed legal the newly introduced screening gates in airports, which search millions of travelers each day. These gates stopped skyjacking in 1973. The courts, as a rule, do not use the term “common good” but refer to the “public interest.” Although they have given different rationales for authoring a considerable variety of searches—many even without a warrant—they seem to follow an ethical concept: Namely that if the privacy intrusion is small and the gain to the public interest is high, searches should be allowed.

A review of Supreme Court rulings shows that the Court has a broad understanding of public safety, which allows diverse intrusions into the realm of individual rights to serve this common good (Etzioni 2015). The most basic element of public safety is upholding law and order, and the deterrence and prevention of crime. A second element of public safety relates to preventing accidental death and injury. Thus, the Court allowed suspicion-less, random drug, and alcohol testing of train engineers in the wake of a series of train accidents, (Skinner v. Ry. Labor Execs. Ass’n 1989), as well as random sobriety checkpoints on highways to prevent deadly car accidents resulting from drunk driving (Mich. Dep’t of State Police v. Sitz 1990). A third element of public safety is the promotion of public health (Jacobson, v. Massachusetts 1905). Thus, the Court held that the public interest in eradicating the smallpox disease justified compulsory vaccination programs (Jacobson, v. Massachusetts 1905) despite the resulting intrusion on privacy, and held that search warrants for Occupational Safety and Health Act (OSHA) inspections do not require “probable cause in the criminal law sense” (Marshall v. Barlow’s Inc. (1978). In short, there are ample precedents to hold that when the common good, in particular public safety and national security are concerned, individual rights can be curbed, especially if the intrusion is small and the gain to the public interest is significant.

True, not all cases run this way. For example, in United States v. Jones, the Supreme Court overturned the conviction of a drug trafficker, ruling that police could not use a GPS device without a warrant to surveil suspects (United States v. Jones 2012). In Kyllo v. United States, a case involving marijuana being grown under heat lights in the petitioner’s garage, the Supreme Court ruled that thermal imaging of a suspect’s residence requires a warrant. Thus, the use of information from a warrantless scan of the residence could not be used in obtaining a warrant to search the premises, which is what had occurred (Kyllo v. United States 2001). In Arizona v. Hicks, while police were searching an apartment for a shooter, they collected serial numbers of expensive stereo equipment, which they thought was stolen. Although they turned out to be correct, the Supreme Court ruled that police violated the Fourth Amendment because they lacked probable cause (Arizona v. Hicks 1987). All I am arguing is that the Constitution, especially the Fourth Amendment, far from being a one-way ticket for the protection of individual rights, is also concerned with the common good and hence if the courts ruled, as they did in this case, that security should take precedence, this is far from unusual or surprising.

In short, from a sheer Fourth Amendment viewpoint, it seems Apple should comply with the lower court order, although of course it is well within its legal rights if it seeks to appeal the ruling of the lower court. Indeed, it would at first seem that the government has a particularly strong case. The reason is that, often, the evidence that someone is a terrorist—and hence the court is asked to allow the government to surveil that person—is based on suspicions, informers’ tips, circumstantial evidence, and other such tenuous evidence, and that the person to be surveilled will have his privacy invaded. However, in this case there is no doubt whatsoever that the phone was used by Syed Rizwan Farook, the San Bernardino terrorist—and because he is dead, he has very diminished privacy rights. (Moreover, the phone was owned by the San Bernardino County Public Health Department, which was happy to grant permission to search it). In short, this case seems to be an unusually clear-cut case in which the value of security should trump the remaining privacy rights of a dead person, a known terrorist. This is, though, not the way Apple, other high-tech corporations, and their supporters see it. They first of all note that the government is using this case to set a precedent for searching millions of other phones. (More about this below). Second, that even in this case, there are strong legal arguments to deny the government’s request. Before these arguments are outlined and discussed, a brief digression on the difference between legal and ethical points, which underlies much of what follows.

Legal or Ethical?

The article focuses on ethics, but many of the sources it draws on concern legal matters. Hence, a few lines on the way these two realms are related. Ethics is sometimes used to refer to a branch of philosophy that clarifies the foundations for moral judgments and the way moral differences are sorted out. However, a very large number of scholars and others treat ethics and morality as synonymous, for instance, when one asks ‘is this behavior moral’ or ‘is this behavior ethical?’ This article follows this way of applying the term ethical.

The moral values of a society are not self-enforcing. They rely on two very different modes of engendering compliance. One is informal social controls, which draw on people’s quest for approval and acceptance by others, on peer pressure, and voices of clergy and secular moral leaders. The other draws on the state’s means of coercion. When enforcement relies on the state, on fines, and prison terms, we are in the legal realm. Both realms deal with moral values, with ethics, but one relies on a social means to promote compliance and the other on force, if need be.

Security and Privacy Combined?

Apple rejects the main underlying ethical assumptions detailed here so far. It argues that if it weakened the encryption software so that the government could surveil phones (put in a ‘backdoor’), many millions of people all across the world would lose not just their privacy but also have their security endangered. Apple states that “at stake is the data security of hundreds of millions of law-abiding people” meaning it sees itself as protecting not just Americans but iPhone users around the world (Williams 2016). This, Apple holds, is because other governments and criminals would come in through the same back door. That is, Apple rejects the very legal and ethical way this and other issues have been framed—as a tension between the common good and individual rights—and the ensuing question of which values should take precedence in a given conflict. Apple argues that it is out to protect both core values. For this reason, Apple repeatedly refers to the FBI request for Apple to develop a key able to unlock encrypted phones as “dangerous.”

Tim Cook, the CEO of Apple, spells out the dangers people face if the government’s demands were to be heeded and the protection Apple provides were to be weakened by introducing a backdoor into the software. ‘Bad actors’ could bring down power grids, people dependent on medical devices would suffer a heart attack, and they would know the locations of peoples’ children (Gibbs 2016). Apple’s vice president of software engineering, Craig Federighi, raised similar concerns:

…the threat to our personal information is just the tip of the iceberg. Your phone is more than a personal device. In today’s mobile, networked world, it’s part of the security perimeter that protects your family and coworkers. Our nation’s vital infrastructure—such as power grids and transportation hubs—becomes more vulnerable when individual devices get hacked. Criminals and terrorists who want to infiltrate systems and disrupt sensitive networks may start their attacks through access to just one person’s smartphone (Federighi 2016).

“People of the World”?

Ethicists differ on whether one has the same obligations to all people or higher ones to members of one’s community or nation. American law, though, clearly grants Americans a much higher level of rights than foreigners. The Constitution is said to stop at the water’s edge. The issue differs with regard to different rights. As far as privacy is concerned, most nations assume that foreigners have much lower privacy rights than their own citizens; that is, it is legal and ethical to spy on foreigners. It is widely recognized that spying is something nations have done to each other since the beginning of history. (When the US was found to be listening to the Chancellor of Germany Angela Merkel’s phones calls, it was embarrassing, but it was not considered illegal). As far as Americans are considered, we already saw that the Fourth Amendment defined under what conditions they can versus cannot be subject to surveillance.

Supporters of Apple argue that the government should instead compel the phone owners to divulge the password. However, this is hard to do to terrorists who commit suicide or are shot dead. For others, the government often needs to keep them under surveillance before it tips them that they are suspects. Hence it needs access to phones, pursuant to court orders, without it being disclosed to the phone owners. (For the same reason, corporate arguments that they should be free to alert customers that the government asked to review their communications lift “gag orders”, which are not compatible with elementary police procedures. Moreover, it is highly damaging to public safety and national security for obvious reasons.)

Setting a Precedent for Foreign Governments

Apple argues that what the US government is asking for amounts to introducing a backdoor into all phones, and that if it complies, a precedent will be set that will force Apple to comply with similar requests made by foreign governments. In his article in The Guardian, Spencer Ackerman writes that “[a]uthoritarian governments including Russia and China will demand greater access to mobile data should Apple lose a watershed encryption case brought by the FBI, leading technology analysts, privacy experts and legislators have warned” (2016). Senator Ron Wyden of Oregon, a leading legislator on privacy and tech issues, “warned the FBI to step back from the brink or risk setting a precedent for authoritarian countries” (Ackerman 2016).

China, however, has never shown that it will be resistant to proceed if it cannot find an American precedent. The very idea that precedence is important is a legalistic, Anglo Saxon notion. Moreover, the lack of a precedent does not cause hesitation even on the part of American corporations. Instead they figure that establishing a precedent will cost them some legal fees (Etzioni 2016).

Furthermore, should the US stop doing what Congress and the courts consider important according to American values because the Chinese may use it as a talking point? For instance, should the US stop censoring child pornography just because China may point to this American limit on the right to free speech, and use it as an excuse to further limit the free speech rights of their citizens? Should the US stop prohibiting medical and food marketers from making unproven claims—just because China may insist it can prevent Chinese corporations from making statements about things China frowns upon?

Most telling, detailed data reveals that Apple often complies with requests to provide information about its clients to the Chinese government, before and after the introduction of UE. According to District Attorney Cyrus Vance, Jr.,

…before iOS 8, between July and December of 2014, China made 31 requests to Apple for account data out of 39 accounts. And Apple provided in response to the Chinese requests 32 percent of the time…Between January and June of 2015, the next six months, now this is after iOS 8 has come in, China made 24 requests for 85 accounts, and Apple responded to 29 percent of those requests.… And then July to December 2015, 32 requests out many thousands of accounts…Apple responded to 53 percent (Privacy and Security in the Digital Age 2016).

Furthermore, Apple is already in effect accommodating Chinese censorship. It designed its News App so that iPhone users have been able to access Apple News abroad—but not in mainland China. According to CNN reporters, in Hong Kong using a local mobile network, there were no issues with the app. In mainland China, however, stories that had been previously downloaded could not be retrieved, and instead there was an error message. (Yan and King 2016).

Losing Control of the Key

High-tech corporations and their supporters are concerned that if a key were created, the software would be stolen or leaked. Cook warned that

[i]n the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession…The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control (Cook 2016).

In response, I suggested on March 7, 2016 that Apple (and other high-tech corporations) leave the encryption software as it is—not introduce a vulnerability or a backdoor—but develop a key to unlock phones, a key they would keep. Thus, once a court orders that a given phone must be unlocked, the FBI would bring it to Apple (or Google or whatever other high-tech corporation is involved)—and they will unlock the phones they produced, and turn over to the FBI any information that is found—but not the key (Communitarian Observations, March 7, 2016). (To apply the same idea to phones still in the hands of bad actors requires considerable additional collaboration between the FBI and the high-tech corporations, but the same principle could be applied).

Several AI experts commented on this suggestion. Many thought that although Apple has the technical capability to create a key, the real issue would be keeping it secure. Steve Bellovin from Columbia University’s department of computer science responded that “a key can be readily available or it can be secure, it can’t be both.” According to Phillip Schrodt, a senior research scientist, “…the problem is not the technology, it is people getting careless about how they use the technology.” David Bantz, Chief Information Architect for the University of Alaska system, noted that “NYC and [the] FBI have hundreds of phones they want to unlock. That would entail a process involving many people and loading the OS on many phones. That makes it possible maybe even likely that one of those people entrusted with that power is coerced or bribed or is clumsy enough to put it in the hands of criminals.” (Communitarian Observations, March 18, 2016). I was surprised to hear during a meeting on May 11, 2016 at the Council of Foreign Relations (a rare one, on the record) District Attorney Vance informing the audience that until September 2014 his office was able to routinely send phones to Apple; Apple would open them and send back the information within a day or two. (Privacy and Security in the Digital Age 2016). The reason Apple stopped, Vance implied, was that in September 2014, it started advertising that it was the only company that sold phones whose encryption could not be broken (Privacy and Security in the Digital Age 2016). It seems that concerns for profits, a fully legitimate concern, played a key role in Apple’s sudden refusal to cooperate with law enforcement and national security authorities.

In response to the repeated claim by high-tech corporations that there is no way such a key can be kept secure, even if it never left their premises and was protected by their own high-powered encryption—I note that Coca Cola kept its formula secret for many decades. And that leaks about secrets from the FBI, during the last 25 years, have been very rare. And that if the key was ‘leaked,’ high-tech corporations would modify their encryption software by patching it up, as they often do, and develop new keys. In effect this is what Apple sought to do when it learned that the FBI found a way to unlock Apple’s iPhone. Most importantly, I agree with Vance, who argued that one must weigh “the risk of maintaining the ability to open a phone by the company…versus…the consequence to law enforcement of not being able to access those phones” (Privacy and Security in the Digital Age 2016). The answer seems self-evident.

Undue Burden?

Even if one agrees that the government-defined security needs trump those identified by Apple, one must still address the question: To what extent can the government burden a private corporation by forcing it to contribute to the common good? The government bases its case on the All Writs Act. Part of the Judiciary Act passed in 1789, the All Writs Act states that federal courts “may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law” (Kerr 2016). As Orin Kerr discusses, “[t]he historical idea of the AWA is that when a federal court has jurisdiction over a matter, the court has the power to issue additional orders (“writs,” in the ancient language of the English common law) to assist the court in adjudicating the matter properly.” (2016)

Apple claims the FBI’s request is excessively burdensome and cannot be justified under this statute (Grossman 2016). The issue of burden comes into play from United States v. New York Telephone, a case between the FBI and a telephone company in which the FBI wanted to place pen registers on telephones in an attempt to catch illegal gamblers (Kerr 2016). The Supreme Court ruled in favor of the FBI, and although leaving much of what is permissible under the All Writs Act unclear, stated that “[we] agree that the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed” (Kerr 2016). Apple uses this to argue that the order it faces is too burdensome. According to the FBI, Apple’s burden is also entirely self-created.

By Apple’s own reckoning, the corporation, which grosses hundreds of billions of dollars per year, would need to set aside as few as six of its 100,000 employees for perhaps as little as 2 weeks. This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that government cannot search them, even with a warrant (Government’s Reply Brief 2016, p. 2). Gordon Crovitz noted in the Wall Street Journal that “[t]he company told the court it would take six to 10 staffers 2–4 weeks to develop the necessary software. That’s less than the annual cost of one engineer, perhaps $200,000.” (2016).

Set a Precedent

Some argue that the FBI deliberately chose a case that favors it in the public eye, because it is the phone of a known terrorist, but that FBI aims to set a precedent for all other phones. Tim Cook said “I think they picked a case to pursue that they felt they had the strongest possibility of winning” (Gibbs 2016). Apple is not the only party to express this sentiment. “‘This wasn’t a fluke they picked this one,’ said James Lewis, a cybersecurity expert at the Center for Strategic and International Studies who advises technology companies and the government on encryption. ‘They’re always strategic in how they do these things’” (Yadron et al. 2016). This may well indeed be true. There already have been reports that the NYPD alone has 175 phones it needs help with in order to be able to ‘read’ them (Benner and Apuzzo 2016). It seems, thus, reasonable to discuss the matter without reference to any particular phone. However, while deliberating the issue makes it more difficult for the FBI to ‘try’ the issue in the court of public opinion, it does not change the basic question of whether the lower court was right that these kinds of searches are reasonable and hence in line with the Constitution and if a corporation can refuse to comply or set up its software in ways that cannot comply.

Free Speech

Apple argues that the FBI is seeking to violate its 1st Amendment rights, (Tracy 2016) because courts recognized codes as a form of speech (Tsukayama 2016). If Apple were to comply with the order, it would be required to write code in order to override the security feature which deletes the iPhone’s data after ten unsuccessful password attempts. According to Apple, this would result in “compelled speech.” (Tsukayama 2016).

However, no right is absolute. Famously, one cannot shout “fire” in a crowded theater, presumably because such a shout may cost many lives. Apple also argues that the government might be able to prevent some speech but not compel Apple to say what it refuses to say, which is what is called for if Apple were to be required to write the code needed to open the terrorist’s phone. However, Congress and the courts allow such a requirement often when public safety is involved. For instance, they require warning labels on cigarette packages and content labeling on foods and medications, among many others.

All said and done, as I see it, on the one hand, the arguments Apple has raised are not strong. Moreover, even if they were very strong, the court has ruled on their merits and found against Apple. Apple is not above the law, and hence should comply. I agree with the court that the risk warrant-free zones pose to public security is very high. And, I would add, that the risks to the privacy and personal security of Apple’s customers can be limited by keeping the keys “in house” and protected by UE. On the other hand, I recognize that Apple has a right to appeal the decision of the lower court to higher courts and to lobby Congress to pass a new law, aligned with its viewpoint. For Apple to admit that its encryption is not flawless might well cause it significant loss of business. The question hence arises if there are ethical grounds, not legal ones, that one can call on Apple to accept the loss, for the common good.

Part III: A Matter of Corporate Social Responsibility

Whatever the legal grounds for seeking the cooperation of high-tech corporations in enhancing the common good—in particular security—there are several reasons one should also spell out the ethical reasons these corporations should cooperate rather the fight (Schwartz and Carroll 2003, pp. 507–508). These include that (a) legal processes tend to be confrontational and polarizing. In court, each side tends to state its position in the most one-sided, emotive ways possible. And the confrontation often spills over in the media, leading to further polarization. This was clearly the case in the 2016 confrontation between the FBI and Apple. The FBI charged Apple with being “corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government.” (Government’s Reply Brief 2016, p. 2) Apple claimed that government seeks “a dangerous power that Congress and the American people have withheld: the ability of to force companies like Apple to undermine the basic security and privacy interests of millions” (Apple, Inc.’s Motion, p. 2). Ethical appeals are more open to shared deliberations and consensus building, although they can also deteriorate into conflict. (b) If a corporation responds to ethical appeals, its staff, shareholders, and public supporters are likely to feel ennobled; they agree to do what is a public good. If compelled by law, they are likely to feel coerced and alienated. The differences in turn are associated with lower versus higher levels of compliance. (c) Ethical appeals do not close the door to legal measures, and hence are best tried first. (d) Appealing to high courts is going to take years, and Congress also is slow to act if it acts at all. Meanwhile, terrorists and criminals are increasingly using high-power encryption.

On what ethical grounds could the government appeal to high-tech corporations to help enhance security? I next explore several grounds for Corporate Social Responsibility, on which one might base a call to high-tech corporations to cooperate with the government in security matters, even if it means some not so trivial costs.

Being Responsible is Good for Business

Many ethicists argue that for corporations to conduct themselves ethically is good for business, in the narrow sense of this term, good for profit. For example, Pava and Krausz discuss how linking social responsibility and financial performance results in motivation for a corporation to act in a socially responsible way.

IBM’s new principles reflect the CEO’s belief that social responsibility, at least in the area of employee and community relations, causes better financial performance. By linking its human resource policies to shareholder value, IBM provides a strong and noncontroversial motivation for social responsibility actions. IBM’s experience demonstrates that under certain prescribed circumstances, CSR programs must be evaluated in view of the effects on financial performance (Pava and Krausz 1997, p. 346).

Of course, good ethics can be good business, but that is beside the point, because an action motivated by profit loses its ethical credibility. Ivar Kolstad provides an interesting aside on the subject of good ethics being good business, noting that “[i]f the argument holds, and CSR [corporate social responsibility] and profits go together in a systematic way, it does not really matter whether the corporation treats advancing one or the other as its ultimate goal. If all good things go together, corporate executives never have to face dilemmas or make trade-offs between social and profitability objectives” (Kolstad 2007, p. 138). That is to say, the entire concept of corporate responsibility would be more or less a nonissue if it were the most profitable course one hundred percent of the time.

The problem is that good ethics is not always good business. Empirical evidence does not support the claim that corporate social responsibility and profit are always tied together (Kolstad 2007, p. 143). As Kolstad discusses, Milton Friedman’s view that a company is responsible only for increasing its profits does not hold up under an ethical analysis: “An ethical theory based entirely on self-interest…leaves out an essential component of any reasonable ethical theory” (2007, p. 142). What is more, acts of corporate responsibility that are chosen for the sake of increasing profits are not, at their core, responsible acts, but profit-minded ones (Kolstad 2007, p. 144). Kolstad concludes by asserting that “[t]he correct way of approaching the issue of CSR, is to first ask what a company is responsible for, and then implement those responsibilities, whether they increase profits or not” (2007, p. 144). This observation in particularly germane in the case under study because there is little reason to doubt if high-tech corporations did cooperate more with the government, they may lose some market share, especially among bad actors, people strongly concerned about privacy, and foreign competition. The question hence stands on what specific ethical grounds one can urge corporations to sacrifice some measure of profit for the common good?

Kant, Stakeholders, Social Contact, and Corporate Privilege

Godwyns Agbude et al. provide a Kantian response, not based on the profit motive. In their discussion, they bring up the following formulation of Kant’s categorical imperative: “Act only according to that maxim by which you can at the same time will that it should become a universal law” (2015, p. 6). A world in which all corporations would grant priority to protecting their market share and profit over national security and public safety is one high-tech shareholders, employees, and CEOs would not want to live in.

Other approaches are based on the fact that corporations do not exist in some world unto themselves, but within society and a state. Hence, they should be expected to contribute to the context that makes them possible. Thus, Marvin Brown writes:

I want to suggest that business is not a sport. Unlike sports, all the participants in the market have not agreed to suspend civic norms. Quite the opposite, people depend on them. The market place is different than the sports arena. Business transactions should be guided by civic norms, including the norms of reciprocity and integrity. Furthermore, because the norms do not change from the public square to the market place, nor should citizens lose their civic rights and responsibilities when they go to work. Business, in other words, should not operate according to its own rules, but according to the norms of civil society (2006, p. 13).

High-tech corporations are called upon to work with the government to develop ways to make phones as secure as possible—for instance by not introducing a backdoor—but also to find ways to respond to national security needs as defined by courts and Congress rather than by high-tech CEOs (For instance, as I suggested above, by developing keys and keeping them on their premises). Such cooperation used to be commonplace during the Cold War and even well beyond; Western Union, AT&T, and Verizon have cooperated with the government to enhance public safety and national security for many decades (Regan 2014, p. 37).

Social contract arguments add similar considerations when they posit that individuals have consented, either explicitly or tacitly, to surrender some of their freedoms and submit to the authority of the ruler or magistrate (or to the decision of a majority), in exchange for protection of their remaining rights (Sacconi 2006; Argandona 1998, p. 1094). One also notes that limited liability, at the very foundation of all modern corporations (which allow shareholders to limit their risks and is an essential condition for accumulating large amounts of capital corporations need) is a privilege granted to investors by the legislators. It is reasonable to assume that it can, in return, seek some reasonable contributions to the common good.

Stakeholder Theory

A similar idea is reflected in the stakeholder theory (Argandona 1998). It holds that a corporation belongs not only to the shareholders—whose economic values must be served and trump all other considerations—but to all who invested in the corporation. The right to participate in the governance of a corporation should be shared by all stakeholder groups rather than only by shareholders. This argument accepts the moral legitimacy of the claim that shareholders have this right because they have invested capital in the corporation, but extends the claim to include all investors, which can include employees, creditors, suppliers, customers, and communities.

The concept of investment involves giving up immediate benefits in order to secure a better return in the future. The investment itself does not necessarily need to involve capital; although creditors may invest in a corporation by providing start-up capital, communities may invest in other ways, such as granting exemptions to corporations. Although employees receive compensation for their work, they also invest their loyalty and dedication, with the future return being continued employment. The investments made by stakeholder groups such as these result in shared interests among them, namely that the corporation will remain viable so as to continue to benefit the parties in the future.

Shareholders’ rights are ultimately based on a conception of fairness: society recognizes that shareholders are provided with no compensation for the use of their assets at the point of investment and that their compensation lies in a future flow of dividends and appreciation of share prices which are expected but not explicitly guaranteed. Hence, the investors have a right to ensure that the tree they helped plant will be properly cultivated so it will bear fruit, hopefully increasing its value. From a moral viewpoint, this concept of fairness applies to all stakeholders and not merely to shareholders (Etzioni 2014).

The application of the stakeholder theory to the issues at hand relies on treating the nation as a community (indeed, nations are often defined as a community invested in a state). First of all, Apple and other high-tech corporations benefit from many ‘investments’ by the government such as tax privileges granted to high-tech corporations (accelerated deduction for R and D outlays; lower taxes on income from capital compared with that from labor; the training of its staff in colleges and universities, many of which are supported by taxpayer dollars, among other benefits). The most important one of these is security for Apple workers and their families, the customers, and all other fellow citizens. There are hence ethical reasons to call on Apple to balance its very legitimate concerns regarding the privacy of its customers and its commitments to benefit its shareholders and its own bottom line—with its responsibility as an American citizen. As I see it, Apple should again keep keys in house and unlock those phones courts ruled the government has provided sufficient evidence that they may be searched. These keys themselves can of course by protected by the high-power encryption Apple developed.

Rights and Responsibilities

Corporations would not exist without society granting them a special privilege: that of limited liability. Without this provision, it would be impossible for corporations to amass the large amounts of capital modern enterprises require. It stands to reason that in exchange, society can demand that corporations pay back by absorbing some losses if, as a result, security could be more enhanced.

Moreover, communitarians have pointed out that rights presume responsibility (Etzioni 1996). Thus, as far as individuals are concerned, the right to be tried before a jury of one’s peers means little if the peers do not see serving on the jury as their responsibility. The right to free speech will not be sustained if people do not realize that they need to accept listening to offensive speech. And the right to life will not be secured if those who seek it do not assume the responsibility to pay for national defense and public safety.

Corporations often claim that they should have the same rights as individuals. For instance, their commercials and labeling of products should not be regulated because such regulations violate their right to free speech. They had previously been granted due process rights and Fourth Amendment protections by the Supreme Court (Stephens 2013). Society hence should expect that they also assume responsibility—like individual members of the community—to contribute to the common good.