Keywords

1 Preamble

On March 11, 2011 I was in Mexico sitting on a veranda overlooking the beautiful blue Pacific Ocean when I received an e-mail from my son asking me whether or not there would be a core-melt accident following the earthquake in Japan. My gut reaction was that the Japanese “probably had it handled” given their history of large earthquakes and experience with seismic design. Several hours later I received another e-mail from a former graduate student who came from Japan to study nuclear engineering at Berkeley. The e-mail read in part, “Everything you taught us about core melt accidents in your class is happening in Japan,” and even more alarming, “Why aren’t they telling us the truth; my friends and I know what is happening and it is worse they say?”

I scrambled to get CNN and was at once shocked and appalled. Shocked because it felt like everything I had been researching and lecturing on regarding reactor safety was no longer an abstract concept and appalled about what I was hearing from “the experts”, mostly colleagues from universities around the country. I began to ask myself, “What happened to the lessons of NUREG 1150Footnote 1 where this same station blackout (SBO) scenario was quantified and determined to be the dominant contributor to risk, as well as detail what steps were needed to prevent or mitigate such an accident in the future?” And from my colleagues, I could feel a state of hubris… a patriarchal know-it-all attitude of defensiveness: “It can’t happen here….” even before they actually knew what happened there! This was painful for me.

In the aftermath of the accident, the focus has been mostly on the machine and how to support the machine with other machines…. better protection from severe external events, strengthening back-up power supplies (from diesel generators to longer battery life), strengthening the regulations and controlling hydrogen, among others. But what about the people involved? When Professor Ahn asked me to give a lecture at this summer school, and I began to look more closely at what happened, I decided to focus more on the people than on the machine. I began my lecture by saying, “Most of the other lecturers will be talking about the machine, so I am going to do something very different, I am going to talk about the people… and this means you! And some of you may get angry and argue with me, and say that I don’t know what I am talking about, and some of you may be astounded and say, wow, why didn’t I think of that, or yes, this feels right and want to know more. In either case, I will consider the talk a success!” And so to you, the reader of this chapter… whether you agree or disagree, my goal is to make you think beyond the machine… to think about the people involved… and what all this means for the future of nuclear energy. I hope this paper is a first step at making the implicit assumptions, values and beliefs we hold regarding the nuclear endeavor explicit… and we begin to recognize this was as much a people accident as a machine accident.

2 Introduction

The disastrous events that took place at the Fukushima-Daiichi Nuclear Power Station beginning on March 11, 2011 have raised questions about Japan’s ability to both assess and manage the risks of core-damage accidents and radiological releases at nuclear power plants, especially when initiated by severe natural phenomena. Moreover, the accident has raised serious doubts about whether those in authority had adequate emergency plans and were prepared to manage such an accident while it was in progress. An article in the New York Times [1] raised serious ethical issues regarding the massive public relations campaign in Japan that resulted in, “the widespread adoption of the belief—called the ‘safety myth’—that Japan’s nuclear power plants were absolutely safe.” The nuclear establishment’s embrace of this “safety myth” apparently led to a state of hubris both in its regard for safety and risk, as well as it’s duties and obligations to the public in the days following the accident. Taken together, these questions and doubts, and this state of hubris have undermined public confidence in nuclear power as witnessed by the unprecedented and growing anti-nuclear sentiment in Japan, and perhaps worldwide.

In this chapter, I will explore the role of cultural conditioning with respect to ethics, risk and safety culture (see for example [2]),Footnote 2 an aspect of risk analysis that has received little or no attention in the past. I believe that cultural conditioning has implications for understanding what happened at the Fukushima Daiichi Nuclear Power Station and that such an understanding might help prevent the next Fukushima Daiichi from happening. Moreover, I will argue that when cultural conditioning, which underliesFootnote 3 a society’s values, assumptions and beliefs, is inapposite to safety culture, the former will undermine the latter.Footnote 4

This chapter revolves around the following three inter-related questions:

  1. 1.

    What would it take to improve the quality of risk analysis and emergency planning so that this terrible accident and the subsequent loss of public confidence can be avoided in the future?

  2. 2.

    Can a risk analysis paradigm be developed that incorporates the cultural conditioning of people and organizations responsible for nuclear energy?

  3. 3.

    Can a global safety culture be developed while still preserving the societal culture of host nations?

3 Preliminaries

Risk can be defined as: (a) the possibility of loss or injury, (b) a dangerous element or factor or (c) the chance or likelihood of loss. These definitions connote both a negative event or state as in loss, injury or danger (hazard or consequence), and a possibility or likelihood (probability or frequency) of that negative event or state occurring. In contrast, safe or safety can be defined as: free from harm or risk, secure from threat of danger, harm or loss, affording safety from danger. Being safe has an absolute quality to it, one is either safe or not. On the other hand, there is a spectrum of risk depending on the severity of the consequences, as well as its degree of probability.

Paul Slovic and his colleagues argue that [3]:

Risk in the modern world is confronted and dealt with in three fundamental ways. Risk as feelings refers to our fast, instinctive and intuitive reactions to danger. Risk as analysis brings logic, reason, and scientific deliberation to bear on hazard management. When our ancient instincts and our modern scientific analysis clash, we become painfully aware of a third reality—risk as politics.

Risk as feelings is an aspect of risk that gives rise to the subject of risk perception, while Risk as analysis gives rise to the subject of risk assessment and management. To those who study risk perception, emotion and affect (good or bad feelings towards an external stimulus) are essential ingredients in risk management. To those who are expert in risk assessment and management, risk as feelings is “irrational,” violating the “rational” or normative rules of decision theory (for example, cost/benefit analysis). The same arguments take place regarding ethics and technological risk [4]. The rationalist believes that ethics is objective, and hence emotions have to be eliminated from moral reasoning. The subjectivist believes that ethics is based on emotions (subjective), and so believes there cannot be objective moral truths. When the emotional and the cognitive aspects of consciousness are held separately, psychologists call these two views “dual process” theory [5]. Moral Emotions is a term that is being used in an attempt to give equal weight to these two human processes [6].

Risk as analysis is a means of addressing the following questions:

  1. 1.

    What are the risks imposed by technology and natural phenomena on society and the environment?

  2. 2.

    Are these risks acceptable?

  3. 3.

    What are the options for reducing these risks?

  4. 4.

    On what basis should we choose among these options?

Risk assessment is concerned with Question #1 and risk management is concerned with Questions #2–4. Risk assessment focuses on the factual—a quantification of the “undesirable consequences” of technology and severe natural phenomena. In doing so, it treats technology like a machine. For the purposes of this chapter, it is important to recognize that risk assessment does not model the individuals or organizations that are responsible for designing, constructing, operating or regulating these machines.Footnote 5 Nor does risk assessment consider the individual or individuals that perform the analysis in the first place.Footnote 6 When risk assessment does consider individuals (mainly control room personnel) it quantifies “human error” and as such, human error rates become failure rates, in essence, treating people (the operators) just like the machines or parts of a machine.

Risk Management originates at the intersection of the factual and the axiological—the study of the nature, types, and criteria of values (good or bad; right or wrong), of ethical codes (principles or standards that express the values) and of moral acts (behaviors of people that play out in real time). The question of acceptable risk straddles the domains of risk as analysis and risk as feelings, and is at the crux of risk as politics. Moreover, risk management, similar to risk assessment, only deals with the machines; it does not deal with the individuals and organizations that are responsible for the machines. Organizational factors are usually considered separately, if at all. And last, but not least, risk as analysis does not consider that humans are emotional, mental, physical and spiritual beings, and not machines.

The current culture of risk analysis in the West derives from Utilitarianism; the ethical theory based on the philosophy of Jeremy Bentham and John Stuart Mill. Utilitarianism’s underlying principle is to achieve the greatest good for the greatest number. Indeed, risk, which is traditionally defined as the “expected value of an undesirable consequence,” and economic determinism as manifest in risk/cost/benefit analysis, are direct descendants of Utilitarianism. The greatest good is usually measured in monetary terms and is interpreted as “…and at the least cost.” This leads to what the Philosopher, Charles Taylor calls the primacy of “instrumental reason”, the “kind of rationality we draw on when we calculate the most economical application of a means to a given end [7].” Taylor goes on to say that instrumental reason in terms of cost-benefit analysis uses efficiency as the measure of success, narrowing our choices, and excluding decisions we might make on other (moral) grounds.

Risk analysis can best be understood by a decomposition in terms of initiating events; systems, structures and component fault trees; event trees: containment release categories, environmental and compartmental fate and transport models; dose-response models, and incremental costs and benefits, all indicative of this linear reductionist approach. All accident sequences are assumed to be independent of one another, and the results are deterministic in that there is a “causal” relationship between each input and output.Footnote 7 Risk assessment, therefore, is reduced to a search for “causal links” or “causal chains” verified by “objective” experimental processes, i.e. by quantifying the behavior of various elements of the system (e.g. pumps, valves, etc.) in terms of failure rate data, dose-response parameters, etc. The behavior of the system elements is then integrated so as to quantify the behavior of the system as a whole. Hence this linear paradigm gives rise to the current culture of risk-analysis itself.Footnote 8

The discussion above about ethics and risk has to do with “safety culture” in particular, and individual and societal culture in general, subjects that risk as analysis does not speak to. In the next section of this chapter, I will explore the effects of twenty-five hundred years of cultural conditioning in the East (e.g. China, India, Japan and Korea) and in the West (Europe and the United States), and its relationship with the concept of safety culture. I believe such an understanding is required for relating the two cultures.Footnote 9 The arguments presented in this chapter are based on the following basic premises:

  • First, culture can be defined as the integrated pattern of human behavior that includes thought, speech, action and artifacts on human capacity for learning and transmitting knowledge to succeeding generations.

  • Second, culture gives rise to a society’s values, assumptions and beliefs. Hence culture is concerned with the act of developing the intellectual and moral facilities, especially by education.

  • Third, culture itself, arises out of a contextFootnote 10 or paradigmFootnote 11 that defines an individual’s or a society’s cultural conditioning. Hence an individual’s or a society’s values, ethics and morality are contextually or paradigmatically dependent.Footnote 12

  • Fourth, for the most part, societal conditioning and the context or paradigm from which it arises is implicit, i.e. cultural conditioning resides in the unconscious (emotive) and sub-conscious (mental).Footnote 13 The conscious aspects of cultural conditioning that are cognitive, resides in the mental.

  • Fifth, safety culture is “designed” within the larger societal cultural context that is “developed organically”. Hence safety culture is affected by the larger culture, usually in an implicit way, as an overlay to achieve a specific goal.

  • Sixth, when the societal culture runs counter to the demands of safety culture, and is left implicit, it can shift from underlying to undermining.

  • Seventh, approaches to quantifying and managing the risk of core-melt accidents before they occur, as well as approaches for emergency preparedness and response should an accident occur, are based on the “safety culture” of the individuals and the organizations/institutions that comprise the nuclear establishment.

  • Eighth and last, in order to explore the safety culture of a host nation with respect to nuclear power, it is essential to understand the context or paradigm out of which cultural conditioning, and hence its ethics and technology arise.

4 Historical Perspective on Culture and Technology

As I look back over the development of human consciousness in general, and ethics and morality in particular, two great ages or eras stand out. And we, as a society, are embarking on a third.

The first is the period between 800 and 200 BCE dubbed the Axial Age by the philosopher Karl Jaspers [8]. Jaspers argued that during the axial age “the spiritual foundations of humanity were laid simultaneously and independently… And these are the foundations upon which humanity still subsists today”. These foundations were laid by individuals within a framework of a changing social environment, and having a profound influence on future philosophy (based in logic and reason) and religion (based in revelation). These Axial Age individuals include Socrates, Plato and Aristotle in the West, the prophets in the Middle East, and Confucius, Lao-Tzu and the Buddha in the East.

As noted by Huston Smith [9], compassion and wisdom are the hallmarks of the Axial Age. Paradigmatically, this Age is pre-egoic, i.e., operating before the rise of individualism and liberal values, and is marked by “collectivism” wherein nomadic peoples came together to form tribes, villages and towns, and the “physical,” where technology supported physical labor, from weapons to support hand-to-hand combat to hand tools for agriculture and beasts of burden. When taken in its entirety, the wisdom traditions (i.e. including Judaism, Christianity and Islam) give us the three Virtues in the West: Humility, Charity and Veracity and the three Poisons in the East: Greed, Hatred and Delusion. Virtues are what we aspire to; poisons are to be avoided. Smith describes the Virtues as follows

  • Humility: The deeper meaning of humility is to treat your-self fully as one, but not more than one.

  • Charity: If you treat your self fully as one, you have an obligation to make sure your fellow human beings are treated fully as one.

  • Veracity: Huston Smith calls it, “seeing the world in its suchness”, which means the ability to drop our “subjective” lens and see the word, “as objectively” as possible.

As I argue throughout this chapter, veracity presents the biggest challenge of all, because the paradigms that give rise to our cultural conditioning lie at the unconscious and sub-conscious; they are implicit in all of our actions and not always, if ever, made explicit. To make this point clear, consider the fundamental canons of the National Society of Professional Engineers’ Code of Ethics.

Engineers, in the fulfillment of their professional duties, shall:

  1. 1.

    Hold paramount the safety, health, and welfare of the public.

  2. 2.

    Perform services only in areas of their competence.

  3. 3.

    Issue public statements only in an objective and truthful manner.

  4. 4.

    Act for each employer or client as faithful agents or trustees.

  5. 5.

    Avoid deceptive acts.

  6. 6.

    Conduct themselves honorably, responsibly, ethically, and lawfully so as to enhance the honor, reputation, and usefulness of the profession.

The first Canon is basically a general statement of Charity, the second Canon is a specific statement of Humility, Canons three, four and five are specific statements of Veracity, and the sixth and final Canon is a combination of all three Virtues. These Canons have been developed over the past 100 years or so, and to the best of my knowledge, their time-honored origin has never been articulated, but carried in the collective unconsciousness of society over the millennia.

The second great era centers on the Enlightenment (eighteenth century Europe) sandwiched between the Age of Reason (seventeenth century Europe) and the Social Movement termed Modernity (ninteenth century Europe and the United States), all of which gave rise to the Industrial Revolution. It began with Descartes and Newton, and it is marked by a paradigmatic shift from the physical to the mental (cogito ergo sum), and from the collective to the individual (from the pre-egoic to the egoic). It focuses on a priori universal laws, whether they are natural, physical or moral. It is an age that gave rise to major achievements in moral philosophy and ethical theory; among the more germane to the engineering profession are Right’s Ethics (Locke), Duty Ethics (Kant) and Utilitarianism (Bentham and Mill).

The Enlightenment also marks the divergence between Eastern and Western cultural values; the paradigmatic shifts from the collective to the individual and from the physical to the mental did not take place in the East to the extent it took place in the West. I must emphasize that this discussion is not about intelligence. This is about a context that enabled Western Society to replace physical labor with machines that is based on new quantitative analyses and replicated empirical data; i.e. the development of the “scientific method.”

This paradigmatic shift is best exemplified by the development of science and technology and how it influenced the Industrial Age. From one perspective, David S. Landes describes in great detail why the Industrial Revolution first occurred in Europe and not elsewhere [10]. To quote Landes:

To be sure, in Europe as elsewhere, science and technology had their ups and downs, areas of strength and weakness, centers shifting with the accidents of politics and personal genius. But if I had to single out the critical, distinctively European sources of success, I would emphasize three considerations: (1) the growing autonomy of intellectual inquiry, (2) the development of unity in disunity in the form of a common implicitly adversarial method, that is, the creation of a language of proof, recognized, used, and understood across national and cultural boundaries; and (3) the invention of invention, that is the routinization of research and its diffusion.

Regarding autonomy, Landes also describes why, within Europe, the Industrial Revolution took place first in Britain. Here too, quoting Landes:

Britain, moreover, was not just any nation… Remember that the salient characteristics of such a society is the ability to transform itself and adapt to new things and ways, so that the content of “modern” and “industrial” is always changing. One key area of change: the increasing freedom and security of the people. To this day, ironically, the British term themselves subjects of the crown, although they have long—longer than anywhere—been citizens.

Although originating within the Greek and Roman Empires, and associated with freedom, it was during the European Enlightenment, that people transitioned from being subjects of a king or queen to being citizens of a city and later, a nation. Such status carried with it rights (such as the ability to participate in the political process) as well as responsibilities (such as military service). Citizenship is the mark of the individual, and the hallmark of the European Renaissance,Footnote 14 the very essence of the egoic period.

We might also ask why the Industrial Revolution did not occur in the East, particularly in Japan. Here I refer to both David Landes [11] and Jared Diamond [12]. While each Asian country had it own unique set of circumstances in terms of natural resources, climate, geography, and the socio-political environment, many suffered from what Diamond calls “cultural isolationism” rather than embracing “cultural diffusion,” the latter, a necessary ingredient for scientific and technological advancement. Beginning in 1633 and lasting until the Meiji Restoration in 1867–1868, Japan had closed the door to the outside world. In the words of Landes [13]:

Japan had had enough of discovery and innovation, enough fire and blood. The aim now: freeze the social order, fix relations of social and political hierarchy, prevent disagreement and conflict.

The net result of cultural isolationism during this nearly 250 year period, is what I would call the “not invented here” syndrome. For Japan in particular, the culture of todayFootnote 15 regarding Fukushima as described by the Chairman of the Independent Commission reporting to the Diet of Japan is also the culture of yesterday: “reflexive obedience, reluctance to question authority, devotion to ‘sticking with the program’, groupism (collectivism) and insularity” [14].Footnote 16

As said, the Industrial Revolution, a product of the Enlightenment, is an age wherein physical labor is replaced by mental analysis resulting in man-made machines that are conceived, built and operated from a (Newtonian-Cartesian) world-view or paradigm based on three premises:

  • Reductionism: The world can be understood by reducing complicated systems to their parts.

  • Determinism: The world consists of causal links or chains; or output is proportional to input.

  • Objectivism: The world obeys universal laws; the results of observations are independent of the observer, which taken together with the first two premises, yield these universal laws of nature.

This world-view has served Western Society well by providing a particular lens through which to view physical reality. It results in a fragmented world with distinct parts or boundaries. Studying these fragments has developed much of the technological world we know today. It is important to stress that in this paradigm, it is assumed that there is good data, the system has a fixed boundary and that second order (nonlinear effects) can be neglected. One has only to look at a complex machine such at an automobile to see that each system, from the engine to the CD player, is researched, designed, developed and manufactured separately—and yet they all fit marvelously together as planned. It is hard to imagine understanding a physical world that is not amenable to such fragmentation. And as long as the roadway is free of ice, the automobile and the driver behave as expected!

These two eras have now taken Society (both East and West) into a third, which is still in the process of being defined. It is sometimes called the Post-Modern or Post-Industrial era. It may have begun with a new understanding of the physical world (quantum mechanics and relativity), the biological world (the double-helix and cloning), the political world (the nuclear weapons race and the space race) or the social-media world (the Internet and the Information Age). It is neither pre-egoic nor egoic, neither physical nor mental; it appears to be trans-egoic and emotional. I will explore this later in the chapter.

It is often said that society’s ability to develop new technologies (biotechnology, information technology, nanotechnology and nuclear technology) has far outstripped its ability to deal with their impacts (both intended and unintended consequences). I believe, in part, it is the unconscious grip of the Newtonian/Cartesian enlightenment world view that has the United States paralyzed with respect to high level radioactive waste disposal for example,Footnote 17 in much the same way as the pre-egoic, Axial Age world-view (primarily echoes of Shintoism coupled with elements of Buddhism, Confucianism and Taoism) that have Japan paralyzed with respect to safety culture, in light of the events at Fukushima. I believe that the way to resolve these dilemmas is to make these implicit world-views, explicit.

5 Safety Culture, Ethics and Risk

As said above culture is concerned with, (1) The act of developing the intellectual and moral facilities, especially by education, and (2) The integrated pattern of human behavior that includes thought, speech, action and artifacts on man’s capacity for learning and transmitting knowledge to succeeding generations.

With respect to safety culture in Japan, Reuters News Service, in a July 4, 2013 article entitled, “Japan says building nuclear safety culture will take a long time,” begins with the statement:

Japan’s nuclear regulator said on Thursday that elevating safety culture to international standards will “take a long time”, (just) days before new rules come into effect to avoid a repeat of the Fukushima nuclear disaster in March 2011.

The article quotes the new Japanese Nuclear Regulation Authority Chairman as stating:

The new regulations include extremely stringent requirements that the operators would not be able to endure if they don’t change their culture. We will need a long time to change this culture, but day-to-day efforts to meet those tough standards will in the end lead to improvement in the safety culture.

As described below, the difficulty in meeting these international standards cannot be overemphasized. The accidents at the Three Mile Island (1979) and Chernobyl (1986) nuclear power plants brought renewed international focus on the importance of a strong safety culture in the design, construction and operation of nuclear power plants internationally. Indeed, the International Atomic Energy Agency (IAEA) published a report [15] aimed at providing guidance to member states in their efforts to provide a sound safety culture for their (nuclear) organizations. The Forward to this report states:

The concept of safety culture was first introduced by the International Safety Advisory Group (INSAG-4), formed by the IAEA. In their report [16] they maintained that the establishment of a safety culture within an organization is one of the fundamental management principles necessary for the safe operation of a nuclear facility. The definition recognized that safety culture is both structural and attitudinal in nature and relates to the organization and its style, as well as attitudes, approaches and the commitment of individuals (emphasis mine) at all levels in the organization.

The IAEA report goes to considerable length to describe the general concept of culture. Two important points made in the IAEA report are worth noting here. First, the nature of culture is very complex, and second, there is no right or wrong culture. Regarding the first point, culture is deep (not a superficial phenomenon), it is broad (it impacts virtually all aspects of life), and it is stable (it provides meaning and makes life predictable). Hence it is very difficult to change. And with respect to the second point, there is no better or worse culture, except in relation to what a group or organization is trying to do. Said another way, the operators at Fukushima were attempting to manage multiple core-melt accidents at once, but were looking for collective solutions from higher authorities when individual actions were required. As I will argue throughout this paper, it is this latter point that may have contributed to the accident at Fukushima and it is the former point that will make elevating safety culture to international standards a very difficult and prolonged task in Japan.

As also noted in the IAEA report, the levels of culture go from the very visible (explicit) to the tacit and invisible (implicit). The report describes three levels of culture, Artifacts and Behavior (explicit), Espoused Values (strategies, goals and philosophies—which can be elicited) and Basic Assumptions (unconsciously held and usually tacit). Of particular interest in understanding any culture, are the fundamental beliefs that are so taken for granted that most people in a cultural group subscribe to them, but not in a conscious way, i.e. they are implicit.

As to a more precise and succinct definition of safety culture, the IAEA report cites the U.S. Nuclear Regulatory Commission’s Policy Statement on the Conduct of Nuclear Power Plant Operations [2], which defines safety culture as:

The necessary full attention to safety matters and the personal dedication and accountability of all individuals (emphasis mine) engaged in any activity which has a bearing on the safety of nuclear power plants. A strong safety culture is one that has a strong safety-first focus.

The recently published U.S. Nuclear Regulatory Commission, Safety Culture Policy Statement (U.S. NRC 2012) [17] expands the focus to all regulated entities and defines safety culture as follows:

Nuclear safety culture is the core values and behaviors resulting from a collective commitment by leaders and individuals (emphasis mine) to emphasize safety over competing goals to ensure protection of people and the environment.

Both the IAEA and the U.S. NRC emphasize that safety culture rests with individuals and leaders in any organization. The notion of the individual as opposed to the collective stems from the European Enlightenment, a cultural shift that took place in the eighteenth century: individual rights, individual duties and individual responsibilities that are essential to a strong safety culture, and which may be incongruent with the societal culture of Japan as articulated by the Commission Chairman cited above.

6 Uncertainty and Safety Philosophy

Perhaps, former United States Secretary of Defense, Donald Rumsfeld said it best [18]:

Reports that say something hasn’t happened are always interesting to me, as we know, there are known knowns. There are things we know we know. We also know there are known unknowns. That is to say, we know there are some things we do not know. But there are also unknown unknowns, the ones we don’t know, we don’t know.

Although the popular press and the late-night pundits found much humor in these statements, it is in fact just a “Truth Table” regarding our knowledge about the state of the world: what is knowable about the world and what is not, and our degree of knowledge about each. In terms of the initiating events at Fukushima, earthquakes that originate in Subduction Zones cause large tsunamis, a fact that has been known (a known-known) for some time. On the other hand, the return frequency and magnitude of such events, is a known-unknown; and so a safety philosophy is developed to account for the unknown.

Technically, a safety philosophy can account for two types of uncertainty: aleatory (random variations and chance outcomes in the physical world) and epistemic (lack of knowledge about the physical world) [19]. It is important to distinguish between random variations and chance outcomes, and lack of knowledge. More research can reduce epistemic uncertainty, however, aleatory uncertainty can only be estimated better, but not reduced with more research. In either case, the annual probabilities and the consequences can be expressed as probability distribution functions. The typical approach for evaluating the risk when consequences and probabilities are expressed as distributions in the risk equation shown in Appendix B is the use of Monte Carlo simulation. When these two types of uncertainty are included, the risk itself might also be quantified as a cumulative probability distribution function.

To cope with aleatory and epistemic uncertainty, a safety philosophy was developed from the inception of the nuclear age called Defense-in-Depth and is still in effect today. While there is no formal definition of Defense-in-Depth, examples of it are found at the nuclear power plant level, at the structural, system and component (SSC) level, and at the phenomenological level.Footnote 18 Moreover, where phenomenological uncertainties exist, safety margins are included leaving a big difference between estimates of capacity and load.

In reality,Footnote 19 there is also indeterminacy (e.g. a unique initiating event leading to accident sequences that may take many paths) and a high level of ambiguity (i.e., non-unique, alternative or multiple legitimate interpretations based on identical observation or data assessments). Ambiguity may come from differences in interpreting factual statements about the world or from differences in applying normative rules to evaluate the state of the world. Finally, there is the realm of the unknown-unknown.

7 Reflections on Fukushima Daiichi

And what of the unknown-unknown, e.g. how will people (the operators, the authorities and the general public) react when confronted with an accident the scope of Fukushima? A recent National Public Radio interview [20] included the following statements:

The Japanese decision-making process, of group decision-making and not individual decision-making, might have been a hindrance for dealing with a situation like this… It’s hard to know, but the timeframe demands of making decisions like this, that are multi-billion-dollar decisions, would be difficult in the Japanese culture to do as promptly as maybe it would be done here.

And later on:

One critical decision was whether to pump seawater into the reactors. That would certainly ruin them, but it could also keep them cool and prevent meltdowns. It appears that the engineers on site hesitated for some hours before they went ahead and did that.

And yet another example had to do with containment venting… the operators had to wait several hours while the request for permission to vent went all the way up to the Prime Minister for approval [21]. Much has also been written about the withholding of information regarding radioactive material dispersion and radiation dose data (see for example, [22]), as well as ignoring new geotechnical data regarding the return frequency of large earthquakes and tsunamis (see for example, [23]).

Taken at face value, these news reports lead me to conclude that individual and societal cultural conditioning was at play; and that this cultural conditioning was inapposite to the safety culture required for the conduct of operations at a nuclear power plant undergoing such a severe event. As said, the embodiment of our cultural conditioning resides as much in the unconscious and sub-conscious domain as it does in the conscious domain, i.e. that we are largely unaware of our motivations and oftentimes intentions.

One aspect of cultural conditioning has to do with responsibility and authority. In some cases, decisions can be made in advance and operators carry them out in conformance with plant procedures and severe accident management guidelines. This would be their responsibility. However, when operators are faced with situations beyond the scope of procedures and guidelines, decisions should be made at the level appropriate to the act. That is, operators should be given the authority to make decisions appropriate to the act they need to perform. Today’s military model calls for just this (see for example, [24]). On-the-scene commanders at all levels have the ability and responsibility to make decisions when confronted with dynamic environments, as opposed to historical or conventional military operations, where centralized headquarters in the field made almost all decisions. In some cases very low-level personnel can and are expected to make decisions in response to unexpected circumstances, whether to mitigate unexpected risks or to exploit unanticipated opportunities (see for example, [25]).

As discussed above, cultural conditioning in the East is based on 2,500 years of collective, pre-egoic, traditions. Cultural conditioning in the West has its toots in the egoic, stressing individual responsibility and authority. Each underlies the safety culture in the respective domains.

8 Where Do We Go from Here?

As stated in the introduction, this chapter has revolved around three questions:

  1. 1.

    What would it take to improve the quality of risk analysis and emergency planning so that this terrible accident and the subsequent loss of public confidence can be avoided in the future?

  2. 2.

    Can a risk analysis paradigm be developed that incorporates the cultural conditioning of people and organizations responsible for nuclear energy?

  3. 3.

    Can a global safety culture be developed while still preserving the societal culture of host nations?

In Appendix C, I describe the Station Blackout scenario as quantified in NUREG 1150 for Unit 2 of the Peach Bottom Nuclear Power Plant, a General Electric boiling water reactor (BWR-4) unit of 1,065 MWe capacity housed in a Mark 1 containment. This report, published in 1991 [26] was an updated version of the same analysis published in 1975 [27]. This nuclear reactor is basically the same as the nuclear reactor systems at Fukushima Daiichi, Units 1–-4. The dominant internally and externally initiated accident sequences leading to core-melt for Peach Bottom in NUREG-1150 consists of three station-blackout scenarios, where the timing of two of them matches the sequence of events at Fukushima Daiichi (the spent-fuel pools notwithstanding). And yet, given the robustness of the analysis, the diesel generators at Fukushima Daiichi were not adequately protected from a large tsunami, in spite of warnings to the contrary, as we discussed above.

We might conclude that the risk as analysis paradigm described in Appendix B works well when the system under consideration has adequate historical or actuarial data on failure rates, and empirical data on public health and environmental impact. Moreover, the system must be fairly well defined, has (assumed) fixed or rigid boundaries and where second order or nonlinear effects are (assumed) small. In terms of a nuclear power plant, as long as the plant functions within its design basis, or accidents occur within its design basis envelope, we might call this “safe”.

Because challenges to public health and safety resulting from beyond design-basis events violate these assumptions, I believe a new paradigm for risk and ethical decision-making is required. And this brings me to the complex domain. Hence it is useful to describe here some of the basic differences between the science and technology of the Industrial and Post-Industrial Ages. The key distinction we draw is between systems that are “complicated” and systems that are “complex”.

The paradigm within which Industrial Age technologies are understood is based on an Enlightenment worldview. As said, this worldview is atomistic (reductionism), deterministic (cause and effect) and objectivistic (universal laws). In other words, the laws governing the behavior of these complicated systems can be:

  • Understood by studying the behavior of their component parts,

  • Deduced from cause and effect (a search for causal links or chains), and

  • Determined independent of the observer, that is, only deduced from “objective” empirical observations.

The context within which our Post-Industrial Age Technologies and their underlying science are understood is based on a nonlinear worldview. This worldview gives rise to complex systems that are characterized by at least one of the following [28]:

  • Holistic/emergent—the system has properties that are exhibited only by the whole and hence cannot be described in terms of its parts,

  • Chaotic—small changes in input often lead to large changes in output and/or there may be many possible outputs for a given input, and

  • Subjective—some aspects of the system may only be described subjectively.

It is often said that for complex systems, “the whole is greater than the sum of its parts”. What this means is that there is an emergent quality (sometimes called an emergent property) that is not exhibited by the parts alone. Examples include electric power transmission grids, the disposal of high-level radioactive waste, and the response of social systems to severe natural phenomena. I believe that the new issues regarding national and international security also fall into this category. In each case, the system is simultaneously a whole and a part of a larger whole, a characteristic of complex systems.

It should be made crystal clear that the impacts of human activities on both society and the environment (from the development of the steam engine to the development of the jet engine) have always been complex. In the past, however, the only undesirable consequences of an Industrial Age technology, such as a nuclear power plant, that were considered in a PRA were geographically local (public health effects out to one mile or 25 miles) or they were observable in “real” time (a hydrogen explosion). This gave the impression that the current risk paradigm is accurate because locality and observability were two characteristics of the impact. This lens is changing, and yet our practices are still based on the same paradigm. That is, a nuclear power plant accident has “global” impacts (an accident at one plant affects the operation of all plants) and manifests very quickly (e.g. loss of public confidence worldwide). In the case of disposal of radioactive waste, the undesirable consequences are almost imperceptible (e.g. the migration of high-level radioactive waste takes place over geological timescales or millennia). Moreover, these impacts may be temporally persistent and/or irreversible (e.g. the degradation of public welfare due to nuclear proliferation).

Thus, as a result of the complexity inherent in Post-Industrial Age Technology, societal and environmental impacts are no longer geographically local, nor perceptible in real time, nor reversible. Rather, complexity can produce impacts that are geographically global (a malicious human act), imperceptible in time either manifesting very quickly (on the Internet) or very slowly (high level radioactive waste disposal), or irreversible (release of radioactivity due to a core-melt accident). We are like the driver of a modern automobile, cruising along on the Interstate (in a linear world), and now suddenly, we are faced with “black ice”!

The impacts we have described above lead to unprecedented ethical issues as reflected in the three questions above. Moreover, questions such as: “What constitutes an acceptable risk and why?” take on new meaning in the face of challenges to the ecology of life. There is a growing belief, as noted by Donald Rumsfeld’s quote above, that not only is the future unknown, it is unknowable. Moreover, because these complex ethical issues are arising so much faster than ever before, and because there has been little time to develop normative processes for decision-making, there is even greater ambiguity. The unknown-unknown looms large in the domain of Risk as feelings.

What we are pointing to, for lack of a better description, is a Cultural Risk Analysis. This would entail making explicit the implicit cultural conditioning of individuals, and organizations/institutions, and their relationship to the society in which they abide. Such a Cultural Risk Analysis would illuminate cases where the underlying societal culture runs counter to the demands of safety culture, such as for nuclear power. If aspects of the societal culture are left implicit, they just don’t underlie the safety culture, they will undermine it. If made explicit, it becomes possible for the safety culture to be designed and constructed in a way that accounts for, accommodates or even overcomes the conflicts between the two cultures.

Such a Cultural Risk Analysis would then require an analysis of cultural conditioning, much the same way we analyze the machine. This would mean understanding how underlying assumptions, values, and beliefs come from culturally defined sources and not “objective facts”.Footnote 20 However, there is one major difference; people are “complex” emotional, mental, physical and spiritual human beings. Humans are not “complicated” machines and so are not amenable to a linear reductionist approach.

Human beings have emergent properties, namely feelings and thoughts that do not reside in any one part of the body. Humans may respond differently to the same stimulus on any given day. And there are no “closed form” analytical solutions to describe human behavior; it is, for the most part subjective. Coincidentally with the development of these new complex technologies, there has been growing empirical evidence that in the realm of human decision-making, the emotional precedes the cognitive [29], and that motivation and intention derive from the unconscious–emotive and subconscious-mental [30]. These findings have found their way into such fields as Behavioral Economics [31] and Risk Perception [32], among others (An extensive literature review can be found in [33]). And a number of consulting companies have developed analytical methods in an attempt to quantify the “Risk Culture” of Business Organizations. In this case, the focus is on comparing the “self-interest” of the individual employees versus the corporate interest.

Developing a framework for a Cultural Risk Analysis, i.e. to carry out a cultural analysis, requires a paradigmatic shift in human consciousness similar to the one that took place in the Enlightenment. And this will be extremely difficult because it is a shift requiring both the rational (cognition) and the emotional (feeling). It will require both risk-as-analysis and risk-as-feelings; it will require both moral reasoning and emotional morals. As any good engineer knows (at least those who have taken my class), a redundant and diverse system has order of magnitude higher reliability if the system is built of “AND” gates rather than “OR” gates.Footnote 21

Perhaps Thomas Kuhn [34] said it best, “…that is why a law that cannot even be demonstrated to one group of scientists may occasionally seem intuitive to others. Equally, it is why, before they can hope to communicate fully, one group or the other must experience the conversion we have been calling a paradigm shift.” And, “Just because it (a paradigm shift) is a transition between incommensurables, the transition between competing paradigms cannot be made a step at a time, forced by logic and neutral experience. Like the gestalt switch, it must occur all at once (though not necessarily in an instant) or not at all.”