The pictures of the nuclear disaster in Fukushima are in our minds and are updated daily. People from around the world feel compassion for the Japanese, who have had to cope with a triple disaster: earthquake, tsunami, and nuclear accident. At the moment of writing this piece, it is far from clear how the latter of this apocalyptic triad will end. In the meantime, the debate about nuclear energy has taken an unexpected turn. In the last few years, there was a growing consensus that nuclear energy would be the solution to generate energy without CO2 emissions. The probability of an accident was said to be negligible. However, now that an accident has occurred, many people wonder whether nuclear energy is a really wise option (cf., e.g., Macilwain 2011). Germany immediately shut down several nuclear reactors, and the German Green Party achieved unprecedented results in the local elections due to its anti-nuclear position.

Nevertheless, there seems to be one constant factor in the debate about nuclear energy: proponents call opponents badly informed, emotional, and irrational, using these notions more or less as synonyms. However, such rhetoric denigrates and hinders a real debate about nuclear energy. In addition, it is simply wrong to equate emotions with irrationality, as they can be a source of practical rationality. I will argue that rather than being an obstacle to a meaningful debate about nuclear energy, emotions can be an important source of ethical insight that should be taken seriously.

Often when a new technology is introduced, a typical pattern can be observed: society is alarmed and worried about its risky aspects, whereas experts assure them that the risks are negligible. Policy makers typically respond to this in two ways: either they ignore the emotions of the public or they take them as a reason to prohibit or restrict a technology, as is the case with genetic modification in many European countries. Let me call these responses the technocratic pitfall and the populist pitfall, respectively. Experts and policy makers emphasize that a dialogue with the public is impossible as it is supposedly ill-informed and so emotional about certain risks that they are immune to rational, objective, scientific information. This pattern has occurred in regard to nuclear energy, cloning, genetic modification, carbon capture and storage, and vaccination, to mention just a few of many hotly debated, controversial, technological developments. Stalemates such as these may seem unavoidable. At least as long as we take it for granted that emotions are irrational and impenetrable by rational information. However, there are developments in the psychological and philosophical study of emotions that can shed an entirely new light on these issues.

It seems to be a platitude that reason and emotion are opposite faculties; reason being targeted to providing us with objective, rational information about the world, emotion being a faculty that provides us with basic survival mechanisms but with no reliable knowledge about the outside world. In psychology, this view is called dual process theory (cf., e.g., Epstein 1994; Sloman 1996; Haidt 2001). This is by and large the picture that is endorsed by many scholars who study emotional responses to risk (Loewenstein et al. 2001; Slovic et al. 2002, 2004; Sunstein 2005).

However, recent developments in emotion scholarship challenge this dichotomy between reason and emotion. The dominant approach in emotion research these days is a so-called cognitive theory of emotions (e.g., Scherer 1984; Frijda 1987; Lazarus 1991). According to this approach, emotions are a form or source of cognition and knowledge. Moral emotions are judgments of value (Solomon 1993; Nussbaum 2001). Emotions are taken to be affective and cognitive at the same time (Roberts 2003; Zagzebski 2003). For example, feeling guilty involves affective states, namely feeling the “pangs of guilt,” but it also involves a belief, namely that one did something wrong. These two aspects are inseparable; they are two sides of the same coin (Roeser 2011).

The neuropsychologist Antonio Damasio (1994) has famously shown that without emotions, we cannot be practically rational. People with damage to the amygdala lose their capacity to have emotions. Even though their IQs are unaffected, they are incapable of making concrete practical and moral judgments. According to Damasio, emotions are “somatic markers” with which we perceive morally and practically salient aspects of the world. This also holds concerning risk; the so-called Iowa gambling task that Damasio and his colleagues have developed shows that amygdala patients are willing to take huge risks that normal people would find unacceptable.

These ideas can shed completely new light on the role of emotions in debates about risky technologies. Rather than being opposed to rationality and hence being inherently misleading, emotions might be an invaluable source of wisdom when it comes to assessing the moral acceptability of risk (Roeser 2006, 2010; Kahan 2008). Emotions such as sympathy, empathy, and compassion can point out unfair distributions of risks and benefits. Indignation and resentment can point to moral digressions such as involuntary risk impositions. Experts might feel responsible and worried about the technologies they develop. Fear might point to concern about unforeseen negative consequences of a technology. Disgust might point to the ambiguous moral status of, for example, clones and human–animal hybrids.

That risk is more than a quantitative, scientific notion, has long been acknowledged by social scientists, psychologists, and philosophers alike (e.g., Krimsky and Golding 1992; Fischhoff et al. 1981; Shrader-Frechette 1991; Hansson 2004; Asveld and Roeser 2009). Risk is more than the probability of an unwanted effect that we could assess with cost–benefit analysis, as conventional, technocratic approaches take it to be.

Many technologies are developed to improve human well-being. We largely owe our contemporary standard of living to technologies, with a high degree of sanitation and many possibilities for travel, transportation, and communication. However, unfortunately most technologies also entail a chance of negative side effects or risks, such as pollution, accidents, and more superficial relationships between people. These positive and negative aspects of technologies need to be partially assessed quantitatively. This can be done for example, by measuring the speed of an airplane, its harmful emissions per kilometer, the probability of a crash, etc, but they also involve moral considerations: How do we value the efficiency of a technology as opposed to its possible disadvantages? What kinds of disadvantages are important to measure at all, and how should they be balanced against each other? Which is worse, a technology with an average risk of one dead person per year or a technology with an average of five severely handicapped people per year? What is the value of a human life? How are the risks and benefits distributed across society? Even though these questions require descriptive information, that information does not as yet constitute answers to the moral questions concerning risky technologies. These questions require moral reflection.

Risk concerns the well-being of human beings and involves ethical considerations such as fairness, equity, and autonomy. There is a strong consensus amongst risk scholars that ethical considerations such as justice, fairness, equity, and autonomy should be included in a risk assessment (Shrader-Frechette 1991, contributions to Asveld and Roeser 2009). Interestingly, these considerations do figure in the risk perceptions of laypeople (Slovic 2000). Apparently, the pre-theoretical connotations that people have with risk include ethical considerations that get excluded from the quantitatively oriented approach to risk used by experts. However, social scientists are struggling with how to deal with the fact that the risk perceptions of laypeople are largely based on emotions, as this seems to undermine the idea that laypeople might employ an alternative, legitimate rationality. However, emotions are not a threat to rationality. This only follows on the model of dual process theory, but as we have seen, emotion scholars provide us with an alternative framework. Hence, the fact that laypeople are emotional about risks does not show that they are irrational, rather, their emotions might be the very ground on which they are capable of including ethical considerations in their risk assessments.

Moral reflection benefits from emotions such as sympathy, compassion, and feelings of responsibility. For example, we only start to grasp the moral impact of the disaster in Fukushima if we see pictures and hear stories of people who have to be evacuated, small children who are tested for radiation contamination, safety workers who are taken to the hospital with burn injuries, and the uncertainty about how it will all end. The moral meaning of a disaster like this only starts to become clear if we are emotionally engaged with the people who have to undergo the consequences.

The probability of a nuclear disaster might be small, but if it occurs, the consequences are enormous. In addition to the supposedly low probabilities of a nuclear disaster, it is also important to focus on its possible consequences. Is a meltdown of a modern reactor less disastrous than in the case of the reactor in Chernobyl? How reliable are the safety barriers? On the other hand, we should also not forget the hazards involved in mining and using coal and the general environmental and health effects of CO2 emissions. Are there acceptable alternatives for nuclear energy if people are unwilling to reduce their energy consumption?

These considerations illustrate the complexity and intricacies of the quantitative and moral considerations involved in contemplating the acceptability of nuclear energy. Debates about nuclear energy have to be based on sound information, but they also involve normative considerations. Experts have a part of the required information, namely the quantitative data, but they do not have privileged access to the moral considerations that are necessary for an assessment of the acceptability of a risky technology such as nuclear energy. The emotions of laypeople can offer an important perspective.

Fukushima has enticed a new debate about nuclear energy. It should provide an opportunity for a more sophisticated debate than has so far been the case. The stereotypical rhetoric should be avoided according to which experts are rational and objective and the public, emotional and irrational. This picture is empirically false and prevents a fruitful debate from taking place. A fruitful debate about nuclear energy should do justice to quantitative, empirical information as much as emotional, moral considerations. This approach allows for a different way of dealing with risk emotions in public debates by avoiding both the technocratic pitfall and the populist pitfall alike. Instead, this alternative approach allows for what I would like to call an “emotional deliberation approach to risk.” It allows the public a genuine voice in which their emotions and concerns are appreciated, listened to, and discussed, rather than ignored (technocratic pitfall) or taken as a given that makes discussion impossible (populist pitfall). By discussing the concerns underlying emotions, justified concerns can be distinguished from—morally or empirically—unjustified concerns. This approach means that debates about risky technologies include emotions and moral concerns that have to be taken into account in order to come to a well-grounded ethical assessment. At the same time, this approach will help overcome the gap between experts and laypeople that occurs repeatedly in debates about risky technologies. It can provide for circumstances in which all involved parties respect each other more and are willing to listen to each other and give and take in a debate. This is necessary in order to broach such a complex technological–ethical issue as whether and under what circumstances nuclear energy might be morally acceptable.