Introduction

The title of this paper (‘emotional engineers’) might strike the reader as an oxymoron: Engineers are normally seen as the archetype of rational and quantitatively oriented people. However, in this paper I will argue that engineers should use their emotions in order to develop morally responsible technologies. This requires a new understanding of the competencies of engineers: they should not be unemotional calculators; quite the opposite, they should work to cultivate their moral emotions and sensitivity, in order to be optimally engaged in morally responsible engineering.

Values in Technology

A common view amongst engineers is that technology is value-neutral and engineering a predominantly mathematical, quantitative discipline. However, various scholars from different backgrounds have argued that technological design is not value neutral. The way a technology is designed determines its possibilities, which can, for better or for worse, have consequences for human well-being.

Scholars in the field of society and technology studies (STS, Winner 1980) and in continental philosophy of technology (Verbeek 2005; Ihde 1990) have shown how the way technological products are designed determines our behavior.

In the field of legal decision making, Thaler and Sunstein (2008) have argued that our choices are largely determined by the design of the products and infrastructures we use. This is similar to the phenomenon of framing in the communication of statistical information (cf. Tversky and Kahneman 1974). For example, depending on where healthy and unhealthy snacks are placed in a cafeteria, people will be more prone to buy the one rather than the other. Thaler and Sunstein argue that there is no neutral design; no matter how somebody designs a product, that will to a significant degree influence our behavior. Hence, it is better to intentionally design products and infrastructures in such a way that they lead us to responsible behavior rather than making arbitrary or unconscious design-choices that might lead to suboptimal or even irresponsible behavior.

Recently, scholars in analytical philosophy of technology have also developed an account for including moral valuesFootnote 1 in the development of technologies in order to come to morally better designs. This account is called ‘value sensitive design’ (Van den Hoven 2007). Design teams should include moral values and stakeholder values in an iterative process in the technologies they develop (Friedman 2004; Zwart et al. 2006).

It is curious how scholars from such diverse backgrounds as decision theory, STS and continental and analytical philosophy of technology independently come to very similar conclusions, partially without even mentioning parallel developments in different disciplines and discourses. Hopefully, this will change in the future by more interdisciplinary research so that scholars from different disciplines can draw on each other’s insights and developments. In any case, the fact that these different disciplines come to similar insights from very different perspectives makes their point even more urgent, that is, that technological design includes values that shape our behavior. However, rather than leaving up these values to be included by happenstance, we should intentionally include values that improve our behavior.

If moral decision making would be left to managers or policy makers, it would take place after a product is already developed. Noëmi Manders-Huits and Michael Zimmer (Manders-Huits and Zimmer 2009) have argued that it would be useful to have a so-called ‘values-advocate’ in a design team to make sure that values get due attention; this could be a moral philosopher or a social scientist. However, I believe that it might not be feasible to have somebody with such a background on each and every design team. Engineers themselves should also be trained to be aware of moral values and to explicitly take them into consideration in the design process (cf. Van der Burg and Van Gorp 2005). Rather than delegating moral reflection to ‘moral experts’, engineers should cultivate their own moral expertise. They have a key moral responsibility in the design process of risky technologies, as they have the technical expertise and are at the cradle of new developments. Engineers can reduce the risks of a technological product by developing a different design.

Risk and value sensitive design can be seen as two sides of the same coin. With value sensitive design we try to diminish the potentially negative effects of risky technologies. Engineers can influence the possible risks and benefits more directly than anybody else. However, technological risks and benefits are not merely a technical matter but also involve ethical aspects. This requires a capacity to be aware of moral saliences. I have argued elsewhere that emotions are an indispensable source of insight into ethical aspects of risk. In the remainder of this paper, I will argue that this means that engineers should also include emotional reflection into their work.

Risk, Values and Emotions

Engineers, policy-makers and other risk-experts generally define risk as a product of probabilities and unwanted consequences. Examples of unwanted consequences are the number of deaths or injuries, or the degree of pollution. In risk analysis and risk management, engineers, policy makers and other risk-experts use cost-benefit analysis to weigh the possible advantages of a technology against its possible disadvantages.

Many social scientists and philosophers who work in the field of risk argue that cost-benefit analysis and the definition of risk as a product of probabilities and unwanted consequences are not sufficient to determine whether a risk is acceptable or not. They also emphasize the importance of further considerations such as whether a risk is taken voluntarily, the distribution of risks and benefits in a population, and the available alternatives to a technology. Furthermore, a high probability of a small effect might be more acceptable than a small probability of a large effect, even though the product of probability and effect might be approximately equal. Defenders of such an approach argue that all risk judgments involve evaluative aspects. Even the standard definition of risk involves an evaluative judgment as to what counts as an unwanted consequence (Fischhoff, Lichtenstein et al. 1981; Jasanoff 1993; Shrader-Frechette 1991; Krimsky and Golding 1992; Slovic 2000; Jaeger, Renn et al. 2001). Hence, risk is not only a quantitative notion; rather, it also involves ethical considerations which conventional methods for risk assessment insufficiently take into account. These ethical considerations do play a role in the risk perceptions of laypeople (Slovic 2000). Hence, they have a richer understanding of risk than experts, which is needed for a complete moral evaluation of risks (Slovic 2000; Roeser 2007).

Empirical research by Paul Slovic and others shows that emotions are a major determinant in the risk perceptions of laypeople (Alhakami and Slovic 1994; Slovic 1999; Finucane et al. 2000; Slovic 2002). Slovic says that emotion and reason can interact and that we should take the emotions of the public seriously since they convey meaning; still, he sees analytic methods as the final arbiter in estimating risks (Slovic et al. 2004). Other scholars even go so far as to say that emotions should be excluded from decision making about risk (Sunstein 2005) or that they should at most be accepted as an unfortunate fact of life (Loewenstein; Weber et al. 2001, p. 281) or used instrumentally, in order to create acceptance for a technology (De Hollander and Hanemaaijer 2003). This interpretation of risk-emotions threatens to undermine the earlier rehabilitation of the risk perceptions of laypeople.

The theoretical framework that most scholars who work on risk and emotion endorse is Dual Process Theory (e.g. Slovic 2002). According to Dual Process Theory, there are two distinct systems with which we apprehend reality. System 1 is unconscious, fast, intuitive and emotional while system 2 is conscious, slow, analytical and rational (Epstein 1994; Sloman 1996; Sloman 2002; Stanovich and West 2002). Rational beliefs are supposed to be an afterthought to our immediate emotional responses (cf. Zajonc 1984; Haidt 2001).

The danger of this approach is that emotions can be discarded as irrational, subjective states. However, there are emotions that by their very nature transcend the two systems postulated by Dual Process Theory (Roeser 2009). These are emotions that involve a high degree of reflectivity and narrativity, such as emotional responses to fictional characters or to people or events who or which are far away (for a more nuanced view on the relation between reason and emotions in philosophy, cf. e.g., de Sousa 1987; Greenspan 1988; Solomon 1993; Stocker and Hegeman 1996; Goldie 2000; Nussbaum 2001; Roberts 2003).

Many emotions are spontaneous responses to what is nearby, but for example sympathetic emotions can lead us to extend our ‘circle of concern’, as Nussbaum (2001) phrases it. If we think about the suffering that other people might undergo by being the victims of a disaster, we usually feel touched and shocked about this. This realization involves moral emotions. These are emotions that are reflective, justifiable and based on reasons. Hence, such moral emotions neither fit neatly into system 1 nor into system 2. We need moral emotions in order to be aware of moral aspects of risky technologies (Roeser 2006b). For example, by caring about certain things we are able to perceive evaluative aspects of the world that we would otherwise not be able to be aware of (Little 1995; Blum 1994). Purely rational reflection would not be able to provide us with the imaginary power that we need to envisage future scenarios and to take part in other people’s perspectives and to evaluate their destinies.

Hence, the fact that the risk perceptions of laypeople involve emotions does not make them suspicious, to the contrary. We need moral emotions in order to have well-grounded insights into whether a technological risk is morally acceptable or not. For example, enthusiasm for a technology can point to benefits to our well-being, whereas fear and worry can indicate that a technology is a threat to our well-being; sympathy and empathy can give us insights in fair distributions of risks and benefits, and indignation can indicate violations of autonomy by technological risks that are imposed on us against our will (Roeser 2006b). Of course these emotions are not infallible: they can bias us towards what is close by. However, all our cognitive capacities are fallible, but we cannot do without them. We need emotions for well-grounded moral evaluations of risk. Emotions can themselves be a source of critical reflection about our risk-emotions (Roeser 2010). Such an approach can provide for a richer account of the importance of emotions in ethical reflection about risk than Dual Process Theory (Roeser 2009). Rather than being biases that threaten objectivity and rationality in thinking about acceptable risks, emotions contribute to a correct understanding of the moral acceptability of a hazard.

Risk-Emotions of Engineers

Engineers are often considered as the archetype of people who perform their work in a rational and quantitative way. They exemplify the idea, which is also endorsed by many Dual Process Theorists, that computational intelligence is superior to other human capacities of processing information, such as intuition and emotion.

However, there are scholars who challenge this computational ideal of intelligence (Dreyfus 1992). Some authors emphasize the importance of narrative intelligence (Mateas and Sengers 2003). Other authors emphasize emotional intelligence (Goleman 1995). These are forms of intelligence that go beyond deductive reasoning and analytical, logical thinking and that play an essential role in our practical rationality. People who lack on these capacities have difficulties making practical and moral judgments (Damasio 1994).

Several authors emphasize that emotions are needed for moral conduct by business managers (Simon 1987; Mumby and Putnam 1992; Gaudine and Thorne 2001; Klein 2002; Lurie 2004). We can extend this idea to other professionals, and more specifically, for the purpose of this paper, to engineers. We need engineers who have a sufficiently developed emotional sensitivity as this will give them access to morally important aspects of the technologies they design.Footnote 2

It might be objected that we should leave the moral decision making about risky technologies to policy makers. However, as I have argued earlier, that would be a missed opportunity. It might mean that we try to constrain a technology when it is already too late. A more fruitful way is to let engineers explicitly and intentionally include moral reflection in the design process of risky technologies. But given my claim that emotions are a necessary source of moral reflection about risky technologies, this means that the emotions of engineers should play a role in the design of risky technologies.Footnote 3

All this means that when educating and recruiting engineers, the emphasis shouldn’t solely be on ‘analytical’ or ‘hard’ skills, as has traditionally been the case, but also on ‘emotional’ or ‘soft’ skills. Currently, many technical universities include compulsory ethics-courses in their curricula (cf. Zandvoort et al. (2000), also cf. the ABET-criteria that require ethics courses in engineering curricula in the United States). This is an important step in the right direction. However, the emphasis in such courses is still mainly on argumentative and reasoning-skills. In addition, engineering-education should also include the development of sympathetic and emotional skills. This could be done by role playing games, through which the imaginative and emotional capacities of engineering students can be trained in a safe setting. An additional trajectory would be to include literature-courses and other parts of a liberal arts-education in the curriculum of engineering education programs (cf. Nussbaum (1997) who argues for this in a broad way, not specifically concerning engineering education).Footnote 4

So far I have sketched why we need to emphasize the emotional capacities of engineers, and how this could be achieved. In the next section I will discuss how we can implement emotional reflection in the engineering design process.

Including Emotions in the Design Process

As I argued in the previous sections, we need engineers who take their emotional responses seriously, as emotions are helpful in assessing the moral values involved in technologies. This will enable engineers to play an important role in reflecting on morally responsible technological design. The importance of the emotions of engineers has so far not been mentioned by the scholars who emphasize values in design and whom I have discussed in section 2. Similar to scholars who work on risk, many scholars who work on values in design see reason as the predestined faculty of critical, moral deliberation, and they see emotions as a threat to rational decision making. At most they acknowledge that engineers should take into account the wants and desires of the customers. However, wants and desires are not necessarily emotions, and they are not necessarily grounded in moral considerations. My alternative account of emotions in risk perception also applies to the design of risky technologies. Emotions should play a key role in risk perception and in value sensitive design. Emotions sensitize us for complex ethical considerations that are involved in the awareness of the risks of technologies as much as in deliberating about how to diminish these risks in designing technologies. Emotions and scientific methods should be in a good balance when engineers think about risks. Where science can inform them about magnitudes, emotions inform them about moral saliences. Both kinds of information are inevitable if engineers want to make well-grounded judgments about acceptable risks.

Experts often accuse the public of being overly frightened of new technologies because they lack the relevant knowledge and are thereby basing their reactions on supposedly irrational feelings. Interestingly, nanotechnology gives rise to greater worries amongst experts than amongst the public (Scheufele, Corley et al. 2007). Of course, this is partially due to fact that most laypeople have never heard of nanotechnology. However, given the newness of nanotechnology, we can assume that the experts are more knowledgeable than the public about nanotechnology and its concomitant risks. Apparently, their fears can be attributed to a rational understanding of the risks involved in nanotechnology. Indeed, fear can point to a source of danger to our well-being (Green 1992; Roberts 2003; Roeser, 2009).

Engineers should use these worries in the design of their research and technologies, e.g., by building barriers to prevent certain hazards from occurring or by applying a precautionary approach, meaning that technologies of which the consequences are hard to predict should first be investigated in a safe setting. If experts are worried about the safety of the products they develop, this should be taken seriously and as a warning sign, asking for a precautionary approach. Experts should communicate their emotional-ethical concerns about technological risks and benefits to the public in addition to supplying quantitative information.

Fear about unpredictable consequences concerns situations in which even the experts do not know exactly what the implications of a technology might be. However, even if the consequences of a technology are fairly well known, there can be remaining emotional-ethical concerns that should be taken seriously. For example, emotions such as sympathy help to reveal ethical considerations such as justice and autonomy in decisions about acceptable risk (Roeser 2006b). By merely focusing on for example annual fatalities, as is the case in conventional approaches to risk analysis, we might overlook other morally relevant considerations which can be revealed through emotions. Emotions about risks can be based on reasonable concerns, for example regarding justice, fairness and autonomy. These concerns should be taken seriously by engineers when they reflect about the risky aspects of the technologies they design.

In the design process there should be a discussion-phase in which the emotional and ethical concerns of the engineers and of stakeholders are made explicit, thereby facilitating ethical reflection about possible risks and how to avoid or diminish these risks. Several methods have been developed to enable reflection about technology, for example scenarios that describe situations in which the use of a technology gives rise to moral considerations (cf. e.g. Boenink et al. 2010). These methods involve narratives that directly engage the imaginative and empathetic capacities of people. To the extent that this is not already the case, these methods could be further developed to explicitly encourage emotional engagement and emotional reflection.

An objection might be that the emotions of different people are too divergent to play such an important role. To this I would like to reply that of course the emotional responses of people can differ, but disagreement is nearly always a part of collective decision making, whether or not emotions are included. We should accept the possibly diverging emotions of people and discuss the concerns that lie behind them. Considering diverging emotions and views enables more balanced judgments. Our emotions are not infallible. Just like other sources of knowledge, emotions can also be mistaken. We should critically assess our emotions, but in doing so, we should take into account other emotions, those of ourselves and of other people. Emotions can be a source of ethical reflection (Lacewing 2005). For example, an emotion such as sympathy can correct egoistic emotions (Roeser 2010).

Emotion and Responsibility in Design

Let me end my discussion by elaborating on the role emotions can play in thinking about the moral responsibility of engineers. Several authors emphasize the importance of emotions such as shame, guilt, resentment and blame for the understanding or ascription of responsibility (cf. Wallace 1994; Schoeman 1987; Eisenberg 2000). These emotions work retrospectively and negatively, by condemning failed responsibility (McGraw 1987). They can be connected with backward-looking responsibility. On the other hand, sympathy, empathy and compassion can let us be aware of our responsibility in a forward-looking sense (for the distinction between these two kinds of responsibility, cf. Nihlén Fahlquist (2008). They make us aware of actions we can perform in order to help to improve the situations of others. This is confirmed by empirical research by Paul Slovic. Slovic shows that we get ‘numbed by numbers’ and statistics concerning desasters. Emotions let us see what matters, they help to motivate. In concrete situations where emotions are aroused, people are capable of being directly involved, and indifference becomes less likely (Slovic 2010).

Backward-looking responsibility and its concomitant emotions are important, as they let people critically reflect on what they have done in the past and how they could have done things better. Ultimately, this should lead to enhanced emotional sensitivity concerning forward-looking responsibility. Forward-looking responsibility and the emotions that are involved with it are especially important in the context of the moral responsibility of engineers in the design of technology, as design is concerned with things that are yet to come.

There is a temptation to try to codify the responsibility of professionals in clear rules that provide for infallible guidelines. However, as various moral philosophers have argued, practical reality is so complex, and every situation so unique, that moral insights cannot be codified and subsumed under simple rules. Rather than applying clear-cut rules, we need context-sensitive insights. (Prichard 1912; Ewing 1929; Broad 1951 [1930]; Ross 1967 [1930]; Dancy 2004). Context-sensitive insights require moral emotions (Damasio 1994; Roeser 2006a). It can be argued that in the case of the design of risky technologies, context-sensitivity is even more important, as risky technologies can lead to new and unpredictable situations that escape codifiable rules.

This connects well with recent developments in thinking about responsibility as a virtue (cf. Williams 2008). Virtue ethicists emphasize that virtuous moral agents need their capacity of moral insight (practical wisdom or ‘phronesis’) to make context-sensitive moral judgments in complex, real-life situations. A virtuous person is somebody whose character is developed in such a way that she steers a wise middle ground between extreme responses. According to some virtue ethicists, this requires that the virtuous person has well developed emotions (Roberts 2003; Döring and Feger 2010; Roberts 2010). A virtue-responsible person is aware of the different normative claims that rest on her and makes the right decision. She is responsive to her responsibilities and ready to act accordingly.

Jessica Nihlén Fahlquist (Nihlén Fahlquist 2010) has argued that this can mean that a professional sees that she has to transcend the formal responsibility she has been assigned by her job description or her official role in the organization she works for. According to Nihlén Fahlquist, such an approach can avoid the so-called ‘problem of many hands’. This problem means that in complex projects that involve the contribution of many different professionals, things can go wrong and serious accidents can happen although nobody acted in a clearly reckless way. Rather, some people made small mistakes that would by themselves have been insignificant. However, due to an unfortunate coincidence, this results in a major accident, because several barriers have failed, as every individual relied on the expectation that the others would do a good job. A famous example is the accident with the Herald of Free Enterprise, where numerous insignificant mistakes led to the capsizing of a ferry and the death of hundreds of people. The same pattern can be seen in other major accidents as well. Nihlén Fahlquist argues that if professionals just follow a minimal responsibility, this might easily lead to gaps in responsibility distributions. This is because real life situations are much more complex than can possibly be foreseen. However, if people see their responsibility less formally, but rather act from virtue, they will extend their responsibility beyond their formally assigned role. Nihlén Fahlquist connects this with insights from the ethics of care. The ethics of care stresses the importance of caring for the needs that concrete persons have, rather than merely obeying abstract rules. People who act from an attitude of responsibility as the virtue of care will check whether things work as they are supposed to work, even if this goes beyond their own task. They will take extra actions if they realize that nobody feels responsibility for a situation that has not been foreseen and has not been formalized in a distribution of tasks. This will likely result in an environment where accidents cannot happen as easily, because more people double-check whether things are going well rather than just doing what they have been told to do. It will also entice people to come with creative solutions to new situations.

This connects well with what I have said before about moral emotions of engineers. Moral emotions make engineers sensitive to moral issues arising from the technologies they develop. Emotions let us get involved with situations. They help us transcend a detached, abstract attitude that could lead to indifference to morally problematic aspects of technologies. This is especially true in the design of risky technologies, where there might be consequences of which we do not know whether and when they manifest themselves, or that are unforeseen or difficult to quantify. A formalistic approach to responsibility can easily lead to negligence or the idea that ‘others are responsible’.

This is also nicely illustrated by a case study (described in Van der Burg and Van Gorp 2005). The design team of a new trailer was aware that the trailer could be designed in a safer way, but since the client had not asked for that, they did not explore that alternative. However, the client, not being a technical expert, was not even aware of the fact that there was a safer alternative. Hence, here the engineers should have taken a pro-active attitude, bringing this option up with the client. Van der Burg and Van Gorp use a virtue ethical approach to argue that engineers should use their imaginative capacities, for example by empathizing with possible victims of a suboptimally safe trailer, in order to come to such a more active appreciation of their moral responsibility in designing risky technologies.

All this shows how engineers can take on stronger responsibilities if they cherish their imaginative, emotional capacities that are also emphasized by various virtue ethicists. By explicitly not only allowing, but even requiring emotional considerations in the engineering arena, engineers will feel involved, responsible and prone to take action. This will lead to morally better designs and to more humane technologies.

Conclusion

In this paper I have argued that in order to have engineers who are morally sensitive to ethical aspects of their work, we need engineers who have well-developed emotional capacities. Engineers who are trained in using their empathy and sympathy can imagine themselves in different roles, for example in the role of victims of risky technologies. This enables them to realize that they should go beyond their formally defined role, and to be motivated accordingly. This means that.

  1. 1.

    we need to include emotional-ethical reflection and deliberation in the design process of risky technologies; and

  2. 2.

    we have to revise our curricula for engineering education, by including courses that enhance the emotional and imaginative capacities of future engineers.

This will enable engineers to live up to the moral responsibilities that are inherent to their work.Footnote 5