Keywords

In a famous psychological experiment, viewers watched a one-minute film in which three people in white clothes and three people in black clothes pass basketballs to each other (Simons and Chabris 1999). Each team has one basketball, and they move around in a tight area while they pass it among each other. The task for the viewer is to count the number of passes the team in white manages to complete. They are supposed to ignore the movements and passes from the team in black. The most focused viewers manage to come up with the right number, which is 15 passes. Then they are asked whether they noticed anything else during the film. Some say that they glimpsed a black figure moving through the playing area. When watching the film for a second time, everyone can see a person dressed up as a gorilla walking slowly into the middle of the area. The gorilla stops and bangs its chest, before it slowly moves on and disappears out of view. When focusing on the counting, people tend to be completely blind to the gorilla. All their attention is on the basketball that is being passed from the hands of one player in white to the next and keeping track of the number of passes. When they have the chance to watch the film a second time, without the counting task, they see the gorilla clearly and distinctly. Most are deeply amazed that they were unable to see it the first time. The gorilla is big, takes up considerable space, and moves quite slowly. It is not a tiny mouse running across the area. How is it possible to be blind to the gorilla?

When I first came across the gorilla film, I was struck by how it could be used to invite reflections on the ways people can be blind to ethical aspects of their behaviour. Those aspects can be obvious and visible to others who have a more distanced view but hidden for those who are closest to them. “Spot the gorilla” became my opening invitation when I facilitated ethics work in organisations (Kvalnes 2006). I facilitated dilemma training and ethical reflections in both the private and public sectors. The initiative to invite my colleagues and me would often come in the aftermath of ethical scandals in an organisation or industry. A contract to drill for oil in Iran had been secured through corruption. A financial institution sold high-risk products to unassuming clients. Top management in a company hired close friends or romantic relations to executive positions. Decision-makers systematically pursued self-interest at the expense of their employers or clients’ interest. One pattern in these cases was that the decision-makers often failed to notice the unethical nature of their behaviour. For others looking at the processes from a distance, it was obvious that the initiatives were unethical. Those who were directly and personally involved were so focused on other tasks that they were blind to the unethical nature of their activities.

I also wonder about ethical blindness when I encounter unethical suggestions and initiatives in my professional work. When I am invited to talk about ethics in organisations or at industry conferences, I tend to say yes, if it does not conflict with other engagements. On two occasions, I have faced unethical suggestions when I have turned down such invitations because I have already agreed to talk about ethics in another setting at the specified time. These were the two responses I received when saying no on those occasions:

  • What if we pay you extra to come to our organisation instead? Would you reconsider?

  • Do you mind if we put your name in the program, and then on the day of the conference we can say that unfortunately you were unable to give the talk due to illness?

The first organiser offered to pay me extra for breaking a promise to come to one organisation to talk about ethics so that I would visit their organisation to talk about ethics instead. The second invited me to be part of a lie to the participants at their conference. On both occasions, the suggestions were impulsive. No deliberate thought processes preceded them. The initiators were applying what Kahneman called System 1 decision-making, the quick and intuitive way, rather than the analytic and slow System 2 method (Kahneman 2013). In the conversations that I had with them immediately afterwards, it became clear that they were initially blind to the unethical aspect of their suggestions but were able to spot it when slowing down and giving proper attention to them. Then they could also appreciate the paradoxical dimension of their impulsive suggestions, using unethical means to secure a contribution about ethics in their organisation.

We can interpret unethical behaviour as an indication of character flaws in the decision-maker. Here we have an immoral person, someone acting from dubious moral standards. The invisible gorilla phenomenon provides an alternative perspective. People may be so eager to solve a problem that they quickly identify and articulate a solution, without considering the ethical aspects. Once they slow down and watch the film a second time, so to speak, they can see the unethical nature of their initial suggestion and retract it. This is what happened on both the occasions above. When the initiators had some time to realise what they had suggested, they saw the situation differently and realised that they had come up with an unethical proposal.

The gorilla experiment serves as a reminder of why it can be fruitful to take a philosophical view of not only specific incidents and situations but also of one’s practices and habits more broadly. Socrates invited his fellow citizens of Athens to take a step back and critically reflect on their priorities and ways of living. He warned against becoming trapped in negative and destructive patterns. By adopting some distance to one’s everyday activities, it would be possible to reconsider and rearrange them to be in better harmony with what one considers meaningful and worthwhile. In a hectic and nonreflective life, a person can end up following patterns that do not conform to what one truly values and holds dearest.

Researchers refer to the phenomenon illustrated by the invisible gorilla experiment as inattentional blindness (Simons and Chabris 1999; Mack 2003; Kreitz et al. 2016). The subcategory I introduce here can be called ethical inattentional blindness. It most obviously includes situations where people are blind to the unethical aspects of their and others’ behaviour. It is also possible to be blind to ethically positive aspects of one’s or other people’s behaviour. A person might perform an ethically good deed but not think of it in ethical terms because it comes naturally to them or has become a habit.

Organisations can prepare for inattentional blindness among employees and stakeholders by building a communication climate where it is normal to spot and speak out about the gorillas. We can learn from the experiment that everyone has a limited capacity to notice important aspects of what is going on in the workplace. High-tempo and complex tasks increase the probability that we will be blind to significant elements in the projects on which we are working. Each individual notices and dwells on different aspects of a situation. A communication climate can facilitate conversations about the nuances and details that different colleagues see. It is likely that we turn up for work wearing what we may call routine glasses. Work becomes routine, and we tend to inspect the work environment and the activities happening there through the same familiar lens. The point is relevant far beyond ethics: Wearing our routine glasses, we may fail to see new developments, opportunities, obstacles, and so on. Colleagues in the team may have changes and become more reliable or less reliable than they used to be. What used to be a stable development in the right direction could imperceptibly change from day to day, and now we are heading in the wrong direction. Research on inattentional blindness tells us that we can be slow to notice changes in the organisation because we are studying the world through the same lenses, as always, or through what I have suggested we call routine glasses.

Newcomers in an organisation can be a phenomenal resource when it comes to overcoming or countering inattentional blindness. They enter the workplace with a fresh pair of eyes and can see important things that the more experienced people fail to notice. Here is an example: A newly recruited engineer at the local offices of the Norwegian Road Authority in Bodø, a town in the north of the country, participated in a gorilla experiment I conducted at the beginning of a workshop on communication climate. The participants and I dwelled on the ways people who are new to an organisation can spot the gorillas that are invisible to more experienced colleagues. The new engineer explained to the group that he had set eyes on something that his more experienced colleagues appeared to be unaware of or ignore. At the end of each workday, he would manoeuvre his car out of the parking area outside the offices and head towards the entrance to the main road. When he came to a spot where he was going to enter the main road, he would look left and right to check for traffic. However, he could not see whether any cars were coming from the right. Why? A big sign blocked his view. The sign said “Welcome to the Norwegian Road Authority.” This newcomer brought attention to the fact that his employer, responsible for road safety in Norway, had put up a sign that created a dangerous situation whenever a driver attempted to enter the main road outside its local offices in Bodø.

When the newcomer had finished speaking, his colleagues looked at each other in disbelief. Was this really the case? One of the more experienced colleagues confirmed that the newcomer was right. He remembered that the sign had caused uproar when it had been put in place some years ago. Everyone agreed that it was a threat to traffic safety and that it had to be removed. Then the days, weeks, months, and years went, and people became used to the sign. One engineer said that everyone who worked there and used the parking area knew that a driver had to lower his or her head down to the dashboard of the car and look beneath the sign to check for traffic from the right. Now, everyone agreed that this was unacceptable and that the sign should be removed as soon as possible.

Kahneman (2013) noted the double nature of inattentional blindness. Not only are we blind to the gorilla but our surprise in spotting it upon reviewing events means that we are also blind to our inability to see important aspects of what is in front of us. We can be blind to the obvious, and we are also blind to our blindness. The example from the Norwegian Road Authority illustrates this point. When the newcomer brought attention to the dangerous sign, his colleagues became aware that they had gradually become blind to it, and their surprise in rediscovering the sign caused a realisation that they had been blind to their tendency to be blind to phenomena that are in front of them.

Research on inattentional blindness indicates that we rarely see what we are looking at, unless our attention is directed at it (Mack 2003). A closely related obstacle to rational decision-making is the tendency we have to seek information that confirms our beliefs and to ignore or distort information that contradicts them, a phenomenon called confirmation bias (Nickerson 1998). Once we have made up our minds about something, we tend to interpret new and incoming information in ways that confirm our hypotheses or beliefs and disregard open and available evidence suggesting that we should revise them. We selectively seek information that is consistent with our prior beliefs. Confirmation bias is a pervasive phenomenon that affects the way people monitor information and the way they protect and strengthen their beliefs about a wide range of issues. An example in present times is the way so-called climate sceptics dismiss growing scientific evidence that shows human-caused climate change and rising temperatures are affecting living systems, leading to dramatic ecological change and destruction (Zhou and Shen 2022).

Even with confirmation bias, it is possible for organisations to counter the negative consequences of it by developing a communication climate where it is normal to challenge and question each other’s beliefs and habits. Confirmation bias affects decision-making in a range of professional contexts, and there is need for friction to neutralise it. Researchers can have exaggerated confidence in their initial hypotheses and become blind to disconfirming evidence. Schoolteachers can have fixed beliefs about the levels of performance they can expect from their pupils and miss important developments in their learning processes. People who work with recruitment in organisations can have first impressions about candidates and fail to take in information that gives them reasons to reconsider those initial beliefs.

In terms of societal relevance, confirmation bias in police work deserves particular attention. Police investigators can be under pressure to find out who committed a particular crime. Once they have identified a suspect and become convinced they have found the culprit, they can become blind to information pointing towards other possible scenarios in which the suspect is innocent. Confirmation bias can also affect prosecutors, judges, and jury members and lead to wrongful convictions (Rassin et al. 2010).

How can the police counter the tendency to become blind to information disconfirming their hypotheses? One experienced police investigator has told me that in his unit officers have become increasingly aware of confirmation bias and other cognitive traps that can threaten the quality of their investigative work. One countermeasure has been to operate with several hypotheses simultaneously in the beginning of an investigation rather than just one. By waiting before they narrow down the number of hypotheses, they can gather and consider a broader set of evidence. Another initiative meant to neutralise confirmation bias is to split the work between colleagues so that the critical evaluation of one hypothesis in light of available information is not conducted by the investigator who initiated it. Research indicates that it makes sense to distribute responsibility in this manner. Presumptions of guilt can affect the questioning style and treatment of information (Hill et al. 2008). One scenario study found that police officers who had decided to apprehend a suspect chose more guilt-presumptive interrogation questions than those who had not partaken in the decision to apprehend. They also rated the suspects as less trustworthy (Lidén et al. 2018).

In this chapter, we have seen that inattentional blindness and confirmation bias can trigger detrimental decision-making and behaviour in the workplace. The invisible gorilla experiment vividly demonstrates why organisations depend on a well-functioning communication climate as a platform for making rational and responsible decisions. Inattentional blindness can cause us to make decisions that are irrational and unethical in light of what we want to achieve and the way we want to proceed. We rarely see what is in front of our eyes unless our attention is directed at it. Due to ethical inattentional blindness, we can become involved in ethical misbehaviour in our quest to reach some economic or strategic goal. We can learn from research in this area that if we do not attend actively and consciously to ethical aspects of what we are currently doing, we are likely to become blind to them. Responsible leadership entails asking questions such as is what we are doing fair and reasonable? Are we sufficiently responsive to stakeholder concerns? Are we behaving in an honest and transparent manner? Questions such as these can bring ethical aspects to the forefront of our attention.