“After takeoff from Kristiansand on our way to Oslo, we experienced a brake pressure leak that caused some shaking in the plane. We tried the standard procedures to neutralize it, with no effect. Then we tested other options, and found that the shaking stopped when we put on the brakes. The co-pilot and I agreed that of course we would release the brakes before landing. Now we had found an unconventional solution to an immediate problem, and would switch back to the normal non-deployment of the brakes when preparing for landing. Usually, when there is something out of the ordinary that we need to remember, we create a reminder, by taping a piece of paper to the window in the cockpit, or something odd like that. This time we did not do that, since we thought it was unnecessary. Checking that the brakes were off would turn up not only once, but twice in the checklist procedures before landing, so to our minds, there was no risk at all that we would forget to release the brakes. The flight continued, and we did the first check. I answered automatically that brakes are off, without actually thinking and taking them off. Then later, for the second time, we did a checklist procedure, and again I answered as I always do, that brakes are off. The result was that we landed with the brakes on, and it was a very rough and unpleasant experience for the passengers and the staff onboard. The tires exploded, and the plane came to a halt across the runway, and not parallel to it, as it should have. Nobody got seriously injured, but it was a shocking experience for everybody, not easy to shake off and forget” (Gimmestad, 2016).

Jarle Gimmestad is an experienced former pilot, who now works as a safety consultant in industry, healthcare, and travel. His own story about landing with the brakes on serves as evidence that pilots, like the rest of us, are prone to make mistakes. He also uses it as an invitation to participants in seminars about safety to open up about their own professional fallacies and mistakes, lowering the threshold to do so. Once the former pilot has admitted a mistake, it is easier for others to do the same. The conversation can begin about human errors and the ways in which to deal with them.

The introduction to this book included another Gimmestad narrative, about the driver of the pushback tractor who made the pilot aware of dripping from the wing, and who persisted with his feedback, even after the pilot had signaled a stop to the conversation. It illustrates the strong emphasis on teamwork in aviation. Even the lowest ranked employee has a responsibility to intervene in a situation where he or she senses that something is wrong. It is also the responsibility of the highest ranked employee to take such interventions seriously.

The main sources of data for the current chapter are extensive interviews with Gimmestad about safety in aviation. We first met in 2009, when I was writing a book in Norwegian about fallibility at work (Kvalnes, 2010), and have remained in contact since then. The relation has gone beyond that of being researcher and informant, in that we have taught seminars and given conference presentations together, combining theoretical and practical, experience-based input about fallibility at work. The interview method has been one where we talk extensively about narratives and cases, I write them down, get feedback from him about the content, and rewrite the text accordingly. The primary theoretical input in this chapter is a barrier model to structure thinking and activity connected to safety (Reason, 1990). It has applicability beyond aviation and safety. Organizations can use it to (a) create awareness, (b) implement analysis and (c) prepare for action in settings where errors can lead to unwelcome outcomes.

1 Inattentional Blindness

Safety in aviation has improved in recent decades because of a shared realization that pilots are fallible beings. There has been a shift in attitude, from seeing pilots as extraordinary, infallible individuals who could be trusted to bring the plane safely to its destination, to understanding air travel as depending on teamwork, where all the individuals involved depend on feedback and support from others. The realization that each individual is fallible and depends upon others to intervene when he or she appears to make a mistake has caused a breakthrough in safety practices (Helmreich & Davies, 2004; Stoop & Kahan, 2005). The development has been noted in healthcare, where the aviation approach has inspired similar practices of coping with fallibility (Kao & Thomas, 2008; Pronovost, et al., 2009; Wilf-Miron, Aviram, Benyamini, & Lewenhoff, 2003). Strategies for learning from mistakes in healthcare is explored further in the next chapter.

Personal narratives about mistakes are a rich source for learning ( Bledow, Bister, Carette, & Kühnel, 2017; Rami & Gould, 2016). Jarle Gimmestad shares a range of stories from his own time in the cockpit with his audiences. A story about the aftermath of the brake incident and how it was handled in his organization, generate further learning points. Two aspects stand out, one regarding knowledge, and another regarding perception. First, his bosses were pondering what to do with Gimmestad after the event, and ended up sending him on a three-days course in how brakes function, thus indicating that what he had been lacking on that dramatic day was basic brake knowledge. They reduced the problem to something concrete and tangible that could be fixed by introducing the pilot to new knowledge. From a philosophical perspective, this can be seen as a contemporary version of Socrates’ idea that for a person to do the good, it is enough that he knows the good. As an explanation of Gimmestad’s mistake, it seems rather weak and unconvincing. It is unlikely that he forgot to put off the brakes because he did not know about the functioning of the brakes, and would have acted differently if that knowledge had been in his possession at the time of the event. Sending Gimmestad on that course appears to originate from a misunderstanding of the causes of his conduct, a simplistic and technical response to a complex set of challenges connected to fallibility and the interaction between human beings and technology.

Second, the words most emphasized by Gimmestad’s main boss in the conversation after the event were that he trusted that there would be no repetition of that particular kind of mistake. “I am sure that you will never again land with the brakes on in your pilot career.” He has turned out to be right about that, but on hindsight, Gimmestad believes that his boss’ words made him exaggerate his attention to the brakes, at the expense of other and equally important aspects of the situation before, during, and after a flight (Gimmestad, 2016).

When a person is encouraged to focus on one particular aspect of a complex situation, it can lead a blindness to other significant aspects, as documented in studies in perception psychology (Mack, 2003; Simons & Chabris, 1999). When you tell a pilot or a professional in other settings that they are not likely to that particular mistake again, it can create a strong motivation to make your words come true. That in itself can trigger aspect blindness since it draws the professional’s attention to one particular aspect of the situation, much as in the gorilla experiment (Simons & Chabris, 1999), mentioned in Chap. 2. Gimmestad says that the period after the dramatic landing was one where he was particularly attentive to the brakes, and made himself vulnerable to overlook other important matters in the cockpit. That might have been the time in his career when the safety of flying with him was at its lowest.

Inattentional blindness is a phenomenon that poses a threat to safety, and to the success of other collaborative processes. One by one, individuals have a limited ability to perceive what goes on around them, and depend upon colleagues to intervene when they are blind to significant aspects of what goes on in their work environment. As noted earlier, the experience of being blind to something that is right in front of their eyes comes as a considerable surprise to participants in experimental studies. It can generate a realization that we are dependent to a high degree of input from other people’s perspectives in order to get a rich and adequate understanding of what goes on in our work environment. The next section focuses on a model central to systematic efforts in aviation to counter the pervasive threat of inattentional blindness. It is a model that can be adopted in other organizational settings to create awareness and readiness for action in situations where people make mistakes.

2 A Barrier Model

Over the years, reflection on practice has strengthened safety in aviation. A combination of practical and academic contributions have highlighted the need for precise and direct communication, and a development from a heroic and individualistic approach, to a more collective one, where teamwork is essential. Theoretical contributions from Reason (1990) have been central to this development, first through the establishment of a vocabulary to distinguish between different kinds of error, and second through his so-called Swiss Cheese Model for dealing adequately with error (Reason, 1990). Both of these conceptual sources have relevance beyond aviation, as they can be useful in analyses of fallibility and error outside the safety domain.

Reason distinguishes between execution errors and planning errors. With the former, the plan is fine, but the execution faulty, while with the latter things go wrong from the start, since the plan is inadequate for the task ahead. Furthermore, he separates between two kinds of execution errors, and calls them slips and lapses. Slips are actions not carried out as intended or planned, as when a person struggles with digits on a phone when dialing in a frequency. There can be “Freudian slips” when a person intends to say one thing, but inadvertently ends up saying something revealing about his or her real attitudes or thoughts. The idea is good, but not the execution. Lapses are missed actions and omissions, as when somebody has failed to do something due to lapses of memory or attention, or because they have forgotten something. Gimmestad’s landing with the brakes on is an example of a lapse (Reason, 1990).

A student presented another example of a lapse to me at a seminar at the Norwegian Police University College. The agent was a police officer who was an expert at rapidly disarming people who point a gun at him. He had built up this expertise through thousands of repetitions in training. The police officer had asked colleagues and friends countless times to point a gun at him, and he wrestled it off them with amazing speed, repeatedly. When he encountered a real and dangerous situation, coming face to face with a gunman in a supermarket, things went well in the beginning. He used his impressive skill to quickly take the weapon out of the hands of the gunman, thus removing his ability to cause serious harm. Then, the policeman proceeded to hand the weapon back to the gunman, reinstating him in a position to cause harm. That was the movement automated though all the repetitions with colleagues and friends. He had grabbed the weapon, handed it back, grabbed the weapon, handed it back again, repeatedly. The police officer was saved through the intervention of a colleague, who was able to disarm the perplexed gunman a second time. A lesson from this example is that it matters how you frame the training situation, since every movement can become automated, even unwelcome ones like handing a weapon back to the person who initially has it in his or her hands.

Slips and lapses, then, are execution errors. In Reason’s vocabulary, they differ from mistakes, which are a type of error brought about by a faulty plan or intention. You make a planning error or mistake when you do something believing that it is the appropriate and correct thing to do, when in fact it is not. As discussed in the previous chapter, we can distinguish between active and passive mistakes, where an active mistake is to do something you should in fact not have done, while a passive mistake is to refrain from doing something you should in fact have done.

A common feature of slips, lapses, and mistakes is that they can start a chain of events that lead to some sort of accident or unfortunate outcome. Reason argues that systematic analyses of accidents need to take into account why the error has occurred. It is easy to start the blame game and point the finger at the person who has slipped, lapsed, or made a mistake, but a thorough understanding of the event at hand needs to clarify the systemic aspects. To what extent have the persons who erred received proper support, training, and guidance? To what extent can long working hours or other potentially stressful factors have contributed to the error? Questions like these are geared towards detecting the root causes of the event, and to keep at bay the understandable instinct to find a scapegoat.

Reason’s (1990) Swiss Cheese Model contains three main elements: Error, barriers, and accidents. The main idea is that an error sets in motion a chain of events that leads to an accident, unless there are barriers in place to stop it. Gimmestad started landing procedures with the brakes on, and although that lapse did not result in casualties, the resulting landing constitutes an accident. It could have been avoided if there had been barriers in place to stop the causal chain. Reason distinguishes between three kinds of barrier elements: Technology, procedures and rules, and human intervention. At the time when Gimmestad made the landing with brakes on, there was no technology in place to prevent it from happening. There were procedures to make him and the co-pilot consider the brake issue, but that did not suffice to stop the chain of events either. Finally, the human element could have consisted in an intervention from the co-pilot, who could have challenged Gimmestad and been more alert to the brake issue. Today, technological improvement is in place, making it impossible to replicate the mistake of landing with the brakes on. That came about as an acknowledgement that these are the kinds of errors humans are likely to make, and that cannot be eliminated through training or exercises in awareness.

When a pilot makes a mistake, and the barriers are not sufficiently strong to halt the fatal causal chain it sets in motion, the bad outcome normally occurs quite rapidly, in a matter of seconds or minutes. In other settings, the time from the mistake to the unwanted result can be much longer. On September 8, 1989, Partnair Flight 394 crashed off the coast of Hirtshals in Denmark, and all the fifty five people on board died. The main cause of the crash was a mistake made three years earlier, when cheap, counterfeit aircraft parts where used instead of original ones, to fix the tail of the aircraft. These parts where not of the required quality, and gradually wore out, leading to a collapse of the tail. The mistake of using low-quality parts set in motion a causal chain of events that ended in the fatal accident three years later (Report on the Convair 340 aircraft accident, 1993). Inspections of the aircraft could have functioned as barrier elements to stop it, but in this case, there were neither technological, procedural, nor human factors in place to avoid the crash from happening.

I have applied Reason’s model in offshore engineering settings, and asked experienced professionals to provide examples from their own work environment, where a mistake can lead to an accident or unwanted event. One engineer said that if he made a mistake on the drawing board today, and nobody, including himself noticed, it could set off a chain of events leading to a bad outcome in about three years, at the bottom of the ocean, where some components in a complex structure would not fit together or not function properly. Even in that kind of work environment, there is a need for efficient barriers to stop the mistake from causing a negative outcome. Technology, procedures, or human intervention can serve to identify the mistake and break off the series of events that otherwise will lead to an unwelcome result. Three years provides more time for a barrier to work, but it might be that the crucial time to detect the mistake and stop it from causing trouble is quite short. If nobody notices anything or takes action in the beginning, there may be no further quality checks of the drawings. The production phase sets in with an undetected mistake on board.

In the engineering context, I inquired about whether people who detect mistakes and intervene receive applause in their work environment. One way to strengthen the barrier system can be to celebrate the instances where a person voices a concern and steps out of passivity. Depending on the size and importance of the project and the savings brought about through the intervention, the active person can receive minor or major hero treatment. The response from the engineering group was that the heroes in their work environment are not those who speak up in critical quality moments, but rather those who step in once an unwelcome event has occurred, at the bottom of the ocean or elsewhere. These are the people who do damage limitation, and are experts at fixing things that are already broken. Things look bleak, but then these exceptional professionals turn up to minimize the negativity. Reflections on this issue brought about a shared realization that even the people who speak up earlier, to stop the unwelcome events to happen in the first place, deserve positive attention in the organization.

The distinction between active and passive mistakes can also help explain reluctance to take an initiative and voice a concern. When you speak up, chances are that you are raising a false alarm, and that constitutes an active mistake, doing something that it turns out you should not have done. To keep quiet in such situations might turn out to be wrong but only constitutes a passive mistake, to refrain from doing something you should have done. You may get away with it more easily than an active mistake. In organizations with a more or less acknowledged preference for passive mistakes over active mistakes, chances are that people opt to say nothing. Efforts to make it normal and appreciated to voice a concern need to build a tolerance for active mistakes in what people perceive to be critical quality moments.

Reason came up with the name Swiss Cheese Model to draw attention to a potential weakness in the barrier mentality he proposed. When people start to think about safety and prevention in barrier terms, they may end up judging the strength of the barrier system in terms of the number of layers it consists in. The more layers, the better. If you have a procedure consisting of safety checks at three different times, it appears to create better safety than if you only have one safety check in place. This way of thinking can create a false sense of safety, according to Reason. He proposes that we should compare each layer in the barrier with a slice of Swiss cheese. What they have in common is a propensity to have large and small holes in them. If we are unlucky, the holes in the barriers are placed next to each other in a way that allows the negative chain of events to travel straight through. We may be content with the high number of layers, but an experience that a negative outcome occurs after all, because we have underestimated the size and positioning of the holes in each layer.

One of my students in a leadership and safety class gave the following example of how a higher number of barrier layers can cause less rather than more safety. She worked in a hospital unit where they sometimes treated dangerous patients, who needed to be checked for weapons and other dangerous objects when they entered and left the premises. It had been the responsibility of the police to check the patients when they went in or out of the hospital. In order to make sure that they came and left unarmed, a second round of checking, conducted by hospital staff, was introduced. The intention was to make the system twice as safe, but in reality, the new system led to lenient controls both by the police and by hospital staff, since both groups had in mind that another group would also check the patient for dangerous objects. The introduction of the second barrier level created a bigger hole in the existing one, and it also came with a hole itself.

The barrier model can also be useful in analyzing creative processes. As discussed in Chap. 2, effective development of new products and services depend on producing intelligent failures as quickly as possible. To persist with a proposal that really is not that good, is a mistake that will lead to a big or small disaster later, unless there are barrier elements in place that cut off the causal chain of events. It takes courage to speak out against a proposal and claim that it should be scrapped.

We can redescribe in barrier terms the three psychological phenomena mentioned as obstacles to detecting and speaking out about mistakes. (1) Sunk cost fallacy can create a weakness in the barrier, if the people who are supposed to intervene and take action when they spot an error, have invested heavily in the development of the idea from which the error generates. In order to intervene and stop the chain of events, they have to admit flaws in their own previous thinking and priorities. That makes them unreliable as contributors to the barrier system. Furthermore, awareness of (2) the bystander effect can counter an unwarranted trust in the barrier system based on numbers. We may think that we can strengthen the human dimension of the barrier system, and the likelihood that someone will intervene in critical situations, by increasing the number of people who are in a position to follow the processes and speak their minds. Research on the bystander effect indicate otherwise. The more people who are included as witnesses to the processes and invited to intervene, the less likely it is that one or some of them will actually do so, due to diffusion of responsibility and doubts about one’s own personal judgement. Finally, (3) according to research on the confirmation trap, we tend to favor evidence that supports our existing beliefs, and overlook information that gives us reasons to reconsider. The human, interventionist elements in a robust barrier system depend on people who are able to detect discrepancies and unexpected turns of events. One such element can be that an experienced professional, who usually does exceptionally good work, has an off-day and is about to put people at risk because of a misjudgement of a situation or a lapse in concentration. Knowledge and awareness about these three psychological phenomena, then, are important in designing an organizational climate where people take action when they spot what they perceive to be a mistake.

3 Beyond Hint and Hope

Human intervention is often the most challenging kind of barrier element to put in place. Technology and procedure plans are more concrete and tangible. Creating a work environment where it is normal to voice your concerns is not so straightforward. The essence of the human element in barriers is that people need to speak up when they witness something out of the ordinary, events that startle, surprise, or frighten them. It seems that aviation has managed to make it normal to do so, thus creating a safety culture that other professional disciplines can take inspiration and learn from.

Speaking up when you sense that somebody has made an error or is about to do so, can be particularly hard for a junior person towards a senior person in an organization. A newly employed person may be less prone to the aspect blindness mentioned earlier, and may see things that the veterans in the workplace are unaware of, but also be unsure about whether it is a good and welcome thing to speak up. A way of communication that has been detected in aviation and in healthcare in such circumstances is what has been called hint and hope. A person, who perceives that something is wrong, but is afraid of the consequences of intervening in the situation, may decide to give a hint about his or her observation, and hope that it will be sufficient to generate a positive response. Investigations into accidents in aviation and healthcare have documented a range of hint and hope responses. A nurse sees that the anesthetics doctor is preparing to set a syringe in what she perceives to be the patient’s wrong shoulder. They are supposed to perform surgery on the left shoulder, and not the right one that the doctor is now getting ready to treat. The nurse is not completely confident in her judgement, and thus decides to hint rather than say out straight that they are now focusing on the wrong shoulder. Then things happen very quickly, the doctors in charge do not understand the hint, and they cut open the wrong shoulder. In the investigation that takes place after the event, the nurse claims that she tried to tell the doctors about the emerging mistake, while they say that she did try to say something to them, but the message was unclear.

It is understandable that people turn to hint and hope instead of addressing an issue in more direct manner. The motivation for vague and indirect communication can protect both the sender and the receiver from unpleasantness.

A lot us are taught that it is not polite to confront another person by directly stating a problem, opinion, or disagreement. Hinting and hoping is a communication strategy that courteous people are tempted to use to avoid confrontation, to preserve someone else’s sense of dignity or status, or to protect themselves from criticism and rejection. People hint and hope every day. (Gordon, Mendenhall, & O’Connor, 2012, p. 59)

When hint and hope works, it is an elegant form of communication, where you succeed in correcting a person’s behavior in other people’s presence, without anybody else noticing it. On other occasions, the hinting is a feeble and weak barrier that cannot stop a mistake from creating a horrible outcome. The Tenerife disaster on March 27, 1977, where two Boing 747 airplanes from Pan American and KLM crashed on the runway, killing 583 people, one of the pilots took off before having received clearance to do so. A recording of the conversation inside the KLM plane reveals that the flight engineer hints that the other plan may be in their way. “Is he not clear, that Pan American?” (Weick, 1990). The warning signal he provides to the pilot is not strong enough, so he proceeds to take the plane onto its fatal journey. Here is an example hint and hope as part of a weak barrier system. The pilot makes a mistake, and it starts a causal chain that ends with disaster, since no barriers are in place to prevent it from happening. A steadfast and persistent flight engineer or co-pilot could have made a difference, but none of them dared to confront their senior, who was one of KLM’s highest ranked and most respected pilots. The pilot had recently provided the first officer with a qualification check to work in a Boing 747, and that might have contributed to make the threshold for confronting him higher than normal. In his analysis of the accident, Weick (1990, p. 574) comments: “Perhaps influenced by his great prestige making it difficult to imagine an error of this magnitude on the part of such an expert pilot, both the co-pilot and the flight engineer made no further objections.”

I witnessed an interesting example of hint and hope during a seminar for leaders in a Norwegian city council. Before the seminar, the administrative leader told me that he wanted to say a few words of truth to the fifty or so participants. He said to me that he was disappointed with the collaboration between them. Individually, they were thinking solely about their own units, and not about what would be best for the city council as a whole. There was little solidarity among them. Now he had the opportunity to confront them and demand improvement.

The leader then took the podium and told the leaders a story about gees, about how they fly together and support each other. Whenever one goose struggles to keep the tempo during flight, two other gees will connect to it and help it to gain speed. Whenever the leader goose is exhausted from flying in the front, another goose will take over, and allow the leader to rest. The audience smiled politely at the story, and that was it. Afterward, I talked to the administrative leader, who was very pleased with himself. “Now I really gave them something to think about.” he said, indicating that he thought he had been sharp and direct in pointing out a lack of collaboration amongst the leaders. From my perspective, he had failed in addressing the issue properly. I doubt that any of the leaders noted a critical or challenging note in the story about the geese. It was another example of hint and hope, of fruitless communication based on a wish not to hurt or anger anybody. The incident can also be analyzed in the terms from Reason’s barrier model. The administrative leader perceived that the city council leaders were on the wrong path with regard to collaboration and solidarity, and attempted to stop a chain of events ultimately leading to the suboptimal use of public resources, and worse service for the citizens. It was most likely an unsuccessful attempt, since he used hint and hope, rather than direct communication.

In aviation, there has been a quest to move beyond hint and hope, to more direct and unambiguous ways of communication. The Tenerife disaster was a turning point, generating activities to improve feedback quality amongst employees, under the heading of Crew Resource Management (CRM). Gordon et al. (2012, p. 59) convey how CRM encourages crew to focus on what is right rather than who is right, and thus draws attention to the matters of fact rather than on opposing views and rivalry amongst colleagues about who has the most appropriate understanding of the situation. Personal prestige can stand in the way of clarification of the situation at hand, since it makes people hold on to their own beliefs, even beyond the point where they have obtained strong reasons to revise them. CRM is all about challenging each other in respectful manners, with a constructive intention. The person who is expressing a concern should be specific about the content, and timely, not hesitating to speak up at the moment when something appears to be wrong. CRM encourages crew to seek information, ask questions and push for clarification of situations that appear ambiguous to them. In order to be effective, the human dimension of a barrier system depends on a wholehearted commitment to these principles of direct and unambiguous speech.

Flight engineer Morten Theiste conveys an experience where a pilot he was working with needed a reminder about his commitment to CRM (Theiste, 2017). This pilot had trouble with the autopilot in the aircraft on the second last leg of the day. The device had disconnected several times. Even though the crew could reconnect it, the autopilot continued to disconnect. The pilot was looking forward to a short turnaround in Oslo before his last leg to home base in Copenhagen but he had to report the autopilot problem to technical staff in Oslo. He considered it to be a minor issue, and thought that he could easily fly the aircraft manually home and have the Copenhagen technical staff to look at the autopilot during night stop.

“I was called out to meet this crew to check up the matter at the gate after landing. The aircraft had been emptied and was ready for boarding when I came to the gate. The captain explained the problem to me. I said that I needed to go back to the hangar to check the technical manuals about the specific logic behind the autopilot disconnect during the described circumstances. Sometimes an autopilot disconnect may indicate that something more is wrong than just the autopilot itself. When I explained this to the captain, he went totally mad, shouting at me, calling me different ugly names and said he needed the turnaround to be fast so that he could return home to his family. He did not need the autopilot to fly back to Copenhagen. The captain verbally abused me and made me almost speechless. After a while, I simply asked him:—Are you angry with me?” (Theiste, 2017).

This simple question got the pilot to see the situation more clearly, much like the pilot in the situation with the persistent driver of the pushback tractor in the introduction to this book. “I saw in his face that he suddenly was reminded of the CRM training he had been through on how to communicate to each other in the aviation industry. He then realized that he had been acting in an unprofessional manner and that it was a great thing that I took the safety of the passengers seriously and did not immediately release the aircraft” (Theiste, 2017).

An hour later, the aircraft was ready for takeoff, after a thorough investigation of the technical issue with the autopilot. The two professionals at the core of the episode had experienced a critical quality moment, a situation where the flight engineer could have succumbed to the pilot’s strong wishes to ignore the technical problem and proceed immediately to takeoff. Verbal abuse from a senior person can easily lead to such a decision from a junior person. It is the kind of behavior that can weaken the will to speak up, and thus can pose a threat to the robustness of a barrier system. In this particular situation, the flight engineer stood his ground, and his reminder to the pilot about the common platform for communicating about safety was enough to diffuse the tension and get the professionals back on track together.

4 Teamwork

One further narrative about barriers and safety illustrates how Reason’s model is relevant beyond aviation. It concerns pilot Gimmestad’s experience when he went through laser eye surgery. The narrative also highlights the nuances between teamwork and individual expert effort. One surgeon and two nurses were in the operating room with him, and he was awake during the entire two-hour operation. One thing started to worry him as the operation proceeded, and that was the lack of talk around him. The operating room was quiet, with no conversation going on between the three people who were working on his eye. “I have learned the people who work together on complex tasks, should talk with each other, to ensure that things were done in the right manner. In a cockpit, silence is a sign of potential danger. It can mean that something out of the ordinary is going on, and the persons involved are confused or uncertain about what to do.” (Gimmestad, 2016) When listening to conversations recorded in cockpits before plane crashes, one striking feature is that the people involved gradually speak less and less to each other. With this knowledge in his mind, Gimmestad found the silence in the operating room disconcerting, and wondered why the surgeon and the two nurses were not speaking to each other.

The operation on Gimmestad’s eye went well, so the silence turned out to be a false alarm. Nevertheless, the pilot was curious about the lack of talk, and asked the surgeon about it afterward, explaining that a crucial feature of safety in his own profession was the conversations in the crew. “Who is your co-pilot during an operation?” he asked the surgeon. The response was that the surgeon did not have a person to talk to like that, and did not perceive that he needed one either. It appeared that the surgeon considered himself to be so skillful with his tools that he did not need people around who could correct or challenge him in critical situations. Gimmestad wondered why the nurses could not be involved as conversation partners during an operation, to ensure that things were done in the right order and that mishaps would be spotted and addressed. The surgeon dismissed that idea, claiming that the nurses were not on his level of expertise and experience. The pilot retorted that at least some nurses are experienced, and have participated in many complex operations, gaining knowledge about procedures and possible complications. “That may be true, but they will never be on my level,” answered the surgeon (Gimmestad, 2016).

No matter how brilliant the surgeon is in his work, it seems unlikely that he will go through his professional life without making errors that can have dramatic negative effects on patients. With the attitude he expressed in the conversation with Gimmestad, it appears that the barrier system to detect and confront his wrong moves is weak or even nonexistent. A slip, lapse, or mistake from this surgeon is likely to start a causal chain of events that will not stop until a patient has been injured. He seems to perceive himself as an infallible individual, who may need others for assistance and help to keep processes flowing, but not to critically evaluate his decisions and behavior as they happen.

I have discussed this story with experienced healthcare staff, who are critical of the surgeon’s apparently dismissive attitude towards the nurses’ possible role as dialogue partners during the operation, and towards the need for collaboration and feedback from colleagues. However, they say that one reason for the quiet that concerned Gimmestad can be that the surgeon performed a high precision operation, requiring intense personal concentration to be able to things exactly right. During such a process, talk may be counterproductive. Those moments of deep concentration do not take up the full two-hours process, so can only account for some of the silence the patient encountered.

It has become safer to travel by airplane after a shift from an individualistic to a more team-oriented approach, where it has become normal to challenge the decisions of the pilot, who we no longer consider to see as an infallible superman. Practitioners in healthcare and other parts of organizational life can learn from this development towards non-heroic professionalism. From time to time, stories of heroism still occur in aviation, none more dramatic than when captain Chesley B. Sullenberger on January 15, 2009, landed US Airline Flight 1549 on the Hudson River, after the plane had hit a flock of geese and lost power in both engines. In interviews, Sullenberger has reiterated that the successful landing and subsequent evacuation of the 155 people on board was a team effort, involving the entire crew. Nevertheless, he is the one who gets public attention and hero treatment. One particular detail in the transcript from the cockpit voice recorder indicates that Sullenberger’s collaborate mentality is real. His final remark to the co-pilot as they are approaching the water and getting ready for impact is “Got any ideas?” Here is an open invitation to the co-pilot to contribute, and not hold back any suggestions he might have about how to proceed from here. Those three words seem to express personal vulnerability, a realization that they are a team who are in this situation together, and need to draw on their collective resources to get out of it, irrespective of rank and position. Now is the moment to speak up. The co-pilot answers “Actually not”, right before impact (Brazy, 2009).

This chapter has presented narratives from aviation, and interpreted them in the light of theoretical approaches to fallibility at work. Research indicates that safety in aviation has improved, and three guiding insights appear to be at the core of this development:

  1. 1.

    All pilots are fallible, including the most skillful and experienced among them.

  2. 2.

    Professionals can be blind to important aspects of their work environment, and they are often blind to this kind of aspect blindness.

  3. 3.

    Safety in aviation depends primarily on teamwork, and not on separate, individual efforts.

Implementation of these insights can happen with the aid of Reason’s Swiss Cheese Model. It offers concrete conceptual tools for handling human fallibility. Organizations can use it (a) to create awareness about the importance of voicing intervention, (b) to analyze and critically assess current ability to deal with error, and (c) to get people to take action and voice a concern when they perceive that somebody has made a mistake. The model originated in aviation, but it can be useful in any setting where it is important to identify mistakes and stop them from causing bad outcomes. Barriers can be technological, as when an alarm goes off when somebody has forgotten to do things properly. They can also be procedural, in that people are trained to follow a particular checklist and are thus able to detect the deviations from normal and correct procedures. Human intervention is the third type of barrier, and often the most fragile one, since it requires that people develop habits of speaking up, even when they are deeply uncomfortable about doing so. Hint and hope may be the least confrontational and most courteous strategy, but also one that is likely to fail. In professional settings, we can witness activities that unbeknownst to the agents seem destined to cause havoc, and need to engage in the matter without hesitation, in order to avoid the bad outcome. Doing that takes courage, and may require considerable training and preparation. In organizations, the barrier system will form a part of the culture, of the way things are normally done there. It is a particularly pressing responsibility for leaders to be aware of the strengths and weaknesses of the current barrier system, and to take steps to strengthen and improve it.