Keywords

9.1 Introduction

Two types of professional activity relate to the control of the hazards, namely doing what is right and doing what is safe. These labels simplify the more complex understanding of what is meant in these two areas. Doing what is safe refers to the human-involved activity of safety control which is largely governed by rules and procedures, with criteria for how things are done and what the outputs of the processes should be. Keeping within the boundaries of the formal system is achieved by compliance. However, when formal control systems break down or are ambiguous or wrong and in any case are never complete in covering variations in the working environment, those at the sharper end are left to resolve the control dilemmas in other ways.Footnote 1 This is labelled here as doing what is right because it depends on knowing what is the right thing to do. It is similar to what Colas (1997) described as the positive contribution of human reliability:

(…) participation in building safety, understanding of situations, the unavoidable “operational adaptation” in areas the rules do not cover, good habits and the real situations to which they apply, depending on the specificities of those situations.

9.2 Doing What Is Right

For doing what is right there is a dependence upon professionalism, a particular kind of expertise for which a person can be trusted to perform both according to established high standards as well as ethically. Professional bodies, like the Institution of Chemical Engineers for example, emphasise the importance of public trust in the professional. There is a normative value system of professionalism in work which is expected at the micro level in individual practices in the workplace. Membership of professional bodies depends upon adherence to accepted knowledge and competence standards and codes of practice.

Holtman (2011), a medical professional, says that expertise—specialised skills and knowledge—is the foundational element of professionalism. Specialised knowledge is often used to distinguish a profession from an occupation; an occupation is considered to rely more on craft skills even though craft skills are also required by professionals. According to Vitale (2012, 2013) professionalization of work has intellectualised occupations and transformed them into knowledge-based professions. This separation from craftsmanship is seen as an inhibitor of inter-professionalism with the creation of in- and out-groups. In the field of safety Almklov et al. (2014) have suggested that the knowledge element produced by scientists, which journals like Safety Science purport to communicate, may actually contribute to undermining local and system specific personal expertise and hence the unwritten craft skills associated with craftsmanship. Top professionals managing high risks recognise the importance of these skills:

…thanks to your experience you will have the right reflexes and will act in a way in which you will find the right answer for situations that are extremely difficult. Professional Alpinist & Adventurer (Van Galen and Bellamy 2015).

This kind of knowledge is also called “tacit” knowledge, a term introduced by Polanyi (1958) to distinguish it from propositional knowledge in that it cannot be communicated by language or mathematics. The person knows how to do something but cannot explain how it is done. Maslen (2014) has looked at building safety knowledge among new engineers in the pipeline industry. She provides evidence that this kind of tacit knowledge associated with craft skills is important and not easily passed on. It has to be developed in the workplace from experience and mentoring.

The skills and expertise expected of the professional are called upon when there are uncertainties in how to respond. There are unforeseen risks and surprises in practice which are not immediately covered by a normative risk management system. This demands a certain amount of flexibility which may be difficult to define in a system focused only on doing what is safe.

9.3 Doing What Is Safe

The safety of any activity must be assured for it to be acceptable. The safety of an activity is like a piece from a cake with many layers built up over time as contexts and knowledge change and develop as shown in Fig. 9.1. This model was originally designed to provide a scenario based approach to inspection and auditing of chemical plants (Oh and Bellamy 2000; Bouchet 2001). Any activity cuts through many perspectives: the knowledge of hazards and how to control them, the management of the processes which ensure integrity of the system, the human role at the different levels of control and, not shown, the environment of the system itself. The integrity of the technical system is at the core of the model. If this functioned without any further need for intervention there would be no requirement to manage it but this is rarely if ever the case. Maintaining the integrity of the technical system, keeping within the boundaries of the formal systems—codes, standards, rules, regulations, procedures, best practices, safety margins etc., are defined here as doing what is safe. It is what the public might expect, to be protected in some definable way. Recently, new efforts have been made to grapple with the complex nature of the problem. For example, Le Coze (2013) provides a “sensitising model” covering multiple perspectives (technological, organisational, psychological, cultural, sociological, political) emphasising their intertwined nature at a micro- meso- and macro-level.

Fig. 9.1
figure 1

A “piece of cake”: an activity associated with a hazardous technology contains the ingredients of the socio-technical system as understood and regulated

The purpose of choosing the words doing what is safe is to also link to the role of safety professional. The professionalization of safety was originally associated with enforcing legislation directed at accident prevention and was not integrated with the work. Factory inspectors for this purpose originated in the mid-19th century (Hale and Harvey 2012; Mohun 2013) and were the first safety professionals. Later, safety officers were appointed to ensure compliance in companies. Today the safety professional’s role has expanded beyond compliance in the evaluation and control of the hazards. Regulation has developed with an increasing focus on risk management like the layers in Fig. 9.1. The safety professional’s role is much concerned with hazard identification, assessing the risks and accident investigation when hazard control fails. Problem solving, advising, training and communicating also need to be considered along with appropriate technical knowledge. Wybo and Wassenhove (2016) distilled four main activities of HSE professionals as: regulatory compliance, design and operation of the HSE management system, risk and accident analysis, and emergency and crisis management. Although regulatory compliance is now regarded in the literature as low importance, compliance with laws, codes and standards is considered fundamental to risk management. In the Netherlands, investigated serious accident reports and major hazard loss of containment reports are analysed with the tool Storybuilder (Bellamy et al 2013, 2014). The extensively analysed data indicate however that only around half the accidents are found to have legal breaches, whether occupational or loss of containment accidents (previously unpublished data).Footnote 2 Compliance does not mean accidents are always prevented.

9.4 Problems

Doing what is safe may contribute to causing accidents. The Three Mile Island (TMI) nuclear accident in 1979 (Kemeny 1979) is an example where following the procedures actually made things worse. TMI was a turning point for highlighting the importance of operator support (interface design, procedures, training) in handling complex technologies. After the Chernobyl accident in 1986 and the conclusion that operators had been violating doing what is safe, the concept of Safety Culture was introduced (IAEA 1991). It was emphasised that safety as a collective attitude should be given number one priority, something that goes beyond procedure following and which requires a questioning attitude, stopping and thinking, and communicating with others. Later however design deficiencies were revealed that suggested it was the Chernobyl plant design that was at fault as well as arrangements for presenting important safety information to operators (IAEA 1992):

Certain actions by operators that were identified in INSAG-1 as violations of rules were in fact not violations (p.24).

The system of doing what is safe can always be used to blame operators rather than support them.

The reality is that professionals engaged in operating a system have multiple performance goals to achieve:

Resources will always be limited. So you need to think carefully, “when is enough, enough?” … You can optimize, and plan and prepare to perfection, but that means you never move. So you need to have some courage and say, okay let’s go, now we’re going to execute and do. Process manager, Petrochemical Industry (Van Galen and Bellamy 2015).

Ale (2005) says that courageous acts should be preceded by consulting “a morbid pessimist like a risk analyst”. While this could be formalised and turned into a matter of routine, even so there is still a need to push on. Besides that, not all risks can be foreseen. When there are many uncertainties, and so an incomplete model of knowledge and risk, and there are high pressures for the day to day operations to continue, balancing safety and production becomes what one manager called “top level sport”: being able to reduce the human error statistics in decision making without the performance going down too much (Van Galen and Bellamy 2015). Another perspective on this is “mindfulness” (Weick and Sutcliffe 2007). Mindful organising is a collective behavioural capability to detect and correct errors and adapt to unexpected events. In doing what is right one would expect professionals to be mindful in attention to detail and quality such as checking their own work, to be observant to the possibility of making and correcting errors as part of successful recovery and to be aware of possible biases in thinking.

The unexpected is about not knowing. For example, Kletz (2009) in his book “What Went Wrong” has a chapter entitled I Did Not Know. The Flixborough accident in the UK in 1974 (Department of Employment 1975) was an explosion at a chemical plant that killed 28 people. It is an example of a major hazard accident involving not knowing what you do not know; in order to by-pass a failed reactor to get production going again, no-one thought it was anything other than a routine plumbing job. This resulted in the court of enquiry recommendation that the training of engineers should be more broadly based and that all engineers should learn at least the elements of other branches of engineering (Recommendation 210 (ii) (Op.cit.). This was an exceptional situation. However, routine non-compliances may also occur where professionals work. Take pilots, for example. Apparently 97% of unstabilised approaches to a landing continue to be flown contrary to airline Standard Operating Procedures (Go-around safety forum 2013). The pilot may think that making a go-around instead of continuing the landing is more risky or that the go-around criterion is overly stringent. It is argued from the failure data that non-stabilised approaches increase the number of accidents like runway excursions and that it is the inadequate situational awareness of pilots that is causing an inaccurate risk assessment. Here is a conflict; where to draw the line between compliance and professionalism?

Standard work methods are important for standard work, but when things are happening that are not standard and the organisation only is built around these standard procedures, you will lack people that have the possibility to think thoroughly Production assistant, Steel-making industry (Van Galen and Bellamy 2015).

An example of this is given by the failure of workers to “drop your tools” (Weick1996). Weick describes two separate incidents where firemen escaping wildland fires failed to drop their heavy tools despite the fact that they needed to do so in order to quickly escape the threat. Failing to outrun the flames, many perished. This lack of flexibility is brought about by adaptation and learning. Conversely, there can be too much potential flexibility of behaviour when it has not been adequately constrained by procedures and learning. In a train derailment accident, lack of skills in handling tools and equipment was an underlying cause in a failure to identify defective railway points (RAIB 2009). In-house competence training did not include hands-on use of the tools or equipment on the infrastructure for which it was intended for use. Worn rails would have been needed to demonstrate a failed switch; the trainer used a computer presentation.

Although knowledge, expertise and experience from doing may be beneficial in finding the right solution to risk problems under uncertainty, there are important flip sides to experience, such as overconfidence (Roberto 2002). The human perception of and response to uncertainty is the subject of psychological debate (Smithson 2008) with a literature beyond coverage here. Cognitive models explaining judgement and decision-making outcomes under uncertainty have received attention, such as Endsley (2000) on situation awareness, a dynamic relationship with the cues from the environment for knowing what is going on around you. Another area concerns the limited availability of cognitive resources. For example, Hollnagel (2009) considers that when these cognitive resources are challenged, such as under time pressure, it results in an efficiency-thoroughness trade off (ETTO) in performance. Another camp (Tversky and Kahneman 1973, 1974; Kahneman 2011) suggests that even when it is within a person’s capacity to reason correctly they still use heuristics (mental shortcuts) and cognitive biases (tendencies to come to a wrong conclusion). When competent professional people make discretionary decisions they are not necessarily free of bias.

Another bias occurs when safety performance is measured in hindsight. Hindsight bias (the wisdom of hindsight) is a projection of new knowledge onto past events as though that knowledge were available at the time. Fischoff (1975) suggests that hindsight bias undermines our ability to appreciate the nature of uncertainty and surprises and to learn from them to improve the system. Lessons learned derived from an analysis of accidents are a form of hindsight bias, usually towards a normative model of safety. They are intended to encourage people to think of safety in their own work context and usually specified in terms of failures e.g. see ARIAFootnote 3 accident database of technological accidents (BARPIFootnote 4 n.d.) and lessons learned from the eMARSFootnote 5 major accidents database (European Commission n.d.). Little lessons learned attention is given to successful recoveries (Resilience Success Consortium 2015). One aspect of biased thinking is that it can lead to depreciation of successful recoveries by attributing them to luck. In fact, professionals with a role in dealing with high risk tend not to show much interest in learning from success, and even suggest that this could be potentially dangerous because of certain heuristic traps (Van Galen and Bellamy 2015). Little if any lessons learned attention was given to, for example, the fact that unarmed but aware and trained persons were able to thwart an armed terrorist on the Thalys train travelling from Amsterdam to Paris on 21 August 2015. After a brief interest in the heroism of these persons who applied their professional knowledge and skills in an unexpected context, attention focused instead on the weaknesses of the system. Positive recoveries are rarely analysed for factoring into risk assessment models and are easily dismissed as luck or heroism, with a consequent absence of data that would enable their effects to be quantified. Take an example—the ditching of a passenger flight in the Hudson River in the US after bird strike caused a complete lack of thrust. Successful recovery, it was concluded, mostly resulted from “a series of fortuitous circumstances” (NTSB 2010, p.79).

9.5 Conclusion and Solutions

Doing what is safe and normative safety management is about constraining human performance. Human beings can follow such a system but this only deals with foreseen risks. In this system the human is regarded as a component that could deviate from the norms and the system is then considered vulnerable to errors and violations when there is human interaction.

The human potential for safety rests with the professional in doing what is right. Here it is an uncertainty management system that is required that allows human beings the flexibility to adapt and deal with the unforeseen risks as they arise, the unexpected events and surprises that can occur. It is through developing the professional as part of an uncertainty management system that the human potential for safe performance can be enhanced. It is only recently that this is being recognised. In Fig. 9.1 a new layer was added, the human factor, and explicitly resilience and safety culture. It is this area that is the important part of the uncertainty management system.

A number of problems have been identified:

  • Treating safety as if it is only about keeping to norms and not including other aspects of safe behaviour associated with the management of uncertainty misses the opportunities to develop and use professionals for thinking about and dealing with the unforeseen.

  • Knowledge and experience are important in uncertainty management yet developing experience takes time; there is the need to acquire craft skills/tacit knowledge which enhances the capacity to deal with the unexpected but this is not the type of knowledge that can be written down—it has to involve doing.

  • When faced with uncertainty, it is a challenge to optimise the human cognitive functions when also having to deal with resource constraints (like limited time, tools, information), conflicting goals (like safety constraints versus the need to push on) and cognitive biases (such as overconfidence).

Some solutions to these problems might be found in changing the way we think about safety. Already there is more emphasis on integration between doing what is safe and doing what is right. Jørgensen (2015) says that accident research shows that safety must be integrated with the whole enterprise and function on all levels of management. It is necessary to make safety a part of professionalism without making it a separate issue; it should be an integrated part of the right way to do things and aligned to the mission of the organisation. This means that safety is not picked out as an independent separate item. Good operators will be attentive to everything and have the judgement and decision-making skills for dealing with uncertainties.

However, the necessary experience for certain jobs may take years to develop. Researchers are recognising the value of considering ways to communicate the kind of knowledge that develops from experience. For example, Podgórski (2015) examines the use of tacit knowledge in occupational safety and health management systems in the context of knowledge management. He finds that the methods of information transfer about knowledge rooted in the working context are not well studied, although there are some promising solutions like hazard awareness apprenticeships in the field and the use of virtual realities. Maslen (2014) identifies that for new engineers learning from experience can be facilitated by actual participation early on the job by being given responsibilities like being in charge of a project instead of just observing how things are done.

Professionals should also have good tools. Ideally the tools of hazard control, unlike the earlier mentioned heavy tools of the firemen (Weick 1996), should help to buy time in situations of threat and uncertainty and not limit it. Time provides the opportunity for the gathering of multiple perspectives and expertise for reducing the uncertainty of a successful outcome and deciding actions when things deviate. In making decisions in the face of uncertainty, professionals managing high risk have pointed out the advantages of thinking and deciding together, of sharing observations, knowledge and experience and of having a group of opinions to find the right path through the uncertainties (Van Galen and Bellamy 2015). Even that may be subject to biases such as groupthink (Janis 1972); biases are always there compromising successful outcomes. Learning about biases is a process lacking in training resources although there are recommendations for debiasing strategies in the literature such as simulation, thinking about thinking, or forcing consideration of alternatives (Croskerry 2003). Professionals should be aware of biases and how to mitigate their adverse effects as well as enhance their good ones as efficient strategies such as when time is limited. The role of safety professionals is very much in the normative camp. They are the watchmen and enforcers when the risks are foreseen. But safety professionals cannot deal with the uncertainties when they lack the specialist professional skills and experience of the subject expert. They cannot be surrogates for mindfulness in this respect. An important role however could be as a devil’s advocate or “morbid pessimist” or at least to facilitate the acquisition of such a resource.

The process of generating increasing mindfulness is, as Weick and Sutcliffe (2007) point out, not more equipment or more training or more of the old strategies. The thinking in this chapter leads to the conclusion it is more about how to:

  • Recognise and develop uncertainty management systems for handling unforeseen risks.

  • Integrate safety with the other goals of the system in professional training, identifying how multiple goals can be achieved through doing what is right and not just by doing what is safe.

  • Learn to recognise and reduce uncertainty and bias, with the need to develop resources to support this and to make use of connections to people with other perspectives. This has an organising aspect of uncertainty management for enabling this connectivity and communication.

  • Improve and speed up acquisition of tacit knowledge through using new possibilities for apprenticeship, such as to virtual masters, and organisational solutions like giving new recruits early opportunities to be responsible for work.

  • Improve safety as an increased awareness for successes in past recoveries where there were uncertainties associated with changes, deviations or near misses, using lessons learned about successes as part of investigation and learning about how to better anticipate the unknown.