Introduction—Overview of Unsafe Practices

There is no doubt that clinicians are human and will make mistakes—even the highly trained and highly skilled members of cardiac surgical teams. Although morbidity and mortality of cardiac surgeries have decreased over the past decade [1], adverse events still occur in 12 % of cardiac surgery patients compared to only 3 % of non-cardiac patients [2]. Over half of these adverse events are preventable [2]. As with the landmark Harvard Practice Study [3, 4], preventable adverse events in cardiac surgery include those related to not doing the right thing (suboptimal care) and attempting to do the right thing, but doing it wrong (human error). If we are to improve patient safety in our cardiovascular operating rooms (CVOR), we must understand and address each of these types of preventable adverse events.

Doing the Right Thing

Knowing is not enough; we must apply. Willing is not enough; we must do.

Johann Wolfgang von Goethe, 1749–1832

Doing the right thing, in the context of this review, refers to the application of evidence-based best practices to every patient every time. Highly dedicated experts in all walks of medicine produce volumes of evidence-based guidelines and consensus statements to guide patient care. Unfortunately, as found by McGlynn et al. [5], only half of patients receive the care suggested by these guidelines. Among coronary artery bypass patients, only 67 % of patients received all of the recommended discharge interventions [6]. In 2007, the Society of Cardiovascular Anesthesiologists (SCA) and the Society of Thoracic Surgeons (STS) published a set of guidelines on blood conservation in cardiac surgery [7]. Three years later, a survey of 1,061 institutions (primarily in the US) found that 78 % of anesthesiologist and 67 % of perfusionist had read all or part of the guidelines, but only 26 % of respondents reported that one or more practice changes had been made in response to the guidelines [8]. A multicenter audit of coronary artery bypass surgery in Denmark in 2004 found that substantial differences in transfusion rates existed and were not related to differences in patient risk factors, but to local practice [9]. Similarly, strict management of hyperglycemia carries a level 1 recommendation in numerous guidelines and in consensus statements; however, many cardiac surgical teams have been slow to implement glucose management protocols. We all now accept that the surgeon must scrub and sterilely gown and glove, but many anesthesiologists have not yet adopted the parallel concept of strict hygiene in placement of central lines (hand hygiene, sterile gown and gloves, full barrier draping, chlorhexidine) [10].

These issues are not unique to cardiac surgery, or to a few outlier physicians, but are well known across the physician profession. A review of this problem by Dr. Reinersen of the Institute for HealthCare Improvement (IHI) begins with a quote from Johann Wolfgang von Goethe, “Knowing is not enough; we must apply” [11]. Barriers to implementation of best practices/guidelines were identified by Dr. Reinersen in his review centered on barriers inherent in the culture of physician autonomy: “we can’t agree on all the science, protocols stifle innovation, guidelines expose us to legal risk, it’s cookbook medicine, and guidelines remove our professional duties.” While Dr. Reinersen effectively deconstructs these barriers, most physicians find changing their practice to be very difficult for reasons often not articulated. Each of us learned our skill, art, and best practices in a setting of daily feedback and recalibration by our expert teachers; changing our practice based on a single CME event or through reading guidelines in a vacuum is difficult, especially when some “experts” provide their disagreement with published guidelines. The legal definition of acceptable practice versus malpractice is based on the generally accepted local practice, and the quality of our practice is viewed in the context of the accepted practice in our local community. The early adopters may find their practice is out of line with accepted practice in their community. Because it is challenging for individual physicians to continually change and update their practice based on published guidelines, we need to provide mechanisms whereby guidelines can be reviewed, discussed, and implemented easily within local communities.

This approach has been successfully undertaken in numerous settings, but not without considerable effort and attention to detail. The American Heart Association has produced guidelines for optimal cardiovascular care since the 1980s. However, a review of discharge data for acute myocardial infarction patients in 2000 found that only 85 % of patients received aspirin at discharge, and only 69 % received beta-blockers. In response, the AHA instituted a “Get with the Guidelines” program that included ways to embed guideline-based care goals directly into bedside care [12, 13]. This program involved a comprehensive program with preliminary planning, identification of hospitals to receive intensive training, provision of tools such as standardized order sets, including the over-riding of discharge orders that did not meet the guidelines, use of monitoring tools with feedback, and debriefings. Support systems for enrolled hospitals included monthly team reports, visits, phone calls, and data summaries—no physician or even a care team was left to implement suggested guidelines in isolation. No similar national initiatives have been implemented in cardiac surgery per se, but examples exist of individual success in application of best practices; many were effective due to participation in a cardiac surgery quality improvement collaborative rather than at the level of an individual physician.

The earliest, and the model for successive collaboratives, was the Northern New England Cardiovascular Disease Study Group (NNECDSG) [14]. Initiated in the era prior to widespread societal guidelines, the NNECDSG brought together five cardiac surgical groups to share patient demographic, process, and outcome data in order to improve standardization, share learning, and continually improve outcomes. The sites conducted round robin visits, face-to-face conferences, and reviews of outcome data and, through this model, have reduced overall mortality, mortality among women, and re-exploration for hemorrhage [1417].

The Virginia Cardiac Surgery Quality Initiative model (VCSQI) has been similarly successful. This voluntary consortium of 17 hospitals and 13 cardiac surgical practices perform over 99 % of Virginia’s open-heart procedures [18]. The collaborative model includes the following elements and functions: (1) establishing clinical priorities by identifying the best opportunities for improvement; (2) provision of a dashboard, a set of indicators for tracking particular improvement initiatives; (3) periodic reporting via blinded, de-identified reports; (4) identifying best practices and performers who consistently meet or miss threshold values; (5) performing systematic reviews of medical literature to provide scientific evidence of best practice care processes; (6) protocols to define data collected and to test the significance of changes in key quality indicators. These elements are provided confidentially, with project management established by the board. Implementation of atrial fibrillation prevention guidelines by the VCSQI hospitals reduced the incidence of postoperative AF to 10.8 % in the best practices sites compared with a 20 % rate nationwide [18].

Implementation of guidelines can be accomplished within a single institution, but typically through an institution-wide initiative, not through the efforts of a single individual. Institution of a blood conservation protocol based on the STS/SCA transfusion guidelines in a single hospital in Brazil increased the application of best practices such as the use of epsilon aminocaproic acid and reduced blood transfusions from 67 % pre-implementation to 40 % post-implementation. This reduction in transfusions was associated with a significant reduction in the rate of acute renal failure, infection, septic shock, and readmission rate [19]. Similarly, implementation of a comprehensive blood conservation program reduced overall blood product use by 40 % in a community cardiac surgery program [20]. Finally, in the VCSQI consortium, implementation of the transfusion guidelines significantly reduced intra- and postoperative blood transfusion rates, which was associated with significant decreases in pneumonia, prolonged ventilation, renal failure, and operative mortality [21]. Over 14,259 patients were included in the results (7,059 pre-implementation, 7,200 post-implementation), with a median reduction in hospital costs of ~$4,000 per patient.

These data clearly demonstrate that implementation of evidence-based best practices in cardiac surgery saves lives and reduces cost, but that it is difficult and time consuming to change routines and engrained local practices. Our academic community has become proficient at the extensive review and analysis required to continually update evidence-based best practices, but has ignored the science about how to implement the knowledge gained. Improving patient safety in cardiac surgery will now require developing more robust methods to drive the rapid implementation of ever-evolving best practices across the continuum of cardiac surgery.

Doing the Right Thing the Right Way

To err is human; to forgive, divine.

Alexander Pope, 1688–1744

Even when we know the right thing to do, and are willing to do it for every patient, success may elude us. James Reason, and Alexander Pope before him, recognized that humans will forever make mistakes. The fast, effortless, and parallel thinking that allows us to do our daily job in the CVOR is possible only through our long-term, but subconscious, learned knowledge base [22]. Although fast and seemingly efficient, implementation of our learned knowledge occurs only with the trade off of the errors inherent in the required similarity matching (“I’ve seen this before, and it looked like tamponade”) and frequency gambling (“I have had 50 cases of hypotension due to decreased blood volume and only 1 due to anaphylaxis”) required for the rapid automated thinking [22]. These hard-wired performance limitations make us all equally susceptible to error traps—situations that set the performer up to fail, regardless of their intelligence or determination. Improvements in patient safety in the CVOR require careful and systematic identification of error traps, and system and teamwork changes to eliminate them, or to prevent our human errors from reaching our patients.

Teamwork and Communication Failures

In the majority of cases where an adverse event occurs, errors are often not as a result of poor technical skill, but are due to teamwork failures, most often involving communication. In a review of litigated surgical outcomes, communication errors were responsible in 87 % of the cases where an indemnity payment resulted [23]. In another review of litigated surgical outcomes, communication failures occurred equally the in pre-, intra-, and postoperative periods; occurred frequently during handoffs or patient transfers; and often involved status asymmetry or ambiguity about responsibility [24]. Status asymmetry refers to a hierarchical structure and is particularly prevalent among operative teams with junior or non-physician members hesitant to speak up and appear to undermine a physician’s authority. In a study of three pediatric cardiac surgical centers, between 30 and 40 % of team members agreed that they would find it difficult to speak up if they saw a patient safety issue or to disagree with an attending surgeon [25]. A more complete review of the significance of teamwork skills is provided in Weller and Boyd [26] of this edition and in a recent AHA Scientific Statement [27••].

Disruptions, Distractions

The propensity for humans to make errors, whether technical or non-technical, is worsened by distractions and disruptions of our daily routine. The likelihood of missing a critical step or overlooking a critical change in patient condition increases if a distracting conversation is occurring, or when our attention is diverted [28]. Wiegmann et al. [29] found that disruptions in surgical flow occur on average 11 times per cardiac surgical case and are associated technical errors; Palmer et al. [30] demonstrated over 100 flow disruptions per cardiac case. These flow disruptions, often viewed as minor events, impact patient outcomes. de Laval et al. [31], after studying outcomes of more than 243 arterial switch operations, reported that minor failures—events not expected to harm a patient—occur frequently and, as they accumulate, increase the likelihood that a major event—one that does harm a patient—will occur. Increases in minor events are associated with increased mortality [31]. The authors describe how a cardiac surgeon, having been interrupted twice by the floor manager to discuss the next day’s cases, nearly went on bypass without heparin being given [32]. Theses disruptions in surgical flow or distractions can be as simple as a piece of equipment needing to be retrieved from the surgical core, or a beeper or phone ringing. We view these a minor annoyances, but they are intrinsically tied to our ability to focus and perform—in short, “little things matter” [33•, 34].

An often-ignored source of distraction is the sheer noise in the cardiac operating room. OR traffic, non-pertinent conversations, alarms, and even music can result in noise levels that exceed standards set for worker safety. Observational studies have found that increased noise levels are associated with surgical site infections [35] and increases in technical errors in simulated laparoscopic surgeries [36]. Alarms are intended to alert clinicians to potentially dangerous conditions, but up to 90 % of all alarms are false positives [37]. In addition, they occur at a rate of up to 1.2 times per minute (359 per cardiac surgery case) [38], which will inevitably lead to “alarm fatigue” and ignoring the one that is a true positive. Music, while it can reduce stress and improve the performance of some OR staff [39], can also impair the ability of staff to communicate with each other [40]. Some have suggested that cardiac surgical operating rooms adopt the “sterile cockpit” concept of aviation, where, during takeoff and landings, no conversation other than that pertaining to the critical task at hand is permitted. While this has great appeal, Wadhera et al. [41] have demonstrated that the mental workload of each CVOR subteam varies considerably throughout the case, leading to casual conversation among the surgery team just when the anesthesiologist needs absolute quiet (induction/intubation) and vice versa.

Unsafe Acts: Errors and Violations

Dr. James Reason has long studied the events and acts that can lead to adverse outcomes. These “unsafe acts” can be subconscious errors, including skill-based errors, where well-rehearsed and often-performed sequences of tasks go awry because of distraction, or rule based, where a rule of “If this, then do that” is either applied in a wrong situation or applied erroneously [28]. Conscious errors can occur when we have no routine sequence to employ and no rule that seems to fit, and we are required to utilize the “slow, limited, and laborious application of conscious attention” [22]. Perhaps the most distressing types of unsafe acts are violations: those acts done with full knowledge that the wrong thing is being done—a violation of best practices or of established policy. These violations occur for personal reasons, such as ignoring hand hygiene to save time or to avoid inconvenience, and because the consequences (infection) are remote in time from the violation. Perhaps most distressing is the fact that we violate best practices because the likely adverse event will happen to someone else (the patient).

However, it is important to distinguish between violations done for personal reasons and those done because there is no other way to get the job done. The first type requires accountability, i.e., that the violating individual be held to the established standard; the second type requires a system-wide evaluation and correction rather than a focus on an individual. In the Keystone Project, the CLABSI checklist mandated a full barrier drape when placing a central line; at that time, however, virtually all of the central line kits contained only partial barrier drapes [42]. Implementing the guideline of administering bisoprolol for the prevention of atrial fibrillation in cardiac surgery patients [43] cannot be followed if the hospital formulary does not carry IV bisoprolol.

Physical Environment, Tools, and Technology

The physical environment in which cardiac surgery occurs includes not only the tools and technologies that we use to do our work, but also the physical placement of these tools, their alarms, and displays, and the characteristics of lighting, noise, vibration, temperature, physical layout and available space, and air quality. Arguably no place in the health care system incorporates more tools and technologies into one space and no place uses more at the same time. Little work has been done to identify the optimal configuration and interaction of equipment in the operating room. Rather, objects have been added and added to an already crowded room with little thought to integration. As a result, we have simply built larger and larger operating rooms, crowded with wires, cords, and display screens, creating what has been termed “the spaghetti syndrome” [44].

Pennathur et al. [45], using the data collected from the LENS study, investigated the role of technology in creating hazards and offer a framework to explain and categorize these errors. Technology is problematic with regards to design and function, how the organization interacts with the equipment through purchasing or training, and how that device is incorporated into the physical environment. Poor design and function or interaction between technologies impacts the providers’ cogitative task load; the more we struggle with our equipment, the more our cogitative load is distracted from the patient.

In a review of hazards in cardiac surgery, Martinez et al. [46] identified four ways machines cause harm: (1) misuse (poor training or negligence), (2) the inherent risks of using the device, (3) poor maintenance and upkeep, and (4) poor machine design. A human factors analysis of modern-day CPB pumps found that information displays suffered from placement, legibility, and format problems; components were poorly integrated into the machine, and the alarms were too loud or too quiet [47]. Alarms, designed to alert the team to an abnormal condition, occur at an “alarming” rate of 359 per case, or 1.2 per minute [38], and 90 % of these alarms are false positive [37]. This quickly leads to alarm fatigue, as well as increasing the noise level, which has been implicated in disruptions and even surgical site infections [35]. Information systems (monitors, electronic medical record systems, alarms) are not integrated and do not provide the comprehensive physiologic and disease model that is optimal for patient safety [48]. To date, no human factors analysis has been performed on the anesthesia delivery system, but would likely identify a myriad of ergonomic and information display issues.

Safety Culture

Deficits in safety culture—those collective behaviors and values that allow a system to identify and mitigate hazards to patient safety—have been implicated in adverse outcomes in cardiac surgery [46]. The absence of robust quality assurance systems contributed to mortality rates in pediatric cardiac hospitals in Bristol (UK) [49, 50] and Winnipeg (Canada) [51, 52]. As alluded to earlier, a rigid hierarchical culture in the CVOR can lead to status asymmetry and disruptive behavior, with the results that team members are reluctant to speak up or challenge authority in the face of recognized hazards [25, 53, 54]. Disruptive behavior is common, reported by nearly three-quarters of physicians and nurses [55], and cuts across all disciplines [56]. Respondents to a survey reported witnessing disruptive behaviors in 75 % of surgeons, 64 % of anesthesiologists, and 59 % of nurses. More than 80 % reported loss of concentration, communication failures, and poor patient safety as a result of disruptive behavior [56]. A culture which permits disruptive behavior among healthcare workers directly impacts patient safety.

The hero culture common to cardiac surgery—the self-sacrificing team that goes beyond the point of exhaustion to save a patient—may actually be detrimental to patient safety [27••]. Fatigue is not only permitted, it is encouraged. Numerous studies have found that sleep deprivation increases risks to patients [57] and to clinicians [58].

Improving Patient Safety in Cardiac Surgery: A Work Systems Approach

The enumeration of the extensive hazards that exist in cardiac surgery above may create depression or despondency—there appears to be so much to be done on so many levels. It is worth reiterating here that the overall mortality among cardiac surgery patients is 2–3 % and has continually decreased over the past decades [1, 59]. This cannot be enough—we must dedicate ourselves daily to identifying and eliminating preventable harm. The science of patient safety is relatively new, and much more of the available literature in cardiac surgery relates to identification of hazards rather than proven methods for improvements in patient safety. As eloquently summarized by Wiegmann et al. [60], patient safety programs must take a comprehensive work systems approach to correcting the factors that negatively impact the safety of our cardiac surgery patients.

The Systems Engineering Initiative to Patient Safety (SEIPS) model was first proposed by Carayon et al. [61] and is based on the engineering subspecialty of human factors, which emphasizes the interaction between people and their environment (Fig. 1) and has recently been revised [62]. As elaborated by the authors:

Fig. 1
figure 1

SEIPS model of work systems and patient safety. Reprinted from [61], with permission from BMJ Publishing Group Ltd.

According to the work system model, a person (who can be a caregiver or the patient) performs a range of tasks using various tools and technologies. The performance of these tasks occurs within a certain physical environment and under specific organizational conditions.

Persons

Teamwork and Communication

This chapter will not go into depth on interventions designed to enhance teamwork and communication. These have been covered extensively in other documents [27••] and are reviewed in depth in Weller and Boyd [26] of this publication. However, the data supporting teamwork training and those supporting the use of briefings and debriefings, checklists, and conflict resolution cannot be over emphasized [53, 6367••, 68•–75••, 7678].

Personal Readiness, Violations

As noted above, fatigue has been linked to needle sticks and motor vehicle accidents, and a comprehensive review by the Institute of Medicine led to a call for limiting resident call hours [79]. The Accreditation Committee of Graduate Medical Education (ACGME) responded by instituting resident call hour limitations, but initially only for interns [80]. These issues have not been fully addressed at this point, and there is some evidence that fatigue does not significantly impact patient outcomes in cardiac surgery [8183]. Suggestions have been made to incorporate rest breaks into the workday, particularly when the task stress and mental workload are high [60].

Violations of established policy are widespread within our medical system and have proven, as with hand hygiene, to be difficult to eradicate [84]. Institution of checklists, process changes, and audits (bundled approach) have proven to be effective in reducing violations surrounding central lines as well as subsequent infection rates [10, 85]. Part of the solution to hand hygiene as well as other violations will be mutual accountability—holding each other to the highest standards in the interest of patient safety [86, 87]. Much of the work required to eliminate personal violations will need to be done by hospital and team leadership.

Physical Environment, Tools and Technologies

It is clear from the review of the hazards associated with tools and technologies as well as the physical environment that interventions to mitigate these hazards must include the system as a whole. The science of human factors engineering has begun investigating how these complicated technologies are incorporated into the OR [88], but this science is in its infancy.

We have much to learn about the ideal space and ideal layout. While architects suggest that the ideal OR is >600 square feet, there are admittedly feew data to suggest what is best. The guiding principles for optimal OR design, as summarized by Killen [89], are as follows: (1) standardize the location of the head of the table and the handedness of the room; (2) provide adequate space for staff to move around and for equipment; (3) maintain focus on the patient; (4) ensure that all staff have a line of sight to the patient at all times; (5) use technology to help workflow. Solving such issues such as the “spaghetti syndrome” [44] of wires and cables and the increased microbial count associated with frequent door openings [90] will take a coordinated and concerted effort. Reduction of distraction, noise, and door openings by controlling traffic patterns has been recommended by the Association of Perioperative Registered Nurses [91].

Reduction of noise by adopting a “sterile cockpit” has been suggested, but Wadhera et al. [41] have demonstrated that, unlike aviation, each CVOR team has a different cognitive workload at different stages of the operation. They propose adoption of structured conversations for various stages, but this concept has not been tested. Coordination of information technologies is critical and is receiving increasing attention [48, 92]. Data such as those provided by the RIPCHORD study group [30] provide a framework and methodology that could be used to help understand traffic patterns in operating rooms and determine optimal positioning of booms and equipment. As each of these problems (noise, alarms, equipment position, and personnel traffic) are studied, high fidelity simulation offers an opportunity to evaluate possible solutions.

Organizational Conditions (Safety Culture)

A culture of safety and trust is at the heart of improving patient safety and improving quality [93]. Leadership must focus on eliminating a culture of blame and shame and replace it with a “just” culture, one that addresses systems issues responsible for human error, but one that also requires personal accountability [94]. The implementation of quality improvement and patient safety programs must originate at the very top of the institutional hierarchy [95] and be visible to, and interact closely with, the frontline staff (adopt-a-unit) [42, 96, 97]. Examples of successful implementation of QI programs in cardiac surgery include Total Quality Management (physician-led implementation of checklists, mandated multidisciplinary conferences, M&M meetings focused on correction, not blame) [98], the “Proven Care” program (consistent implementation of evidence based care package) [99], a process-oriented multidisciplinary approach to cardiac surgery (POMA) [100], and an operational excellence method derived from the Toyota Production System [101] (see Table 1).

Table 1 The ten step process to promoting a culture of safety [61]

A culture of patient safety must address disruptive behavior, which is common in our healthcare systems, particularly in the hierarchical culture of the cardiac operating room; disruptive behavior directly impacts patient outcomes [56, 102, 103]. The Joint Commission has set clear standards to deal with disruptive behavior, beginning with the hospital leadership [104, 105]. Structured teamwork training can provide effective conflict resolution strategies and reduce disruptive behavior [106]. Few data exist at present to support implementation of one type of corrective program over another; additional research will be needed.

Most of the literature regarding “safety culture” is based on the institutional or individual unit/team level, and there have been few efforts to develop multi-disciplinary teams to explore and test ways to optimize team performance. Our societies have worked together for years to develop multidisciplinary guidelines for technical aspects of care (transfusion guidelines in cardiac surgery), but few formal mechanisms exist for multidisciplinary teams to develop and promote guidelines for teamwork and communication. The FOCUS (Flawless Operative Cardiac Unified Systems) patient safety initiative, instigated and promoted by the Society of Cardiovascular Anesthesiologists Foundation, has brought together representatives of the American Society of Extra-Corporeal Technology (AmSECT), the Association of Perioperative Registered Nurses (AORN), Human Factors and Ergonomics Society (HFES), and numerous cardiac surgeons to work together to better understand how and why errors occur, and to develop robust methods to optimize team performance. The initial FOCUS research effort was lead by Dr. Pronovost at Johns Hopkins and has resulted in a deeper understanding of good practices and hazards in cardiac surgery [45, 46, 107110]. The multidisciplinary FOCUS team, through the aegis of the American Heart Association, developed and has published a scientific statement on patient safety in the cardiac operating room [27••]. Improving patient safety in the cardiac operating room depends on robust research by multidisciplinary teams to identify team vulnerabilities and to develop means to optimize team performance.

Proven Interventions

Although we have taken a work systems approach to means to improve patient care, we recognize that successful interventions will affect multiple aspects of the care process. Use of team training and implementation of briefings and debriefings will improve teamwork and communication, reduce equipment failures, and improve safety climate. Addressing the issue of poor equipment design will directly impact patient safety, but can also eliminate alarm fatigue and reduce the cognitive demands on clinicians. Elimination of a culture of blame and establishing a just culture will permit identification of system processes to improve patient safety and also establish accountability for personal violations.

Conclusions

Our current literature is robust with research into the potential hazards existing in cardiac surgery; fewer data are available on interventions that improve patient safety. Clearly further research is needed, but there are key processes that can be implemented immediately to improve patient safety in the cardiac operating rooms:

  • Initiate and maintain team training sessions

  • Institute briefings and debriefings for every cardiac surgical case

  • Institute use of checklists and cognitive aids, particularly for crisis situations

  • Establish a routine to identify and eliminate system defects

  • Establish robust and effective leadership regarding safety culture

These approaches can significantly improve patient safety in the cardiac operating room: it is what we want and what our patients deserve.