Concepts and Models of Safety, Resilience, and Reliability

Chapter

Abstract

Approaches to safety have often considered the “human” factor in an organisation or operation as a major contributor to unwanted outcomes. The view of “human” as a problem leads to responses that involve trying to exert more control over people. While these may make intuitive sense for some, research suggests that such a view may not be valid as there are an enormous number of other factors (many of which are beyond control of the human at the sharp end) that are behind the creation of success and the occasional failures. This chapter begins with a review of normal accident theory, before using complexity science to fill out the problematic nature of the notion of “human error”. It then discusses one of the prominent problems associated with complexity, safety drift. Lastly, this chapter looks at various proposed solutions (e.g. resilience engineering) by which a healthcare system can manage complexity and perhaps reduce the potential harm to patients.

Keywords

Normal accident theory Complexity Drift Resilience High reliability Safety 

References

  1. 1.
    Dekker S. Safety differently: human factors for a new era. Boca Raton: CRC Press; 2014.CrossRefGoogle Scholar
  2. 2.
    Rankin A, Lundberg J, Woltjer R, Rollenhagen C, Hollnagel E. Resilience in everyday operations: a framework for analysing adaptations in high-risk work. J Cogn Eng Decis Mak. 2014;8:78–97.CrossRefGoogle Scholar
  3. 3.
    Perrow C. Normal accidents: living with high-risk technologies. 1st ed. Princeton: Princeton University Press; 1984.Google Scholar
  4. 4.
    Hollnagel E. Safety-I and safety-II: the past and future of safety management. Farnham: Ashgate Publishing; 2014.Google Scholar
  5. 5.
    Rochlin GI, La Porte TR, Roberts KH. The self-designing high-reliability organisation: aircraft carrier flight operations at sea. Nav War Coll Rev. 1998;51:97–113.Google Scholar
  6. 6.
    Hollnagel E. ETTO principle: efficiency-thoroughness trade-off: why things that go right sometimes go wrong. Surrey: Ashgate Publishing; 2009.Google Scholar
  7. 7.
    Barach PR, Small SD. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. Br Med J. 2000;320:759–63.CrossRefGoogle Scholar
  8. 8.
    Levitt P. When medical errors kill: American hospitals have embraced a systems solution that doesn’t solve the problem. Los Angel. Times [Internet]. Los Angeles; 2014. http://articles.latimes.com/2014/mar/15/opinion/la-oe-levitt-doctors-hospital-errors-20140316.
  9. 9.
    Topham G. Railway accidents happen because someone makes a mistake. The Guardian [Internet]. London; 2013. http://www.theguardian.com/uk-news/2013/jul/25/railway-accidents-human-error-warning-systems.
  10. 10.
    Dekker S, Cilliers P, Hofmeyr J-H. The complexity of failure: implications of complexity theory for safety investigations. Saf Sci. 2011;49:939–45.CrossRefGoogle Scholar
  11. 11.
    Mohr JJ, Barach PR, Cravero JP, Blike GT, Godfrey MM, Batalden PB, et al. Microsystems in health care: part 6. Designing patient safety into the microsystem. Jt Comm J Qual Patient Saf. 2003;29:401–8.Google Scholar
  12. 12.
    Chatterjee MT, Moon JC, Murphy R, McCrea D. The ‘OBS’ chart: an evidence based approach to re-design of the patient observation chart in a district general hospital setting. Postgrad Med J. 2005;81:663–6.CrossRefPubMedPubMedCentralGoogle Scholar
  13. 13.
    Ebright PR, Patterson ES, Chalko BA, Render ML. Understanding the complexity of registered nurse work in acute care settings. J Nurs Adm. 2003;33:630–8.CrossRefPubMedGoogle Scholar
  14. 14.
    Leape LL. Error in medicine. J Am Med Assoc. 1994;272:1851–7.CrossRefGoogle Scholar
  15. 15.
    Walsh-Sukys M, Reitenbach A, Hudson-Barr D, DePompei P. Reducing light and sound in the neonatal intensive care unit: an evaluation of patient safety, staff satisfaction and costs. J Perinatol. 2001;21:230–5.CrossRefPubMedGoogle Scholar
  16. 16.
    Westbrook JI, Woods A, Rob MI, Dunsmuir WTM, Day RO. Association of interruptions with an increased risk and severity of medication administration errors. Arch Intern Med. 2010;170:683–90.CrossRefPubMedGoogle Scholar
  17. 17.
    Eastridge BJ, Hamilton EC, O’Keefe GE, Rege RV, Valentine RJ, Jones DJ, et al. Effect of sleep deprivation on the performance of simulated laparoscopic surgical skill. Am J Surg. 2003;186:169–74.CrossRefPubMedGoogle Scholar
  18. 18.
    Wetzel CM, Kneebone RL, Woloshynowych M, Nestel D, Moorthy K, Kidd J, et al. The effects of stress on surgical performance. Am J Surg. 2006;191:5–10.CrossRefPubMedGoogle Scholar
  19. 19.
    Wiegmann DA, ElBardissi AW, Dearani JA, Daly RC, Sundt TM. Disruptions in surgical flow and their relationship to surgical errors: an exploratory investigation. Surgery. 2007;142:658–65.CrossRefPubMedGoogle Scholar
  20. 20.
    Coiera E. The science of interruption. BMJ Qual Saf. 2012;21:357–60.CrossRefPubMedGoogle Scholar
  21. 21.
    Reason J. Safety in the operating theatre—part 2: human error and organisational failure. Curr Anaesth Crit Care. 1995;6:121–6.CrossRefGoogle Scholar
  22. 22.
    Robson R. ECW in complex adaptive systems. In: Wears RL, Hollnagel E, Braithwaite J, editors. Resilient health care, The resilience of everyday clinical work, vol. 2. Surrey: Ashgate Publishing Limited; 2015. p. 177–88.Google Scholar
  23. 23.
    Snook SA. Friendly fire: the accidental shootdown of US Black Hawks over northern Iraq. Princeton: Princeton University Press; 2002.Google Scholar
  24. 24.
    Dekker S. Drift into failure: from hunting broken components to understanding complex systems. Surrey: Ashgate Publishing Limited; 2011.Google Scholar
  25. 25.
    Leveson NG. Applying systems thinking to analyze and learn from events. Saf Sci. 2011;49:55–64.CrossRefGoogle Scholar
  26. 26.
    Vaughan D. The dark side of organisations: mistake, misconduct, and disaster. Annu Rev Sociol. 1999;25:271–305.CrossRefGoogle Scholar
  27. 27.
    Lindsay DS. Misleading suggestions can impair eyewitnesses’ ability to remember event details. J Exp Psychol Learn Mem Cogn. 1990;16:1077–83.CrossRefGoogle Scholar
  28. 28.
    Loftus EF, Palmer JC. Reconstruction of automobile destruction: an example of the interaction between language and memory. J Verbal Learn Verbal Behav. 1974;13:585–9.CrossRefGoogle Scholar
  29. 29.
    Ramirez S, Liu X, Lin P-A, Suh J, Pignatelli M, Redondo RL, et al. Creating a false memory in the hippocampus. Science. 2013;341:387–91.CrossRefPubMedGoogle Scholar
  30. 30.
    Itsukushima Y, Nishi M, Maruyama M, Takahashi M. The effect of presentation medium of post-event information: impact of co-witness information. Appl Cogn Psychol. 2006;20:575–81.CrossRefGoogle Scholar
  31. 31.
    Woods DD, Dekker S, Cook R, Johannesen L, Sarter N. Behind human error. 2nd ed. Surrey: Ashgate Publishing Ltd; 2010.Google Scholar
  32. 32.
    Dekker S, Leveson NG. The systems approach to medicine: controversy and misconceptions. BMJ Qual Saf. 2015;24:7–9.CrossRefPubMedGoogle Scholar
  33. 33.
    Rochlin GI. Safe operation as a social construct. Ergonomics. 1999;42:1549–60.CrossRefGoogle Scholar
  34. 34.
    Weick K, Sutcliffe KM. Managing the unexpected: resilient performance in an age of uncertainty. 2nd ed. San Francisco: Jossey-Bass; 2007.Google Scholar
  35. 35.
    Steiner JL. Managing risk: systems approach versus personal responsibility for hospital incidents. J Am Acad Psychiatry Law. 2006;34:96–8.PubMedGoogle Scholar
  36. 36.
    Wachter RM, Pronovost PJ. Balancing ‘no blame’ with accountability in patient safety. N Engl J Med. 2009;361:1401–6.CrossRefPubMedGoogle Scholar
  37. 37.
    Dekker S, Pruchnicki S. Drifting into failure: theorising the dynamics of disaster incubation. Theor Issues Ergon Sci. 2014;15:534–44.CrossRefGoogle Scholar
  38. 38.
    Amalberti R, Auroy Y, Berwick D, Barach PR. Five system barriers to achieving ultrasafe health care. Ann Intern Med. 2005;142:756–64.CrossRefPubMedGoogle Scholar
  39. 39.
    Dekker S. Patient safety: a human factors approach. Boca Raton: CRC Press; 2011.CrossRefGoogle Scholar
  40. 40.
    Barach PR, Phelps G. Clinical sensemaking: a systematic approach to reduce the impact of normalised deviance in the medical profession. J R Soc Med. 2010;106:387–90.CrossRefGoogle Scholar
  41. 41.
    Rasmussen J. Risk management in a dynamic society: a modelling problem. Saf Sci. 1997;27:183–213.CrossRefGoogle Scholar
  42. 42.
    Wynne B. Unruly technology: practical rules, impractical discourses and public understanding. Soc Stud Sci. 1988;18:147–67.CrossRefGoogle Scholar
  43. 43.
    Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004;11:104–12.CrossRefPubMedPubMedCentralGoogle Scholar
  44. 44.
    Rasmussen J, Svedung I. Proactive risk management in a dynamic society. Karlstad: Swedish Rescue Services Agency; 2000.Google Scholar
  45. 45.
    Weick KE. The collapse of sensemaking in organisations: the Mann Gulch Disaster. Adm Sci Q. 1993;38:628–52.CrossRefGoogle Scholar
  46. 46.
    Spear SJ, Schmidhofer M. Ambiguity and workarounds as contributors to medical error. Ann Intern Med. 2005;142:627–30.CrossRefPubMedGoogle Scholar
  47. 47.
    Mandell LA, Wunderink RG, Anzueto A, Bartlett JG, Campbell GD, Dean NC, et al. Infectious Diseases Society of America/American Thoracic Society consensus guidelines on the management of community-acquired pneumonia in adults. Clin Infect Dis. 2007;44:S27–72.CrossRefPubMedGoogle Scholar
  48. 48.
    Perry SJ, Fairbanks RJ. Tempest in a teapot: standardisation and workarounds in everyday clinical work. In: Wears RL, Hollnagel E, Braithwaite J, editors. Resilient health care, The resilience of everyday clinical work, vol. 2. Surrey: Ashgate Publishing Limited; 2015. p. 163–75.Google Scholar
  49. 49.
    Bergstrom J, Dahlstrom N, Van Winsen R, Lutzhoft M, Dekker S, Nyce J. Rule-and role-retreat: an empirical study of procedures and resilience. J Marit Res. 2009;6:75–90.Google Scholar
  50. 50.
    Wears RL, Perry SJ, McFauls A. Dynamic changes in reliability and resilience in the emergency department. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting 2007. Vol 51, pp. 612–6.Google Scholar
  51. 51.
    Woods DD, Branlat M. Basic patterns in how adaptive systems fail. In: Hollnagel E, Paries J, Woods DD, editors. Resilience engineering in practice: a guidebook. Surrey: Ashgate Publishing Limited; 2011.Google Scholar
  52. 52.
    Woods DD, Wreathall J. Stress-strain plots as a basis for assessing system resilience. In: Hollnagel E, Nemeth C, Dekker S, editors. Resilience engineering perspectives: remaining sensitive to the possibility of failure. Hampshire: Ashgate Publishing Limited; 2008. p. 145–61.Google Scholar
  53. 53.
    Hollnagel E. The four cornerstones of resilience engineering. In: Nemeth CP, Hollnagel E, Dekker S, editors. Resilience engineering perspectives, Preparation and restoration, vol. 2. Surrey: Ashgate Publishing Limited; 2009. p. 117–33.Google Scholar
  54. 54.
    Hollnagel E. Prologue: the scope of resilience engineering. In: Hollnagel E, Paries J, Woods D, Wreathall J, editors. Resilience engineering in practice: a guidebook. Surrey: Ashgate Publishing Limited; 2011.Google Scholar
  55. 55.
    Dekker S, Woods D. The high reliability organisation perspective. In: Salas E, Maurino D, editors. Human factors in aviation. New York: Wiley; 2010. p. 123–46.CrossRefGoogle Scholar
  56. 56.
    Mumaw RJ, Roth EM, Vicente KJ, Burns CM. There is more to monitoring a nuclear power plant than meets the eye. Hum Factors. 2000;42:36–55.CrossRefPubMedGoogle Scholar
  57. 57.
    La Porte TR. A strawman speaks up: comments on the limits of safety. J Conting Crisis Manag. 1994;2:207–11.CrossRefGoogle Scholar
  58. 58.
    Roberts KH, Stout SK, Halpern JJ. Decision dynamics in two high reliability military organisations. Manag Sci. 1994;40:614–24.CrossRefGoogle Scholar
  59. 59.
    Lekka C. High reliability organisations: a review of the literature. Derbyshire: Health and Safety Laboratory; 2011. p. 34.Google Scholar
  60. 60.
    Waller MJ, Roberts KH. High reliability and organisational behavior: finally the twain must meet. J Organ Behav. 2003;24:813–4.CrossRefGoogle Scholar
  61. 61.
    Marais K, Dulac N, Leveson N. Beyond normal accidents and high reliability organizations: the need for an alternative approach to safety in complex systems. In: Paper Presented at the Engineering Systems Division Symposium, MIT, Cambridge; 2004. pp. 29–31.Google Scholar
  62. 62.
    Boin A, Schulman P. Assessing NASA’s safety culture: the limits and possibilities of high‐reliability theory. Public Adm Rev. 2008;68:1050–62.CrossRefGoogle Scholar
  63. 63.
    Frankel AS, Leonard MW, Denham CR. Fair and just culture, team behavior, and leadership engagement: the tools to achieve high reliability. Health Serv Res. 2006;41:1690–709.CrossRefPubMedPubMedCentralGoogle Scholar
  64. 64.
    Madsen P, Desai V, Roberts K, Wong D. Mitigating hazards through continuing design: the birth and evolution of a pediatric intensive care unit. Organ Sci. 2006;17:239–48.CrossRefGoogle Scholar
  65. 65.
    Sanchez JA, Barach PR. High reliability organisations and surgical microsystems: re-engineering surgical care. Surg Clin North Am. 2012;92:1–14.CrossRefPubMedGoogle Scholar
  66. 66.
    Barach PR. Addressing barriers for change in clinical practice. In: Guidet B, Valentin A, Flaatten H, editors. Quality management in intensive care. Cambridge: Cambridge University Press; 2016. p. 142–51.CrossRefGoogle Scholar
  67. 67.
    Federal Aviation Administration. Aviation Careers [Internet]. 2015. https://www.faa.gov/jobs/career_fields/aviation_careers/.
  68. 68.
    United States Naval Academy. Admissions [Internet]. n.d. http://www.usna.edu/Admissions/FAQ.php
  69. 69.
    Jackson JJ, Thoemmes F, Jonkmann K, Ludtke O, Trautwein U. Military training and personality trait development: does the military make the man, or does the man make the military? Psychol Sci. 2012;23:270–7.CrossRefPubMedGoogle Scholar
  70. 70.
    Nelson EC, Batalden PB, Huber TP, Mohr JJ, Godfrey MM, Headrick LA, et al. Microsystems in health care: part 1. Learning from high-performing front-line clinical units. Jt Comm J Qual Patient Saf. 2002;28:472–93.Google Scholar
  71. 71.
    Quinn JB. Intelligent enterprise: a knowledge and service based paradigm for industry. New York: The Free Press; 1992.Google Scholar
  72. 72.
    Fruhen LS, Flin RH, McLeod R. Chronic unease for safety in managers: a conceptualisation. J Risk Res. 2013;17:969–79.CrossRefGoogle Scholar
  73. 73.
    Christofidis MJ, Hill A, Horswill MS, Watson MO. A human factors approach to observation chart design can trump health professionals’ prior chart experience. Resuscitation. 2013;84:657–65.CrossRefPubMedGoogle Scholar
  74. 74.
    Barach PR, Berwick DM. Patient safety and the reliability of health care systems. Ann Intern Med. 2003;138:997–8.CrossRefPubMedGoogle Scholar
  75. 75.
    Jensen PF, Barach PR. The role of human factors in the intensive care unit. Qual Saf Health Care. 2003;12:147–8.CrossRefPubMedCentralGoogle Scholar
  76. 76.
    Mohr JJ, Barach PR. The role of microsystems. In: Carayon P, editor. Handbook of human factors and ergonomics in health care and patient safety. Boca Raton: Taylor & Francis; 2006. p. 95–107.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  1. 1.School of Humanities, Languages and Social Science, Safety Science Innovation LabGriffith UniversityBrisbaneAustralia

Personalised recommendations