Advertisement

Improving Clinical Performance by Analyzing Surgical Skills and Operative Errors

  • Katherine L. Forsyth
  • Anne-Lise D’Angelo
  • Elaine M. Cohen
  • Carla M. Pugh
Chapter

Abstract

Operative errors play a major role in the safety and quality for surgical patients. This chapter looks at the operative errors in relation to skills performance by first reviewing previously developed and commonly used surgical skills assessments. Observation-based methods such as OSATS, performance checklists, and global rating scales are discussed along with technology-based performance methods including motion analysis, as well as attention and physiologic stress monitoring. Secondly, this chapter reviews error analysis methods utilized in the fields of aviation, mining, and anesthesia. In addition, current approaches to operative error analysis employing closed malpractice claims, video-recorded surgical procedures, and simulation are discussed. This chapter culminates with a discussion on the future directions for skill performance and assessment and the role the surgical community must play in order to achieve optimal levels of safety and quality in the operating room.

Keywords

Operative error Surgical performance Error analysis Operative assessment Surgical skills assessment Patient safety 

References

  1. 1.
    Kohn LT, Corrigan JM, Donaldson MS. To err is human: building a safer health system, vol. 6. Washington, DC: National Academies Press; 1999.Google Scholar
  2. 2.
    Birkmeyer JD, Finks JF, O’Reilly A, Oerline M, Carlin AM, Nunn AR, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369(15):1434–42.CrossRefPubMedGoogle Scholar
  3. 3.
    The American Board of Surgery. 2015–2016 ABS Booklet of Information Surgery. 2015. https://www.absurgery.org/xfer/BookletofInfo-Surgery.pdf. Accessed 1 Nov 2015.
  4. 4.
    ACGME program requirements for graduate medical education in surgery [Internet]. https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/440_general_surgery_07012015.pdf.
  5. 5.
    Martin JA, Regehr G, Reznick R, Macrae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273–8.CrossRefPubMedGoogle Scholar
  6. 6.
    Reznick R, MacRae H. Changes in the wind. N Engl J Med. 2006;355:2664–9.CrossRefPubMedGoogle Scholar
  7. 7.
    Moorthy K, Munz Y, Sarker SK, Darzi A. Objective assessment of technical skills in surgery. Br Med J. 2003;327:1032–7.CrossRefGoogle Scholar
  8. 8.
    Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1:447–51.CrossRefPubMedPubMedCentralGoogle Scholar
  9. 9.
    Regehr G, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73(9):993–7.CrossRefPubMedGoogle Scholar
  10. 10.
    Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(97):226–30.CrossRefPubMedGoogle Scholar
  11. 11.
    Swift SE, Carter JF. Institution and validation of an observed structured assessment of technical skills (OSATS) for obstetrics and gynecology residents and faculty. Am J Obstet Gynecol. 2006;195:617–21.CrossRefPubMedGoogle Scholar
  12. 12.
    Bodle JF, Kaufmann SJ, Bisson D, Nathanson B, Binney DM. Value and face validity of objective structured assessment of technical skills (OSATS) for work based assessment of surgical skills in obstetrics and gynaecology. Med Teach. 2008;30:212–6.CrossRefPubMedGoogle Scholar
  13. 13.
    D’Angelo A-LD, Cohen ER, Kwan C, Laufer S, Greenberg C, Greenberg J, et al. Use of decision-based simulations to assess resident readiness for operative independence. Am J Surg. 2015;209(1):132–9.CrossRefPubMedGoogle Scholar
  14. 14.
    Hiemstra E. Value of an objective assessment tool in the operating room. Can J Surg. 2011;54:116–22.CrossRefPubMedPubMedCentralGoogle Scholar
  15. 15.
    Eubanks TR, Clements RH, Pohl D, Williams N, Schaad DC, Horgan S, et al. An objective scoring system for laparoscopic cholecystectomy. J Am Coll Surg. 1999;189(99):566–74.CrossRefPubMedGoogle Scholar
  16. 16.
    van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010;97:972–87.CrossRefPubMedGoogle Scholar
  17. 17.
    Larson JL, Williams RG, Ketchum J, Boehler ML, Dunnington GL. Feasibility, reliability and validity of an operative performance rating system for evaluating surgery residents. Surgery. 2005;138:640–9.CrossRefPubMedGoogle Scholar
  18. 18.
    Sarker SK, Chang A, Vincent C. Technical and technological skills assessment in laparoscopic surgery. J Soc Laparoendosc Surg. 2006;10:284–92.Google Scholar
  19. 19.
    Aggarwal R, Grantcharov T, Moorthy K, Milland T, Darzi A. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room. Ann Surg. 2008;247(2):372–9.CrossRefPubMedGoogle Scholar
  20. 20.
    Moorthy K, Munz Y, Dosis A, Bello F, Chang A, Darzi A. Bimodal assessment of laparoscopic suturing skills: construct and concurrent validity. Surg Endosc. 2004;18:1608–12.PubMedGoogle Scholar
  21. 21.
    Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg. 2007;193:551–5.CrossRefPubMedGoogle Scholar
  22. 22.
    Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190:107–13.CrossRefPubMedGoogle Scholar
  23. 23.
    Jelovsek JE, Kow N, Diwadkar GB. Tools for the direct observation and assessment of psychomotor skills in medical trainees: a systematic review. Med Educ. 2013;47:650–73.CrossRefPubMedGoogle Scholar
  24. 24.
    Rutherford DN, D’Angelo A-LD, Law KE, Pugh CM. Advanced engineering technology for measuring performance. Surg Clin North Am. 2015;95:813–26.CrossRefPubMedGoogle Scholar
  25. 25.
    Oropesa I, Sánchez-González P, Chmarra MK, Lamata P, Fernández Á, Sánchez-Margallo JA, et al. EVA: laparoscopic instrument tracking based on endoscopic video analysis for psychomotor skills assessment. Surg Endosc. 2013;27(3):1029–39.CrossRefPubMedGoogle Scholar
  26. 26.
    Datta V, Chang A, Mackay S, Darzi A. The relationship between motion analysis and surgical technical assessments. Am J Surg. 2002;184:70–3.CrossRefPubMedGoogle Scholar
  27. 27.
    Brydges R, Sidhu R, Park J, Dubrowski A. Construct validity of computer-assisted assessment: quantification of movement processes during a vascular anastomosis on a live porcine model. Am J Surg. 2007;193:523–9.CrossRefPubMedGoogle Scholar
  28. 28.
    Aggarwal R, Grantcharov T, Moorthy K, Milland T, Papasavas P, Dosis A, et al. An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room. Ann Surg. 2007;245(6):992–9.CrossRefPubMedPubMedCentralGoogle Scholar
  29. 29.
    D’Angelo A-LD, Rutherford DN, Ray RD, Laufer S, Kwan C, Cohen ER, et al. Idle time: an underdeveloped performance metric for assessing surgical skill. Am J Surg. 2015;209(4):645–51.CrossRefPubMedPubMedCentralGoogle Scholar
  30. 30.
    D’Angelo A-LD, Rutherford DN, Ray RD, Laufer S, Mason A, Pugh CM. Working volume: evaluating validity evidence of a new measure of surgical efficiency. Am J Surg. 2016;211(2):445–50.Google Scholar
  31. 31.
    Datta V, Mandalia M, Mackay S, Chang A, Cheshire N, Darzi A. Relationship between skill and outcome in the laboratory-based model. Surgery. 2002;131(3):318–23.CrossRefPubMedGoogle Scholar
  32. 32.
    Bann S, Davis IM, Moorthy K, Munz Y, Hernandez J, Khan M, et al. The reliability of multiple objective measures of surgery and the role of human performance. Am J Surg. 2005;189(6):747–52.CrossRefPubMedGoogle Scholar
  33. 33.
    Oropesa I, Chmarra MK, Sanchez-Gonzalez P, Lamata P, Rodrigues SP, Enciso S, et al. Relevance of motion-related assessment metrics in laparoscopic surgery. Surg Innov. 2013;20(3):299–312.CrossRefPubMedGoogle Scholar
  34. 34.
    D’Angelo A-LD, Rutherford DN, Ray RD, Mason A, Pugh CM. Operative skill: quantifying surgeon’s response to tissue properties. J Surg Res. 2015;198(2):1–5.CrossRefGoogle Scholar
  35. 35.
    Datta V, Bann S, Mandalia M, Darzi A. The surgical efficiency score: a feasible, reliable, and valid method of skills assessment. Am J Surg. 2006;192:372–8.CrossRefPubMedGoogle Scholar
  36. 36.
    Wickens C. Attention. In: Lee DN, Kirlik A, editors. The oxford handbook of cognitive engineering. Oxford: Oxford University Press; 2013.Google Scholar
  37. 37.
    Atkins MS, Tien G, Khan RSA, Meneghetti A, Zheng B. What do surgeons see: Capturing and synchronizing eye gaze for surgery applications. Surg Innov. 2012;20(3):241–8.CrossRefPubMedGoogle Scholar
  38. 38.
    Tien T, Pucher PH, Sodergren MH, Sriskandarajah K, Yang G-Z, Darzi A. Differences in gaze behaviour of expert and junior surgeons performing open inguinal hernia repair. Surg Endosc. 2015;29(2):405–13.CrossRefPubMedGoogle Scholar
  39. 39.
    Khan RSA, Tien G, Atkins MS, Zheng B, Panton ONM, Meneghetti AT. Analysis of eye gaze: do novice surgeons look at the same location as expert surgeons during a laparoscopic operation? Surg Endosc. 2012;26(12):3536–40.CrossRefPubMedGoogle Scholar
  40. 40.
    Cohen RA. Yerkes-Dodson law. In: Encyclopedia of clinical neuropsychology. New York: Springer; 2011. p. 2737–8.CrossRefGoogle Scholar
  41. 41.
    Pavlidis I, Tsiamyrtzis P, Shastri D, Wesley A, Zhou Y, Lindner P, et al. Fast by nature—how stress patterns define human experience and performance in dexterous tasks. Sci Rep. 2012;2:305.CrossRefPubMedPubMedCentralGoogle Scholar
  42. 42.
    Dunkin BJ, Donovan M, Bass B. Methodist Institute for Technology, Innovation and Education. J Surg Educ. 2011;68(1):79–82.CrossRefPubMedGoogle Scholar
  43. 43.
    Wiegmann D, Shappell S. A human error approach to aviation accident analysis: the human factors analysis and classification system. Aldershot: Ashgate Publishing Company; 2003.Google Scholar
  44. 44.
    Reason J. Human error. New York: Cambridge University Press; 1990.CrossRefGoogle Scholar
  45. 45.
    Wiegmann D, Faaborg T, Boquet A, Detwiler C, Holcomb K, Shappell S. Human error and general aviation accidents: a comprehensive, fine-grained analysis using HFACS. Federal Aviation Administration Oklahoma City OK Civil Aeromedical Institute; No. DOT/FAA/AM-05/24. 2005.Google Scholar
  46. 46.
    Wiegmann D, Shappell S. Human error analysis of commercial aviation accidents: application of the human factors analysis and classification system (HFACS). Aviat Space Environ Med. 2001;72(11):1006–16.PubMedGoogle Scholar
  47. 47.
    Wiegmann D, Shappell S. Human error perspectives in aviation. Int J Aviat Psychol. 2001;11(4):341–57.CrossRefGoogle Scholar
  48. 48.
    Shappell S, Wiegmann D. HFACS analysis of military and civilian aviation accidents: a North American comparison. In: Proceedings of the Annual Meeting of the International Society of the Air Safety Investigators, Gold Coast, Australia. 2004. pp. 1–8.Google Scholar
  49. 49.
    Dambier M, Hinkelbein J. Analysis of 2004 German general aviation aircraft accidents according to the HFACS model. Air Med J. 2006;25:265–9.CrossRefPubMedGoogle Scholar
  50. 50.
    Gaur D. Human factors analysis and classification system applied to civil aircraft accidents in India. Aviat Space Environ Med. 2005;76(5):501–5.PubMedGoogle Scholar
  51. 51.
    Li W-C, Harris D. Pilot error and its relationship with higher organizational levels: HFACS analysis of 523 accidents. Aviat Space Environ Med. 2006;77(10):1056–61.PubMedGoogle Scholar
  52. 52.
    Hooper BJ, O’Hare DP. Exploring human error in military aviation flight safety events using post-incident classification systems. Aviat Space Environ Med. 2013;84:803–13.CrossRefPubMedGoogle Scholar
  53. 53.
    O’Hare D, Wiggins M, Batt R, Morrison D. Cognitive failure analysis for aircraft accident investigation. Ergonomics. 1994;37(11):1855–69.CrossRefGoogle Scholar
  54. 54.
    Wiggins MW, Stevens C, Howard A, Henley I, O’Hare D. Expert, intermediate and novice performance during simulated pre-flight decision-making. Aust J Psychol. 2002;54(3):162–7.CrossRefGoogle Scholar
  55. 55.
    Wiegmann D, Shappell S. Human factors analysis of postaccident data: Applying theoretical taxonomies of human error. Int J Aviat Psychol. 1997;7(1):67–81.CrossRefGoogle Scholar
  56. 56.
    O’Hare D. Cognitive functions and performance shaping factors in aviation accidents and incidents. Int J Aviat Psychol. 2006;16(2):145–56.CrossRefGoogle Scholar
  57. 57.
    Coleman PJ, Kerkering JC. Measuring mining safety with injury statistics: lost workdays as indicators of risk. J Safety Res. 2007;38:523–33.CrossRefPubMedGoogle Scholar
  58. 58.
    Rushworth AM, Talbot CF, von Glehn FH, Lomas RM. Investigating the causes of transport and tramming accidents on coal mines, Safety in Mine Research Advisory Committee. 1999.Google Scholar
  59. 59.
    Patterson JM, Shappell S. Operator error and system deficiencies: analysis of 508 mining incidents and accidents from Queensland, Australia using HFACS. Accid Anal Prev. 2010;42(4):1379–85.CrossRefPubMedGoogle Scholar
  60. 60.
    Cooper JB, Newbower RS, Long CD, Mc Peek B. Preventable anesthesia mishaps: a study of human factors. Anesthesiology. 1978;49(6):399–406.CrossRefPubMedGoogle Scholar
  61. 61.
    Cooper JB, Newbower RS, Kitz RJ. An analysis of major errors and equipment failures in anesthesia management: considerations for prevention and detection. Anesthesiology. 1984;60:34–42.CrossRefPubMedGoogle Scholar
  62. 62.
    Gaba DM. Dynamic decision-making in anesthesiology: cognitive models and training approaches. In: Evans D, Patel V, editors. Advanced models of cognition for medical training and practice. Berlin Heidelberg: Springer; 1992. p. 123–47.CrossRefGoogle Scholar
  63. 63.
    Gaba DM. Human error in anesthetic mishaps. Int Anesthesiol Clin. 1989;27(3):137–47.CrossRefPubMedGoogle Scholar
  64. 64.
    Rasmussen J. Human errors. A taxonomy for describing human malfunction in industrial installations. J Occup Accid. 1982;4:311–33.CrossRefGoogle Scholar
  65. 65.
    Swain AD, Weston LM. An approach to the diagnosis and misdiagnosis of abnormal conditions in post-accident sequences in complex man-machine systems. In: Goodstein LP, Andersen HB, Olsen SE, editors. Tasks, errors, and mental models. Bristol, PA: Taylor & Francis; 1988. p. 209–29.Google Scholar
  66. 66.
    Carayon P, Schoofs Hundt A, Karsh B-T, Gurses AP, Alvarado CJ, Smith M, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006;15:i50–8.CrossRefPubMedPubMedCentralGoogle Scholar
  67. 67.
    Regenbogen SE, Greenberg CC, Studdert DM, Lipsitz SR, Zinner MJ, Gawande AA. Patterns of technical error among surgical malpractice claims. Ann Surg. 2007;246(5):705–11.CrossRefPubMedGoogle Scholar
  68. 68.
    Singh H, Thomas E, Petersen L, Studdert D. Medical errors involving trainees. Arch Intern Med. 2007;167(19):2030–6.CrossRefPubMedGoogle Scholar
  69. 69.
    Fabri PJ, Zayas-Castro JL. Human error, not communication and systems, underlies surgical complications. Surgery. 2008;144:557–65.CrossRefPubMedGoogle Scholar
  70. 70.
    Joice P, Hanna GB, Cuschieri A. Errors enacted during endoscopic surgery—a human reliability analysis. Appl Ergon. 1998;29(6):409–14.CrossRefPubMedGoogle Scholar
  71. 71.
    Tang B, Hanna GB, Joice P, Cuschieri A. Identification and categorization of technical errors by Observational Clinical Human Reliability Assessment (OCHRA) during laparoscopic cholecystectomy. Arch Surg. 2004;139:1215–20.CrossRefPubMedGoogle Scholar
  72. 72.
    Lien H-H, Huang C-C, Liu J-S, Shi M-Y, Chen D-F, Wang N-Y, et al. System approach to prevent common bile duct injury and enhance performance of laparoscopic cholecystectomy. Surg Laparosc Endosc Percutan Tech. 2007;17(3):164–70.CrossRefPubMedGoogle Scholar
  73. 73.
    Mishra A, Catchpole K, Dale T, McCulloch P. The influence of non-technical performance on technical outcome in laparoscopic cholecystectomy. Surg Endosc. 2008;22(1):68–73.CrossRefPubMedGoogle Scholar
  74. 74.
    Pugh CM, Santacaterina S, DaRosa DA, Clark RE. Intra-operative decision making: more than meets the eye. J Biomed Inform. 2011;44(3):486–96.CrossRefPubMedGoogle Scholar
  75. 75.
    Pugh C, Plachta S, Auyang E, Pryor A, Hungness E. Outcome measures for surgical simulators: is the focus on technical skills the best approach? Surgery. 2010;147(5):646–54.CrossRefPubMedGoogle Scholar
  76. 76.
    D’Angelo A-LD, Law KE, Cohen ER, Greenberg JA, Kwan C, Greenberg C, et al. The use of error analysis to assess resident performance. Surgery. 2015;158:1408–14.CrossRefPubMedPubMedCentralGoogle Scholar
  77. 77.
    Karamichalis JM, Barach PR, Nathan M, Henaine R, del Nido PJ, Bacha EA. Assessment of technical competency in pediatric cardiac surgery. Prog Pediatr Cardiol. 2012;33(1):15–20.CrossRefGoogle Scholar
  78. 78.
    Catchpole KR, Giddings AEB, Wilkinson M, Hirst G, Dale T, de Leval MR. Improving patient safety by identifying latent failures in successful operations. Surgery. 2007;142(1):102–10.CrossRefPubMedGoogle Scholar
  79. 79.
    Barach P, Johnson JK, Ahmad A, Galvan C, Bognar A, Duncan R, et al. A prospective observational study of human factors, adverse events, and patient outcomes in surgery for pediatric cardiac disease. J Thorac Cardiovasc Surg. 2008;136(6):1422–8.CrossRefPubMedGoogle Scholar
  80. 80.
    Schraagen JM, Schouten T, Smit M, Haas F, van der Beek D, van de Ven J, et al. A prospective study of paediatric cardiac surgical microsystems: assessing the relationships between non-routine events, teamwork and patient outcomes. BMJ Qual Saf. 2011;20(7):599–603.CrossRefPubMedGoogle Scholar
  81. 81.
    Blocker RC, Duff S, Wiegmann D, Catchpole K, Blaha J, Shouhed D, et al. Flow disruptions in trauma surgery: type, impact, and affect. Proc Hum Fact Ergon Soc Annu Meet. 2012;56:811–5.CrossRefGoogle Scholar
  82. 82.
    Wiegmann DA, ElBardissi AW, Dearani JA, Daly RC, Sundt TM. Disruptions in surgical flow and their relationship to surgical errors: an exploratory investigation. Surgery. 2007;142(5):658–65.CrossRefPubMedGoogle Scholar
  83. 83.
    Shouhed D, Catchpole K, Ley EJ, Blaha J, Blocker RC, Duff S, et al. Flow disruptions during trauma care. J Am Coll Surg. 2012;215(3):S99–100.CrossRefGoogle Scholar
  84. 84.
    Keith N, Frese M. Effectiveness of error management training: a meta-analysis. J Appl Psychol. 2008;93(1):59–69.CrossRefPubMedGoogle Scholar
  85. 85.
    DaRosa DA, Pugh CM. Error training: missing link in surgical education. Surgery. 2012;151(2):139–45.CrossRefPubMedGoogle Scholar
  86. 86.
    Carthey J, de Leval MR, Reason JT. The human factor in cardiac surgery: errors and near misses in a high technology medical domain. Ann Thorac Surg. 2001;72(1):300–5.CrossRefPubMedGoogle Scholar
  87. 87.
    Bonrath EM, Dedy NJ, Zevin B, Grantcharov TP. Defining technical errors in laparoscopic surgery: a systematic review. Surg Endosc. 2013;27(8):2678–91.CrossRefPubMedGoogle Scholar
  88. 88.
    Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100(3):363–406.CrossRefGoogle Scholar
  89. 89.
    Ericsson K. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):70–81.CrossRefGoogle Scholar
  90. 90.
    Laufer S, Cohen ER, Kwan C, D’Angelo A-LD, Yudkowsky R, Boulet JR, et al. Sensor technology in assessments of clinical skill. N Engl J Med. 2015;372(8):784–6.CrossRefPubMedPubMedCentralGoogle Scholar
  91. 91.
    Pirsiavash H, Vondrick C, Torralba A. Assessing the quality of actions. In: Medical Education. 2014. p. 556–71.Google Scholar
  92. 92.
    Moeslund TB, Hilton A, Krüger V. A survey of advances in vision-based human motion capture and analysis. Comput Vis Image Underst. 2006;104(2–3):90–126.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Katherine L. Forsyth
    • 1
  • Anne-Lise D’Angelo
    • 2
  • Elaine M. Cohen
    • 3
  • Carla M. Pugh
    • 4
  1. 1.Department of Industrial and System EngineeringUniversity of Wisconsin School of Medicine and Public HealthMadisonUSA
  2. 2.Department of SurgeryUniversity of Wisconsin Hospitals and ClinicsMadisonUSA
  3. 3.Department of SurgeryUniversity of Wisconsin School of Medicine and Public HealthMadisonUSA
  4. 4.Department of SurgeryUniversity of Wisconsin—MadisonMadisonUSA

Personalised recommendations