Article

Surgical Endoscopy And Other Interventional Techniques

, Volume 17, Issue 4, pp 580-585

First online:

Evaluating minimally invasive surgery training using low-cost mechanical simulations

  • G.L. AdralesAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , U.B. ChuAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , D.B. WitzkeAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , M.B. DonnellyAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , D. HoskinsAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , M.J. MastrangeloAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , A. GandsasAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA
  • , A.E. ParkAffiliated withCenter for Minimally Invasive Surgery, University of Kentucky, Lexington, KY 40536-0298, USA

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Background: The goal of this study was to develop, test, and validate the efficacy of inexpensive mechanical minimally invasive surgery (MIS) model simulations for training faculty, residents, and medical students. We sought to demonstrate that trained and experienced MIS surgeon raters could reliably rate the MIS skills acquired during these simulations. Methods: We developed three renewable models that represent difficult or challenging segments of laparoscopic procedures; laparoscopic appendectomy (LA), laparoscopic cholecystectomy (LC), and laparoscopic inguinal hernia (LH). We videotaped 10 students, 12 surgical residents, and 1 surgeon receiving training on each of the models and again during their posttraining evaluation session. Five MIS surgeons then assessed the evaluation session performance. For each simulation, we asked them to rate overall competence (COM) and four skills: clinical judgment (respect for tissue) (CJ), dexterity (economy of movement) (DEX), serial/simultaneous complexity (SSC), and spatial orientation (SO). We computed intraclass correlation (ICC) coefficients to determine the extent of agreement (i.e., reliability) among ratings. Results: We obtained ICC values of 0.74, 0.84, and 0.81 for COM ratings on LH, LC, and LA, respectively. We also obtained the following ICC values for the same three models: CJ, 0.75, 0.83, and 0.89; DEX, 0.88, 0.86, and 0.89; SSC, 0.82, 0.82, and 0.82; and SO, 0.86, 0.86, and 0.87, respectively. Conclusions: We obtained very high reliability of performance ratings for competence and surgical skills using a mechanical simulator. Typically, faculty evaluations of residents in the operating room are much less reliable. In contrast, when faculty members observe residents in a controlled, standardized environment, their ratings can be very reliable.