Skip to main content
Log in

A systematic review of performance assessment tools for laparoscopic cholecystectomy

  • Review
  • Published:
Surgical Endoscopy Aims and scope Submit manuscript

Abstract

Background

Multiple tools are available to assess clinical performance of laparoscopic cholecystectomy (LC), but there are no guidelines on how best to implement and interpret them in educational settings. The purpose of this systematic review was to identify and critically appraise LC assessment tools and their measurement properties, in order to make recommendations for their implementation in surgical training.

Methods

A systematic search (1989–2013) was conducted in MEDLINE, Embase, Scopus, Cochrane, and grey literature sources. Evidence for validity (content, response process, internal structure, relations to other variables, and consequences) and the conditions in which the evidence was obtained were evaluated.

Results

A total of 54 articles were included for qualitative synthesis. Fifteen technical skills and two non-technical skills assessment tools were identified. The 17 tools were used for either: recorded procedures (nine tools, 60 %), direct observation (five tools, 30 %), or both (three tools, 18 %). Fourteen (82 %) tools reported inter-rater reliability and one reported a Generalizability Theory coefficient. Nine (53 %) had evidence for validity based on clinical experience and 11 (65 %) compared scores to other assessments. Consequences of scores, educational impact, applications to residency training, and how raters were trained were not clearly reported. No studies mentioned cost.

Conclusions

The most commonly reported validity evidence was inter-rater reliability and relationships to other known variables. Consequences of assessments and rater training were not clearly reported. These data and the evidence for validity should be taken into consideration when deciding how to select and implement a tool to assess performance of LC, and especially how to interpret the results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Vassiliou MC, Feldman LS (2011) Objective assessment, selection, and certification in surgery. Surg Oncol 20:140–145

    Article  PubMed  Google Scholar 

  2. The American Educational Research Association, American Psychological Association, and the National Council on Measurement in Education (1999) Standard for educational and psychological testing. American Educational Research Association, Washington

  3. Ghaderi I, Manji F, Park YS, Juul D, Ott M, Harris I, Farrell TM (2014) Technical skills assessment toolbox: a review using the unitary framework of validity. Ann Surg. doi:10.1097/SLA.0000000000000520

    Google Scholar 

  4. Downing SM (2003) Validity: on meaningful interpretation of assessment data. Med Educ 37:830–837

    Article  PubMed  Google Scholar 

  5. Kogan JR, Holmboe ES, Hauer KE (2009) Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 302:1316–1326

    Article  CAS  PubMed  Google Scholar 

  6. Schijven MP, Jakimowicz JJ, Broeders IAMJ, Tseng LNL (2005) The Eindhoven laparoscopic cholecystectomy training course–improving operating room performance using virtual reality training: results from the first E.A.E.S. accredited virtual reality trainings curriculum. Surg Endosc 19:1220–1226

    Article  CAS  PubMed  Google Scholar 

  7. van Det MJ, Meijerink WJHJ, Hoff C, Middel B, Pierie JPEN (2013) Effective and efficient learning in the operating theater with intraoperative video-enhanced surgical procedure training. Surg Endosc 27:2947–2954

    Article  PubMed  Google Scholar 

  8. Hwang H, Lim J, Kinnaird C, Nagy AG, Panton ONM, Hodgson AJ, Qayumi KA (2006) Correlating motor performance with surgical error in laparoscopic cholecystectomy. Surg Endosc 20:651–655

    Article  CAS  PubMed  Google Scholar 

  9. van Det MJ, Meijerink WJHJ, Hoff C, Middel LJ, Koopal SA, Pierie JPEN (2011) The learning effect of intraoperative video-enhanced surgical procedure training. Surg Endosc 25:2261–2267

    Article  PubMed Central  PubMed  Google Scholar 

  10. Ahlberg G, Enochsson L, Gallagher AG, Hedman L, Hogman C, McClusky DA, Ramel S, Smith CD, Arvidsson D (2007) Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 193:797–804

    Article  PubMed  Google Scholar 

  11. Gauger PG, Hauge LS, Andreatta PB, Hamstra SJ, Hillard ML, Arble EP, Kasten SJ, Mullan PB, Cederna PS, Minter RM (2010) Laparoscopic simulation training with proficiency targets improves practice and performance of novice surgeons. Am J Surg 199:72–80

    Article  PubMed  Google Scholar 

  12. Mohan P, Chaudhry R (2009) Laparoscopic simulators: are they useful! Med J Arm Forces India 324:1073–1078

    Google Scholar 

  13. Hamilton EC, Scott DJ, Fleming JB, Rege RV, Laycock R, Bergen PC, Tesfay ST, Jones DB (2002) Comparison of video trainer and virtual reality training systems on acquisition of laparoscopic skills. Surg Endosc 16:406–411

    Article  CAS  PubMed  Google Scholar 

  14. Williams RG, Sanfey H, Chen XP, Dunnington GL (2012) A controlled study to determine measurement conditions necessary for a reliable and valid operative performance assessment: a controlled prospective observational study. Ann Surg 256:177–187

    Article  PubMed  Google Scholar 

  15. Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, Fried GM (2005) A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg 190:107–113

    Article  PubMed  Google Scholar 

  16. Sarker SK, Maciocco M, Zaman A, Kumar I (2010) Operative performance in laparoscopic cholecystectomy using the procedural-based assessment tool. Am J Surg 200:334–340

    Article  PubMed  Google Scholar 

  17. Hogle NJ, Chang L, Strong VEM, Welcome AOU, Sinaan M, Bailey R, Fowler DL (2009) Validation of laparoscopic surgical skills training outside the operating room: a long road. Surg Endosc 23:1476–1482

    Article  CAS  PubMed  Google Scholar 

  18. Moldovanu R, Târcoveanu E, Dimofte G, Lupaşcu C, Bradea C (2011) Preoperative warm-up using a virtual reality simulator. J Soc Laparoendosc Surg 15:533–538

    Article  Google Scholar 

  19. Joice P, Hanna GB, Cuschieri A (1998) Errors enacted during endoscopic surgery—a human reliability analysis. Appl Ergon 29:409–414

    Article  CAS  PubMed  Google Scholar 

  20. Sarker SK, Chang A, Vincent C, Darzi SAW (2006) Development of assessing generic and specific technical skills in laparoscopic surgery. Am J Surg 191:238–244

    Article  PubMed  Google Scholar 

  21. Sarker SK, Chang A, Albrani T, Vincent C (2008) Constructing hierarchical task analysis in surgery. Surg Endosc 22:107–111

    Article  PubMed  Google Scholar 

  22. Mishra A, Catchpole K, McCulloch P (2009) The Oxford NOTECHS System: reliability and validity of a tool for measuring teamwork behaviour in the operating theatre. Qual Saf Health Care 18:104–108

    Article  CAS  PubMed  Google Scholar 

  23. Russ S, Hull L, Rout S, Vincent C, Darzi A, Sevdalis N (2012) Observational teamwork assessment for surgery: feasibility of clinical and nonclinical assessor calibration with short-term training. Ann Surg 255:804–809

    Article  PubMed  Google Scholar 

  24. Williams RG, Verhulst S, Colliver JA, Sanfey H, Chen X, Dunnington GL (2012) A template for reliable assessment of resident operative performance: assessment intervals, numbers of cases and raters. Surgery 152:517–527

    Article  PubMed  Google Scholar 

  25. Aggarwal R, Grantcharov T, Moorthy K, Milland T, Papasavas P, Dosis A, Bello F, Darzi A (2007) An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room. Ann Surg 245:992–999

    Article  PubMed Central  PubMed  Google Scholar 

  26. Eubanks TR, Clements RH, Pohl D, Williams N, Schaad DC, Horgan S, Pellegrini C (1999) An objective scoring system for laparoscopic cholecystectomy. J Am Coll Surg 189:566–574

    Article  CAS  PubMed  Google Scholar 

  27. Fried GM, Feldman LS, Vassiliou MC, Fraser SA, Stanbridge D, Ghitulescu G, Andrew CG (2004) Proving the value of simulation in laparoscopic surgery. Ann Surg 240:518–528

    Article  PubMed Central  PubMed  Google Scholar 

  28. Kundhal PS, Grantcharov TP (2009) Psychomotor performance measured in a virtual environment correlates with technical skills in the operating room. Surg Endosc 23:645–649

    Article  PubMed  Google Scholar 

  29. Scott DJ, Valentine RJ, Bergen PC, Rege RV, Laycock R, Tesfay ST, Jones DB (2000) Evaluating surgical competency with the American Board of Surgery In-Training Examination, skill testing, and intraoperative assessment. Surgery 128:613–622

    Article  CAS  PubMed  Google Scholar 

  30. Wohaibi EM, Bush RW, Earle DB, Seymour NE (2010) Surgical resident performance on a virtual reality simulator correlates with operating room performance. J Surg Res 160:67–72

    Article  PubMed  Google Scholar 

  31. Tang B, Hanna GB, Carter F, Adamson GD, Martindale JP, Cuschieri A (2006) Competence assessment of laparoscopic operative and cognitive skills: objective structured clinical examination (OSCE) or observational clinical human reliability assessment (OCHRA). World J Surg 30:527–534

    Article  CAS  PubMed  Google Scholar 

  32. Catchpole K, Mishra A, Handa A, McCulloch P (2008) Teamwork and error in the operating room: analysis of skills and roles. Ann Surg 247:699–706

    Article  CAS  PubMed  Google Scholar 

  33. Larson JL, Williams RG, Ketchum J, Boehler ML, Dunnington GL (2005) Feasibility, reliability and validity of an operative performance rating system for evaluating surgery residents. Surgery 138:640–649

    Article  PubMed  Google Scholar 

  34. Grantcharov TP, Schulze S, Kristiansen VB (2007) The impact of objective assessment and constructive feedback on improvement of laparoscopic performance in the operating room. Surg Endosc 21:2240–2243

    Article  PubMed  Google Scholar 

  35. Doyle JD, Webber EM, Sidhu RS (2007) A universal Global Rating Scale for the evaluation of technical skills in the operating room. Am J Surg 193:551–555

    Article  PubMed  Google Scholar 

  36. Crossley J, Davies H, Humphris G, Jolly B (2002) Generalisability: a key to unlock professional assessment. Med Educ 36:972–978

    Article  PubMed  Google Scholar 

  37. Bloch R, Norman G (2012) Generalizability theory for the perplexed: a practical introduction and guide: AMEE guide no. 68. Med Teach 34:960–992

    Article  PubMed  Google Scholar 

  38. Downing SM (2003) Item response theory: applications of modern test theory in medical education. Med Educ 37:739–745

    Article  PubMed  Google Scholar 

  39. Birkmeyer JD, Finks JF, O’Reilly A, Oerline M, Carlin AM, Nunn AR, Dimick J, Banerjee M, Birkmeyer NJO, Michigan Bariatric Surgery Collaborative (2013) Surgical skill and complication rates after bariatric surgery. N Engl J Med 369:1434–1442

    Article  CAS  PubMed  Google Scholar 

  40. Sanfey H, Williams RG, Chen X, Dunnington GL (2011) Evaluating resident operative performance: a qualitative analysis of expert opinions. Surgery 150:759–770

    Article  PubMed  Google Scholar 

  41. Kim MJ, Williams RG, Boehler ML, Ketchum JK, Dunnington GL (2009) Refining the evaluation of operating room performance. J Surg Educ 66:352–356

    Article  PubMed  Google Scholar 

  42. Gumbs AA, Hogle NJ, Fowler DL (2007) Evaluation of resident laparoscopic performance using global operative assessment of laparoscopic skills. J Am Coll Surg 204:308–313

    Article  PubMed  Google Scholar 

  43. Vassiliou MC, Feldman LS, Fraser SA, Charlebois P, Chaudhury P, Stanbridge DD, Fried GM (2007) Evaluating intraoperative laparoscopic skill: direct observation versus blinded videotaped performances. Surg Innov 14:211–216

    Article  PubMed  Google Scholar 

  44. Chang L, Hogle NJ, Moore BB, Graham MJ, Sinanan MN, Bailey R, Fowler DL (2007) Reliable assessment of laparoscopic performance in the operating room using videotape analysis. Surg Innov 14:122–126

    Article  PubMed  Google Scholar 

  45. Choy I, Fecso A, Kwong J, Jackson T, Okrainec A (2013) Remote evaluation of laparoscopic performance using the global operative assessment of laparoscopic skills. Surg Endosc 27:378–383

    Article  PubMed  Google Scholar 

  46. Sroka G, Feldman LS, Vassiliou MC, Kaneva PA, Fayez R, Fried GM (2010) Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room-a randomized controlled trial. Am J Surg 199:115–120

    Article  PubMed  Google Scholar 

  47. Beyer L, Troyer JD, Mancini J, Bladou F, Berdah SV, Karsenty G (2011) Impact of laparoscopy simulator training on the technical skills of future surgeons in the operating room: a prospective study. Am J Surg 202:265–272

    Article  PubMed  Google Scholar 

  48. Aggarwal R, Grantcharov T, Moorthy K, Milland T, Darzi A (2008) Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room. Ann Surg 247:372–379

    Article  PubMed  Google Scholar 

  49. Calatayud D, Arora S, Aggarwal R, Kruglikova I, Schulze S, Funch-Jensen P, Grantcharov T (2010) Warm-up in a virtual reality environment improves performance in the operating room. Ann Surg 251:1181–1185

    Article  PubMed  Google Scholar 

  50. Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P (2004) Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg 91:146–150

    Article  CAS  PubMed  Google Scholar 

  51. Palter VN, Orzech N, Reznick RK, Grantcharov TP (2013) Validation of a structured training and assessment curriculum for technical skill acquisition in minimally invasive surgery: a randomized controlled trial. Ann Surg 257:224–230

    Article  PubMed  Google Scholar 

  52. Wohaibi EM, Earle DB, Ansanitis FE, Wait RB, Fernandez G, Seymour NE (2007) A new web-based operative skills assessment tool effectively tracks progression in surgical resident performance. J Surg Educ 64:333–341

    Article  PubMed  Google Scholar 

  53. Sarker SK, Hutchinson R, Chang A, Vincent C, Darzi AW (2006) Self-appraisal hierarchical task analysis of laparoscopic surgery performed by expert surgeons. Surg Endosc 20:636–640

    Article  CAS  PubMed  Google Scholar 

  54. Sarker SK, Chang A, Vincent C, Darzi AW (2005) Technical skills errors in laparoscopic cholecystectomy by expert surgeons. Surg Endosc 19:832–835

    Article  CAS  PubMed  Google Scholar 

  55. Sarker SK, Chang A, Vincent C (2006) Technical and technological skills assessment in laparoscopic surgery. J Soc Laparoendosc Surg 10:284–292

    Google Scholar 

  56. Guerlain S, Adams RB, Turrentine FB, Shin T, Guo H, Collins SR, Calland JF (2005) Assessing team performance in the operating room: development and use of a “black-box” recorder and other tools for the intraoperative environment. ACS 200:29–37

    Google Scholar 

  57. Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, Satava RM (2002) Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 236:458–464

    Article  PubMed Central  PubMed  Google Scholar 

  58. Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Andersen DK, Satava RM (2004) Analysis of errors in laparoscopic surgical procedures. Surg Endosc 18:592–595

    Article  CAS  PubMed  Google Scholar 

  59. Tang B, Hanna GB, Joice P, Cuschieri A (2004) Identification and categorization of technical errors by observational clinical human reliability assessment (OCHRA) during laparoscopic cholecystectomy. Arch Surg 139:1215–1220

    Article  CAS  PubMed  Google Scholar 

  60. Mishra A, Catchpole K, Dale T, McCulloch P (2008) The influence of non-technical performance on technical outcome in laparoscopic cholecystectomy. Surg Endosc 22:68–73

    Article  CAS  PubMed  Google Scholar 

  61. McCulloch P, Mishra A, Handa A, Dale T, Hirst G, Catchpole K (2009) The effects of aviation-style non-technical skills training on technical performance and outcome in the operating theatre. Qual Saf Health Care 18:109–115

    Article  CAS  PubMed  Google Scholar 

  62. Hull L, Arora S, Kassab E, Kneebone R, Sevdalis N (2011) Observational teamwork assessment for surgery: content validation and tool refinement. J Am Coll Surg 212:234–245

    Article  PubMed  Google Scholar 

Download references

Disclosures

The Steinberg-Bernstein Centre for Minimally Invasive Surgery and Innovation received an unrestricted educational grant from Covidien Canada. Y. Watanabe, E. Bilgic, E. Lebedeva, K.M. McKendy, L.S. Feldman, G.M. Fried, M.C. Vassiliou have no relevant conflicts of interests or financial ties to disclose.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yusuke Watanabe.

Appendix: Search strategy of MEDLINE

Appendix: Search strategy of MEDLINE

1

exp cholecystectomy/or cholecystectomy, laparoscopic/

2

exp professional competence/or exp clinical competence/

3

(Task Performance and Analysis).mp. [mp = title, abstract, original title, name of substance word, subject heading word, keyword heading word, protocol supplementary concept, rare disease supplementary concept, unique identifier]

4

exp study characteristics/or exp evaluation studies/

5

(Internship and Residency).mp. [mp = title, abstract, original title, name of substance word, subject heading word, keyword heading word, protocol supplementary concept, rare disease supplementary concept, unique identifier]

6

1 and 2 and 3

7

exp Curriculum/ed, mt, sn [Education, Methods, Statistics & Numerical Data]

8

exp validation studies/

9

exp Laparoscopy/ed, mt, st [Education, Methods, Standards]

10

1 and 2 and 3 and 8

11

1 and 2 and 8

12

limit 1 to systematic reviews

13

2 and 12

14

4 and 8

15

1 and 2 and 14

16

3 and 15

17

exp self concept/or self-assessment/

18

exp methods/or exp observation/or exp research design/

19

(decision making and clinical competence$ and skill$).ab.

20

1 and 2 and 18 and 19

21

Educational Measurement/and “Internship and Residency”/

22

1 and 2 and 21

23

1 and 9 and 17

24

1 and 2 and 5

25

evaluation studies/or evaluation studies as topic/or program evaluation/or validation studies as topic/or Intervention Studies/or (effectiveness or (pre- adj5 post-)).ti,ab. or (program* adj3 evaluat*).ti,ab. or intervention*.ti,ab.

26

1 and 2 and 25

27

exp Clinical Trial/or double-blind method/or (clinical trial* or randomized controlled trial or multicenter study).pt. or exp Clinical Trials as Topic/or ((randomi?ed adj7 trial*) or (controlled adj3 trial*) or (clinical adj2 trial*) or ((single or doubl* or tripl* or treb*) and (blind* or mask*))).ti,ab.

28

1 and 2 and 27

29

(exp methods/or exp observation/or exp research design/) and #1.mp. and #9.mp. [mp = title, abstract, original title, name of substance word, subject heading word, keyword heading word, protocol supplementary concept, rare disease supplementary concept, unique identifier]

30

1 and 17 and 29

31

17 not patients.mp. [mp = title, abstract, original title, name of substance word, subject heading word, keyword heading word, protocol supplementary concept, rare disease supplementary concept, unique identifier]

32

1 and 9 and 31

33

(Task Performance and Analysis).tw

34

1 and 2 and 33

35

1 and 2 and 4 and 33

36

1 and 2 and 4 and 5

37

(self concept or self-assessment).tw

38

((self concept or self-assessment) not patients).tw

39

1 and 2 and 4 and 5 and 38

40

1 and 2 and 38

41

1 and 33 and 38

42

1 and 2 and 5 and 7

43

curriculum.tw

44

1 and 5 and 43 (

45

validation studies.tw

46

1 and 2 and 29 and 45

47

1 and 5 and 18

48

1 and 5 and 25

49

1 and 5 and 27

50

1 and 17 and 25

51

1 and 21 and 25

52

1 and 5 and 9 and 21 and 25

53

decision$making.tw

54

(Educational adj2 assessment).tw

55

(General surgery adj2 training).mp

56

(objective adj2 assessment).mp

57

Non$technical skill$.mp

58

((performance adj2 assessment) or (performance adj2 evaluation)).tw

59

(surgical adj2 assessment tool$).mp

60

(surgical adj2 skill$).mp

61

Technical error$.tw

62

(Resident adj2 evaluation).tw

63

(simulator adj2 training).tw

64

((mental adj2 training) and (mental adj2 practice)).tw

65

(motor adj2 skill$).tw

66

(Intraoperative adj2 performance).tw

67

human error$.tw

68

direct observation.tw

69

(acquisition adj2 skil$).tw

70

or/53–69

71

1 and 70

72

feedback.tw

73

expert testimony.tw

74

Confidence Intervals.tw

75

video recording.tw

76

operating rooms.tw

77

simulation.tw

78

or/72–77

79

1 and 70 and 78

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Watanabe, Y., Bilgic, E., Lebedeva, E. et al. A systematic review of performance assessment tools for laparoscopic cholecystectomy. Surg Endosc 30, 832–844 (2016). https://doi.org/10.1007/s00464-015-4285-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00464-015-4285-8

Keywords

Navigation