Surgical Endoscopy

, Volume 27, Issue 5, pp 1468–1477 | Cite as

Is motion analysis a valid tool for assessing laparoscopic skill?

  • John D. Mason
  • James AnsellEmail author
  • Neil Warren
  • Jared Torkington



The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment.


Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation.


Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity.


Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.


Quality Control Surgical < technical Education 



J. Ansell is currently funded by a Royal College of Surgeons England research fellowship grant.


Authors John D. Mason, James Ansell, Neil Warren and Jared Torkington have no conflicts of interest or financial ties to disclose.


  1. 1.
    Darzi A, Smith S, Taffinder N (1999) Assessing operative skill. Using these systems, assessment. BMJ 318:887–888PubMedCrossRefGoogle Scholar
  2. 2.
    Chesser S, Bowman K, Phillips H (2002) The European Working Time Directive and the training of surgeons. BMJ 325:S69PubMedCrossRefGoogle Scholar
  3. 3.
    Donohoe C, Sayana M, Kennedy M, Niall D (2010) European Time Directive: implications for surgical training. Ir Med J 103:57–59PubMedGoogle Scholar
  4. 4.
    Smith R (1998) All changed, changed utterly: British medicine will be transformed by the Bristol case. BMJ 316:1917–1918PubMedCrossRefGoogle Scholar
  5. 5.
    Bridges M, Diamond D (1999) The financial impact of teaching surgical residents in the operating room. Am J Surg 177:28–32PubMedCrossRefGoogle Scholar
  6. 6.
    Munz Y, Kumar B, Moorthy K, Bann S, Darzi A (2004) Laparoscopic virtual reality and box trainers. Is one superior to the other? Surg Endosc 18:485–494PubMedCrossRefGoogle Scholar
  7. 7.
    Sturm L, Windsor J, Cosman P, Cregan P, Hewett P, Maddern G (2008) A systematic review of skills transfer after surgical simulation training. Ann Surg 248(2):166–179PubMedCrossRefGoogle Scholar
  8. 8.
    Chung J, Sackier J (1998) A method of objectively evaluating improvements in laparoscopic skills. Surg Endosc 12:1117–1120CrossRefGoogle Scholar
  9. 9.
    Smith S, Torkington J, Brown T, Taffinder N, Darzi A (2002) Motion analysis. A tool for assessing laparoscopic dexterity in the performance of a laboratory-based laparoscopic cholecystectomy. Surg Endosc 16:640–645PubMedCrossRefGoogle Scholar
  10. 10.
    Clayden G (1994) Teaching laparoscopic surgery. Preliminary training on animals is essential. BMJ 309:342PubMedGoogle Scholar
  11. 11.
    Wilson M, Middlebrooke A, Sutton C, Stone R, McCloy R (1997) MIST-VR: a virtual reality trainer for laparoscopic surgery assesses performance. Ann R Coll Surg Engl 79:403–404PubMedGoogle Scholar
  12. 12.
    Larson A (2001) An open and flexible framework for computer aided surgical training. Stud Health Technol Inform 81:263–265Google Scholar
  13. 13.
    Van Sickle K, McClusky DI, Gallagher A, Smith C (2005) Construct validation of the ProMIS simulator using a novel laparoscopic suturing task. Surg Endosc 19:1227–1231PubMedCrossRefGoogle Scholar
  14. 14.
    McDougall E, Corica F, Boker J, Sala L, Stoliar G, Borin J, Chu F, Clayman R (2006) Construct validity testing of a laparoscopic surgical simulator. J Am Coll Surg 202(5):779–787PubMedCrossRefGoogle Scholar
  15. 15.
    van Hove P, Tuijthof G, Verdaasdonk E, Stassen LP, Dankelman J (2010) Objective assessment of technical surgical skills. Br J Surg 97:972–987PubMedCrossRefGoogle Scholar
  16. 16.
    Datta V, Mackay S, Darzi A, Gillies D (2001) Motion analysis in the assessment of surgical skill. Comput Methods Med Biomed Eng 4:515–523CrossRefGoogle Scholar
  17. 17.
    Aggarwal R, Hance J, Darzi A (2004) Surgical education and training in the new millenium. Surg Endosc 18:1409–1410PubMedCrossRefGoogle Scholar
  18. 18.
    Aggarwal R, Grantcharov T, Moorthy K, Milland T, Papasavas P, Dosis A, Bello F, Darzi A (2007) An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room. Ann Surg 245(6):992–999PubMedCrossRefGoogle Scholar
  19. 19.
    Thijssen A, Schijven M (2010) Contemporary virtual reality laparoscopy simulators: quicksand or solid grounds for assessing surgical trainees? Am J Surg 199:529–541PubMedCrossRefGoogle Scholar
  20. 20.
    Sedlack R (2011) Validation process for new endoscopy teaching tools. Tech Gastro Endosc 13:151–154CrossRefGoogle Scholar
  21. 21.
    Feldman L, Sherman V, Fried G (2004) Using simulators to assess laparoscopic competence: ready for widespread use? Surgery 135:28–42PubMedCrossRefGoogle Scholar
  22. 22.
    Technical skills education surgery (2011). Accessed 8 Oct 2011
  23. 23.
    The PRISMA (2011). Accessed 1 Oct 2011
  24. 24.
    Google Translate (2011). Accessed 30 Sept 2011
  25. 25.
    Oxford Centre for Evidence Based Medicine—Levels of Evidence (2011). Accessed 30 Sept 2011
  26. 26.
    Carter F, Schijven M, Aggarwal R, Grantcharov T, Francis N, Hanna G, Jakimowicz J (2005) Work group for evaluation and implementation of simulators and skills training programmes. Consensus guidelines for validation of virtual reality surgical simulators. Surg Endosc 19:1523–1532PubMedCrossRefGoogle Scholar
  27. 27.
    Macmillan A, Cuschieri A (1999) Assessment of innate ability and skills for endoscopic manipulations by the advanced Dundee endoscopic psychomotor tester: predictive and concurrent validity. Am J Surg 177:274–277PubMedCrossRefGoogle Scholar
  28. 28.
    Francis N, Hanna G, Cuschieri A (2002) The performance of master surgeons on the advanced Dundee endoscopic psychomotor tester. Arch Surg 137:841–844PubMedCrossRefGoogle Scholar
  29. 29.
    Egi H, Okajima M, Yoshimitsu M, Ikeda S, Miyata Y, Masugami H, Kawahara T, Kurita Y, Kaneko M, Asahara T (2008) Objective assessment of endoscopic surgical skills by analyzing direction-dependent dexterity using the Hiroshima University endoscopic surgical assessment device (HUESAD). Surg Today 38:705–710PubMedCrossRefGoogle Scholar
  30. 30.
    Tokunaga M, Egi H, Minoru H, Yoshimitsu M, Sumitani D, Kawahara T, Okajima M, Ohdan H (2011) Approaching time is important for assessment of endoscopic surgical skills. Min Invasive Ther Allied Technol 21:142–149CrossRefGoogle Scholar
  31. 31.
    Moorthy K, Munz Y, Doris A, Bello F, Chang A, Darzi A (2004) Bimodal assessment of laparoscopic suturing skills: construct and concurrent validity. Surg Endosc 18:1608–1612PubMedGoogle Scholar
  32. 32.
    Xeroulis G, Dubrowski A, Leslie K (2009) Simulation in laparoscopic surgery: a concurrent validity study for FLS. Surg Endosc 23:161–165PubMedCrossRefGoogle Scholar
  33. 33.
    Broe D, Ridgway P, Johnson S, Tierney K, Conlon K (2006) Construct validation of a novel hybrid surgical simulator. Surg Endosc 20:900–904PubMedCrossRefGoogle Scholar
  34. 34.
    Oostema J, Abdel M, Gould J (2008) Time-efficient laparoscopic skills assessment using an augmented-reality simulator. Surg Endosc 22:2621–2624PubMedCrossRefGoogle Scholar
  35. 35.
    Pellen M, Horgan L, Barton J, Attwood S (2009) Laparoscopic surgical skills assessment: can simulators replace experts? World J Surg 33:440–447PubMedCrossRefGoogle Scholar
  36. 36.
    Pellen M, Horgan L, Barton J, Attwood S (2009) Construct validity of the ProMIS laparoscopic simulator. Surg Endosc 23:130–139PubMedCrossRefGoogle Scholar
  37. 37.
    Chmarra M, Kolkman W, Jansen F, Grimbergen C, Dankelman J (2007) The influence of experience and camera holding on laparoscopic instrument movements measured with the TrEndo tracking system. Surg Endosc 21:2069–2075PubMedCrossRefGoogle Scholar
  38. 38.
    Chmarra M, Klein S, de Winter J, Jansen F-W, Dankelman J (2010) Objective classification of residents based on their psychomotor laparoscopic skills. Surg Endosc 24:1031–1039PubMedCrossRefGoogle Scholar
  39. 39.
    Aggarwal R, Moorthy K, Darzi A (2004) Laparoscopic skills training and assessment. Br J Surg 91:1549–1558PubMedCrossRefGoogle Scholar
  40. 40.
    Doris A, Aggarwal R, Bello F, Moorthy K, Munz Y, Gillies D, Darzi A (2005) Synchronized video and motion analysis for the assessment of procedures in the operating theater. Arch Surg 140:293–299CrossRefGoogle Scholar
  41. 41.
    Martin J, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchinson C, Brown M (1997) Objective Structured Assessment of Technical Skill (OSATS) for surgical residents. Br J Surg 84:273–278PubMedCrossRefGoogle Scholar
  42. 42.
    Eubanks T, Clements R, Pohl D, Williams N, Schaad D, Horgan S, Pellegrini C (1999) An objective scoring system for laparoscopic cholecystectomy. J Am Coll Surg 189:566–574PubMedCrossRefGoogle Scholar
  43. 43.
    Dath D, Regehr G, Birch D, Schlachta C, Poulin E, Mamazza J, Reznick R, MacRae H (2004) Toward reliable operative assessment: the reliability and feasibility of videotaped assessment of laparoscopic technical skills. Surg Endosc 18(12):1800–1804PubMedCrossRefGoogle Scholar
  44. 44.
    Moorthy K, Munz Y, Sarker S, Darzi A (2003) Objective assessment of technical skills in surgery. BMJ 327:1032–1037PubMedCrossRefGoogle Scholar
  45. 45.
    Spencer F (1978) Teaching and measuring surgical techniques—the technical evaluation of competence. Bull Am Coll Surg 63:9–12Google Scholar
  46. 46.
    FLS program (2011). Accessed 9 Oct 2011

Copyright information

© Springer Science+Business Media New York 2012

Authors and Affiliations

  • John D. Mason
    • 1
  • James Ansell
    • 2
    Email author
  • Neil Warren
    • 3
  • Jared Torkington
    • 4
  1. 1.Cardiff University School of MedicineWelsh Institute of Minimal Access Therapy (WIMAT), Cardiff MedicentreCardiffUK
  2. 2.Royal College of Surgeons of England Research FellowWelsh Institute of Minimal Access Therapy (WIMAT), Cardiff MedicentreCardiffUK
  3. 3.Postgraduate Deanery WalesWelsh Institute of Minimal Access Therapy (WIMAT), Cardiff MedicentreCardiffUK
  4. 4.Department of Colorectal SurgeryUniversity Hospital of WalesCardiffUK

Personalised recommendations