Skip to main content
Log in

Evaluation of contactless human–machine interface for robotic surgical training

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Teleoperated robotic systems are nowadays routinely used for specific interventions. Benefits of robotic training courses have already been acknowledged by the community since manipulation of such systems requires dedicated training. However, robotic surgical simulators remain expensive and require a dedicated human–machine interface.

Methods

We present a low-cost contactless optical sensor, the Leap Motion, as a novel control device to manipulate the RAVEN-II robot. We compare peg manipulations during a training task with a contact-based device, the electro-mechanical Sigma.7. We perform two complementary analyses to quantitatively assess the performance of each control method: a metric-based comparison and a novel unsupervised spatiotemporal trajectory clustering.

Results

We show that contactless control does not offer as good manipulability as the contact-based. Where part of the metric-based evaluation presents the mechanical control better than the contactless one, the unsupervised spatiotemporal trajectory clustering from the surgical tool motions highlights specific signature inferred by the human–machine interfaces.

Conclusions

Even if the current implementation of contactless control does not overtake manipulation with high-standard mechanical interface, we demonstrate that using the optical sensor complete control of the surgical instruments is feasible. The proposed method allows fine tracking of the trainee’s hands in order to execute dexterous laparoscopic training gestures. This work is promising for development of future human–machine interfaces dedicated to robotic surgical training systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Alemzadeh H, Chen D, Lewis A, Kalbarczyk Z, Raman J, Leveson N, Iyer R (2015) Systems-theoretic safety assessment of robotic telesurgical systems. In: International conference on computer safety, reliability, and security, vol 9337, pp 213–227

  2. Alemzadeh H, Raman J, Leveson N, Kalbarczyk Z, Iyer RK (2016) Adverse events in robotic surgery: a retrospective study of 14 years of FDA data. PLoS ONE 11(4):1–20

    Article  Google Scholar 

  3. Casiez G, Roussel N, Vogel D (2012) 1 Filter: a simple speed-based low-pass filter for noisy input in interactive systems. In: ACM annual conference on human factors in computing systems, pp 2527–2530

  4. Cooper MA, Ibrahim A, Lyu H, Makary MA (2015) Underreporting of robotic surgery complications. J Healthcare Qual 37(2):133–138

    Article  Google Scholar 

  5. Cotin S, Stylopoulos N, Ottensmeyer MP, Neumann PF, Rattner DW, Dawson S (2002) Metrics for laparoscopic skills trainers: the weakest link! In: Medical image computing and computer-assisted intervention, vol 2488, pp 35–43

  6. Derossis AM, Fried GM, Abrahamowicz M, Sigman HH, Barkun JS, Meakins JL (1998) Development of a model for training and evaluation of laparoscopic skills. Am J Surg 175(6):482–487

    Article  CAS  PubMed  Google Scholar 

  7. Despinoy F, Alonso S, Zemiti N, Jannin P, Poignet P (2014) Comparative assessment of a novel optical human–machine interface for laparoscopic telesurgery. In: International conference on information processing in computer-assisted interventions, vol 8498, pp 21–30

  8. Dragan AD, Srinivasa SS, Lee KCT (2013) Teleoperation with intelligent and customizable interfaces. J Hum Robot Interact 2(2):33–57

    Google Scholar 

  9. Du G, Zhang P, Mai J, Li Z (2012) Markerless kinect-based hand tracking for robot teleoperation. Int J Adv Rob Syst 9(36):1–10

    Google Scholar 

  10. Forestier G, Lalys F, Riffaud L, Trelhu B, Jannin P (2011) Assessment of surgical skills using surgical processes and dynamic time warping. In: Modeling and monitoring of computer assisted interventions workshop, pp 1–10

  11. Forestier G, Lalys F, Riffaud L, Trelhu B, Jannin P (2012) Classification of surgical processes using dynamic time warping. J Biomed Inform 45(2):255–264

    Article  PubMed  Google Scholar 

  12. Forestier G, Lalys F, Riffaud L, Louis Collins D, Meixensberger J, Wassef SN, Neumuth T, Goulet B, Jannin P (2013) Multi-site study of surgical practice in neurosurgery based on surgical process models. J Biomed Inform 46(5):822–829

    Article  PubMed  Google Scholar 

  13. Freschi C, Ferrari V, Melfi F, Ferrari M, Mosca F, Cuschieri A (2013) Technical review of the da Vinci surgical telemanipulator. Int J Med Robot Comput Assist Surg 9(4):396–406

    Article  CAS  Google Scholar 

  14. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720

    Article  PubMed  PubMed Central  Google Scholar 

  15. Guthart G (2016) Intuitive surgical 2016. Annual report

  16. Hannaford B, Rosen J, Friedman DW, King H, Roan P, Cheng L, Glozman D, Ma J, Kosari SN, White L (2013) Raven-II: an open platform for surgical robotics research. IEEE Trans Biomed Eng 60(4):954–959

    Article  PubMed  Google Scholar 

  17. Hernoux F, Béarée R, Gajny L, Nyiri E, Bancalin J, Gibaru O (2013) Leap Motion pour la capture de mouvement 3D par spline L1—application à la robotique. In: Conférence Groupe de Travail en Modélisation Géométrique, pp 1–6

  18. Hofstad EF, Våpenstad C, Chmarra MK, Langø T, Kuhry E, Mårvik R (2013) A study of psychomotor skills in minimally invasive surgery: what differentiates expert and nonexpert performance. Surg Endosc 27(3):854–863

    Article  PubMed  Google Scholar 

  19. Howard T, Szewczyk J (2016) Improving precision in navigating laparoscopic surgery instruments toward a planar target using haptic and visual feedback. Front Robot AI 3:37

    Article  Google Scholar 

  20. Jin H, Chen Q, Chen Z, Hu Y, Zhang J (2016) Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. Trans Intell Technol 1(1):104–113

    Google Scholar 

  21. Kim Y, Kim PCW, Selle R, Shademan A, Krieger A (2014) Experimental evaluation of contact-less hand tracking systems for tele-operation of surgical tasks. In: IEEE international conference on robotics and automation, pp 3502–3509

  22. Lalys F, Jannin P (2014) Surgical process modelling: a review. Int J Comput Assist Radiol Surg 9(3):495–511

    Article  PubMed  Google Scholar 

  23. Loram ID, Gawthrop PJ, Lakie M (2006) The frequency of human, manual adjustments in balancing an inverted pendulum is constrained by intrinsic physiological factors. J Physiol 577(1):417–432

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Masson-Lecomte A, Bensalah K, Seringe E, Vaessen C, de la Taille A, Doumerc N, Rischmann P, Bruyre F, Soustelle L, Droupy S, Rouprt M (2013) A prospective comparison of surgical and pathological outcomes obtained after robot-assisted or pure laparoscopic partial nephrectomy in moderate to complex renal tumours: results from a French multicentre collaborative study. Br J Urol 111(2):256–263

    Article  Google Scholar 

  25. Moglia A, Ferrari V, Morelli L, Ferrari M, Mosca F, Cuschieri A (2016) A systematic review of virtual reality simulators for robot-assisted surgery. Eur Urol 69(6):1065–1080

    Article  PubMed  Google Scholar 

  26. Oropesa I, Sánchez-González P, Lamata P, Chmarra MK, Pagador JB, Ja Sánchez-Margallo, Sánchez-Margallo FM, Gómez EJ (2011) Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery. J Surg Res 171(1):81–95

    Article  Google Scholar 

  27. Rossol N, Cheng I, Basu A (2016) A multisensor technique for gesture recognition through intelligent skeletal pose analysis. IEEE Trans Hum Mach Syst 46(3):350–359

    Article  Google Scholar 

  28. Sachdeva AK, Buyske J, Dunnington GL, Sanfey HA, Mellinger JD, Scott DJ, Satava R, Fried GM, Jacobs LM, Burns KJ (2011) A new paradigm for surgical procedural training. Curr Probl Surg 48(12):854–968

    Article  PubMed  Google Scholar 

  29. Sakoe H, Chiba S (1978) Dynamic programming algorithm optimization for spoken word recognition. IEEE Trans Acoust Speech Signal Process 26(1):43–49

    Article  Google Scholar 

  30. Schreuder HWR, Wolswijk R, Zweemer RP, Schijven MP, Verheijen RHM (2012) Training and learning robotic surgery, time for a more structured approach: a systematic review. BJOG Int J Obstet Gynaecol 119(2):137–149

    Article  CAS  Google Scholar 

  31. Simorov A, Otte RS, Kopietz CM, Oleynikov D (2012) Review of surgical robotics user interface: what is the best way to control robotic surgery? Surg Endosc 26(8):2117–2125

    Article  PubMed  Google Scholar 

  32. Ten Holt GA, Reinders MJT, Hendriks EA (2007) Multi-dimensional dynamic time warping for gesture recognition. In: Annual conference of the advanced school for computing and imaging, vol 5, pp 1–8

  33. Tobergte A, Helmer P, Hagn U, Rouiller P, Thielmann S, Grange S, Albu-Schäffer A, Conti F, Hirzinger G (2011) The Sigma.7 haptic interface for MiroSurge: a new bi-manual surgical console. In: IEEE international conference on intelligent robots and systems, pp 3023–3030

  34. Travaglini TA, Swaney PJ, Weaver KD, Webster RJ III (2015) Initial experiments with the leap motion as a user interface in robotic endonasal surgery. In: International symposium on robotics and mechatronics, vol 37, pp 171–179

  35. Tsuda S, Scott D, Doyle J, Jones DB (2009) Surgical skills training and simulation. Curr Probl Surg 46(4):271–370

    Article  PubMed  Google Scholar 

  36. Vargas HF, Vivas OA (2014) Gesture recognition system for surgical robot’s manipulation. In: Symposium on image, signal processing and artificial vision, pp 1–5

  37. Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, Fried GM (2005) A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg 190(1):107–113

    Article  PubMed  Google Scholar 

  38. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap Motion controller. Sensors 13(5):6380–6393

    Article  PubMed  PubMed Central  Google Scholar 

  39. Xiong Y, Quek F (2006) Hand motion gesture frequency properties and multimodal discourse analysis. Int J Comput Vision 69(3):353–371

    Article  Google Scholar 

  40. Yang S, Wells TS, Maclachlan RA, Riviere CN (2013) Performance of a 6-degree-of-freedom active microsurgical manipulator in handheld tasks. IEEE Int Conf Eng Med Biol Soc 2013:5670–5673

    Google Scholar 

  41. Zhou T, Cabrera ME, Wachs JP (2014) Touchless telerobotic surgery: a comparative study. In: Workshop on telerobotics for real-life applications, challenges and new developments, in intelligent robots and systems

  42. Zhou T, Cabrera ME, Low T, Sundaram C, Wachs J (2016) A comparative study for telerobotic surgery using free hand gestures. J Hum Robot Interact 5(2):1–28

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabien Despinoy.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

This work was supported in part by the French ANR within the Investissements d’Avenir Program (Labex CAMI, ANR-11-LABX0004); by the Equipex ROBOTEX Program (ANR-10-EQPX-44-01); and by the Région Languedoc-Roussillon.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Despinoy, F., Zemiti, N., Forestier, G. et al. Evaluation of contactless human–machine interface for robotic surgical training. Int J CARS 13, 13–24 (2018). https://doi.org/10.1007/s11548-017-1666-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-017-1666-6

Keywords

Navigation