Evaluation of contactless human–machine interface for robotic surgical training

  • Fabien Despinoy
  • Nabil Zemiti
  • Germain Forestier
  • Alonso Sánchez
  • Pierre Jannin
  • Philippe Poignet
Original Article

Abstract

Purpose

Teleoperated robotic systems are nowadays routinely used for specific interventions. Benefits of robotic training courses have already been acknowledged by the community since manipulation of such systems requires dedicated training. However, robotic surgical simulators remain expensive and require a dedicated human–machine interface.

Methods

We present a low-cost contactless optical sensor, the Leap Motion, as a novel control device to manipulate the RAVEN-II robot. We compare peg manipulations during a training task with a contact-based device, the electro-mechanical Sigma.7. We perform two complementary analyses to quantitatively assess the performance of each control method: a metric-based comparison and a novel unsupervised spatiotemporal trajectory clustering.

Results

We show that contactless control does not offer as good manipulability as the contact-based. Where part of the metric-based evaluation presents the mechanical control better than the contactless one, the unsupervised spatiotemporal trajectory clustering from the surgical tool motions highlights specific signature inferred by the human–machine interfaces.

Conclusions

Even if the current implementation of contactless control does not overtake manipulation with high-standard mechanical interface, we demonstrate that using the optical sensor complete control of the surgical instruments is feasible. The proposed method allows fine tracking of the trainee’s hands in order to execute dexterous laparoscopic training gestures. This work is promising for development of future human–machine interfaces dedicated to robotic surgical training systems.

Keywords

Contactless teleoperation Hand tracking Human–machine interface Robotic surgical training Unsupervised trajectory analysis 

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Alemzadeh H, Chen D, Lewis A, Kalbarczyk Z, Raman J, Leveson N, Iyer R (2015) Systems-theoretic safety assessment of robotic telesurgical systems. In: International conference on computer safety, reliability, and security, vol 9337, pp 213–227Google Scholar
  2. 2.
    Alemzadeh H, Raman J, Leveson N, Kalbarczyk Z, Iyer RK (2016) Adverse events in robotic surgery: a retrospective study of 14 years of FDA data. PLoS ONE 11(4):1–20CrossRefGoogle Scholar
  3. 3.
    Casiez G, Roussel N, Vogel D (2012) 1 Filter: a simple speed-based low-pass filter for noisy input in interactive systems. In: ACM annual conference on human factors in computing systems, pp 2527–2530Google Scholar
  4. 4.
    Cooper MA, Ibrahim A, Lyu H, Makary MA (2015) Underreporting of robotic surgery complications. J Healthcare Qual 37(2):133–138CrossRefGoogle Scholar
  5. 5.
    Cotin S, Stylopoulos N, Ottensmeyer MP, Neumann PF, Rattner DW, Dawson S (2002) Metrics for laparoscopic skills trainers: the weakest link! In: Medical image computing and computer-assisted intervention, vol 2488, pp 35–43Google Scholar
  6. 6.
    Derossis AM, Fried GM, Abrahamowicz M, Sigman HH, Barkun JS, Meakins JL (1998) Development of a model for training and evaluation of laparoscopic skills. Am J Surg 175(6):482–487CrossRefPubMedGoogle Scholar
  7. 7.
    Despinoy F, Alonso S, Zemiti N, Jannin P, Poignet P (2014) Comparative assessment of a novel optical human–machine interface for laparoscopic telesurgery. In: International conference on information processing in computer-assisted interventions, vol 8498, pp 21–30Google Scholar
  8. 8.
    Dragan AD, Srinivasa SS, Lee KCT (2013) Teleoperation with intelligent and customizable interfaces. J Hum Robot Interact 2(2):33–57CrossRefGoogle Scholar
  9. 9.
    Du G, Zhang P, Mai J, Li Z (2012) Markerless kinect-based hand tracking for robot teleoperation. Int J Adv Rob Syst 9(36):1–10Google Scholar
  10. 10.
    Forestier G, Lalys F, Riffaud L, Trelhu B, Jannin P (2011) Assessment of surgical skills using surgical processes and dynamic time warping. In: Modeling and monitoring of computer assisted interventions workshop, pp 1–10Google Scholar
  11. 11.
    Forestier G, Lalys F, Riffaud L, Trelhu B, Jannin P (2012) Classification of surgical processes using dynamic time warping. J Biomed Inform 45(2):255–264CrossRefPubMedGoogle Scholar
  12. 12.
    Forestier G, Lalys F, Riffaud L, Louis Collins D, Meixensberger J, Wassef SN, Neumuth T, Goulet B, Jannin P (2013) Multi-site study of surgical practice in neurosurgery based on surgical process models. J Biomed Inform 46(5):822–829CrossRefPubMedGoogle Scholar
  13. 13.
    Freschi C, Ferrari V, Melfi F, Ferrari M, Mosca F, Cuschieri A (2013) Technical review of the da Vinci surgical telemanipulator. Int J Med Robot Comput Assist Surg 9(4):396–406CrossRefGoogle Scholar
  14. 14.
    Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720CrossRefPubMedPubMedCentralGoogle Scholar
  15. 15.
    Guthart G (2016) Intuitive surgical 2016. Annual reportGoogle Scholar
  16. 16.
    Hannaford B, Rosen J, Friedman DW, King H, Roan P, Cheng L, Glozman D, Ma J, Kosari SN, White L (2013) Raven-II: an open platform for surgical robotics research. IEEE Trans Biomed Eng 60(4):954–959CrossRefPubMedGoogle Scholar
  17. 17.
    Hernoux F, Béarée R, Gajny L, Nyiri E, Bancalin J, Gibaru O (2013) Leap Motion pour la capture de mouvement 3D par spline L1—application à la robotique. In: Conférence Groupe de Travail en Modélisation Géométrique, pp 1–6Google Scholar
  18. 18.
    Hofstad EF, Våpenstad C, Chmarra MK, Langø T, Kuhry E, Mårvik R (2013) A study of psychomotor skills in minimally invasive surgery: what differentiates expert and nonexpert performance. Surg Endosc 27(3):854–863CrossRefPubMedGoogle Scholar
  19. 19.
    Howard T, Szewczyk J (2016) Improving precision in navigating laparoscopic surgery instruments toward a planar target using haptic and visual feedback. Front Robot AI 3:37CrossRefGoogle Scholar
  20. 20.
    Jin H, Chen Q, Chen Z, Hu Y, Zhang J (2016) Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. Trans Intell Technol 1(1):104–113Google Scholar
  21. 21.
    Kim Y, Kim PCW, Selle R, Shademan A, Krieger A (2014) Experimental evaluation of contact-less hand tracking systems for tele-operation of surgical tasks. In: IEEE international conference on robotics and automation, pp 3502–3509Google Scholar
  22. 22.
    Lalys F, Jannin P (2014) Surgical process modelling: a review. Int J Comput Assist Radiol Surg 9(3):495–511CrossRefPubMedGoogle Scholar
  23. 23.
    Loram ID, Gawthrop PJ, Lakie M (2006) The frequency of human, manual adjustments in balancing an inverted pendulum is constrained by intrinsic physiological factors. J Physiol 577(1):417–432CrossRefPubMedPubMedCentralGoogle Scholar
  24. 24.
    Masson-Lecomte A, Bensalah K, Seringe E, Vaessen C, de la Taille A, Doumerc N, Rischmann P, Bruyre F, Soustelle L, Droupy S, Rouprt M (2013) A prospective comparison of surgical and pathological outcomes obtained after robot-assisted or pure laparoscopic partial nephrectomy in moderate to complex renal tumours: results from a French multicentre collaborative study. Br J Urol 111(2):256–263CrossRefGoogle Scholar
  25. 25.
    Moglia A, Ferrari V, Morelli L, Ferrari M, Mosca F, Cuschieri A (2016) A systematic review of virtual reality simulators for robot-assisted surgery. Eur Urol 69(6):1065–1080CrossRefPubMedGoogle Scholar
  26. 26.
    Oropesa I, Sánchez-González P, Lamata P, Chmarra MK, Pagador JB, Ja Sánchez-Margallo, Sánchez-Margallo FM, Gómez EJ (2011) Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery. J Surg Res 171(1):81–95CrossRefGoogle Scholar
  27. 27.
    Rossol N, Cheng I, Basu A (2016) A multisensor technique for gesture recognition through intelligent skeletal pose analysis. IEEE Trans Hum Mach Syst 46(3):350–359CrossRefGoogle Scholar
  28. 28.
    Sachdeva AK, Buyske J, Dunnington GL, Sanfey HA, Mellinger JD, Scott DJ, Satava R, Fried GM, Jacobs LM, Burns KJ (2011) A new paradigm for surgical procedural training. Curr Probl Surg 48(12):854–968CrossRefPubMedGoogle Scholar
  29. 29.
    Sakoe H, Chiba S (1978) Dynamic programming algorithm optimization for spoken word recognition. IEEE Trans Acoust Speech Signal Process 26(1):43–49CrossRefGoogle Scholar
  30. 30.
    Schreuder HWR, Wolswijk R, Zweemer RP, Schijven MP, Verheijen RHM (2012) Training and learning robotic surgery, time for a more structured approach: a systematic review. BJOG Int J Obstet Gynaecol 119(2):137–149CrossRefGoogle Scholar
  31. 31.
    Simorov A, Otte RS, Kopietz CM, Oleynikov D (2012) Review of surgical robotics user interface: what is the best way to control robotic surgery? Surg Endosc 26(8):2117–2125CrossRefPubMedGoogle Scholar
  32. 32.
    Ten Holt GA, Reinders MJT, Hendriks EA (2007) Multi-dimensional dynamic time warping for gesture recognition. In: Annual conference of the advanced school for computing and imaging, vol 5, pp 1–8Google Scholar
  33. 33.
    Tobergte A, Helmer P, Hagn U, Rouiller P, Thielmann S, Grange S, Albu-Schäffer A, Conti F, Hirzinger G (2011) The Sigma.7 haptic interface for MiroSurge: a new bi-manual surgical console. In: IEEE international conference on intelligent robots and systems, pp 3023–3030Google Scholar
  34. 34.
    Travaglini TA, Swaney PJ, Weaver KD, Webster RJ III (2015) Initial experiments with the leap motion as a user interface in robotic endonasal surgery. In: International symposium on robotics and mechatronics, vol 37, pp 171–179Google Scholar
  35. 35.
    Tsuda S, Scott D, Doyle J, Jones DB (2009) Surgical skills training and simulation. Curr Probl Surg 46(4):271–370CrossRefPubMedGoogle Scholar
  36. 36.
    Vargas HF, Vivas OA (2014) Gesture recognition system for surgical robot’s manipulation. In: Symposium on image, signal processing and artificial vision, pp 1–5Google Scholar
  37. 37.
    Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, Fried GM (2005) A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg 190(1):107–113CrossRefPubMedGoogle Scholar
  38. 38.
    Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap Motion controller. Sensors 13(5):6380–6393CrossRefPubMedPubMedCentralGoogle Scholar
  39. 39.
    Xiong Y, Quek F (2006) Hand motion gesture frequency properties and multimodal discourse analysis. Int J Comput Vision 69(3):353–371CrossRefGoogle Scholar
  40. 40.
    Yang S, Wells TS, Maclachlan RA, Riviere CN (2013) Performance of a 6-degree-of-freedom active microsurgical manipulator in handheld tasks. IEEE Int Conf Eng Med Biol Soc 2013:5670–5673Google Scholar
  41. 41.
    Zhou T, Cabrera ME, Wachs JP (2014) Touchless telerobotic surgery: a comparative study. In: Workshop on telerobotics for real-life applications, challenges and new developments, in intelligent robots and systemsGoogle Scholar
  42. 42.
    Zhou T, Cabrera ME, Low T, Sundaram C, Wachs J (2016) A comparative study for telerobotic surgery using free hand gestures. J Hum Robot Interact 5(2):1–28CrossRefGoogle Scholar

Copyright information

© CARS 2017

Authors and Affiliations

  1. 1.LTSI-INSERMUniversité de Rennes 1RennesFrance
  2. 2.LIRMM-CNRSUniversité de MontpellierMontpellierFrance
  3. 3.MIPSUniversité de Haute AlsaceMulhouseFrance

Personalised recommendations