Evaluation of contactless human–machine interface for robotic surgical training
Teleoperated robotic systems are nowadays routinely used for specific interventions. Benefits of robotic training courses have already been acknowledged by the community since manipulation of such systems requires dedicated training. However, robotic surgical simulators remain expensive and require a dedicated human–machine interface.
We present a low-cost contactless optical sensor, the Leap Motion, as a novel control device to manipulate the RAVEN-II robot. We compare peg manipulations during a training task with a contact-based device, the electro-mechanical Sigma.7. We perform two complementary analyses to quantitatively assess the performance of each control method: a metric-based comparison and a novel unsupervised spatiotemporal trajectory clustering.
We show that contactless control does not offer as good manipulability as the contact-based. Where part of the metric-based evaluation presents the mechanical control better than the contactless one, the unsupervised spatiotemporal trajectory clustering from the surgical tool motions highlights specific signature inferred by the human–machine interfaces.
Even if the current implementation of contactless control does not overtake manipulation with high-standard mechanical interface, we demonstrate that using the optical sensor complete control of the surgical instruments is feasible. The proposed method allows fine tracking of the trainee’s hands in order to execute dexterous laparoscopic training gestures. This work is promising for development of future human–machine interfaces dedicated to robotic surgical training systems.
KeywordsContactless teleoperation Hand tracking Human–machine interface Robotic surgical training Unsupervised trajectory analysis
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
- 1.Alemzadeh H, Chen D, Lewis A, Kalbarczyk Z, Raman J, Leveson N, Iyer R (2015) Systems-theoretic safety assessment of robotic telesurgical systems. In: International conference on computer safety, reliability, and security, vol 9337, pp 213–227Google Scholar
- 3.Casiez G, Roussel N, Vogel D (2012) 1 Filter: a simple speed-based low-pass filter for noisy input in interactive systems. In: ACM annual conference on human factors in computing systems, pp 2527–2530Google Scholar
- 5.Cotin S, Stylopoulos N, Ottensmeyer MP, Neumann PF, Rattner DW, Dawson S (2002) Metrics for laparoscopic skills trainers: the weakest link! In: Medical image computing and computer-assisted intervention, vol 2488, pp 35–43Google Scholar
- 7.Despinoy F, Alonso S, Zemiti N, Jannin P, Poignet P (2014) Comparative assessment of a novel optical human–machine interface for laparoscopic telesurgery. In: International conference on information processing in computer-assisted interventions, vol 8498, pp 21–30Google Scholar
- 9.Du G, Zhang P, Mai J, Li Z (2012) Markerless kinect-based hand tracking for robot teleoperation. Int J Adv Rob Syst 9(36):1–10Google Scholar
- 10.Forestier G, Lalys F, Riffaud L, Trelhu B, Jannin P (2011) Assessment of surgical skills using surgical processes and dynamic time warping. In: Modeling and monitoring of computer assisted interventions workshop, pp 1–10Google Scholar
- 15.Guthart G (2016) Intuitive surgical 2016. Annual reportGoogle Scholar
- 17.Hernoux F, Béarée R, Gajny L, Nyiri E, Bancalin J, Gibaru O (2013) Leap Motion pour la capture de mouvement 3D par spline L1—application à la robotique. In: Conférence Groupe de Travail en Modélisation Géométrique, pp 1–6Google Scholar
- 20.Jin H, Chen Q, Chen Z, Hu Y, Zhang J (2016) Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. Trans Intell Technol 1(1):104–113Google Scholar
- 21.Kim Y, Kim PCW, Selle R, Shademan A, Krieger A (2014) Experimental evaluation of contact-less hand tracking systems for tele-operation of surgical tasks. In: IEEE international conference on robotics and automation, pp 3502–3509Google Scholar
- 24.Masson-Lecomte A, Bensalah K, Seringe E, Vaessen C, de la Taille A, Doumerc N, Rischmann P, Bruyre F, Soustelle L, Droupy S, Rouprt M (2013) A prospective comparison of surgical and pathological outcomes obtained after robot-assisted or pure laparoscopic partial nephrectomy in moderate to complex renal tumours: results from a French multicentre collaborative study. Br J Urol 111(2):256–263CrossRefGoogle Scholar
- 32.Ten Holt GA, Reinders MJT, Hendriks EA (2007) Multi-dimensional dynamic time warping for gesture recognition. In: Annual conference of the advanced school for computing and imaging, vol 5, pp 1–8Google Scholar
- 33.Tobergte A, Helmer P, Hagn U, Rouiller P, Thielmann S, Grange S, Albu-Schäffer A, Conti F, Hirzinger G (2011) The Sigma.7 haptic interface for MiroSurge: a new bi-manual surgical console. In: IEEE international conference on intelligent robots and systems, pp 3023–3030Google Scholar
- 34.Travaglini TA, Swaney PJ, Weaver KD, Webster RJ III (2015) Initial experiments with the leap motion as a user interface in robotic endonasal surgery. In: International symposium on robotics and mechatronics, vol 37, pp 171–179Google Scholar
- 36.Vargas HF, Vivas OA (2014) Gesture recognition system for surgical robot’s manipulation. In: Symposium on image, signal processing and artificial vision, pp 1–5Google Scholar
- 40.Yang S, Wells TS, Maclachlan RA, Riviere CN (2013) Performance of a 6-degree-of-freedom active microsurgical manipulator in handheld tasks. IEEE Int Conf Eng Med Biol Soc 2013:5670–5673Google Scholar
- 41.Zhou T, Cabrera ME, Wachs JP (2014) Touchless telerobotic surgery: a comparative study. In: Workshop on telerobotics for real-life applications, challenges and new developments, in intelligent robots and systemsGoogle Scholar