Advertisement

Medical & Biological Engineering & Computing

, Volume 57, Issue 3, pp 601–614 | Cite as

Head-mounted interface for intuitive vision control and continuous surgical operation in a surgical robot system

  • Nhayoung Hong
  • Myungjoon Kim
  • Chiwon Lee
  • Sungwan KimEmail author
Original Article

Abstract

Although robot-assisted surgeries offer various advantages, the discontinuous surgical operation flow resulting from switching the control between the patient-side manipulators and the endoscopic robot arm can be improved to enhance the efficiency further. Therefore, in this study, a head-mounted master interface (HMI) that can be implemented to an existing surgical robot system and allows continuous surgical operation flow using the head motion is proposed. The proposed system includes an HMI, a four degrees of freedom endoscope control system, a simple three-dimensional endoscope, and a da Vinci Research Kit. Eight volunteers performed seven head movements and their data from HMI was collected to perform support vector machine (SVM) classification. Further, ten-fold cross-validation was performed to optimize its parameters. Using the ten-fold cross-validation result, the SVM classifier with the Gaussian kernel (σ = 0.85) was chosen, which had an accuracy of 92.28%. An endoscopic control algorithm was developed using the SVM classification result. A peg transfer task was conducted to check the time-related effect of HMI’s usability on the system, and the paired t test result showed that the task completion time was reduced. Further, the time delay of the system was measured to be 0.72 s.

Graphical abstract

A head-mounted master interface (HMI), which can be implemented to an existing surgical robot system, was developed to allow simultaneous surgical operation flow. The surgeon’s head motion is detected through the proposed HMI and classified using a support vector machine to manipulate the endoscopic robotic arm. A classification accuracy of 92.28% was achieved.

Keywords

Minimally invasive surgical procedure Endoscopic control interface da Vinci Research Kit Machine learning Head movement 

Abbreviations

HMI

Head-mounted master interface

dVRK

da Vinci Research Kit

SVM

Support vector machine

MIS

Minimally invasive surgery

DOFs

Degrees of freedom

MTM

Master tool manipulator

PSM

Patient-side manipulator

NMI

Novel master interface

iNMI

Improved novel master interface

ECS

Endoscope control system

HOTAS

Hands-on-throttle-and-stick

3D

Three-dimensional

CMOS

Complementary metal-oxide-semiconductor

DAQ

Data acquisition device

FLS

Fundamentals of laparoscopic surgery

Notes

Acknowledgments

The da Vinci Research Kit was donated by Intuitive Surgical, Inc. (Sunnyvale, CA, USA) in 2014.

Funding information

This work was supported by the National Research Foundation of Korea grant funded by the Korea Government (MSIP) (Grant No. 2017R1A2B2006163).

References

  1. 1.
    Hanly EJ, Talamini MA (2004) Robotic abdominal surgery. Am J Surg 118:19–26.  https://doi.org/10.1016/j.amjsurg.2004.08.020 CrossRefGoogle Scholar
  2. 2.
    The Clinical Outcomes of Surgical Therapy Study Group (2004) A comparison of laparoscopically assisted and open colectomy for colon cancer. N Engl J Med 350:2050–2059.  https://doi.org/10.1056/NEJMoa032651 CrossRefGoogle Scholar
  3. 3.
    Ahlering TE, Skarecky D, Lee D, Clayman RV (2003) Successful transfer of open surgical skills to a laparoscopic environment using a robotic interface: initial experience with laparoscopic radical prostatectomy. J Urol 170:1738–1741.  https://doi.org/10.1097/01.ju.0000092881.24608.5e CrossRefPubMedGoogle Scholar
  4. 4.
    Ficarra V, Novara G, Artibani W, Cestari A, Galfano A, Graefen M, Guazzoni G, Guillonneau B, Menon M, Montorsi F, Patel V, Rassweiler J, van Poppel H (2009) Retropubic, laparoscopic, and robot-assisted radical prostatectomy: a systematic review and cumulative analysis of comparative studies. Eur Urol 55:1037–1063.  https://doi.org/10.1016/j.eururo.2009.01.036 CrossRefPubMedGoogle Scholar
  5. 5.
    Lanfranco AR, Castellanos AE, Desai JP, Meyers WC (2004) Robotic surgery: a current perspective. Ann Surg 239:14–21.  https://doi.org/10.1097/01.sla.0000103020.19595.7d CrossRefPubMedPubMedCentralGoogle Scholar
  6. 6.
    Palep JH (2009) Robotic assisted minimally invasive surgery. J Minim Access Surg 5:1–7.  https://doi.org/10.4103/0972-9941.51313 CrossRefPubMedPubMedCentralGoogle Scholar
  7. 7.
    Ruurda JP, van Vrronhoven TJMV, Broeders IAMJ (2002) Robot-assisted surgical systems: a new era in laparoscopic surgery. Ann R Coll Surg Engl 84:223–226CrossRefPubMedPubMedCentralGoogle Scholar
  8. 8.
    Finkelstein J, Eckersberger E, Sadri H, Taneja SS, Lepor H, Djavan B (2010) Open versus laparoscopic versus robot-assisted laparoscopic prostatectomy: the European and US experience. Rev Urol 12:35–43PubMedPubMedCentralGoogle Scholar
  9. 9.
    Van der Meijden OAJ, Schijven MP (2009) The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg Endosc 23:1180–1190.  https://doi.org/10.1007/s00464-008-0298-x CrossRefPubMedPubMedCentralGoogle Scholar
  10. 10.
    Turchetti G, Palla I, Pierotti F, Cuschieri A (2012) Economic evaluation of da Vinci-assisted robotic surgery: a systematic review. Surg Endosc 26:598–606.  https://doi.org/10.1007/s00464-011-1936-2 CrossRefPubMedGoogle Scholar
  11. 11.
    Kim CW, Kim CH, Baik SH (2014) Outcomes of robotic-assisted colorectal surgery compared with laparoscopic and open surgery: a systematic review. J Gastrointest Surg 18:816–830.  https://doi.org/10.1007/s11605-014-2469-5 CrossRefPubMedGoogle Scholar
  12. 12.
    Ahmad A, Ahmad ZF, Carleton JD, Agarwala A (2017) Robotic surgery: current perceptions and the clinical evidence. Surg Endosc 31:255–263.  https://doi.org/10.1007/s00464-016-4996-y CrossRefPubMedGoogle Scholar
  13. 13.
    Gomez JB, Ceballos A, Prieto F, Redarce T (2009) Mouth gesture and voice command based robot command interface. In: IEEE International Conference on Robotics and Automation (ICRA). Japan: IEEE; pp. 333–8.  https://doi.org/10.1109/ROBOT.2009.5152858
  14. 14.
    King BW, Reisner LA, Pandya AK, Composto AM, Ellis RD, Klein MD (2013) Towards an autonomous robot for camera control during laparoscopic surgery. J Laparoendosc Adv Surg Tech A 23:1027–1030.  https://doi.org/10.1089/lap.2013.0304 CrossRefPubMedGoogle Scholar
  15. 15.
    Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, Kakutani H, Miyazaki F, Sekimoto M, Yasui M, Miyake Y, Takiguchi S, Monden M (2003) FAce MOUSe: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans Robot Autom 19:825–841.  https://doi.org/10.1109/TRA.2003.817093 CrossRefGoogle Scholar
  16. 16.
    Cao Y, Miura S, Kobayashi Y, Kawamura K, Sugano S, Fujie MG et al (2016) Pupil variation applied to the eye tracking control of an endoscopic manipulator. IEEE Robot Autom Lett 1:531–538.  https://doi.org/10.1109/LRA.2016.2521894 CrossRefGoogle Scholar
  17. 17.
    Zinchenko K, Wu CY, Song KT (2017) A study on speech recognition control for a surgical robot. IEEE Trans Ind Inform 13:607–615.  https://doi.org/10.1109/TII.2016.2625818 CrossRefGoogle Scholar
  18. 18.
    Kawai T, Fukunishi M, Nishikawa A, Nishizawa Y, Nakamura T (2014) Hands-free interface for surgical procedures based on foot movement patterns. In: IEEE Engineering in Medicine and Biology Society (EMBC), USA: IEEE; pp 345–8.  https://doi.org/10.1109/EMBC.2014.6943600
  19. 19.
    Lee C, Park WJ, Kim M, Noh S, Yoon C, Lee C, Kim Y, Kim H, Kim H, Kim S (2014) Pneumatic-type surgical robot end-effector for laparoscopic surgical-operation-by-wire. Biomed Eng Online 13:130.  https://doi.org/10.1186/1475-925X-13-130 CrossRefPubMedPubMedCentralGoogle Scholar
  20. 20.
    Kim S, Kim Y, Kim H, Kim HC, Park CG, Lee C, et al (2013) Surgical robot control apparatus. U.S. Patent US20130103199 A1, issued April 25Google Scholar
  21. 21.
    Kim M, Lee C, Hong N, Kim YJ, Kim S (2017) Development of stereo endoscope system with its innovative master interface for continuous surgical operation. Biomed Eng Online 16:1–16.  https://doi.org/10.1186/s12938-017-0376-1 CrossRefGoogle Scholar
  22. 22.
    Luo R, Tsai JY, Lee KM, Chen HT (2016) Intuitive maneuver of multi-DOFs laparoscopic system for minimally invasive surgery. IEEE International Conference on Industrial Technology. Taiwan: IEEE; pp. 90–95.  https://doi.org/10.1109/ICIT.2016.7474731
  23. 23.
    Finlay PA, Ornstein MH (1995) Controlling the movement of a surgical laparoscope. IEEE Eng Med Biol Mag 14:289–291.  https://doi.org/10.1109/51.391775 CrossRefGoogle Scholar
  24. 24.
    Matsuoka Y, Kihara K, Kawashima K, Fujii Y (2014) Integrated image navigation system using head-mounted display in “RoboSurgeon” endoscopic radical prostatectomy. Videosurgery Miniinv 9:613–618.  https://doi.org/10.5114/wiitm.2014.44135 CrossRefGoogle Scholar
  25. 25.
    Nishikawa A, Hosoi T, Koara K, Hikita A, Negoro D, Asano S, Miyazaki F, Sekimoto M, Miyake Y, Yasui M, Monden M (2001) A laparoscope positioning system with the surgeon’s face image-based human-machine interface. Int Congr Ser 1230:166–173.  https://doi.org/10.1016/S0531-5131(01)00041-3 CrossRefGoogle Scholar
  26. 26.
    Reilink R, Bruin G, Franken M, Mariani M, Misra S, Stramigioli S (2010) Endoscopic camera control by head movements for thoracic surgery. IEEE/RSJ International Conference on Biomedical Robotics and Biomechatronics. Japan: IEEE; pp 510–515.  https://doi.org/10.1109/BIOROB.2010.5627043
  27. 27.
    Aidlen J, Glick S, Silverman K, Silverman H, Luks F (2009) Head-motion-controlled video goggles: preliminary concept for an interactive laparoscopic image display (i-LID). J Laparoendosc Adv Surg Tech 19:595–598.  https://doi.org/10.1089/lap.2009.0123 CrossRefGoogle Scholar
  28. 28.
    Kim YJ, Heo J, Park KS, Kim S (2016) Proposition of novel classification approach and features for improved real-time arrhythmia monitoring. Comput Biol Med 75:190–202.  https://doi.org/10.1016/j.compbiomed.2016.06.009 CrossRefPubMedGoogle Scholar
  29. 29.
    Schüldt C, Laptev I, Caputo B (2004) Recognizing human actions: a local SVM approach. Pattern Recogn.  https://doi.org/10.1109/ICPR.2004.1334462
  30. 30.
    Alkan A, Gunay M (2012) Identification of EMG signals using discriminant analysis and SVM classifier. Expert Syst Appl 39:44–47.  https://doi.org/10.1016/j.eswa.2011.06.043 CrossRefGoogle Scholar
  31. 31.
    Yoshikawa M, Mikawa M, Tanaka K (2007) A myoelectric interface for robotic hand control using support vector machine. IEEE/RSJ International Conference on Intelligent Robots and Systems. USA: IEEE; pp 2723–2728.  https://doi.org/10.1109/IROS.2007.4399301
  32. 32.
    Winter JCF (2013) Using the student’s t-test with extremely small sample sizes. Pract Assess Res Eval 18:1–12Google Scholar
  33. 33.
    Li Y, Gong S, Sherrah J, Liddell (2003) Support vector machine based multi-view face detection and recognition. Image Vis Comput 22:413–427.  https://doi.org/10.1016/j.imavis.2003.12.005 CrossRefGoogle Scholar
  34. 34.
    Zhou T, Cabrera ME, Wachs JP (2016) A comparative study for telerobotic surgery using free hand gestures. J Human-Robot Interact 5:1–27.  https://doi.org/10.5898/JHRI.5.2.Zhou CrossRefGoogle Scholar

Copyright information

© International Federation for Medical and Biological Engineering 2018

Authors and Affiliations

  • Nhayoung Hong
    • 1
  • Myungjoon Kim
    • 2
  • Chiwon Lee
    • 2
  • Sungwan Kim
    • 3
    • 4
    Email author
  1. 1.Interdisciplinary Program for Bioengineering, Graduate SchoolSeoul National UniversitySeoulRepublic of Korea
  2. 2.Korea Electrotechnology Research InstituteAnsanRepublic of Korea
  3. 3.Institute of Medical and Biological EngineeringSeoul National UniversitySeoulRepublic of Korea
  4. 4.Department of Biomedical EngineeringSeoul National University College of MedicineSeoulRepublic of Korea

Personalised recommendations