Abstract
Purpose
We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent.
Methods
The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study.
Results
In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis.
Conclusion
The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.
Similar content being viewed by others
References
Bouguet JY (2013) Camera calibration toolbox for matlab. http://www.vision.caltech.edu/bouguetj/calib_doc/. Accessed 2014/1/6
Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV Library. O’Reilly Media
Doignon C, Nageotte F, de Mathelin M (2006) Segmentation and guidance of multiple rigid objects for intra-operative endoscopic vision. In: WDV’05/WDV’06/ICCV’05/ECCV’06 proceedings of the 2005/2006 international conference on dynamical vision. Springer, pp 314–327
Gill J, Booth MI, Stratford J, Dehn TCB (2007) The extended learning curve for laparoscopic fundoplication: a cohort analysis of 400 consecutive cases. J Gastrointest Surg 11(4):487–492. doi:10.1007/s11605-007-0132-0
Heikkilä J, Silvén O (1997) A four-step camera calibration procedure with implicit image correction. In: Proceedings of IEEE computer society conference on computer vision and pattern recognition, IEEE Comput. Soc, San Juan, pp 1106–1112. doi:10.1109/CVPR.1997.609468
Lee ACH, Haddad MJ, Hanna GB (2007) Influence of instrument size on endoscopic task performance in pediatric intracorporeal knot tying: smaller instruments are better in infants. Surg Endosc 21(11):2086–2090. doi:10.1007/s00464-007-9311-z
Liao H, Inomata T, Sakuma I, Dohi T (2010) 3-D augmented reality for MRI-guided surgery using integral videography autostereoscopic image overlay. IEEE Trans Bio-med Eng 57(6):1476–1486. doi:10.1109/TBME.2010.2040278
Maier-Hein L, Mountney P, Bartoli A, Elhawary H, Elson D, Groch A, Kolb A, Rodrigues M, Sorger J, Speidel S, Stoyanov D (2013) Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery. Med Image Anal 17(8):974–996. doi:10.1016/j.media.2013.04.003
Marescaux J, Leroy J, Rubino F, Smith M, Vix M, Simone M, Mutter D (2002) Transcontinental robot-assisted remote telesurgery: feasibility and potential applications. Ann Surg 235(4):487–492
Nageotte F, Zanne P, Doignon C, Mathelin M (2006) Visual servoing-based endoscopic path following for robot-assisted laparoscopic surgery. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, IEEE, pp 2364–2369. doi:10.1109/IROS.2006.282647
Navab N, Feuerstein M, Bichlmeier C (2007) Laparoscopic virtual mirror new interaction paradigm for monitor based augmented reality. In: 2007 IEEE virtual reality conference, IEEE, pp 43–50. doi:10.1109/VR.2007.352462
Okubo T, Nakaguchi T, Hayashi H (2011) Abdominal view expansion by retractable camera. J Signal Process 15(4):311–314
Perez M, Quiaios F, Andrivon P, Husson D, Dufaut M, Felblinger J, Hubert J (2007) Paradigms and experimental set-up for the determination of the acceptable delay in telesurgery. Conference proceedings: annual international conference of the IEEE engineering in medicine and biology society IEEE engineering in medicine and biology society conference 2007, pp 453–456. doi:10.1109/IEMBS.2007.4352321
Rayman R, Croome K, Galbraith N, McClure R, Morady R, Peterson S, Smith S, Subotic V, Van Wynsberghe A, Primak S (2006) Long-distance robotic telesurgery: a feasibility study for care in remote environments. Int J Med Robot Comput Assist Surg 2(3):216–224. doi:10.1002/rcs.99
Reiter A, Allen PK, Zhao T (2013) Appearance learning for 3D tracking of robotic surgical tools. Int J Robot Res 33(2):342–356. doi:10.1177/0278364913507796
Terry BS, Mills ZC, Schoen JA, Rentschler ME (2012) Single-port-access surgery with a novel magnet camera system. IEEE Trans Bio-med Eng 59(4):1187–1193. doi:10.1109/TBME.2012.2187292
Tomikawa M, Kenmotsu H, Uemura M, Konishi K, Ohuchida K, Okazaki K, Ieiri S, Tanoue K, Hashizume M (2014) Analysis of surgeons who attended laparoscopic surgical skills training programs at Kyushu University. International journal of computer assisted radiology and surgery: 18th Annual Conference of the International Society for. Computer Aided Surgery 9(1 Suppl), pp 130–131
Xia T, Baird C, Jallo G, Hayes K, Nakajima N, Hata N, Kazanzides P (2008) An integrated system for planning, navigation and robotic assistance for skull base surgery. Int J Med Robot Comput Assis Surg 4(4):321–330. doi:10.1002/rcs.213
Xu S, Perez M, Yang K, Perrenot C, Felblinger J, Hubert J (2014) Determination of the latency effects on surgical performance and the acceptable latency levels in telesurgery using the dV-Trainer() simulator. Surg Endosc. doi:10.1007/s00464-014-3504-z
Yamaguchi T, Nakamoto M, Sato Y, Konishi K, Hashizume M, Sugano N, Yoshikawa H, Tamura S (2004) Development of a camera model and calibration procedure for oblique-viewing endoscopes. Comput Aided Surg 9(5):203–214. doi:10.3109/10929080500163505
Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334. doi:10.1109/34.888718
Acknowledgments
This work was supported in part by Grants for Excellent Graduate Schools, Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT), a Grant-in-Aid of for Scientific Research from MEXT (No. 25220005).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
The authors declare that they have no conflict of interest.
Ethical standard
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. The in vivo experiment was conducted with the approval of the Ethics Committee of Kyushu University Hospital, Fukuoka, Japan. All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.
Informed consent
None.
Rights and permissions
About this article
Cite this article
Koreeda, Y., Kobayashi, Y., Ieiri, S. et al. Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions. Int J CARS 11, 1927–1936 (2016). https://doi.org/10.1007/s11548-016-1384-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-016-1384-5