Skip to main content
Log in

Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent.

Methods

The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study.

Results

In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis.

Conclusion

The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Bouguet JY (2013) Camera calibration toolbox for matlab. http://www.vision.caltech.edu/bouguetj/calib_doc/. Accessed 2014/1/6

  2. Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV Library. O’Reilly Media

  3. Doignon C, Nageotte F, de Mathelin M (2006) Segmentation and guidance of multiple rigid objects for intra-operative endoscopic vision. In: WDV’05/WDV’06/ICCV’05/ECCV’06 proceedings of the 2005/2006 international conference on dynamical vision. Springer, pp 314–327

  4. Gill J, Booth MI, Stratford J, Dehn TCB (2007) The extended learning curve for laparoscopic fundoplication: a cohort analysis of 400 consecutive cases. J Gastrointest Surg 11(4):487–492. doi:10.1007/s11605-007-0132-0

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. Heikkilä J, Silvén O (1997) A four-step camera calibration procedure with implicit image correction. In: Proceedings of IEEE computer society conference on computer vision and pattern recognition, IEEE Comput. Soc, San Juan, pp 1106–1112. doi:10.1109/CVPR.1997.609468

  6. Lee ACH, Haddad MJ, Hanna GB (2007) Influence of instrument size on endoscopic task performance in pediatric intracorporeal knot tying: smaller instruments are better in infants. Surg Endosc 21(11):2086–2090. doi:10.1007/s00464-007-9311-z

    Article  PubMed  Google Scholar 

  7. Liao H, Inomata T, Sakuma I, Dohi T (2010) 3-D augmented reality for MRI-guided surgery using integral videography autostereoscopic image overlay. IEEE Trans Bio-med Eng 57(6):1476–1486. doi:10.1109/TBME.2010.2040278

    Article  Google Scholar 

  8. Maier-Hein L, Mountney P, Bartoli A, Elhawary H, Elson D, Groch A, Kolb A, Rodrigues M, Sorger J, Speidel S, Stoyanov D (2013) Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery. Med Image Anal 17(8):974–996. doi:10.1016/j.media.2013.04.003

    Article  CAS  PubMed  Google Scholar 

  9. Marescaux J, Leroy J, Rubino F, Smith M, Vix M, Simone M, Mutter D (2002) Transcontinental robot-assisted remote telesurgery: feasibility and potential applications. Ann Surg 235(4):487–492

    Article  PubMed  PubMed Central  Google Scholar 

  10. Nageotte F, Zanne P, Doignon C, Mathelin M (2006) Visual servoing-based endoscopic path following for robot-assisted laparoscopic surgery. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, IEEE, pp 2364–2369. doi:10.1109/IROS.2006.282647

  11. Navab N, Feuerstein M, Bichlmeier C (2007) Laparoscopic virtual mirror new interaction paradigm for monitor based augmented reality. In: 2007 IEEE virtual reality conference, IEEE, pp 43–50. doi:10.1109/VR.2007.352462

  12. Okubo T, Nakaguchi T, Hayashi H (2011) Abdominal view expansion by retractable camera. J Signal Process 15(4):311–314

  13. Perez M, Quiaios F, Andrivon P, Husson D, Dufaut M, Felblinger J, Hubert J (2007) Paradigms and experimental set-up for the determination of the acceptable delay in telesurgery. Conference proceedings: annual international conference of the IEEE engineering in medicine and biology society IEEE engineering in medicine and biology society conference 2007, pp 453–456. doi:10.1109/IEMBS.2007.4352321

  14. Rayman R, Croome K, Galbraith N, McClure R, Morady R, Peterson S, Smith S, Subotic V, Van Wynsberghe A, Primak S (2006) Long-distance robotic telesurgery: a feasibility study for care in remote environments. Int J Med Robot Comput Assist Surg 2(3):216–224. doi:10.1002/rcs.99

    Article  CAS  Google Scholar 

  15. Reiter A, Allen PK, Zhao T (2013) Appearance learning for 3D tracking of robotic surgical tools. Int J Robot Res 33(2):342–356. doi:10.1177/0278364913507796

    Article  Google Scholar 

  16. Terry BS, Mills ZC, Schoen JA, Rentschler ME (2012) Single-port-access surgery with a novel magnet camera system. IEEE Trans Bio-med Eng 59(4):1187–1193. doi:10.1109/TBME.2012.2187292

    Article  Google Scholar 

  17. Tomikawa M, Kenmotsu H, Uemura M, Konishi K, Ohuchida K, Okazaki K, Ieiri S, Tanoue K, Hashizume M (2014) Analysis of surgeons who attended laparoscopic surgical skills training programs at Kyushu University. International journal of computer assisted radiology and surgery: 18th Annual Conference of the International Society for. Computer Aided Surgery 9(1 Suppl), pp 130–131

  18. Xia T, Baird C, Jallo G, Hayes K, Nakajima N, Hata N, Kazanzides P (2008) An integrated system for planning, navigation and robotic assistance for skull base surgery. Int J Med Robot Comput Assis Surg 4(4):321–330. doi:10.1002/rcs.213

    Article  Google Scholar 

  19. Xu S, Perez M, Yang K, Perrenot C, Felblinger J, Hubert J (2014) Determination of the latency effects on surgical performance and the acceptable latency levels in telesurgery using the dV-Trainer() simulator. Surg Endosc. doi:10.1007/s00464-014-3504-z

  20. Yamaguchi T, Nakamoto M, Sato Y, Konishi K, Hashizume M, Sugano N, Yoshikawa H, Tamura S (2004) Development of a camera model and calibration procedure for oblique-viewing endoscopes. Comput Aided Surg 9(5):203–214. doi:10.3109/10929080500163505

    PubMed  Google Scholar 

  21. Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334. doi:10.1109/34.888718

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported in part by Grants for Excellent Graduate Schools, Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT), a Grant-in-Aid of for Scientific Research from MEXT (No. 25220005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuta Koreeda.

Ethics declarations

Conflicts of interest

The authors declare that they have no conflict of interest.

Ethical standard

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. The in vivo experiment was conducted with the approval of the Ethics Committee of Kyushu University Hospital, Fukuoka, Japan. All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.

Informed consent

None.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Koreeda, Y., Kobayashi, Y., Ieiri, S. et al. Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions. Int J CARS 11, 1927–1936 (2016). https://doi.org/10.1007/s11548-016-1384-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-016-1384-5

Keywords

Navigation