Skip to main content

Advertisement

Log in

Surgical navigation system for brachytherapy based on mixed reality using a novel stereo registration method

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

In this study, we present a novel mixed reality navigation system to facilitate brachytherapy, which is an effective method for curing cancer. The accuracy of needle positioning is a vital problem in brachytherapy that can influence the treatment effect. The purpose of the developed system is to help doctors more quickly and easily position the needles and improve seed accuracy in brachytherapy surgery. Based on mixed reality and a multi-information fusion method, a successful fusion of medical images and a preoperative plan for real patients was achieved, allowing doctors to gain an intuitive understanding of the tumor. Image recognition and pose estimation were used to track the needle punctures in real time and perform registration processes. After global registration using an iterative closest-point algorithm with a pattern tracker, medical images and volume renderings of organs, needles and seeds were aligned with the patient. Based on a phantom experiment, the average needle location error was 0.961 mm, and the angle error was 1.861°. The accuracy of needle insertion was 1.586 mm, and the angle error was 2.429°. This research presents the design and validation of a surgical navigation system for thoracoabdominal brachytherapy based on mixed reality. The proposed system was validated through both phantom and animal experiments. The results indicated that the proposed system achieves clinically acceptable accuracy and can aid doctors in performing surgery based on a visualized plan.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Arnolli M, Buijze M, Franken M et al (2018) System for CT-guided needle placement in the thorax and abdomen: a design for clinical acceptability, applicability and usability. Int J Med Robot Comp 14:1

    Google Scholar 

  • Audenaert E, De Smedt K, Gelaude F et al (2011) A custom-made guide for femoral component positioning in hip resurfacing arthroplasty: development and validation study. Comput Aided Surg 16:304–309

    Article  Google Scholar 

  • Chaudhary A et al (2019) Cross-platform ubiquitous volume rendering using programmable shaders in VTK for scientific and medical visualization. IEEE Comput Graph Appl 39(1):26–43

    Article  Google Scholar 

  • Chen X, Xu L, Wang Y et al (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124–131

    Article  Google Scholar 

  • Chu Y, Yang J, Ma S et al (2017) Registration and fusion quantification of augmented reality based nasal endoscopic surgery. Med Image Anal 42:241–256

    Article  Google Scholar 

  • Chusetthagarn D, Visoottiviseth V, Haga J (2018) A prototype of collaborative augment reality environment for HoloLens. In: 22nd international computer science and engineering conference (ICSEC), pp 1–5

  • Dontschewa M, Stamatov D, Marinov MB (2018) Mixed reality smart glasses application for interactive working. In: 2018 IEEE XXVII international scientific conference electronics—ET, Sozopol pp 1–4

  • Durrant-Whyte H, Bailey T (2006) Simultaneous localization and mapping: part I. IEEE Robot Autom Mag 13(2):99–110

    Article  Google Scholar 

  • Fabbri R, Giblin P (2012) B Kimia (2012) Camera pose estimation using first-order curve differential geometry. Lect Notes Comput Sci ECCV 2012 7575:231–244

    Article  Google Scholar 

  • Funk M, Kritzer M, Michaelles F (2017) HoloLens is more than air tap: natural and intuitive interaction with holograms. In: IoT’ 17 proceedings of the seventh international conference on the internet of things. https://doi.org/10.1145/3131542.3140267.

  • Furlan R (2016) The future of augmented reality: Hololens Microsoft’s AR headset shines despite rough edges. IEEE Spectr 53(6):21–21

    Article  Google Scholar 

  • Gsaxner C, Pepe A, Wallner J, Schmalstieg D, Egger J (2019) Markerless image-to-face registration for untethered augmented reality in head and neck surgery. In: Medical image computing and computer assisted intervention, pp 236–244. https://doi.org/10.1007/978-3-030-32254-0_27

  • Hast A, Nysjö J, Marchetti A (2013) Optimal RANSAC-towards a repeatable algorithm for finding the optimal set. J WSCG 21(1):21–30

    Google Scholar 

  • Jiang S, Chen C, Dou H, Yang Z (2018) Automatic positioning and real-time tracking system for lung cancer brachytherapy. J Tianjin Univ 51(4):373–379

    Google Scholar 

  • Kanzaki H, Kataoka M, Nishikawa A et al (2015) Kinetics differences between PSA bounce and biochemical failure in patients treated with I-125 prostate brachytherapy. Jpn J Clin Oncol 45(7):688–694

    Article  Google Scholar 

  • Lacey G, Ryan D, Cassidy D, Young D (2007) Mixed-reality simulation of minimally invasive surgeries. IEEE Multimed 14(4):76–87

    Article  Google Scholar 

  • Lee CD et al (2014) Recent developments and best practice in brachytherapy treatment planning. Br J Radiol 87:1041

    Article  Google Scholar 

  • Ma L, Jiang W, Zhang B et al (2018) Augmented reality surgical navigation with accurate CBCT-patient registration for dental implant placement. Med Biol Eng Comput. https://doi.org/10.1007/s11517-018-1861-9

    Article  Google Scholar 

  • Ma X, Jiang S, Yang Z et al (2019) A real-time tracking and visualization system for robot-assisted template location method applied to lung cancer brachytherapy. ASME J Med Dev. https://doi.org/10.1115/1.4042542

    Article  Google Scholar 

  • Navab N, Traub J, Sielhorst T et al (2007) Action- and workflow-driven augmented reality for computer-aided medical procedures. IEEE Comput Graph Appl 27(5):10–14

    Article  Google Scholar 

  • Panchal PM, Panchal SR, Shah SK (2013) A comparison of SIFT and SURF. Int J Innov Res Comput Commun Eng 1:2–9

    Google Scholar 

  • Pepe A et al (2018) Pattern recognition and mixed reality for computer-aided maxillofacial surgery and oncological assessment. In: 2018 11th biomedical engineering international conference, pp 1–5. https://doi.org/10.1109/BMEiCON.2018.8609921

  • Pepe A, Trotta GF, Mohr-Ziak P et al (2019) A Marker-less registration approach for mixed reality-aided maxillofacial surgery: a pilot evaluation. J Dig Imag 32(6):1008–1018

    Article  Google Scholar 

  • Qian L, Azimi E et al (2017) Comprehensive tracker based display calibration for holographic optical see-through head-mounted display. arXiv preprint 1703.05834

  • Stewart A, Parashar B, Patel M et al (2015) American brachytherapy society consensus guidelines for thoracic brachytherapy for lung cancer. Brachytherapy 15:1–11

    Article  Google Scholar 

  • Sylos Labini M et al (2019) Depth-awareness in a system for mixed-reality aided surgical procedures. In: Intelligent computing methodologies, pp 716–726

  • Tanderup K, Menard C, Polgar C et al (2017) Advancements in brachytherapy. Adv Drug Deliv Rev 109:15–25

    Article  Google Scholar 

  • Wang J, Shen Y, Yang S (2019) A practical marker-less image registration method for augmented reality oral and maxillofacial surgery. Int J CARS 14:763

    Article  Google Scholar 

Download references

Acknowledgements

This research was partially supported by the National Natural Science Foundation of China (Grant No. 51775368, 81871457, 51811530310, 8167071354) and the Technology Planning Project of Guangdong Province, China (Grant No. 2017B020210004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shan Jiang.

Ethics declarations

Conflict of interest

All authors declare that they have no conflicts of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

This article does not contain patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Z., Jiang, S., Yang, Z. et al. Surgical navigation system for brachytherapy based on mixed reality using a novel stereo registration method. Virtual Reality 25, 975–984 (2021). https://doi.org/10.1007/s10055-021-00503-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-021-00503-8

Keywords

Navigation