Robust augmented reality registration method for localization of solid organs’ tumors using CT-derived virtual biomechanical model and fluorescent fiducials
- 577 Downloads
Augmented reality (AR) is the fusion of computer-generated and real-time images. AR can be used in surgery as a navigation tool, by creating a patient-specific virtual model through 3D software manipulation of DICOM imaging (e.g., CT scan). The virtual model can be superimposed to real-time images enabling transparency visualization of internal anatomy and accurate localization of tumors. However, the 3D model is rigid and does not take into account inner structures’ deformations. We present a concept of automated AR registration, while the organs undergo deformation during surgical manipulation, based on finite element modeling (FEM) coupled with optical imaging of fluorescent surface fiducials.
Two 10 × 1 mm wires (pseudo-tumors) and six 10 × 0.9 mm fluorescent fiducials were placed in ex vivo porcine kidneys (n = 10). Biomechanical FEM-based models were generated from CT scan. Kidneys were deformed and the shape changes were identified by tracking the fiducials, using a near-infrared optical system. The changes were registered automatically with the virtual model, which was deformed accordingly. Accuracy of prediction of pseudo-tumors’ location was evaluated with a CT scan in the deformed status (ground truth). In vivo: fluorescent fiducials were inserted under ultrasound guidance in the kidney of one pig, followed by a CT scan. The FEM-based virtual model was superimposed on laparoscopic images by automatic registration of the fiducials.
Biomechanical models were successfully generated and accurately superimposed on optical images. The mean measured distance between the estimated tumor by biomechanical propagation and the scanned tumor (ground truth) was 0.84 ± 0.42 mm. All fiducials were successfully placed in in vivo kidney and well visualized in near-infrared mode enabling accurate automatic registration of the virtual model on the laparoscopic images.
Our preliminary experiments showed the potential of a biomechanical model with fluorescent fiducials to propagate the deformation of solid organs’ surface to their inner structures including tumors with good accuracy and automatized robust tracking.
KeywordsAugmented reality Automatic registration Optical imaging Finite element modeling Solid organ tumor Fluorescence-guided surgery Fiducials
Authors are grateful to Christopher Burel, professional in medical English proofreading, for their valuable help in revising the manuscript.
Compliance with ethical standards
SH Kong, N Haouchine, R Soares, A Klymchenko, B Andreiuk, B Marques, G Shabat, T Piechaud, M Diana, S Cotin, and J Marescaux have no conflicts of interest or financial ties to disclose.
- 12.Collins BT, Erickson K, Reichner CA, Collins SP, Gagnon GJ, Dieterich S, McRae DA, Zhang Y, Yousefi S, Levy E, Chang T, Jamis-Dow C, Banovac F, Anderson ED (2007) Radical stereotactic radiosurgery with real-time tumor motion tracking in the treatment of small peripheral lung tumors. Radiat Oncol 2:39CrossRefPubMedPubMedCentralGoogle Scholar
- 15.Kong SH, Noh YW, Suh YS, Park HS, Lee HJ, Kang KW, Kim HC, Lim YT, Yang HK (2015) Evaluation of the novel near-infrared fluorescence tracers pullulan polymer nanogel and indocyanine green/gamma-glutamic acid complex for sentinel lymph node navigation surgery in large animal models. Gastric Cancer 18:55–64CrossRefPubMedGoogle Scholar
- 17.Faure F, Duriez C, Delingette H, Allard J, Gilles B, Marchesseau S, Talbot H, Courtecuisse H, Bousquet G, Peterlik I (2012) Sofa: a multi-model framework for interactive physical simulation. Soft tissue biomechanical modeling for computer assisted surgery. Springer, New York, pp 283–321CrossRefGoogle Scholar
- 18.Yamada H, Evans FG (1970) Strength of biological materials. Williams & Wilkins, PhiladelphiaGoogle Scholar
- 19.Marescaux J, Diana M, Soler L (2013) Augmented reality and minimally invasive surgery. J Gastroenterol Hepatol Res 2:555–560Google Scholar
- 23.Marescaux J, Rubino F, Arenas M, Mutter D, Soler L (2004) Augmented-reality-assisted laparoscopic adrenalectomy. JAMA J Am Med Assoc 292:2214–2215Google Scholar
- 28.Ieiri S, Uemura M, Konishi K, Souzaki R, Nagao Y, Tsutsumi N, Akahoshi T, Ohuchida K, Ohdaira T, Tomikawa M, Tanoue K, Hashizume M, Taguchi T (2012) Augmented reality navigation system for laparoscopic splenectomy in children based on preoperative CT image using optical tracking device. Pediatr Surg Int 28:341–346CrossRefPubMedGoogle Scholar
- 33.Kim SP, Thompson RH, Boorjian SA, Weight CJ, Han LC, Murad MH, Shippee ND, Erwin PJ, Costello BA, Chow GK, Leibovich BC (2012) Comparative effectiveness for survival and renal function of partial and radical nephrectomy for localized renal tumors: a systematic review and meta-analysis. J Urol 188:51–57CrossRefPubMedGoogle Scholar
- 39.Moreno-Noguer F, Lepetit V, Fua P (2007) Accurate non-iterative o (n) solution to the pnp problem. In: IEEE 11th international conference on computer vision. IEEE, pp 1–8Google Scholar
- 42.Paulus CJ, Haouchine N, Cazier D, Cotin S (2015) Surgical augmented reality with topological changes. International conference on medical image computing and computer-assisted intervention. Springer, New York, pp 413–420Google Scholar