A novel method for texture-mapping conoscopic surfaces for minimally invasive image-guided kidney surgery
- 294 Downloads
Organ-level registration is critical to image-guided therapy in soft tissue. This is especially important in organs such as the kidney which can freely move. We have developed a method for registration that combines three-dimensional locations from a holographic conoscope with an endoscopically obtained textured surface. By combining these data sources clear decisions as to the tissue from which the points arise can be made.
By localizing the conoscope’s laser dot in the endoscopic space, we register the textured surface to the cloud of conoscopic points. This allows the cloud of points to be filtered for only those arising from the kidney surface. Once a valid cloud is obtained we can use standard surface registration techniques to perform the image-space to physical-space registration. Since our methods use two distinct data sources we test for spatial accuracy and characterize temporal effects in phantoms, ex vivo porcine and human kidneys. In addition we use an industrial robot to provide controlled motion and positioning for characterizing temporal effects.
Our initial surface acquisitions are hand-held. This means that we take approximately 55 s to acquire a surface. At that rate we see no temporal effects due to acquisition synchronization or probe speed. Our surface registrations were able to find applied targets with submillimeter target registration errors.
The results showed that the textured surfaces could be reconstructed with submillimetric mean registration errors. While this paper focuses on kidney applications, this method could be applied to any anatomical structures where a line of sight can be created via open or minimally invasive surgical techniques.
KeywordsImage-guided surgery Kidney surgery Minimally invasive surgery Conoscopy
This work is funded in part by the National Institutes of Health: Grant R01 CA162477 from the National Cancer Institute, R01 NS049251 of the National Institute for Neurological Disorders and Stroke, and R44 DK081240 National Institute of Diabetes and Digestive and Kidney Diseases.
Compliance with ethical standards
Conflict of interest
Rowena Ong, Courtenay Glisson, Jessica Burgner-Kahrs, Amber Simpson, Andrei Danilchenko, Ray Lathrop, Duke Herrell, Robert Webster III, Michael Miga, and Robert L. Galloway declare that they have no conflict of interest.
Human and animal participants
The work used ex vivo kidneys obtained post-euthanasia from animals killed under IACUC-approved protocols. No human data were used.
- 22.Mersmann S, Müller M, Seitel A, Arnegger F, Tetzlaff R, Dinkel J, Baumhauer M, Schmied B, Meinzer H-P, Maier-Hein L (2011) Time-of-flight camera technique for augmented reality in computer-assisted interventions. In: Proceeding of SPIE 7964, medical imaging 2011: visualization. Image-guided procedures, and modeling. doi: 10.1117/12.878149
- 31.Burgner J, Simpson AL, Fitzpatrick JM, Lathrop RA, Herrell SD, Miga MI, Webster RJ (2012) A study on the theoretical and practical accuracy of conoscopic holography-based surface measurements: toward image registration in minimally invasive surgery. Int J Med Robot 9(2):190–203. doi: 10.1002/rcs.1446 CrossRefPubMedGoogle Scholar
- 33.Glisson C, Ong R, Clark P, Herrell D, Galloway R (2011) The use of virtual fiducials in image-guided kidney surgery. In: Proceedings of SPIE 7964, medical imaging 2011: visualization. Image-guided procedures, and modeling. doi: 10.1117/12.877092
- 34.Danilchenko A (2011) Fiducial-based registration with anisotropic localization error. Doctoral dissertation, Vanderbilt UniversityGoogle Scholar
- 35.Szpala S, Wierzbicki M, Guiraudon G, Peters TM (2005) Real-time fusion of endoscopic views with dynamic 3-D cardiac images: a phantom study. IEEE Trans Med Imaging 24:1207–1215. doi: 10.1109/TMI.2005.853639
- 36.Sielhorst T, Sa W, Khamene A, Sauer F, Navab N (2007) Measurement of absolute latency for video see through augmented reality. In: Proceedings of the 2007 6th IEEE and ACM internationalsymposium on mixed and augmented reality. pp 1–4, 2007. doi: 10.1109/ISMAR.2007.4538850