Advertisement

Automatic Detection and Localization of da Vinci Tool Tips in 3D Ultrasound

  • Omid Mohareri
  • Mahdi Ramezani
  • Troy Adebar
  • Purang Abolmaesumi
  • Septimiu Salcudean
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7330)

Abstract

Radical prostatectomy (RP) is viewed by many as the gold standard treatment for clinically localized prostate cancer. State of the art radical prostatectomy involves the da Vinci surgical system, a laparoscopic robot which provides the surgeon with excellent 3D visualization of the surgical site and improved dexterity over standard laparoscopic instruments. Given the limited field of view of the surgical site in Robot-Assisted Laparoscopic Radical Prostatectomy (RALRP), several groups have proposed the integration of Transrectal Ultrasound (TRUS) imaging in the surgical work flow to assist with the resection of prostate and sparing the Neuro-Vascular Bundle (NVB). Rapid and automatic registration of TRUS imaging coordinates to the da Vinci tools or camera is a critical component of this integration. We propose a fully automatic registration technique based on accurate and automatic localization of robot tool tips pressed against the air-tissue boundary of the prostate, in 3D TRUS. The detection approach uses a multi-scale filtering technique to uniquely identify and localize the tool tip in the ultrasound volume and could also be used to detect other surface fiducials in 3D ultrasound. Feasibility experiments using a phantom and two ex vivo tissue samples yield promising results with target registration error (defined as the root mean square distance of corresponding points after registration) of (\(1.80\ mm\)) that proves the system’s accuracy for registering 3D TRUS to the da Vinci surgical system.

Keywords

Robot-assisted surgery da Vinci surgical robot 3D ultrasound fiducial detection 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Adebar, T., Salcudean, S., Mahdavi, S., Moradi, M., Nguan, C., Goldenberg, L.: A Robotic System for Intra-operative Trans-Rectal Ultrasound and Ultrasound Elastography in Radical Prostatectomy. In: Taylor, R.H., Yang, G.-Z. (eds.) IPCAI 2011. LNCS, vol. 6689, pp. 79–89. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  2. 2.
    Cheung, C.L., Wedlake, C., Moore, J., et al.: Fusion of stereoscopic video and laparoscopic ultrasound for minimally invasive partial nephrectomy. In: Proceedings of SPIE Medical Imaging, vol. 7261, pp. 1–10 (2009)Google Scholar
  3. 3.
    Danilchenko, A., Fitzpatrick, J.M.: General approach to first-order error prediction in rigid point registration. IEEE Trans. on Medical Imaging 30(3), 679–693 (2011)CrossRefGoogle Scholar
  4. 4.
    Frangi, A.F.: 3D model-based analysis of vascular and cardiac images (2001)Google Scholar
  5. 5.
    Fuchs, H., Livingston, M.A., Raskar, R., Keller, K., State, A., Crawford, J.R., Rademacher, P., Drake, S.H., Meyer, A.A.: Augmented Reality Visualization for Laparoscopic Surgery. In: Wells, W.M., Colchester, A.C.F., Delp, S.L. (eds.) MICCAI 1998. LNCS, vol. 1496, pp. 934–943. Springer, Heidelberg (1998)Google Scholar
  6. 6.
    Gaufillet, F., Liegbott, H., Uhercik, M., et al.: 3d ultrasound real-time monitoring of surgical tools. In: IEEE Ultrasonics Symposium (IUS), pp. 2360–2363 (2010)Google Scholar
  7. 7.
    Grimson, E., Leventon, M., Ettinger, G., Chabrerie, A., Ozlen, F., Nakajima, S., Atsumi, H., Kikinis, R., Black, P.: Clinical Experience with a High Precision Image-Guided Neurosurgery System. In: Wells, W.M., Colchester, A.C.F., Delp, S.L. (eds.) MICCAI 1998. LNCS, vol. 1496, pp. 63–73. Springer, Heidelberg (1998)Google Scholar
  8. 8.
    Guide, M.U.: The mathworks. Inc., Natick, MA 5 (1998)Google Scholar
  9. 9.
    Johnson, S.C.: Hierarchical clustering schemes. Psychometrika 32(3), 241–254 (1967)CrossRefGoogle Scholar
  10. 10.
    Linte, C.A., Moore, J., Wiles, A.D., et al.: Virtual reality-enhanced ultrasound guidance: A novel technique for intracardiac interventions. Computer Aided Surgery 13(2), 82–94 (2008)Google Scholar
  11. 11.
    Novotny, P.M., Stoll, J.A., Vasilyev, N.V., Nido, P.J.D., Dupont, P.E., Zickler, T.E., Howe, R.D.: Gpu based real-time instrument tracking with three-dimensional ultrasound. Medical Image Analysis 11(5), 458–464 (2007)CrossRefGoogle Scholar
  12. 12.
    Poon, T.C., Rohling, R.N.: Tracking a 3-d ultrasound probe with constantly visible fiducials. Ultrasound in Medicine & Biology 33(1), 152–157 (2007)CrossRefGoogle Scholar
  13. 13.
    Umeyama, S.: Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. on PAMI 13(4), 376–380 (1991)CrossRefGoogle Scholar
  14. 14.
    Yip, M.C., Adebar, T.K., Rohling, R.N., Salcudean, S.E., Nguan, C.Y.: 3D Ultrasound to Stereoscopic Camera Registration through an Air-Tissue Boundary. In: Jiang, T., Navab, N., Pluim, J.P.W., Viergever, M.A. (eds.) MICCAI 2010. LNCS, vol. 6362, pp. 626–634. Springer, Heidelberg (2010)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Omid Mohareri
    • 1
  • Mahdi Ramezani
    • 1
  • Troy Adebar
    • 1
  • Purang Abolmaesumi
    • 1
  • Septimiu Salcudean
    • 1
  1. 1.Robotics and Control Laboratory, Department of Electrical and Computer EngineeringUniversity of British ColumbiaVancouverCanada

Personalised recommendations