Skip to main content

Preclinical evaluation of a markerless, real-time, augmented reality guidance system for robot-assisted radical prostatectomy



Intra-operative augmented reality (AR) during surgery can mitigate incomplete cancer removal by overlaying the anatomical boundaries extracted from medical imaging data onto the camera image. In this paper, we present the first such completely markerless AR guidance system for robot-assisted laparoscopic radical prostatectomy (RALRP) that transforms medical data from transrectal ultrasound (TRUS) to endoscope camera image. Moreover, we reduce the total number of transformations by combining the hand–eye and camera calibrations in a single step.


Our proposed solution requires two transformations: TRUS to robot, \(^\mathrm{DV}T_\mathrm{TRUS}\), and camera projection matrix, \(\mathbf{M} \) (i.e., the transformation from endoscope to camera image frame). \(^\mathrm{DV}T_\mathrm{TRUS}\) is estimated by the method proposed in Mohareri et al. (in J Urol 193(1):302–312, 2015). \(\mathbf{M} \) is estimated by selecting corresponding 3D-2D data points in the endoscope and the image coordinate frame, respectively, by using a CAD model of the surgical instrument and a preoperative camera intrinsic matrix with an assumption of a projective camera. The parameters are estimated using Levenberg–Marquardt algorithm. Overall mean re-projection errors (MRE) are reported using simulated and real data using a water bath. We show that \(\mathbf{M} \) can be re-estimated if the focus is changed during surgery.


Using simulated data, we received an overall MRE in the range of 11.69–13.32 pixels for monoscopic and stereo left and right cameras. For the water bath experiment, the overall MRE is in the range of 26.04–30.59 pixels for monoscopic and stereo cameras. The overall system error from TRUS to camera world frame is 4.05 mm. Details of the procedure are given in supplementary material.


We demonstrate a markerless AR guidance system for RALRP that does not need calibration markers and thus has the capability to re-estimate the camera projection matrix if it changes during surgery, e.g., due to a focus change.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5


  1. 1.

    Adebar TK, Yip MC, Salcudean SE, Rohling RN, Nguan CY, Goldenberg SL (2012) Registration of 3d ultrasound through an air-tissue boundary. IEEE Trans Med Imaging 31(11):2133–2142

    Article  Google Scholar 

  2. 2.

    Davis JW, Kreaden US, Gabbert J, Thomas R (2014) Learning curve assessment of robot-assisted radical prostatectomy compared with open-surgery controls from the premier perspective database. J Endourol 28(5):560–566

    Article  Google Scholar 

  3. 3.

    Geiger A, Moosmann F, Car Ö, Schuster B (2012) Automatic camera and range sensor calibration using a single shot. In: 2012 IEEE international conference on robotics and automation, pp 3936–3943. IEEE

  4. 4.

    Hartley R, Zisserman A (2003) Multiple view geometry in computer vision. Cambridge University Press, Cambridge

    Google Scholar 

  5. 5.

    Hartley RI (1997) In defense of the eight-point algorithm. IEEE Trans Pattern Anal Mach Intell 19(6):580–593

    Article  Google Scholar 

  6. 6.

    Kalia M, Mathur P, Tsang K, Black P, Navab N, Salcudean S (2020) Evaluation of a marker-less, intra-operative, augmented reality guidance system for robot-assisted laparoscopic radical prostatectomy. Int J Comput Assist Radiol Surg 15:1225–1233

    Article  Google Scholar 

  7. 7.

    Kolagunda A, Sorensen S, Mehralivand S, Saponaro P, Treible W, Turkbey B, Pinto P, Choyke P, Kambhamettu C (2018) A mixed reality guidance system for robot assisted laparoscopic radical prostatectomy. In: OR 2.0 context-aware operating theaters, computer assisted robotic endoscopy, clinical image-based procedures, and skin image analysis, pp 164–174. Springer

  8. 8.

    Linte CA, Davenport KP, Cleary K, Peters C, Vosburgh KG, Navab N, Jannin P, Peters TM, Holmes DR III, Robb RA (2013) On mixed reality environments for minimally invasive therapy guidance: systems architecture, successes and challenges in their implementation from laboratory to clinic. Comput Med Imaging Graph 37(2):83–97

    Article  Google Scholar 

  9. 9.

    Mitterberger M, Horninger W, Aigner F, Pinggera GM, Steppan I, Rehder P, Frauscher F (2010) Ultrasound of the prostate. Cancer Imaging 10(1):40

    Article  Google Scholar 

  10. 10.

    Mohareri O, Ischia J, Black PC, Schneider C, Lobo J, Goldenberg L, Salcudean SE (2015) Intraoperative registered transrectal ultrasound guidance for robot-assisted laparoscopic radical prostatectomy. J Urol 193(1):302–312

    Article  Google Scholar 

  11. 11.

    Porpiglia F, Fiori C, Checcucci E, Amparore D, Bertolo R (2018) Augmented reality robot-assisted radical prostatectomy: preliminary experience. Urology 115:184

    Article  Google Scholar 

  12. 12.

    Samei G, Tsang K, Kesch C, Lobo J, Hor S, Mohareri O, Chang S, Goldenberg SL, Black PC, Salcudean S (2019) A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med Image Anal.

    Article  PubMed  Google Scholar 

  13. 13.

    Silberstein JL, Eastham JA (2014) Significance and management of positive surgical margins at the time of radical prostatectomy. Indian J Urol: IJU: J Urol Soc India 30(4):423

    Article  Google Scholar 

  14. 14.

    Thompson S, Penney G, Billia M, Challacombe B, Hawkes D, Dasgupta P (2013) Design and evaluation of an image-guidance system for robot-assisted radical prostatectomy. BJU Int 111(7):1081–1090

    Article  Google Scholar 

  15. 15.

    Ye M, Zhang L, Giannarou S, Yang GZ (2016) Real-time 3d tracking of articulated tools for robotic surgery. In: International conference on medical image computing and computer-assisted intervention, pp 386–394. Springer

  16. 16.

    Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334

    Article  Google Scholar 

Download references


We are thankful for financial support provided by the Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council (NSERC) and the Charles Laszlo Chair in Biomedical Engineering held by Professor Salcudean. Canada Foundation of Innovation (CFI) provided infrastructure support. We would also like to thank Intuitive Surgical for providing the da Vinci Research API and support.

Author information



Corresponding author

Correspondence to Megha Kalia.

Ethics declarations

Conflicts of interest

The authors have no conflicts of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

This article does not contain patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 11 KB)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kalia, M., Avinash, A., Navab, N. et al. Preclinical evaluation of a markerless, real-time, augmented reality guidance system for robot-assisted radical prostatectomy. Int J CARS 16, 1181–1188 (2021).

Download citation


  • Surgical augmented reality
  • Robot-assisted radical prostatectomy
  • Markerless camera projection matrix estimation