Skip to main content

Shape-Based Pose Estimation of Robotic Surgical Instruments

  • Conference paper
  • First Online:
Computer Assisted and Robotic Endoscopy and Clinical Image-Based Procedures (CARE 2017, CLIP 2017)

Abstract

We describe a detector of robotic instrument parts in image-guided surgery. The detector consists of a huge ensemble of scale-variant and pose-dedicated, rigid appearance templates. The templates, which are equipped with pose-related keypoints and segmentation masks, allow for explicit pose estimation and segmentation of multiple end-effectors as well as fine-grained non-maximum suppression. We train the templates by grouping examples of end-effector articulations, imaged at various viewpoints, in thus arising space of instrument shapes. Proposed shape-based grouping forms tight clusters of pose-specific end-effector appearance. Experimental results show that the proposed method can effectively estimate the end-effector pose and delineate its boundary while being trained with moderately sized data clusters. We then show that matching such huge ensemble of templates takes less than one second on commodity hardware.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Eklund, A., Dufort, P., Non-separable 2D, 3D, and 4D Filtering with CUDA.

References

  1. Allan, M., Chang, P.-L., Ourselin, S., Hawkes, D.J., Sridhar, A., Kelly, J., Stoyanov, D.: Image based surgical instrument pose estimation with multi-class labelling and optical flow. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9349, pp. 331–338. Springer, Cham (2015). doi:10.1007/978-3-319-24553-9_41

    Chapter  Google Scholar 

  2. Bouget, D., Benenson, R., Omran, M., Riffaud, L., Schiele, B., Jannin, P.: Detecting surgical tools by modelling local appearance and global shape. IEEE Trans. Med. Imaging 34(12), 2603–2617 (2015)

    Article  Google Scholar 

  3. Bouget, D., Allan, M., Stoyanov, D., Jannin, P.: Vision-based and marker-less surgical tool detection and tracking: a review of the literature. Med. Image Anal. 35, 633–654 (2017)

    Article  Google Scholar 

  4. Du, X., Allan, M., Dore, A., Ourselin, S., Hawkes, D., Kelly, J.D., Stoyanov, D.: Combined 2D and 3D tracking of surgical instruments for minimally invasive and robotic-assisted surgery. Int. J. Comput. Assist. Radiol. Surgery 11(6), 1109–1119 (2016)

    Article  Google Scholar 

  5. Hinterstoisser, S., Cagniart, C., Ilic, S., Sturm, P., Navab, N., Fua, P., Lepetit, V.: Gradient response maps for real-time detection of textureless objects. IEEE Trans. Pattern Anal. Mach. Intell. 34(5), 876–888 (2012)

    Article  Google Scholar 

  6. Laina, I., Rieke, N., Rupprecht, C., Vizcano, J. P., Eslami, A., Tombari, F., Navab, N.: Concurrent segmentation and localization for tracking of surgical instruments. arXiv preprint (2017). arXiv:1703.10701

  7. Malisiewicz, T., Efros, A.A.: Improving spatial support for objects via multiple segmentations. In: British Machine Vision Conference (2007)

    Google Scholar 

  8. Malisiewicz, T., Gupta, A., Efros, A.A.: Ensemble of exemplar-SVMs for object detection and beyond. In: International Conference on Computer Vision, pp. 89–96 (2011)

    Google Scholar 

  9. Padoy, N., Hager, G.D.: Deformable tracking of textured curvilinear objects. In: British Machine Vision Conference, pp. 1–11 (2012)

    Google Scholar 

  10. Pezzementi, Z., Voros, S., Hager, G.D.: Articulated object tracking by rendering consistent appearance parts. In: IEEE International Conference on Robotics and Automation, pp. 3940–3947 (2009)

    Google Scholar 

  11. Reiter, A., Allen, P.K., Zhao, T.: Feature classification for tracking articulated surgical tools. In: Ayache, N., Delingette, H., Golland, P., Mori, K. (eds.) MICCAI 2012. LNCS, vol. 7511, pp. 592–600. Springer, Heidelberg (2012). doi:10.1007/978-3-642-33418-4_73

    Chapter  Google Scholar 

  12. Reiter, A., Allen, P.K., Zhao, T.: Marker-less articulated surgical tool detection. Comput. Assist. Radiol. Surg. (2012)

    Google Scholar 

  13. Sarikaya, D., Corso, J., Guru, K.: Detection and localization of robotic tools in robot-assisted surgery videos using deep neural networks for region proposal and detection. IEEE Trans. Med. Imaging 36, 1542–1549 (2017)

    Article  Google Scholar 

  14. Staub, C., Lenz, C., Panin, G., Knoll, A., Bauernschmitt, R.: Contour-based surgical instrument tracking supported by kinematic prediction. In: RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, pp. 746–752 (2010)

    Google Scholar 

  15. Sznitman, R., Becker, C., Fua, P.: Fast part-based classification for instrument detection in minimally invasive surgery. In: Golland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds.) MICCAI 2014. LNCS, vol. 8674, pp. 692–699. Springer, Cham (2014). doi:10.1007/978-3-319-10470-6_86

    Google Scholar 

  16. Wesierski, D., Wojdyga, G., Jezierska, A.: Instrument tracking with rigid part mixtures model. In: Luo, X., Reichl, T., Reiter, A., Mariottini, G.-L. (eds.) CARE 2015. LNCS, vol. 9515, pp. 22–34. Springer, Cham (2016). doi:10.1007/978-3-319-29965-5_3

    Chapter  Google Scholar 

  17. Yang, G.-Z., Cambias, J., Cleary, K., Daimler, E., Drake, J., Dupont, P., Hata, N., Kazanzides, P., Martel, S., et al.: Medical robotics - regulatory, ethical, and legal considerations for increasing levels of autonomy. Sci. Robot. 2(4), eaam8638 (2017)

    Google Scholar 

  18. Ye, M., Zhang, L., Giannarou, S., Yang, G.-Z.: Real-time 3D tracking of articulated tools for robotic surgery. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9900, pp. 386–394. Springer, Cham (2016). doi:10.1007/978-3-319-46720-7_45

    Chapter  Google Scholar 

Download references

Acknowledgment

This work was partially supported by the National Science Center under the agreement UMO-2014/13/D/ST7/03358.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel Wesierski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Wesierski, D., Cygert, S. (2017). Shape-Based Pose Estimation of Robotic Surgical Instruments. In: Cardoso, M., et al. Computer Assisted and Robotic Endoscopy and Clinical Image-Based Procedures. CARE CLIP 2017 2017. Lecture Notes in Computer Science(), vol 10550. Springer, Cham. https://doi.org/10.1007/978-3-319-67543-5_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67543-5_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67542-8

  • Online ISBN: 978-3-319-67543-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics