Advertisement

Needle Tip Force Estimation Using an OCT Fiber and a Fused convGRU-CNN Architecture

  • Nils Gessert
  • Torben Priegnitz
  • Thore Saathoff
  • Sven-Thomas Antoni
  • David Meyer
  • Moritz Franz Hamann
  • Klaus-Peter Jünemann
  • Christoph Otte
  • Alexander Schlaefer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11073)

Abstract

Needle insertion is common during minimally invasive interventions such as biopsy or brachytherapy. During soft tissue needle insertion, forces acting at the needle tip cause tissue deformation and needle deflection. Accurate needle tip force measurement provides information on needle-tissue interaction and helps detecting and compensating potential misplacement. For this purpose we introduce an image-based needle tip force estimation method using an optical fiber imaging the deformation of an epoxy layer below the needle tip over time. For calibration and force estimation, we introduce a novel deep learning-based fused convolutional GRU-CNN model which effectively exploits the spatio-temporal data structure. The needle is easy to manufacture and our model achieves a mean absolute error of \(1.76\,\pm \,1.5\) mN with a cross-correlation coefficient of 0.9996, clearly outperforming other methods. We test needles with different materials to demonstrate that the approach can be adapted for different sensitivities and force ranges. Furthermore, we validate our approach in an ex-vivo prostate needle insertion scenario.

Keywords

Force estimation Optical coherence tomography Convolutional GRU Convolution Neural Network Needle placement 

Notes

Acknowledgements

This work was partially supported by DFG grants SCHL 1844/2-1 and SCHL 1844/2-2.

References

  1. 1.
    Aviles, A.I., Alsaleh, S.M., Hahn, J.K., Casals, A.: Towards retrieving force feedback in robotic-assisted surgery: a supervised neuro-recurrent-vision approach. IEEE Trans. Haptics 10(3), 431–443 (2017)CrossRefGoogle Scholar
  2. 2.
    Beekmans, S., Lembrechts, T., van den Dobbelsteen, J., van Gerwen, D.: Fiber-optic fabry-Pérot interferometers for axial force sensing on the tip of a needle. Sensors 17(1), 38 (2016)CrossRefGoogle Scholar
  3. 3.
    Donahue, J., et al.: Long-term recurrent convolutional networks for visual recognition and description. In: CVPR, pp. 2625–2634 (2015)Google Scholar
  4. 4.
    Hatzfeld, C., Wismath, S., Hessinger, M., Werthschtzky, R., Schlaefer, A., Kupnik, M.: A miniaturized sensor for needle tip force measurements. Biomed. Eng. 62(1), 109–115 (2017)Google Scholar
  5. 5.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2016)Google Scholar
  6. 6.
    Kataoka, H., Washio, T., Chinzei, K., Mizuhara, K., Simone, C., Okamura, A.M.: Measurement of the tip and friction force acting on a needle during penetration. In: Dohi, T., Kikinis, R. (eds.) MICCAI 2002. LNCS, vol. 2488, pp. 216–223. Springer, Heidelberg (2002).  https://doi.org/10.1007/3-540-45786-0_27CrossRefzbMATHGoogle Scholar
  7. 7.
    Kennedy, K.M., et al.: Quantitative micro-elastography: imaging of tissue elasticity using compression optical coherence elastography. Sci. Rep. 5(15), 538 (2015)Google Scholar
  8. 8.
    Kumar, S., Shrikanth, V., Amrutur, B., Asokan, S., Bobji, M.S.: Detecting stages of needle penetration into tissues through force estimation at needle tip using fiber bragg grating sensors. J. Biomed. Opt. 21(12), 127009 (2016)CrossRefGoogle Scholar
  9. 9.
    Mo, Z., Xu, W., Broderick, N.G.: Capability characterization via ex-vivo experiments of a fiber optical tip force sensing needle for tissue identification. IEEE Sens. J. 18, 1195–1202 (2017)Google Scholar
  10. 10.
    Okamura, A.M., Simone, C., O’leary, M.D.: Force modeling for needle insertion into soft tissue. IEEE Trans. Biomed. Eng. 51(10), 1707–1716 (2004)CrossRefGoogle Scholar
  11. 11.
    Otte, C., et al.: Investigating recurrent neural networks for OCT a-scan based tissue analysis. Methods Inf. Med. 53(4), 245–249 (2014)CrossRefGoogle Scholar
  12. 12.
    Rodrigues, S., Horeman, T., Sam, P., Dankelman, J., van den Dobbelsteen, J., Jansen, F.W.: Influence of visual force feedback on tissue handling in minimally invasive surgery. Br. J. Surg. 101(13), 1766–1773 (2014)CrossRefGoogle Scholar
  13. 13.
    Sun, L., Jia, K., Yeung, D.Y., Shi, B.E.: Human action recognition using factorized spatio-temporal convolutional networks. In: CVPR, pp. 4597–4605 (2015)Google Scholar
  14. 14.
    Taylor, R.H., Menciassi, A., Fichtinger, G., Fiorini, P., Dario, P.: Medical robotics and computer-integrated surgery. In: Siciliano, B., Khatib, O. (eds.) Springer Handbook of Robotics, pp. 1657–1684. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-32552-1_63CrossRefGoogle Scholar
  15. 15.
    Xingjian, S., Chen, Z., Wang, H., Yeung, D.Y., Wong, W.K., Woo, W.C.: Convolutional LSTM network: a machine learning approach for precipitation nowcasting. In: Advances in Neural Information Processing Systems, pp. 802–810 (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Nils Gessert
    • 1
  • Torben Priegnitz
    • 1
  • Thore Saathoff
    • 1
  • Sven-Thomas Antoni
    • 1
  • David Meyer
    • 2
  • Moritz Franz Hamann
    • 2
  • Klaus-Peter Jünemann
    • 2
  • Christoph Otte
    • 1
  • Alexander Schlaefer
    • 1
  1. 1.Institute of Medical TechnologyHamburg University of TechnologyHamburgGermany
  2. 2.Department of UrologyUniversity Hospital Schleswig-HolsteinKielGermany

Personalised recommendations