Advertisement

Weakly Supervised Representation Learning for Endomicroscopy Image Analysis

  • Yun Gu
  • Khushi Vyas
  • Jie Yang
  • Guang-Zhong Yang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11071)

Abstract

This paper proposes a weakly-supervised representation learning framework for probe-based confocal laser endomicroscopy (pCLE). Unlike previous frame-based and mosaic-based methods, the proposed framework adopts deep convolutional neural networks and integrates frame-based feature learning, global diagnosis prediction and local tumor detection into a unified end-to-end model. The latent objects in pCLE mosaics are inferred via semantic label propagation and the deep convolutional neural networks are trained with a composite loss function. Experiments on 700 pCLE samples demonstrate that the proposed method trained with only global supervisions is able to achieve higher accuracy on global and local diagnosis prediction.

Keywords

Probe-based Confocal Laser Endomicroscopy Feature learning Semantic exclusivity 

Notes

Acknowledgement

This research is partly supported by Committee of Science and Technology, Shanghai, China (No. 17JC1403000) and 973 Plan, China (No. 2015CB856004). Yun Gu is supported by Chinese Scholarship Council (CSC). We also thank NVIDIA to provide the device for our work. The tissue specimens were obtained using the Imperial tissue bank ethical protocol following the R-12047 project.

References

  1. 1.
    André, B., Vercauteren, T., Buchner, A.M., Wallace, M.B., Ayache, N.: A smart atlas for endomicroscopy using automated video retrieval. MedIA 15(4), 460–476 (2011)Google Scholar
  2. 2.
    Kamen, A., et al.: Automatic tissue differentiation based on confocal endomicroscopic images for intraoperative guidance in neurosurgery. In: BioMed Research International 2016 (2016)Google Scholar
  3. 3.
    Gu, Y., Yang, J., Yang, G.Z.: Multi-view multi-modal feature embedding for endomicroscopy mosaic classification. In: CVPR, pp. 11–19 (2016)Google Scholar
  4. 4.
    Gu, Y., Vyas, K., Yang, J., Yang, G.-Z.: Unsupervised feature learning for endomicroscopy image retrieval. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10435, pp. 64–71. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-66179-7_8CrossRefGoogle Scholar
  5. 5.
    Aubreville, M., et al.: Automatic classification of cancerous tissue in laserendomicroscopy images of the oral cavity using deep learning. Sci. Rep. 7(1), 11979 (2017)CrossRefGoogle Scholar
  6. 6.
    André, B., Vercauteren, T., Buchner, A.M., Wallace, M.B., Ayache, N.: Endomicroscopic video retrieval using mosaicing and visual words. In: IEEE ISBI 2010, pp. 1419–1422. IEEE (2010)Google Scholar
  7. 7.
    Fortin, M., Glowinski, R.: Augmented Lagrangian Methods: Applications to the Numerical Solution of Boundary-Value Problems, vol. 15. Elsevier, New York (2000)zbMATHGoogle Scholar
  8. 8.
    Chang, T.P., et al.: Imaging breast cancer morphology using probe-based confocal laser endomicroscopy: towards a real-time intraoperative imaging tool for cavity scanning. Breast Cancer Res. Treat. 153(2), 299–310 (2015)CrossRefGoogle Scholar
  9. 9.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2016)Google Scholar
  10. 10.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
  11. 11.
    André, B., Vercauteren, T., Perchant, A., Buchner, A.M., Wallace, M.B., Ayache, N.: Endomicroscopic image retrieval and classification using invariant visual features. In: IEEE ISBI 2009, pp. 346–349. IEEE (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Yun Gu
    • 1
    • 2
  • Khushi Vyas
    • 2
  • Jie Yang
    • 1
  • Guang-Zhong Yang
    • 2
  1. 1.Institute of Image Processing and Pattern RecognitionShanghai Jiao Tong UniversityShanghaiChina
  2. 2.Hamlyn Centre for Robotic SurgeryImperial College LondonLondonUK

Personalised recommendations