Advertisement

Deep Attentional Features for Prostate Segmentation in Ultrasound

  • Yi Wang
  • Zijun Deng
  • Xiaowei Hu
  • Lei ZhuEmail author
  • Xin Yang
  • Xuemiao Xu
  • Pheng-Ann Heng
  • Dong Ni
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11073)

Abstract

Automatic prostate segmentation in transrectal ultrasound (TRUS) is of essential importance for image-guided prostate biopsy and treatment planning. However, developing such automatic solutions remains very challenging due to the ambiguous boundary and inhomogeneous intensity distribution of the prostate in TRUS. This paper develops a novel deep neural network equipped with deep attentional feature (DAF) modules for better prostate segmentation in TRUS by fully exploiting the complementary information encoded in different layers of the convolutional neural network (CNN). Our DAF utilizes the attention mechanism to selectively leverage the multi-level features integrated from different layers to refine the features at each individual layer, suppressing the non-prostate noise at shallow layers of the CNN and increasing more prostate details into features at deep layers. We evaluate the efficacy of the proposed network on challenging prostate TRUS images, and the experimental results demonstrate that our network outperforms state-of-the-art methods by a large margin.

Notes

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China (61701312; 61571304; 61772206), in part by the Natural Science Foundation of SZU (No. 2018010), in part by the Shenzhen Peacock Plan (KQTD2016053112051497), in part by Hong Kong Research Grants Council (No. 14202514) and Innovation and Technology Commission under TCFS (No. GHP/002/13SZ), and in part by the Guangdong Natural Science Foundation (No. 2017A030311027). Xiaowei Hu is funded by the Hong Kong Ph.D. Fellowship.

References

  1. 1.
    Chang, H.H., Zhuang, A.H., Valentino, D.J., Chu, W.C.: Performance measure characterization for evaluating neuroimage segmentation algorithms. Neuroimage 47(1), 122–135 (2009)CrossRefGoogle Scholar
  2. 2.
    Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: ImageNet: a large-scale hierarchical image database. In: CVPR (2009)Google Scholar
  3. 3.
    Ghose, S., et al.: A supervised learning framework of statistical shape and probability priors for automatic prostate segmentation in ultrasound images. Med. Image Anal. 17(6), 587–600 (2013)CrossRefGoogle Scholar
  4. 4.
    Hu, X., Zhu, L., Qin, J., Fu, C.W., Heng, P.A.: Recurrently aggregating deep features for salient object detection. In: AAAI (2018)Google Scholar
  5. 5.
    Krähenbühl, P., Koltun, V.: Efficient inference in fully connected CRFs with Gaussian edge potentials. In: NIPS (2011)Google Scholar
  6. 6.
    Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: CVPR (2015)Google Scholar
  7. 7.
    Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-24574-4_28CrossRefGoogle Scholar
  8. 8.
    Shen, D., Zhan, Y., Davatzikos, C.: Segmentation of prostate boundaries from ultrasound images using statistical shape model. IEEE Trans. Med. Imaging 22(4), 539–551 (2003)CrossRefGoogle Scholar
  9. 9.
    Siegel, R.L., Miller, K.D., Jemal, A.: Cancer statistics, 2018. CA: Cancer J. Clin. 68(1), 7–30 (2018)Google Scholar
  10. 10.
    Wang, Y., et al.: Towards personalized statistical deformable model and hybrid point matching for robust MR-TRUS registration. IEEE Trans. Med. Imaging 35(2), 589–604 (2016)CrossRefGoogle Scholar
  11. 11.
    Wang, Y., Zheng, Q., Heng, P.A.: Online robust projective dictionary learning: shape modeling for MR-TRUS registration. IEEE Trans. Med. Imaging 37(4), 1067–1078 (2018)CrossRefGoogle Scholar
  12. 12.
    Xie, S., Girshick, R., Dollár, P., Tu, Z., He, K.: Aggregated residual transformations for deep neural networks. In: CVPR (2017)Google Scholar
  13. 13.
    Xie, S., Tu, Z.: Holistically-nested edge detection. In: ICCV (2015)Google Scholar
  14. 14.
    Yan, P., Xu, S., Turkbey, B., Kruecker, J.: Discrete deformable model guided by partial active shape model for TRUS image segmentation. IEEE Trans. Biomed. Eng. 57(5), 1158–1166 (2010)CrossRefGoogle Scholar
  15. 15.
    Yang, X., et al.: Fine-grained recurrent neural networks for automatic prostate segmentation in ultrasound images. In: AAAI (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Yi Wang
    • 1
    • 2
  • Zijun Deng
    • 3
  • Xiaowei Hu
    • 4
  • Lei Zhu
    • 4
    • 5
    Email author
  • Xin Yang
    • 4
  • Xuemiao Xu
    • 3
  • Pheng-Ann Heng
    • 4
  • Dong Ni
    • 1
    • 2
  1. 1.National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Health Science CenterShenzhen UniversityShenzhenChina
  2. 2.Medical UltraSound Image Computing (MUSIC) LabShenzhenChina
  3. 3.School of Computer Science and EngineeringSouth China University of TechnologyGuangzhouChina
  4. 4.Department of Computer Science and EngineeringThe Chinese University of Hong KongHong KongChina
  5. 5.Centre for Smart Health, School of NursingThe Hong Kong Polytechnic UniversityHong KongChina

Personalised recommendations