Deep Convolutional Encoder-Decoders for Prostate Cancer Detection and Classification
- 9.5k Downloads
Prostate cancer accounts for approximately 11% of all cancer cases. Definitive diagnosis is made by histopathological examination of tissue biopsies. Recently, there have been strong correlations established between pre-biopsy multi-parametric MR image findings and the histopathology results. We investigate novel deep learning networks that provide tumor localization and classification solely based on prostate multi-parametric MR images using images with biopsy confirmed lesions. We propose to use a multi-channel image-to-image convolutional encoder-decoders where responses signify localized lesions and output channels represent different tumor classes. We take simple point locations in the labeled ground truth data and train networks to output Gaussian kernels around those points across multiple channels. This approach allows for both localization and classification within a single run. The input data consists of axial T2-weighted images, apparent diffusion coefficient maps, high b-value diffusion-weighted images, and K-trans parameter maps from 202 patients. The images were co-registered on a per patient basis and exhaustive comparisons were performed with 5-fold cross-validation across three different models with increasing complexity. The highest average classification area-under-the-curve (AUC) achieved was 83.4% using a medium complexity model, in which no skip-connection were used across layers. In individual k-folds, AUCs above 90% were achieved. The results demonstrate promise for directly determining tumor malignancy without performing an invasive biopsy procedure.
KeywordsProstate cancer Detection Characterization Deep learning
Data used in this research were obtained from The Cancer Imaging Archive (TCIA) sponsored by the SPIE, NCI/NIH, AAPM, and Radboud University .
- 1.Anthimopoulos, M., Christodoulidis, S., Ebner, L., Christe, A., Mougiakakou, S.: Lung pattern classification for interstitial lung diseases using a deep convolutional neural network. IEEE TMI 35(5), 1207–1216 (2016)Google Scholar
- 2.Badrinarayanan, V., Kendall, A., Cipolla, R.: SegNet: a deep convolutional encoder-decoder architecture for image segmentation. In: CVPR 2015, p. 5 (2015)Google Scholar
- 3.Chung, A.G., Shafiee, M.J., Kumar, D.: Discovery radiomics for multi-parametric MRI prostate cancer detection. In: Computer Vision and Pattern Recognition, pp. 1–8 (2015)Google Scholar
- 4.Drozdzal, M., Vorontsov, E., Chartrand, G., Kadoury, S., Pal, C.: The importance of skip connections in biomedical image segmentation. In: Carneiro, G., Mateus, D., Peter, L., Bradley, A., Tavares, J.M.R.S., Belagiannis, V., Papa, J.P., Nascimento, J.C., Loog, M., Lu, Z., Cardoso, J.S., Cornebise, J. (eds.) LABELS/DLMIA -2016. LNCS, vol. 10008, pp. 179–187. Springer, Cham (2016). doi: 10.1007/978-3-319-46976-8_19 CrossRefGoogle Scholar
- 5.Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A. and Bengio, Y.: Generative adversarial nets. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 27, pp. 2672–2680. Curran Associates Inc. (2014)Google Scholar
- 7.Litjens, G., Debats, O., Barentsz, J., Karssemeijer, N., Huisman, H.: Computer-aided detection of prostate cancer in MRI. IEEE TMI 33(5), 1083–1092 (2014)Google Scholar
- 8.Radford, A., Metz, L., Chintala, S.: Unsupervised representation learning with deep convolutional generative adversarial networks. CoRR, abs/1511.06434 (2015)Google Scholar
- 9.Reda, I., Shalaby, A., Khalifa, F., Elmogy, M., Aboulfotouh, A., El-Ghar, M.A., Hosseini-Asl, E., Werghi, N., Keynton, R., El-Baz, A.: Computer-aided diagnostic tool for early detection of prostate cancer. In: 2016 IEEE International Conference on Image Processing (ICIP), pp. 2668–2672, September 2016Google Scholar
- 11.Theano Development Team. Theano: a Python framework for fast computation of mathematical expressions. arXiv e-prints, abs/1605.02688, May 2016Google Scholar