Large-Scale Mammography CAD with Deformable Conv-Nets

  • Stephen MorrellEmail author
  • Zbigniew Wojna
  • Can Son Khoo
  • Sebastien Ourselin
  • Juan Eugenio Iglesias
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11040)


State-of-the-art deep learning methods for image processing are evolving into increasingly complex meta-architectures with a growing number of modules. Among them, region-based fully convolutional networks (R-FCN) and deformable convolutional nets (DCN) can improve CAD for mammography: R-FCN optimizes for speed and low consumption of memory, which is crucial for processing the high resolutions of to \(50\,\upmu \hbox {m}\) used by radiologists. Deformable convolution and pooling can model a wide range of mammographic findings of different morphology and scales, thanks to their versatility. In this study, we present a neural net architecture based on R-FCN/DCN, that we have adapted from the natural image domain to suit mammograms—particularly their larger image size—without compromising resolution. We trained the network on a large, recently released dataset (Optimam) including 6,500 cancerous mammograms. By combining our modern architecture with such a rich dataset, we achieved an area under the ROC curve of 0.879 for breast-wise detection in the DREAMS challenge (130,000 withheld images), which surpassed all other submissions in the competitive phase.


  1. 1.
    American Cancer Society: What are the key statistics about breast cancer?Google Scholar
  2. 2.
    Lehman, C., Wellman, R., Buist, D., Kerlikowske, K., Tosteson, A., Miglioretti, D.: Diagnostic accuracy of digital screening mammography with and without computer-aided detection. JAMA Intern. Med. 175, 1828 (2015)CrossRefGoogle Scholar
  3. 3.
    Gross, C.P., et al.: The cost of breast cancer screening in the medicare population. JAMA Intern. Med. 173(3), 220 (2013)CrossRefGoogle Scholar
  4. 4.
    Jiang, M., Zhang, S., Zheng, Y., Metaxas, D.N.: Mammographic mass segmentation with online learned shape and appearance priors. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 35–43. Springer, Cham (2016). Scholar
  5. 5.
    Karssemeijer, N., te Brake, G.M.: Detection of stellate distortions in mammograms. IEEE Trans. Med. Imaging 15(5), 611–619 (1996)CrossRefGoogle Scholar
  6. 6.
    Dhungel, N., Carneiro, G., Bradley, A.P.: The automated learning of deep features for breast mass classification from mammograms. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 106–114. Springer, Cham (2016). Scholar
  7. 7.
    Kooi, T., Litjens, G., Ginneken, B.V., Gubern-mérida, A., Sánchez, C.I., Mann, R., Heeten, A.D., Karssemeijer, N.: Large scale deep learning for computer aided detection of mammographic lesions. Med. Image Anal. 35, 303–312 (2017)CrossRefGoogle Scholar
  8. 8.
    Russakovsky, O., et al.: ImageNet large scale visual recognition challenge. IJCV 115(3), 211–252 (2015)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Lin, T.-Y., et al.: Microsoft COCO: common objects in context. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8693, pp. 740–755. Springer, Cham (2014). Scholar
  10. 10.
    Royal Surrey County Hospital: The Optimam Mammography Image DatabaseGoogle Scholar
  11. 11.
    Sage Bionetworks: The Digital Mammography DREAM Challenge (2016)Google Scholar
  12. 12.
    Dai, J., et al.: Deformable convolutional networks. In: CVPR, pp. 764–773 (2017)Google Scholar
  13. 13.
    Lin, T.Y., Dollár, P., Girshick, R., He, K., Hariharan, B., Belongie, S.: Feature pyramid networks for object detection. In: CVPR, vol. 1, p. 4 (2017)Google Scholar
  14. 14.
    He, K., Gkioxari, G., Dollár, P., Girshick, R.: Mask R-CNN. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 2980–2988. IEEE (2017)Google Scholar
  15. 15.
    Dai, J., Li, Y., He, K., Sun, J.: R-FCN: object Detection via Region-based Fully Convolutional Networks, May 2016Google Scholar
  16. 16.
    Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. In: Advances in Neural Information Processing Systems, pp. 91–99 (2015)Google Scholar
  17. 17.
    Shrivastava, A., Gupta, A., Girshick, R.: Training region-based object detectors with online hard example mining. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 761–769 (2015)Google Scholar
  18. 18.
    Szegedy, C., Vanhoucke, V., Shlens, J., Wojna, Z.: Rethinking the Inception Architecture for Computer Vision. (2015)Google Scholar
  19. 19.
    Carneiro, G., Nascimento, J., Bradley, A.P.: Unregistered multiview mammogram analysis with pre-trained deep learning models. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 652–660. Springer, Cham (2015). Scholar
  20. 20.
    Hu, J., Shen, L., Sun, G.: Squeeze-and-Excitation Networks (2017)Google Scholar
  21. 21.
    Huang, G., Liu, Z., van der Maaten, L., Weinberger, K.Q.: Densely Connected Convolutional Networks, 1–12. arXiv preprint (2016)Google Scholar
  22. 22.
    Xie, S., Girshick, R., Dollár, P., Tu, Z., He, K.: Aggregated Residual Transformations for Deep Neural Networks (2016)Google Scholar
  23. 23.
    Canziani, A., Paszke, A., Culurciello, E.: An analysis of deep neural network models for practical applications, 7p. Arxiv (2016)Google Scholar
  24. 24.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Stephen Morrell
    • 1
    Email author
  • Zbigniew Wojna
    • 1
  • Can Son Khoo
    • 1
  • Sebastien Ourselin
    • 2
  • Juan Eugenio Iglesias
    • 1
  1. 1.Medical Physics and Biomedical EngineeringUniversity College LondonLondonUK
  2. 2.School of Biomedical Engineering and Imaging SciencesKing’s College LondonLondonUK

Personalised recommendations