Advertisement

Self-paced Convolutional Neural Network for Computer Aided Detection in Medical Imaging Analysis

  • Xiang LiEmail author
  • Aoxiao Zhong
  • Ming Lin
  • Ning Guo
  • Mu Sun
  • Arkadiusz Sitek
  • Jieping Ye
  • James Thrall
  • Quanzheng Li
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10541)

Abstract

Tissue characterization has long been an important component of Computer Aided Diagnosis (CAD) systems for automatic lesion detection and further clinical planning. Motivated by the superior performance of deep learning methods on various computer vision problems, there has been increasing work applying deep learning to medical image analysis. However, the development of a robust and reliable deep learning model for computer-aided diagnosis is still highly challenging due to the combination of the high heterogeneity in the medical images and the relative lack of training samples. Specifically, annotation and labeling of the medical images is much more expensive and time-consuming than other applications and often involves manual labor from multiple domain experts. In this work, we propose a multi-stage, self-paced learning framework utilizing a convolutional neural network (CNN) to classify Computed Tomography (CT) image patches. The key contribution of this approach is that we augment the size of training samples by refining the unlabeled instances with a self-paced learning CNN. By implementing the framework on high performance computing servers including the NVIDIA DGX1 machine, we obtained the experimental result, showing that the self-pace boosted network consisntly outperformed the original network even with very scarce manual labels. The performance gain indicates that applications with limited training samples such as medical image analysis can benefit from using the proposed framework.

Keywords

Deep learning Self-paced learning Medical image analysis 

References

  1. 1.
    Anthimopoulos, M., Christodoulidis, S., Ebner, L., Christe, A., Mougiakakou, S.: Lung pattern classification for interstitial lung diseases using a deep convolutional neural network. IEEE Trans. Med. Imaging 35(5), 1207–1216 (2016)CrossRefGoogle Scholar
  2. 2.
    Cireşan, D.C., Giusti, A., Gambardella, L.M., Schmidhuber, J.: Mitosis detection in breast cancer histology images with deep neural networks. In: Mori, K., Sakuma, I., Sato, Y., Barillot, C., Navab, N. (eds.) MICCAI 2013. LNCS, vol. 8150, pp. 411–418. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40763-5_51 CrossRefGoogle Scholar
  3. 3.
    Shen, D., Wu, G., Suk, H.-I.: Deep learning in medical image analysis. Ann. Rev. Biomed. Eng. (2016)Google Scholar
  4. 4.
    Shin, H.C., Roth, H.R., Gao, M., Lu, L., Xu, Z., Nogues, I., Yao, J., Mollura, D., Summers, R.M.: Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans. Med. Imaging 35(5), 1285–1298 (2016)CrossRefGoogle Scholar
  5. 5.
    Greenspan, H., van Ginneken, B., Summers, R.M.: Guest editorial deep learning in medical imaging: overview and future promise of an exciting new technique. IEEE Trans. Med. Imaging 35(5), 1153–1159 (2016)CrossRefGoogle Scholar
  6. 6.
    Jiang, L., Meng, D., Yu, S.-I., Lan, Z., Shan, S., Hauptmann, A.G.: Self-paced learning with diversity. In: Advances in Neural Information Processing Systems (NIPS) (2014)Google Scholar
  7. 7.
    Kumar, M.P., Packer, B., Koller, D.: Self-paced learning for latent variable models. In: Advances in Neural Information Processing Systems (NIPS) (2010)Google Scholar
  8. 8.
    Jiang, L., Meng, D., Zhao, Q., Shan, S., Hauptmann, A.G.: Self-paced curriculum learning. In: AAAI Conference on Artificial Intelligence (2015)Google Scholar
  9. 9.
    Bengio, Y., Louradour, J., Collobert, R., Weston, J.: Curriculum learning. In: International Conference on Machine Learning (ICML) (2009)Google Scholar
  10. 10.
    Jia, Y., Shelhamer, E., Donahue, J., Karayev, S., Long, J., Girshick, R., Guadarrama, S., Darrell, T.: Caffe: convolutional architecture for fast feature embedding. In: International Conference on Multimedia (ACM MM) (2014)Google Scholar
  11. 11.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Maas, A.L., Hannun, A.Y., Ng, A.Y.: Rectifier nonlinearities improve neural network acoustic models. In: ICML Workshop on Deep Learning for Audio, Speech, and Language Processing (WDLASL) (2013)Google Scholar
  13. 13.
    Zhu, X., Vondrick, C., Fowlkes, C., Ramanan, D.: Do we need more training data? In: British Machine Vision Conference (2012)Google Scholar
  14. 14.
    Washko, G.R., Hunninghake, G.M., Fernandez, I.E., Nishino, M., Okajima, Y., Yamashiro, T., Ross, J.C., Estépar, R.S.J., Lynch, D.A., Brehm, J.M., Andriole, K.P., Diaz, A.A., Khorasani, R., D’Aco, K., Sciurba, F.C., Silverman, E.K., Hatabu, H., Rosas, I.O.: Lung volumes and emphysema in smokers with interstitial lung abnormalities. N. Engl. J. Med. 364(10), 897–906 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Xiang Li
    • 1
    Email author
  • Aoxiao Zhong
    • 2
  • Ming Lin
    • 3
  • Ning Guo
    • 1
  • Mu Sun
    • 4
  • Arkadiusz Sitek
    • 4
  • Jieping Ye
    • 3
  • James Thrall
    • 1
  • Quanzheng Li
    • 1
  1. 1.Massachusetts General HospitalBostonUSA
  2. 2.Zhejiang UniversityHangzhouChina
  3. 3.University of MichiganAnn ArborUSA
  4. 4.Beijing Institute of TechnologyBeijingChina

Personalised recommendations