Advertisement

TUNA-Net: Task-Oriented UNsupervised Adversarial Network for Disease Recognition in Cross-domain Chest X-rays

  • Yuxing TangEmail author
  • Youbao Tang
  • Veit Sandfort
  • Jing Xiao
  • Ronald M. Summers
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11769)

Abstract

In this work, we exploit the unsupervised domain adaptation problem for radiology image interpretation across domains. Specifically, we study how to adapt the disease recognition model from a labeled source domain to an unlabeled target domain, so as to reduce the effort of labeling each new dataset. To address the shortcoming of cross-domain, unpaired image-to-image translation methods which typically ignore class-specific semantics, we propose a task-driven, discriminatively trained, cycle-consistent generative adversarial network, termed TUNA-Net. It is able to preserve (1) low-level details, (2) high-level semantic information and (3) mid-level feature representation during the image-to-image translation process, to favor the target disease recognition task. The TUNA-Net framework is general and can be readily adapted to other learning tasks. We evaluate the proposed framework on two public chest X-ray datasets for pneumonia recognition. The TUNA-Net model can adapt labeled adult chest X-rays in the source domain such that they appear as if they were drawn from pediatric X-rays in the unlabeled target domain, while preserving the disease semantics. Extensive experiments show the superiority of the proposed method as compared to state-of-the-art unsupervised domain adaptation approaches. Notably, TUNA-Net achieves an AUC of 96.3% for pediatric pneumonia classification, which is very close to that of the supervised approach (98.1%), but without the need for labels on the target domain.

Notes

Acknowledgments

This research was supported by the Intramural Research Program of the National Institutes of Health Clinical Center and by the Ping An Technology Co., Ltd. through a Cooperative Research and Development Agreement. The authors thank NVIDIA for GPU donations.

Supplementary material

490281_1_En_48_MOESM1_ESM.pdf (740 kb)
Supplementary material 1 (pdf 739 KB)

References

  1. 1.
    Chen, C., Dou, Q., Chen, H., Heng, P.-A.: Semantic-aware generative adversarial nets for unsupervised domain adaptation in chest X-ray segmentation. In: Shi, Y., Suk, H.-I., Liu, M. (eds.) MLMI 2018. LNCS, vol. 11046, pp. 143–151. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-00919-9_17CrossRefGoogle Scholar
  2. 2.
    Ghafoorian, M., et al.: Transfer learning for domain adaptation in MRI: application in brain lesion segmentation. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10435, pp. 516–524. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-66179-7_59CrossRefGoogle Scholar
  3. 3.
    Goodfellow, I., et al.: Generative adversarial nets. In: NIPS (2014)Google Scholar
  4. 4.
    He, K., et al.: Deep residual learning for image recognition. In: CVPR (2016)Google Scholar
  5. 5.
    Hoffman, J., et al.: CyCADA: cycle consistent adversarial domain adaptation. In: ICML (2018)Google Scholar
  6. 6.
    Johnson, J., Alahi, A., Fei-Fei, L.: Perceptual losses for real-time style transfer and super-resolution. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9906, pp. 694–711. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46475-6_43CrossRefGoogle Scholar
  7. 7.
    Kamnitsas, K., et al.: Unsupervised domain adaptation in brain lesion segmentation with adversarial networks. In: Niethammer, M., et al. (eds.) IPMI 2017. LNCS, vol. 10265, pp. 597–609. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59050-9_47CrossRefGoogle Scholar
  8. 8.
    Tang, Y.X., et al.: Deep adversarial one-class learning for normal and abnormal chest radiograph classification. In: Medical Imaging: CAD (2019)Google Scholar
  9. 9.
    Tang, Y., Wang, X., Harrison, A.P., Lu, L., Xiao, J., Summers, R.M.: Attention-guided curriculum learning for weakly supervised classification and localization of thoracic diseases on chest radiographs. In: Shi, Y., Suk, H.-I., Liu, M. (eds.) MLMI 2018. LNCS, vol. 11046, pp. 249–258. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-00919-9_29CrossRefGoogle Scholar
  10. 10.
    Tzeng, E., et al.: Adversarial discriminative domain adaptation. In: CVPR (2017)Google Scholar
  11. 11.
    Wang, X., et al.: ChestX-ray8: hospital-scale chest X-ray database and benchmarks on weakly-supervised classification and localization of common thorax diseases. In: CVPR (2017)Google Scholar
  12. 12.
    Zhang, Y., Miao, S., Mansi, T., Liao, R.: Task driven generative modeling for unsupervised domain adaptation: application to X-ray image segmentation. In: Frangi, A.F., Schnabel, J.A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11071, pp. 599–607. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-00934-2_67CrossRefGoogle Scholar
  13. 13.
    Zhang, Z., et al.: Translating and segmenting multimodal medical volumes with cycle- and shape-consistency generative adversarial network. In: CVPR (2018)Google Scholar
  14. 14.
    Zhu, J.Y., et al.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: ICCV (2017)Google Scholar

Copyright information

© This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2019

Authors and Affiliations

  • Yuxing Tang
    • 1
    Email author
  • Youbao Tang
    • 1
  • Veit Sandfort
    • 1
  • Jing Xiao
    • 2
  • Ronald M. Summers
    • 1
  1. 1.National Institutes of Health, Clinical CenterBethesdaUSA
  2. 2.Ping An Technology Co., Ltd.ShenzhenChina

Personalised recommendations