Adversarial Regression Training for Visualizing the Progression of Chronic Obstructive Pulmonary Disease with Chest X-Rays

  • Ricardo Bigolin LanfrediEmail author
  • Joyce D. Schroeder
  • Clement Vachet
  • Tolga Tasdizen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11769)


Knowledge of what spatial elements of medical images deep learning methods use as evidence is important for model interpretability, trustiness, and validation. There is a lack of such techniques for models in regression tasks. We propose a method, called visualization for regression with a generative adversarial network (VR-GAN), for formulating adversarial training specifically for datasets containing regression target values characterizing disease severity. We use a conditional generative adversarial network where the generator attempts to learn to shift the output of a regressor through creating disease effect maps that are added to the original images. Meanwhile, the regressor is trained to predict the original regression value for the modified images. A model trained with this technique learns to provide visualization for how the image would appear at different stages of the disease. We analyze our method in a dataset of chest x-rays associated with pulmonary function tests, used for diagnosing chronic obstructive pulmonary disease (COPD). For validation, we compute the difference of two registered x-rays of the same patient at different time points and correlate it to the generated disease effect map. The proposed method outperforms a technique based on classification and provides realistic-looking images, making modifications to images following what radiologists usually observe for this disease. Implementation code is available at


COPD Chest x-ray Regression interpretation Visual attribution Adversarial training Disease effect VR-GAN 


  1. 1.
    Ancona, M., Ceolini, E., Öztireli, C., Gross, M.: Towards better understanding of gradient-based attribution methods for deep neural networks. In: ICLR (2018)Google Scholar
  2. 2.
    Baumgartner, C.F., Koch, L.M., Tezcan, K.C., Ang, J.X.: Visual feature attribution using Wasserstein GANs. In: CVPR (2018)Google Scholar
  3. 3.
    Bazrafkan, S., Corcoran, P.: Versatile auxiliary regressor with generative adversarial network (VAR+GAN). arXiv preprint arXiv:1805.10864 (2018)
  4. 4.
    Foster, W.L., et al.: The emphysemas: radiologic-pathologic correlations. Radiographics 13(2), 311–328 (1993)CrossRefGoogle Scholar
  5. 5.
    Goodfellow, I., Shlens, J., Szegedy, C.: Explaining and harnessing adversarial examples. In: ICLR (2015)Google Scholar
  6. 6.
    He, K., et al.: Deep residual learning for image recognition. In: CVPR (2016)Google Scholar
  7. 7.
    Isola, P., Zhu, J., Zhou, T., Efros, A.A.: Image-to-image translation with conditional adversarial networks. In: CVPR (2017)Google Scholar
  8. 8.
    Johnson, J.D., Theurer, W.M.: A stepwise approach to the interpretation of pulmonary function tests. Am. Fam. Phys. 89(5), 359–66 (2014)Google Scholar
  9. 9.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (2015)Google Scholar
  10. 10.
    Mirza, M., Osindero, S.: Conditional generative adversarial nets. CoRR abs/1411.1784 (2014)Google Scholar
  11. 11.
    Rajpurkar, P., Irvin, J., et al.: Chexnet: Radiologist-level pneumonia detection on chest x-rays with deep learning. CoRR abs/1711.05225 (2017)Google Scholar
  12. 12.
    Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: MICCAI (2015)Google Scholar
  13. 13.
    Seah, J.C.Y., et al.: Chest radiographs in congestive heart failure: visualizing neural network learning. Radiology 290(2), 514–522 (2019)CrossRefGoogle Scholar
  14. 14.
    Tong, L., et al.: Adversarial regression with multiple learners. In: ICML (2018)Google Scholar
  15. 15.
    Zhou, B., Khosla, A., Lapedriza, À., Oliva, A., Torralba, A.: Learning deep features for discriminative localization. In: CVPR (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Scientific Computing and Imaging InstituteUniversity of UtahSalt Lake CityUSA
  2. 2.Department of Radiology and Imaging SciencesUniversity of UtahSalt Lake CityUSA

Personalised recommendations