Skip to main content

Attention-Based Multi-scale Gated Recurrent Encoder with Novel Correlation Loss for COVID-19 Progression Prediction

  • Conference paper
  • First Online:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2021 (MICCAI 2021)

Abstract

COVID-19 image analysis has mostly focused on diagnostic tasks using single timepoint scans acquired upon disease presentation or admission. We present a deep learning-based approach to predict lung infiltrate progression from serial chest radiographs (CXRs) of COVID-19 patients. Our method first utilizes convolutional neural networks (CNNs) for feature extraction from patches within the concerned lung zone, and also from neighboring and remote boundary regions. The framework further incorporates a multi-scale Gated Recurrent Unit (GRU) with a correlation module for effective predictions. The GRU accepts CNN feature vectors from three different areas as input and generates a fused representation. The correlation module attempts to minimize the correlation loss between hidden representations of concerned and neighboring area feature vectors, while maximizing the loss between the same from concerned and remote regions. Further, we employ an attention module over the output hidden states of each encoder timepoint to generate a context vector. This vector is used as an input to a decoder module to predict patch severity grades at a future timepoint. Finally, we ensemble the patch classification scores to calculate patient-wise grades. Specifically, our framework predicts zone-wise disease severity for a patient on a given day by learning representations from the previous temporal CXRs. Our novel multi-institutional dataset comprises sequential CXR scans from N = 93 patients. Our approach outperforms transfer learning and radiomic feature-based baseline approaches on this dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bae, J., et al.: Predicting mechanical ventilation requirement and mortality in COVID-19 using radiomics and deep learning on chest radiographs: a multi-institutional study. arXiv preprint arXiv:2007.08028 (2020)

  2. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  3. Cho, K., et al.: Learning phrase representations using rnn encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1724–1734 (October 2014)

    Google Scholar 

  4. Hu, Q., Drukker, K., Giger, M.L.: Role of standard and soft tissue chest radiography images in COVID-19 diagnosis using deep learning. In: Medical Imaging 2021: Computer-Aided Diagnosis, vol. 11597, p. 1159704. International Society for Optics and Photonics (February 2021)

    Google Scholar 

  5. Konwer, A., et al.: Predicting COVID-19 lung infiltrate progression on chest radiographs using spatio-temporal LSTM based encoder-decoder network. Med. Imag. Deep Learn. (MIDL) 143, 384–398 (2021)

    Google Scholar 

  6. Kwon, Y.J.F., et al.: Combining initial radiographs and clinical variables improves deep learning prognostication in patients with COVID-19 from the emergency department. Radiol. Artif. Intell. 3(2), e200098 (2020)

    Google Scholar 

  7. Litmanovich, D.E., Chung, M., Kirkbride, R.R., Kicska, G., Kanne, J.P.: Review of chest radiograph findings of COVID-19 pneumonia and suggested reporting language. J. Thorac. Imaging 35(6), 354–360 (2020)

    Google Scholar 

  8. López-Cabrera, J.D., Orozco-Morales, R., Portal-Diaz, J.A., Lovelle-Enríquez, O., Pérez-Díaz, M.: Current limitations to identify COVID-19 using artificial intelligence with chest X-ray imaging. Heal. Technol. 11(2), 411–424 (2021)

    Article  Google Scholar 

  9. Pavithra, M., Saruladha, K., Sathyabama, K.: GRU based deep learning model for prognosis prediction of disease progression. In: 3rd International Conference on Computing Methodologies and Communication (ICCMC), pp. 840–844 (2019)

    Google Scholar 

  10. Shi, F., et al.: Review of artificial intelligence techniques in imaging data acquisition, segmentation and diagnosis for COVID-19. arXiv arXiv:2004.02731 [cs, eess, q-bio] (April 2020)

  11. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)

  12. Toussie, D., et al.: Clinical and chest radiography features determine patient outcomes in young and middle age adults with COVID-19. Radiology 297, E197–E206 (2020)

    Google Scholar 

  13. Van Griethuysen, J.J.M., et al.: Computational radiomics system to decode the radiographic phenotype. Cancer Res. 77(21), e104–e107 (2017)

    Google Scholar 

  14. Wang, C., et al..: Toward predicting the evolution of lung tumors during radiotherapy observed on a longitudinal MR imaging study via a deep learning algorithm. Med. Phys. 46, 4699–4707 (2019)

    Google Scholar 

  15. Wong, H.Y.F., et al.: Frequency and distribution of chest radiographic findings in COVID-19 positive patients. Radiology 296, E72–E78 (2020)

    Article  Google Scholar 

  16. Xu, Y., et al.: Deep learning predicts lung cancer treatment response from serial medical imaging. Clin. Cancer Res. 25, 3266–3275 (2019)

    Article  Google Scholar 

  17. Yang, X., Ramesh, P., Chitta, R., Madhvanath, S., Bernal, E.A., Luo, J.: Deep multimodal representation learning from temporal data. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5447–5455 (2017)

    Google Scholar 

  18. Zhang, L., Lu, L., Summers, R., Kebebew, E., Yao, J.: Convolutional invasion and expansion networks for tumor growth prediction. IEEE Trans. Med. Imaging 37, 638–648 (2018)

    Article  Google Scholar 

Download references

Acknowledgments

Reported research was supported by the OVPR and IEDM seed grants, 2020 at Stony Brook University, NIGMS T32GM008444, and NIH 75N92020D00021 (subcontract). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Prateek Prasanna .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 119 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Konwer, A. et al. (2021). Attention-Based Multi-scale Gated Recurrent Encoder with Novel Correlation Loss for COVID-19 Progression Prediction. In: de Bruijne, M., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2021. MICCAI 2021. Lecture Notes in Computer Science(), vol 12905. Springer, Cham. https://doi.org/10.1007/978-3-030-87240-3_79

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-87240-3_79

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-87239-7

  • Online ISBN: 978-3-030-87240-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics