Abstract
Generating realistic breast masses is a highly important task because the large-size database of annotated breast masses is scarcely available. In this study, a novel realistic breast mass generation framework using the characteristics of the breast mass (i.e. BIRADS category) has been devised. For that purpose, the visual-semantic BIRADS description for characterizing breast masses is embedded into the deep network. The visual-semantic description is encoded together with image features and used to generate the realistic masses according the visual-semantic description. To verify the effectiveness of the proposed method, two public mammogram datasets were used. Qualitative and quantitative experimental results have shown that the realistic breast masses could be generated according to the BIRADS category.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chuquicusma, M.J., et al.: How to fool radiologists with generative adversarial networks? A visual turing test for lung cancer diagnosis. In: IEEE 15th International Symposium on Biomedical Imaging. IEEE (2018)
Frid-Adar, M., et al.: Synthetic data augmentation using GAN for improved liver lesion classification. In: IEEE 15th International Symposium on Biomedical Imaging. IEEE (2018)
D’Orsi, C.J.: ACR BI-RADS atlas: breast imaging reporting and data system. 2013. American College of Radiology (2013)
Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems (2014)
Bojanowski, P., et al.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)
Chung, J., et al.: Empirical evaluation of gated recurrent neural networks on sequence modeling. In: NIPS 2014 Workshop on Deep Learning (2014)
Heath, M., et al.: The digital database for screening mammography. In: Proceedings of the 5th International Workshop on Digital Mammography. Medical Physics Publishing (2000)
Moreira, I.C., et al.: Inbreast: toward a full-field digital mammographic database. Acad. Radiol. 19(2), 236–248 (2012)
Reed, S., et al.: Learning deep representations of fine-grained visual descriptions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)
Chen, X., et al.: Infogan: interpretable representation learning by information maximizing generative adversarial nets. In: Advances in Neural Information Processing Systems (2016)
Karssemeijer, N., te Brake, G.M.: Detection of stellate distortions in mammograms. IEEE Trans. Med. Imaging 15(5), 611–619 (1996)
Chan, H.P., et al.: Characterization of masses in digital breast tomosynthesis: Comparison of machine learning in projection views and reconstructed slices. Med. Phys. 37(7Part1), 3576–3586 (2010)
Acknowledgements
This work was supported by Institute for Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-01779, A machine learning and statistical inference framework for explainable artificial intelligence).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Lee, H., Kim, S.T., Lee, JH., Ro, Y.M. (2019). Realistic Breast Mass Generation Through BIRADS Category. In: Shen, D., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2019. MICCAI 2019. Lecture Notes in Computer Science(), vol 11769. Springer, Cham. https://doi.org/10.1007/978-3-030-32226-7_78
Download citation
DOI: https://doi.org/10.1007/978-3-030-32226-7_78
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32225-0
Online ISBN: 978-3-030-32226-7
eBook Packages: Computer ScienceComputer Science (R0)