Deep Learning Approaches for Gynaecological Ultrasound Image Segmentation: A Radio-Frequency vs B-mode Comparison
- 829 Downloads
Ovarian cancer is one of the pathologies with the worst prognostic in adult women and it has a very difficult early diagnosis. Clinical evaluation of gynaecological ultrasound images is performed visually, and it is dependent on the experience of the medical doctor. Besides the dependency on the specialists, the malignancy of specific types of ovarian tumors cannot be asserted until their surgical removal. This work explores the use of ultrasound data for the segmentation of the ovary and the ovarian follicles, using two different convolutional neural networks, a fully connected residual network and a U-Net, with a binary and multi-class approach. Five different types of ultrasound data (from beam-formed radio-frequency to brightness mode) were used as input. The best performance was obtained using B-mode, for both ovary and follicles segmentation. No significant differences were found between the two convolutional neural networks. The use of the multi-class approach was beneficial as it provided the model information on the spatial relation between follicles and the ovary. This study demonstrates the suitability of combining convolutional neural networks with beam-formed radio-frequency data and with brightness mode data for segmentation of ovarian structures. Future steps involve the processing of pathological data and investigation of biomarkers of pathological ovaries.
KeywordsB-mode ultrasound data Beam-formed ultrasound data Image segmentation Neuronal networks Ovarian cancer
This work is financed by National Funds through the Portuguese funding agency, FCT - Fundação para a Ciência e a Tecnologia as part of project “UID/EEA/50014/2019”.
- 2.Ali, M., Magee, D., Dasgupta, U.: Signal processing overview of ultrasound systems for medical imaging. In: SPRAB12, Texas, pp. 1–27 (2008)Google Scholar
- 9.He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. zrXiv (2015)Google Scholar
- 10.Hiremath, P.S., Tegnoor, J.R.: automatic detection of follicles in ultrasound images of ovaries by optimal threshoding method. Int. J. Comput. Sci. Inf. Technol. 3(2), 217-2 (2010)Google Scholar
- 11.Isah, O.R., Usman, A.D., Tekanyi, A.M.: A hybrid model of PSO algorithm and artificial neural network for automatic follicle classification. Int. J. Bioautomation 21(1), 43–58 (2017)Google Scholar
- 12.Kingma, D., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations (2014)Google Scholar
- 13.Lenic, M., Zazula, D., Cigale, B.: Segmentation of ovarian ultrasound images using single template cellular neural networks trained with support vector machines. In: Twentieth IEEE International Symposium on Computer-Based Medical Systems (CBMS 2007), pp. 205–212 (2007)Google Scholar
- 14.Milletari, F., Navan, N., Ahmadi, S.A.: V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation. arXiv (2016)Google Scholar
- 17.Rauh-Hain, J.A., Krivak, T.C., Del Carmen, M.G., Olawaiye, A.B.: Ovarian cancer screening and early detection in the general population. Rev. Obstet. Gynecol. 4(1), 15–21 (2011)Google Scholar
- 18.Ronneberger, O., Fischer, P., Brox, T.: U-net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28CrossRefGoogle Scholar
- 19.Usman, A.D., Isah, O.R., Tekanyi, A.M.S.: Application of artificial neural network and texture features for follicle detection. Afr. J. Comput. ICT 8(4), 2–9 (2015)Google Scholar
- 20.Veit, A., Wilber, M., Belongie, S.: Residual Networks Behave Like Ensembles of Relatively Shallow Networks, pp. 550–558 (2016)Google Scholar