Automatic Detection of Tumor Budding in Colorectal Carcinoma with Deep Learning
Colorectal cancer patients would benefit from a valid, reliable and efficient detection of Tumor Budding (TB), as this is a proven prognostic biomarker. We explored the application of deep learning techniques to detect TB in Hematoxylin and Eosin (H&E) stained slides, and used convolutional neural networks to classify image patches as containing tumor buds, tumor glands and background. As a reference standard for training we stained slides both with H&E and immunohistochemistry (IHC), where one pathologist first annotated buds in IHC and then transferred the obtained annotations to the corresponding H&E image. We show the effectiveness of the proposed three-class approach, which allows to substantially reduce the amount of false positives, especially when combined with a hard-negative mining technique. Finally we report the results of an observer study aimed at investigating the correlation between pathologists at detecting TB in IHC and H&E.
KeywordsDeep learning Computational pathology Colorectal carcinoma Tumor budding
This project was funded by a research grant from the Dutch Cancer Society, project number 10602/2016-2. The authors would like to thank Irene Otte-Holler and Rob van de Loo for staining and scanning the WSI’s.
- 1.Bejnordi, B.E., et al.: Deep learning-based assessment of tumor-associated Stroma for diagnosing breast cancer in histopathology images. In: Biomedical Imaging (ISBI 2017), pp. 929–932. IEEE (2017)Google Scholar
- 3.He, K., Zhang, X., Ren, S., Sun, J.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1026–1034 (2015)Google Scholar
- 7.Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition (2014)Google Scholar
- 8.Wang, D., Khosla, A., Gargeya, R., Irshad, H., Beck, A.H.: Deep learning for identifying metastatic breast cancer. arXiv preprint arXiv:1606.05718 (2016)