Advertisement

Semi-automatic Brain Tumor Segmentation by Drawing Long Axes on Multi-plane Reformat

  • David GeringEmail author
  • Kay Sun
  • Aaron Avery
  • Roger Chylla
  • Ajeet Vivekanandan
  • Lisa Kohli
  • Haley Knapp
  • Brad Paschke
  • Brett Young-Moxon
  • Nik King
  • Thomas Mackie
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11384)

Abstract

A semi-automatic image segmentation method, called SAMBAS, based on workflow familiar to clinical radiologists is described. The user initializes 3D segmentation by drawing a long axis on a multi-plane reformat (MPR). As the user draws, a 2D segmentation updates in real-time for interactive feedback. When necessary, additional long axes, short axes, or other editing operations may be drawn on one or more MPR planes. The method learns probability distributions from the drawing to perform the MPR segmentation, and in turn, it learns from the MPR segmentation to perform the 3D segmentation. As a preliminary experiment, a batch simulation was performed where long and short axes were automatically drawn on each of 285 multispectral MR brain scans of glioma patients in the 2018 BraTS Challenge training data. Average Dice coefficient for tumor core was 0.86, and the Hausdorff-95% distance was 4.4 mm. As another experiment, a convolution neural network was trained on the same data, and applied to the BraTS validation and test data. Its outputs, computed offline, were integrated into the interactive method. Ten volunteers used the interface on the BraTS validation and test data. On the 66 scans of the validation data, average Dice coefficient for core tumor improved from 0.76 with deep learning alone, to 0.82 as an interactive system.

Keywords

Brain tumor Image segmentation Semi-automatic Machine learning 

References

  1. 1.
    Nordstrom, R.: The quantitative imaging network in precision medicine. Tomography 2(4), 239–241 (2016)CrossRefGoogle Scholar
  2. 2.
    Macdonald, D.R., Cascino, T.L., Schold Jr., S.C., Cairncross, J.G.: Response criteria for phase II studies of supratentorial malignant glioma. J. Clin. Oncol. 8, 1277–1280 (1990)CrossRefGoogle Scholar
  3. 3.
    Therasse, P., et al.: New guidelines to evaluate the response to treatment in solid tumors. J. Natl. Cancer Inst. 92(3), 205–216 (2000)CrossRefGoogle Scholar
  4. 4.
    Sorensen, A.G., Batchelor, T.T., Wen, P.Y., Zhang, W.T., Jain, R.K.: Response criteria for glioma. Nat. Clin. Pract. Oncol. 5, 634–644 (2008)CrossRefGoogle Scholar
  5. 5.
    Suzuki, C., et al.: Radiologic measurements of tumor response to treatment: practical approaches and limitations. Radiographics 28, 329–344 (2008)CrossRefGoogle Scholar
  6. 6.
    Wen, P.Y., et al.: Updated response assessment criteria for high-grade gliomas: response assessment in neuro-oncology working group. J. Clin. Oncol. Off. J. Am. Soc. Clin. Oncol. 28, 1963–1972 (2010)CrossRefGoogle Scholar
  7. 7.
    Yankeelov, E., Mankoff, D., Schwartz, L., Rubin, D.: Quantitative imaging in cancer clinical trials. Clin. Cancer Res. 22, 284–290 (2016)CrossRefGoogle Scholar
  8. 8.
  9. 9.
    Mehta, A.I., Kanaly, C.W., Friedman, A.H., Bigner, D.D., Sampson, J.H.: Monitoring radiographic brain tumor progression. Toxins 3(3), 191–200 (2011).  https://doi.org/10.3390/toxins3030191CrossRefGoogle Scholar
  10. 10.
    Freedman, D.H.: A reality check for IBM’s AI ambitions. MIT Technol. Rev. (2017). https://www.technologyreview.com/s/607965/a-reality-check-for-ibms-ai-ambitions/
  11. 11.
    Menze, B.H., et al.: The multimodal brain tumor image segmentation benchmark (BRATS). IEEE Trans. Med. Imaging 34(10), 1993–2024 (2015).  https://doi.org/10.1109/TMI.2014.2377694CrossRefGoogle Scholar
  12. 12.
    Bakas, S., et al.: Advancing The Cancer Genome Atlas glioma MRI collections with expert segmentation labels and radiomic features. Nat. Sci. Data 4, 170117 (2017).  https://doi.org/10.1038/sdata.2017.117CrossRefGoogle Scholar
  13. 13.
    Çiçek, Ö., Abdulkadir, A., Lienkamp, S.S., Brox, T., Ronneberger, O.: 3D U-Net: learning dense volumetric segmentation from sparse annotation. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 424–432. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46723-8_49CrossRefGoogle Scholar
  14. 14.
    Kayahbay, B., Jensen, G., Van Der Smagt, P.: CNN-based segmentation of medical imaging data. arXiv Prepr. arXiv:1701.03056 (2017)
  15. 15.
    Isensee, F., Kickingereder, P., Wick, W., Bendszus, M., Maier-Hein, K.H.: Brain tumor segmentation and radiomics survival prediction: contribution to the BRATS 2017 challenge. In: Crimi, A., Bakas, S., Kuijf, H., Menze, B., Reyes, M. (eds.) BrainLes 2017. LNCS, vol. 10670, pp. 287–297. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-75238-9_25CrossRefGoogle Scholar
  16. 16.
    Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-24574-4_28CrossRefGoogle Scholar
  17. 17.
    Roth, H.R., et al.: An application of cascaded 3D fully convolutional networks for medical image segmentation. Comput. Med. Imaging Graph. 66, 90–99 (2018)CrossRefGoogle Scholar
  18. 18.
    Jackson, P., Hardcastle, N., Dawe, N., Kron, T., Hofman, M.S., Hicks, R.J.: Deep learning renal segmentation for fully automated radiation dose estimation in unsealed source therapy. Front. Oncol. (2018)Google Scholar
  19. 19.
    Wang, G., Li, W., Ourselin, S., Vercauteren, T.: Automatic brain tumor segmentation using cascaded anisotropic convolutional neural networks. In: Crimi, A., Bakas, S., Kuijf, H., Menze, B., Reyes, M. (eds.) BrainLes 2017. LNCS, vol. 10670, pp. 178–190. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-75238-9_16CrossRefGoogle Scholar
  20. 20.
    Multimodal brain tumor segmentation challenge 2017 rankings. https://www.med.upenn.edu/sbia/brats2017/rankings.html
  21. 21.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, New York (2001)zbMATHGoogle Scholar
  22. 22.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes: The Art of Scientific Computing, 3rd edn. Cambridge University Press, Cambridge (2007)zbMATHGoogle Scholar
  23. 23.
    Li, S.Z.: Markov Random Field Modeling in Image Analysis. Advances in Computer Vision and Pattern Recognition. Springer, London (2009).  https://doi.org/10.1007/978-1-84800-279-1
  24. 24.
    Bakas, S., et al.: Segmentation labels and radiomic features for the pre-operative scans of the TCGA-GBM collection. Cancer Imaging Arch. (2017).  https://doi.org/10.7937/K9/TCIA.2017.KLXWJJ1QCrossRefGoogle Scholar
  25. 25.
    Bakas, S., et al.: Segmentation labels and radiomic features for the pre-operative scans of the TCGA-LGG collection. Cancer Imaging Arch. (2017).  https://doi.org/10.7937/K9/TCIA.2017.GJQ7R0EFCrossRefGoogle Scholar
  26. 26.
    Bakas, S., et al.: Identifying the best machine learning algorithms for brain tumor segmentation, progression assessment, and overall survival prediction in the BRATS challenge. arXiv preprint arXiv:1811.02629 (2018)
  27. 27.
  28. 28.
  29. 29.
    Vezhnevets, V., Konouchine, V.: GrowCut: interactive multi-label ND image segmentation by cellular automata. In: Proceedings of Graphicon, vol. 1, pp. 150–156 (2005)Google Scholar
  30. 30.
    Zhu, L., Kolesov, I., Gao, Y., Kikinis, R., Tannenbaum, A.: An effective interactive medical image segmentation method using fast growcut. In: MICCAI Workshop on Interactive Medical Image Computing, Boston (2014)Google Scholar
  31. 31.
  32. 32.
    Velazquez, E.R., et al.: Volumetric CT-based segmentation of NSCLC using 3D-Slicer. Sci. Rep. 3, 3529 (2013)Google Scholar
  33. 33.
    Jolly, M.P., Grady, L.: 3D general lesion segmentation in CT. In: 5th IEEE International Symposium on Biomedical Imaging (ISBI): From Nano to Macro, pp. 796–799. IEEE (2008)Google Scholar
  34. 34.
    Grady, L.: Random walks for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 28(11), 1768–1783 (2006)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • David Gering
    • 1
    Email author
  • Kay Sun
    • 1
  • Aaron Avery
    • 1
  • Roger Chylla
    • 1
  • Ajeet Vivekanandan
    • 1
  • Lisa Kohli
    • 1
  • Haley Knapp
    • 1
  • Brad Paschke
    • 1
  • Brett Young-Moxon
    • 1
  • Nik King
    • 1
  • Thomas Mackie
    • 1
  1. 1.HealthMyneMadisonUSA

Personalised recommendations