Robust extraction for low-contrast liver tumors using modified adaptive likelihood estimation
- 149 Downloads
Liver tumor extraction is essential for liver ablation surgery planning and treatment. For accurate and robust tumor segmentation, we propose a semiautomatic method using adaptive likelihood classification with modified likelihood model.
First, a minimal ellipse (or quasi-ellipsoid) that encloses a liver tumor is generated for initialization. Then, a hybrid intensity likelihood modification based on nonparametric density estimation is proposed to enhance local likelihood contrast and reduce its inhomogeneity. A prior elliptical (or quasi-ellipsoid) shape constraint is directly integrated into the likelihood to further prevent leakage of the algorithm into adjacent tissues with similar intensity. Finally, an adaptive likelihood classification is proposed for accurate segmentation of tumors with low contrast, high noise or heterogeneous densities.
Experiments were performed on 3Dircadb and LiTS datasets. The average volumetric overlap errors of the 3Dircadb and LiTS datasets were 27.05 and 35.72%, respectively. The algorithm’s robustness was validated by comparing results of 5 operators with multiple selections on different tumors.
The proposed method achieved good results in different tumors, even in low-contrast tumors with blurred boundaries. Reliable results can still be achieved over different initializations by different operators using the proposed method.
KeywordsLiver tumor segmentation Hybrid intensity likelihood modification Shape constraint modification Adaptive likelihood classification
The authors appreciate Julia Wu from MIT for English correction and acknowledge the support of the National Nature Science Foundation of China (Grant Number 81471759).
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
For this type of study, formal consent is not required.
- 1.Siegel RL, Miller KD, Jemal A (2017) Cancer statistics, 2018 CA: a cancer. J Clin 68:7–30Google Scholar
- 4.LiTS Challenge (2017). https://competitions.codalab.org/competitions/17094
- 5.Moltz JH, Bornemann L, Dicken V, Peitgen H-O (2008) Segmentation of liver metastases in CT scans by adaptive thresholding and morphological processing. In: MICCAI workshop, 2008, pp 195Google Scholar
- 6.Wong D, Liu J, Fengshou Y, Tian Q, Xiong W, Zhou J, Qi Y, Han T, Venkatesh S, Wang S-C (2008) A semi-automated method for liver tumor segmentation based on 2D region growing with knowledge-based constraints. In: MICCAI workshop, 2008, pp 159Google Scholar
- 7.Stawiaski J, Decenciere E, Bidault F (2008) Interactive liver tumor segmentation using graph-cuts and watershed. In: Workshop on 3D segmentation in the clinic: a grand challenge II. Liver tumor segmentation challenge. MICCAI, New York, USA, 2008Google Scholar
- 15.Abdel-massieh NH, Hadhoud MM, Amin KM (2010) A novel fully automatic technique for liver tumor segmentation from CT scans with knowledge-based constraints. In: 2010 10th International conference on intelligent systems design and applications, 2010, pp 1253–1258Google Scholar
- 22.Wu W, Wu S, Zhou Z, Zhang R, Zhang Y (2017) 3D liver tumor segmentation in CT images using improved fuzzy C-means and graph cuts. Biomed Res Int 2017:11Google Scholar
- 23.Christ PF, Ettlinger F, Grün F, Elshaera MEA, Lipkova J, Schlecht S, Ahmaddy F, Tatavarty S, Bickel M, Bilic P (2017) Automatic liver and tumor segmentation of ct and MRI volumes using cascaded fully convolutional neural networks, arXiv preprint arXiv:1702.05970
- 24.Lipková J, Rempfler M, Christ P, Lowengrub J, Menze BH (2017) Automated unsupervised segmentation of liver lesions in CT scans via Cahn-Hilliard phase separation, arXiv preprint arXiv:1704.02348
- 25.Li X, Chen H, Qi X, Dou Q, Fu C-W, Heng PA (2017) H-DenseUNet: hybrid densely connected UNet for liver and liver tumor segmentation from CT volumes, arXiv preprint arXiv:1709.07330