A comparison of regularization models for few-view CT image reconstruction

In this paper I analyse some regularization models for the reconstruction of X-rays Computed Tomography images from few-view projections. It is well known that the widely used low-cost Filtered Back Projection method is not suitable in case of low-dose data, since it produces images with noise and artifacts. Iterative reconstruction methods based on the model discretization are preferred in this case. However, since the problem has infinite possible solutions and is ill-posed, regularization is necessary to obtain a good solution. Different iterative regularization methods have been proposed in literature, but an organized comparison among them is not available. We compare some regularization approaches in the case of few-view tomography by means of simulated projections from both a phantom and a real image.

One technique to reduce the radiation exposure is to lower the number of X-rays projections. In this case, the protocols are named as sparse tomography (or sparseview, few-view tomography). Figure 1 shows a graphical draft of the acquisition and reconstruction process. In the first column is it sketched the geometry of the acquisition process, where an X-rays source moves along a circular trajectory; in the second column the set of acquired data (also called sinogram) is represented; in the third column the reconstructions obtained with the Filtered Backprojection (FBP) method [2] are shown. In the first row the classical full dose CT case is represented, where the source spans the whole circular trajectory; in the second row a sparse-view fullangle tomography is considered where a reduced number of views is taken in the whole circular orbit. A different sparse-view geometry using few projections is called limited-angle tomography (see the third row of Fig. 1). Here, a further reduction of X-rays scans is made by limiting the source trajectory to a C-shaped path, i.e. by restricting the 360-degrees angular scanning interval to a range smaller than 180 degrees. In some tomographic applications, the human anatomy does not allow a complete circular motion to the X-rays source, thus the use of a reduced range is mandatory and the resulting technique is called tomosynthesis. An example is breast imaging, where the breast is in a stationary position between the detector surface and the compression plate. The resulting scans are fast but the sinogram is incomplete and the reconstruction process is quite difficult.
As clearly visibile from Fig. 1 the traditional analytic FBP algorithm produces images with striking artifacts. Hence, regularized iterative methods, which minimize a suitable function modelling the noise, are commonly used in these cases [3] in place of FBP. However, since the reconstruction problem is ill-posed [2] it is necessary to introduce regularization.
Aim of this work is to compare different regularization models for CT image reconstruction. In literature, several works using one of these models and showing its efficiency have been presented, but there isn't any fair comparison between them.
The paper is organized as follows. In Sect. 2 some regularization models for image reconstruction are shortly described; in Sect. 3 numerical experiments using different regularization approaches for 2D image reconstruction from few views are presented and discussed; finally in Sect. 4 we draw some conclusions.

Iterative regularized approach
The model-based iterative methods (for a possible classification see [4] and the references therein) solve the linear system obtained from the discretization of both the object and the acquisition device (also called detector). In the following, we present for simplicity the analysis for the 2D case. The extension to 3D is straightforward.
Suppose the discrete object is of size n x × n y , the detector of size n p and the projections are collected from n θ angles. Hence, the linear system represents the discretization of the Radon transform [5]. It relates the object x ∈ R n x ×n y with the sinogram y ∈ R n p ×n θ , collecting all the projections, through the matrix A, called projection matrix. In the case of few-views, n p × n θ < n x × n y , the linear system (1) admits infinite possible solutions.
To overcome this problem, the linear system is usually replaced by a least-squares problem: Moreover, since Eq. (1) is the discretization of a mildly ill-posed problem (i.e. the integral Radon equation), then the solution of (2) is sensitive to noise present on the data and regularization is needed [2].

Regularization by iteration
The simplest form of regularization is regularization by iteration [6]. The normal equations of (2) can be solved by a fast iterative method, such as the Conjugate Gradient Least-Squares (CGLS) algorithm, stopped after few iterations, far before convergence [7]. Several criteria have been proposed to suitably stop the iterations before noise enters the solution [8,9].

Regularization by function
Another possible, and mostly used, strategy is to use regularization functions which impose prior information on the solution. The so called model based iterative methods compute the CT image by solving a constrained or unconstrained minimization problem involving a data-fit function F(x) and a regularization function R(x). Common choices for the a data-fit function F(x) are the least-squares, which models gaussian white noise, and the Kullback-Leibler divergence, modelling Poisson noise. In this work, we always consider the least-squares data-fit: Concerning the regularization function R(x), there are more possible choices. The unconstrained formulation of the minimization problem has the form: where λ is the regularization parameter balancing the data-fit and regularization terms. A non negative constraint on the solution can also be added as: The most classical regularization approach is the so called Tikhonov regularization, where and D is usually chosen as one of the follwing: 1. the identity operator I ; 2. the discrete gradient operator G, usually computed with finite differences.
Tikhonov minimization problem (4) can be re-written as: where:Ã The problem (7) can be solved by applying the CGLS to the normal equations.
Widely used alternatives to Tikhonov function are sparse promoting regularization functions, such as the l1 norm. In this case, and the differentiable minimization problem (5) is solved here by FISTA method [10].
Another sparse promoting regularizer is the Total Variation (TV), defined as: where ∇(·) is the discrete gradient operator. The resulting minimization problem (5) is solved again by FISTA. For its edge preserving properties and its effectiveness in removing noise, TV is the most used regularization function in medical CT imaging [11,12]. A smoothed version of TV is often considered, by introducing a small positive parameter β which makes the TV function differentiable in zero: In the numerical experiments we have set β = 10 −3 . When considering T V β the Scaled Gradient Projection (SGP) method is used to solve the differentiable problem (5) (see [12] for more details on SGP method on CT image reconstruction).

Numerical experiments
Some numerical tests have been executed on both a phantom from the Tomophantom package [13] and a real image from the Mayo Clinic data set [14], shown in Fig. 2. For the implementation of the forward and backward projection operators and of some of the model-based iterative reconstruction methods we have used the Python version of the Core Image Libray (CIL) [15] (www.ccpi.ac.uk/cil).
We compare the traditional analytic FBP method with the following regularized iterative approaches: • Regularization by iteration by means of CGLS method stopped after few (8) iterations (hence used as regularization by iteration) and near convergence (at 100 iterations). In the following we will name them as CGLS (8) and CGLS(100), respectively; We evaluate the reconstructions obtained from the different methods by computing the Mean Squared Error (MSE), Mean Absolute Error (MAE) and Peak Signal to Noise Ratio (PSNR) metrics defined as: where N = n x × n y is the number of pixels of the reconstructed imagex and x is the ground truth image. By using the forward projector function available in CIL, we simulate the projections considering 60 angles equally spaced in [0, 180] degrees. In all the experiments reported in this section, the regularization parameter λ is heuristically chosen by trial and error to get the lowest possible MSE value.

Tests on the phantom image
The first test problem consists in reconstructing the phantom image from 60 noise-free projections. In Fig. 3 we see some reconstructions. The FBP image (top left) is clearly corrupted by streaking artifacts, due to the sparse view and lack of data, which are reduced, but yet visible, in the CGLS(100) (top center). Since we are considering noise- In Fig. 4 we report some plots of a single row of the image. In all the plots, the blue line represents the ground truth. In the top left figure, the ground truth signal is compared with the FBP one, which is highly oscillating as expected. In the top right one, we can see the results obtained with CGLS(8) (orange line ), CGLS(100) (green line) and Tik(G) (red line). Tik(G) appears smoother than CGLS. The bottom right graphic confirms that TV regularization is outstanding in recovering the true signal and removing noise.
In the second test problem white gaussian noise with variance 0.01 is added to the previously tested phantom sinogram. Figure 5 contains some reconstructions. When data are affected by noise we can appreciate the difference between CGLS(8) (top left) and CGLS(100) (top center) reconstructions. The noise on the projections produces semi-convergence and, if CGLS is not properly stopped, it also affects the solution, as visibile in the CGLS(100) image. The CGLS(8) image is less noisy, but slightly blurred. Tik(G) (top right) reconstruction is quite blurred and artifacts are visible in the flat regions. We can infer that the 2-norm regularization is not effective in well recovering the object contours from noisy data. In the L1 image (bottom left) the circles have sharp edges but the noise has not been completely removed. The TV method (bottom central) provides a reconstruction with high contrast and no noise but the image looks cartoon: the objects are very flat and homogenoeus. This is a well known unwanted effect of TV function. Finally, the TV(s) image visually appears as the most accurate reconstruction, similar to the TV one, but less blocky. Table 1 summarizes the results of these two experiments, reporting the values of the MSE, MAE and PSNR metrics for all the reconstructions obtained on the phantom test image. The first three columns are from noise-free data and the last three columns from noisy sinograms. They are in agreement with the visual results reported in Figs. 3 and 4 and confirm that the TV-based iterative reconstructions (in particular the TV(s) model) outperform the others for all the considered metrics.

Tests on a chest CT image
The last test problem is created from the real CT image of a chest shown in Fig. 3 by simulating 60 projections in [0, 180] degrees and by adding white noise with variance 0.005. As clearly visible from Fig. 2, the CT image of the chest is quite different from the previously considered phantom. It presents many small details over a dark background and smoothed regions along the borders of the lungs. Figure 6 shows some reconstructions obtained with the considered methods. In the first row the FBP image (on the left) is clearly damaged by noise; the L1 image (on the right) appears well contrasted with neat object contours. Few noise as speckle is still visible, but it does not alter the objects shape. In the second row, the TV image (on the left) appears smooth and the details inside the lungs are noticeable; however the dark regions look spotty, as effect of the TV function. Also in this test problem, the TV(s) image (bottom right) seems the best reconstruction.
The plots in Fig. 7 confirm the considerations deduced from the images. TV regularization (bottom) suppresses the noise but cannot reproduce the peaks, whereas the L1 reconstruction (top right) is a bit more noisy but it follows the fast changes of the blue line (representing in all the plots the ground truth). The other reconstructions (CGLS and Tikhonov) (top left) are not very accurate.
Finally, Table 2 reports the MSE, MAE and PSNR obtained in this experiment. The best values (highlighted in bold) are again obtained with the TV(s) regularization.

Conclusions
In this paper I have investigated the effects of regularization in the reconstruction of X-rays CT images from few views. Due to the lack of data, the linear system deriving from the model discretization has infinite possible solutions. Regularization is necessary not only to suppress noise (as in all the ill-posed inverse problems), but also to choose a suitable solution among the many possible ones.
The most common regularization approaches have been analysed: regularization by iteration and regularization by function, with Tikhonov regularization, Total Variation  and L1 norm. The results obtained on 2D test problems with simulated projections from a sparse geometry show that the images reconstructed by the various regularization approaches are quite different, hence a correct choice of regularization is of primary importance to obtain the image with desired features. For example, TV regularization produces highly contrasted images where the noise is almost completely absent, but the objects are flat and cartoon. On the contrary, the images reconstructed with L1 regularization are slightly noisy, but well contrasted and more realistic. TV smoothed regularizer is a good compromise between TV and L1: it produces images with very few noise residual and good contrasted objects. A very important consideration regards the computational time. The regularization per iteration (obtained by stopping the iterative method after very few iterations) are quite blurred but they are computed in much lower time with respect to the other methods. This is essential in real applications where the time available for the CT exam is limited.
To complete the analysis, in the future I intend to consider 3D experiments and possibly real data. Other regularization techniques will also be tested, such as Generalized Total Variation or Nonlocal Mean Filter, to cite only some. Finally, the role of the regularization parameter will be investigated.