Abstract
A general class of variational models with concave priors is considered for obtaining certain sparse solutions, for which nonsmoothness and non-Lipschitz continuity of the objective functions pose significant challenges from an analytical as well as numerical point of view. For computing a stationary point of the underlying variational problem, a Newton-type scheme with provable convergence properties is proposed. The possible non-positive definiteness of the generalized Hessian is handled by a tailored regularization technique, which is motivated by reweighting as well as the classical trust-region method. Our numerical experiments demonstrate selected applications in image processing, support vector machines, and optimal control of partial differential equations.
Similar content being viewed by others
References
ℓ_1-magic. http://users.ece.gatech.edu/~justin/l1magic/
Attouch, H., Buttazzo, G., Michaille, G.: Variational Analysis in Sobolev and BV Spaces: Applications to PDEs and Optimization. SIAM, Philadelphia (2006)
Aubert, G., Kornprobst, P.: Mathematical Problems in Image Processing. Springer, New York (2002)
Bruckstein, A.M., Donoho, D.L., Elad, M.: From sparse solutions of systems of equations to sparse modeling of signals and images. SIAM Rev. 51, 34–81 (2009)
Burke, J.V., Lewis, A.S., Overton, M.L.: A robust gradient sampling algorithm for nonsmooth, nonconvex optimization. SIAM J. Optim. 15, 751–779 (2005)
Cai, J.F., Osher, S., Shen, Z.: Split Bregman methods and frame based image restoration. Multiscale Model. Simul. 8, 337–369 (2009)
Candès, E.J., Tao, T.: Near optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52, 5406–5425 (2006)
Chan, T.F., Mulet, P.: On the convergence of the lagged diffusivity fixed point method in total variation image restoration. SIAM J. Numer. Anal. 36, 354–367 (1999)
Chapelle, O.: Training a support vector machine in the primal. Neural Comput. 19, 1155–1178 (2007)
Charbonnier, P., Blanc-Féraud, L., Aubert, G., Barlaud, M.: Deterministic edge-preserving regularization in computed imaging. IEEE Trans. Image Process. 6, 298–311 (1997)
Chartrand, R., Yin, W.: Iteratively reweighted algorithms for compressive sensing. In: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3869–3872 (2008)
Chen, X.: Smoothing methods for nonsmooth, nonconvex minimization. Math. Program., Ser. B 134, 71–99 (2012)
Chen, S.S., Donoho, D.L., Saunders, M.A.: Atomic decomposition by basis pursuit. SIAM Rev. 43, 129–159 (2001)
Chen, X., Xu, F., Ye, Y.: Lower bound theory of nonzero entries in solutions of ℓ 2–ℓ p minimization. SIAM J. Sci. Comput. 32, 2832–2852 (2010)
Clason, C., Kunisch, K.: A measure space approach to optimal source placement. Comput. Optim. Appl. 53, 155–171 (2012)
Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust-Region Methods. SIAM, Philadelphia (2000)
Daubechies, I., DeVore, R., Fornasier, M., Güntürk, C.: Iteratively reweighted least squares minimization for sparse recovery. Commun. Pure Appl. Math. 63, 1–38 (2010)
Dennis, J.E. Jr., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. SIAM, Philadelphia (1996)
Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96, 1348–1360 (2001)
Fischer, A.: Solution of monotone complementarity problems with locally Lipschitzian functions. Math. Program. 76, 513–532 (1997)
Geman, D., Reynolds, G.: Constrained restoration and the recovery of discontinuities. IEEE Trans. Pattern Anal. Mach. Intell. 14, 367–383 (1992)
Hintermüller, M., Wu, T.: Nonconvex TVq-models in image restoration: analysis and a trust-region regularization based superlinearly convergent solver. SIAM J. Imaging Sci. (to appear)
Hintermüller, M., Ito, K., Kunisch, K.: The primal-dual active set strategy as a semismooth Newton method. SIAM J. Optim. 13, 865–888 (2003)
Huang, J., Horowitz, J.L., Ma, S.: Asymptotic properties of bridge estimators in sparse high-dimensional regression models. Ann. Stat. 36, 587–613 (2008)
Huber, P.J.: Robust estimation of a location parameter. Ann. Math. Stat. 53, 73–101 (1964)
Knight, K., Fu, W.: Asymptotics for lasso-type estimators. Ann. Stat. 28, 1356–1378 (2000)
Kunisch, K., Pock, T.: A bilevel optimization approach for parameter learning in variational models. SIAM J. Imaging Sci. 6, 938–983 (2013)
Lin, Q.: Sparsity and nonconvex nonsmooth optimization. Ph.D. thesis, University of Washington (2009)
Nikolova, M.: Analysis of the recovery of edges in images and signals by minimizing nonconvex regularized least-squares. Multiscale Model. Simul. 4, 960–991 (2005)
Nikolova, M., Chan, R.H.: The equivalence of half-quadratic minimization and the gradient linearization iteration. IEEE Trans. Image Process. 16, 1623–1627 (2007)
Nikolova, M., Ng, M.K., Zhang, S., Ching, W.K.: Efficient reconstruction of piecewise constant images using nonsmooth nonconvex minimization. SIAM J. Imaging Sci. 1, 2–25 (2008)
Nikolova, M., Ng, M.K., Tam, C.P.: Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction. IEEE Trans. Image Process. 19, 3073–3088 (2010)
Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer, New York (2006)
Qi, L., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)
Ramlau, R., Zarzer, C.A.: On the minimization of a Tikhonov functional with a non-convex sparsity constraint. Electron. Trans. Numer. Anal. 39, 476–507 (2012)
Stadler, G.: Elliptic optimal control problems with L 1-control cost and applications for the placement of control devices. Comput. Optim. Appl. 44, 159–181 (2009)
Vogel, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17, 227–238 (1996)
Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. Adv. Neural Inf. Process. Syst. 13, 668–674 (2000)
Acknowledgements
This research was supported by the Austrian Science Fund (FWF) through START project Y305 “Interfaces and Free Boundaries” and through SFB project F3204 “Mathematical Optimization and Applications in Biomedical Sciences”. The authors would like to thank T. Pock (TU Graz) for communication on the problem concerning the overcomplete dictionary. We also thank the referee for the pointer to a recent thesis work [28], which contains an extensive algorithmic study of nonconvex-regularization based sparse optimization.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Hintermüller, M., Wu, T. A superlinearly convergent R-regularized Newton scheme for variational models with concave sparsity-promoting priors. Comput Optim Appl 57, 1–25 (2014). https://doi.org/10.1007/s10589-013-9583-2
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-013-9583-2