Skip to main content
Log in

A Regularized Semi-Smooth Newton Method with Projection Steps for Composite Convex Programs

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: (1) Many well-known operator splitting methods, such as forward–backward splitting and Douglas–Rachford splitting, actually define a fixed-point mapping; (2) The optimal solutions of the composite convex program and the solutions of a system of nonlinear equations derived from the fixed-point mapping are equivalent. Solving this kind of system of nonlinear equations enables us to develop second-order type methods. These nonlinear equations may be non-differentiable, but they are often semi-smooth and their generalized Jacobian matrix is positive semidefinite due to monotonicity. By combining with a regularization approach and a known hyperplane projection technique, we propose an adaptive semi-smooth Newton method and establish its convergence to global optimality. Preliminary numerical results on \(\ell _1\)-minimization problems demonstrate that our second-order type algorithms are able to achieve superlinear or quadratic convergence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Downloadable from http://yall1.blogs.rice.edu.

References

  1. Ahookhosh, M., Amini, K., Bahrami, S.: Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations. Numer. Algorithms 64, 21–42 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2011)

    Book  MATH  Google Scholar 

  3. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3, 1–122 (2011)

    Article  MATH  Google Scholar 

  4. Byrd, R.H., Chin, G.M., Nocedal, J., Oztoprak, F.: A family of second-order methods for convex \(\ell _1\)-regularized optimization. Math. Program. 159, 435–467 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  5. Chen, C., Liu, Y.J., Sun, D., Toh, K.C.: A semismooth Newton-CG based dual PPA for matrix spectral norm approximation problems. Math. Program. 155, 435–470 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  6. Combettes, P.L., Pesquet, J.-C.: Proximal splitting methods in signal processing. In: Bauschke, H.H., Burachik, R., Combettes, P.L., Elser, V., Luke, D.R., Wolkowicz, H. (eds.) Fixed-Point Algorithms for Inverse Poblems in Science and Engineering, Volume 49 of Springer Optim. Appl., pp. 185–212. Springer, New York (2011)

    Chapter  Google Scholar 

  7. Davis, D., Yin, W.: Convergence rate analysis of several splitting schemes. In: Glowinski, R., Osher, S.J., Yin, W. (eds.) Splitting Methods in Communication, Imaging, Science, and Engineering, Sci. Comput., pp. 115–163. Springer, Cham (2016)

    Chapter  Google Scholar 

  8. Davis, D., Yin, W.: Faster convergence rates of relaxed Peaceman–Rachford and ADMM under regularity assumptions. Math. Oper. Res. 42, 783–805 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  9. De Luca, T., Facchinei, F., Kanzow, C.: A semismooth equation approach to the solution of nonlinear complementarity problems. Math. Program. 75, 407–439 (1996)

    MathSciNet  MATH  Google Scholar 

  10. Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two and three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)

    Article  MathSciNet  MATH  Google Scholar 

  11. Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  12. Facchinei, F., Pang, J.-S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, vol. II. Springer, New York (2003)

    MATH  Google Scholar 

  13. Fan, J.Y., Yuan, Y.X.: On the quadratic convergence of the Levenberg–Marquardt method without nonsingularity assumption. Computing 74, 23–39 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl. 2, 17–40 (1976)

    Article  MATH  Google Scholar 

  15. Glowinski, R., Marrocco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité, d’une classe de problèmes de Dirichlet non linéaires. Rev. Française Automat. Informat. Recherche Opérationnelle Sér. Rouge Anal. Numér. 9, 41–76 (1975)

    MathSciNet  MATH  Google Scholar 

  16. Griesse, R., Lorenz, D.A.: A semismooth Newton method for Tikhonov functionals with sparsity constraints. Inverse Probl. 24, 035007 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  17. Jiang, H., Qi, L.: Local uniqueness and convergence of iterative methods for nonsmooth variational inequalities. J. Math. Anal. Appl. 196, 314–331 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  18. Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM J. Optim. 24, 1420–1443 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  19. Li, Q., Li, D.H.: A class of derivative-free methods for large-scale nonlinear monotone equations. IMA J. Numer. Anal. 31, 1625–1635 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  20. Lions, P.-L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  21. Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 15, 959–972 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  22. Milzarek, A., Ulbrich, M.: A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization. SIAM J. Optim. 24, 298–333 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nocedal, J., Wright, S.J.: Numerical Optimization, Springer Series in Operations Research and Financial Engineering, 2nd edn. Springer, New York (2006)

    Google Scholar 

  24. Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, New York (1970)

    MATH  Google Scholar 

  25. Pang, J.-S., Qi, L.Q.: Nonsmooth equations: motivation and algorithms. SIAM J. Optim. 3, 443–465 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  26. Patrinos, P., Stella, L., Bemporad, A..: Forward–backward truncated Newton methods for convex composite optimization. http://arxiv.org/abs/1402.6655, 02 (2014)

  27. Qi, L.Q.: Convergence analysis of some algorithms for solving nonsmooth equations. Math. Oper. Res. 18, 227–244 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  28. Qi, L.Q., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  29. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)

    Book  MATH  Google Scholar 

  30. Scholtes, S.: Introduction to Piecewise Differentiable Equations, Springer Briefs in Optimization. Springer, New York (2012)

    Book  MATH  Google Scholar 

  31. Shapiro, A.: Directionally nondifferentiable metric projection. J. Optim. Theory Appl. 81, 203–204 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  32. Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods (Lausanne, 1997), vol. 22, pp. 355–369. Kluwer Academic Publishers, Dordrecht (1999)

    Chapter  Google Scholar 

  33. Solodov, M.V., Svaiter, B.F.: A hybrid projection-proximal point algorithm. J. Convex Anal. 6, 59–70 (1999)

    MathSciNet  MATH  Google Scholar 

  34. Sun, D., Han, J.: Newton and quasi-Newton methods for a class of nonsmooth equations and related problems. SIAM J. Optim. 7, 463–480 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  35. Sun, D., Sun, J.: Semismooth matrix-valued functions. Math. Oper. Res. 27, 150–169 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  36. Tseng, P., Yun, S.: A coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  37. Ulbrich, M.: Semismooth Newton Methods for Variational Inequalities and Constrained Optimization Problems in Function spaces, Society for Industrial and Applied Mathematics SIAM), Philadelphia, PA. Mathematical Optimization Society, Philadelphia (2011)

    Google Scholar 

  38. van den Berg, E., Friedlander, M.P.: Probing the Pareto frontier for basis pursuit solutions. SIAM J. Sci. Comput. 31, 890–912 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  39. Wen, Z., Yin, W., Goldfarb, D., Zhang, Y.: A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation. SIAM J. Sci. Comput. 32, 1832–1857 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  40. Wright, S.J., Nowak, R.D., Figueiredo, M.A.T.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57, 2479–2493 (2009)

    Article  MathSciNet  Google Scholar 

  41. Xiu, N., Zhang, J.: Some recent advances in projection-type methods for variational inequalities. J. Comput. Appl. Math. 152, 559–585 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  42. Yang, J., Zhang, Y.: Alternating direction algorithms for \(\ell _{1}\)-problems in compressive sensing. SIAM J. Sci. Comput. 33, 250–278 (2011)

    Article  MathSciNet  Google Scholar 

  43. Zhao, X.Y., Sun, D., Toh, K.C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 20, 1737–1765 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zhao, Y.-B., Li, D.: Monotonicity of fixed point and normal mappings associated with variational inequality and its application. SIAM J. Optim. 11, 962–973 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  45. Zhou, G., Toh, K.C.: Superlinear convergence of a Newton-type algorithm for monotone equations. J. Optim. Theory Appl. 125, 205–221 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  46. Zhou, W.J., Li, D.H.: A globally convergent BFGS method for nonlinear monotone equations without any merit functions. Math. Comput. 77, 2231–2240 (2008)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Professor Defeng Sun for the valuable discussions on semi-smooth Newton methods, and Professor Michael Ulbrich and Dr. Andre Milzarek for sharing their code SNF. In particular, the authors appreciate Dr. Andre Milzarek for reading the manuscript carefully and the help on improving the convergence analysis. The authors are grateful to the associate editor and two anonymous referees for their valuable comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zaiwen Wen.

Additional information

X. Xiao: Research supported by the Fundamental Research Funds for the Central Universities under the Grant DUT16LK30. Z. Wen: Research supported in part by NSFC Grant 91730302, 11421101, and by the National Basic Research Project under the Grant 2015CB856002. L. Zhang: Research partially supported by NSFC Grants 11571059 and 91330206.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xiao, X., Li, Y., Wen, Z. et al. A Regularized Semi-Smooth Newton Method with Projection Steps for Composite Convex Programs. J Sci Comput 76, 364–389 (2018). https://doi.org/10.1007/s10915-017-0624-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10915-017-0624-3

Keywords

Mathematics Subject Classification

Navigation