Skip to main content
Log in

Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Successive quadratic approximations (SQA) are numerically efficient for minimizing the sum of a smooth function and a convex function. The iteration complexity of inexact SQA methods has been analyzed recently. In this paper, we present an algorithmic framework of inexact SQA methods with four types of line searches, and analyze its global complexity under milder assumptions. First, we show its well-definedness and some decreasing properties. Second, under the quadratic growth condition and a uniform positive lower bound condition on stepsizes, we show that the function value sequence and the iterate sequence are linearly convergent. Moreover, we obtain a o(1/k) complexity without the quadratic growth condition, improving existing \({\mathcal {O}}(1/k)\) complexity results. At last, we show that a local gradient-Lipschitz-continuity condition could guarantee a uniform positive lower bound for the stepsizes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/.

References

  1. Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42(2), 330–348 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bauschke, H.H., Combettes, P.L.: Convex analysis and monotone operator theory in Hilbert spaces. In: Dilcher, K., Taylor, K. (eds.) CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, 2nd edn. Springer, Cham (2017). With a foreword by Hédy Attouch

  3. Beck, A.: First-order Methods in Optimization, vol. 25. SIAM, Philadelphia (2017)

    Book  MATH  Google Scholar 

  4. Bello Cruz, J.Y., de Oliveira, W.: On weak and strong convergence of the projected gradient method for convex optimization in real Hilbert spaces. Numer. Funct. Anal. Optim. 37(2), 129–144 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bello Cruz, J.Y., Li, G., Nghia, T.T.A.: On the q-linear convergence of forward-backward splitting method and uniqueness of optimal solution to lasso. arXiv preprint arXiv:1806.06333, (2018)

  6. Bello Cruz, J.Y., Nghia, T.T.A.: On the convergence of the forward–backward splitting method with linesearches. Optim. Methods Softw. 31(6), 1209–1238 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  7. Byrd, R.H., Nocedal, J., Oztoprak, F.: An inexact successive quadratic approximation method for \(\ell _1\) regularized optimization. Math. Program. 157(2, Ser. B), 375–396 (2016)

    MathSciNet  MATH  Google Scholar 

  8. Cegielski, A.: Iterative Methods for Fixed Point Problems in Hilbert Spaces, Lecture Notes in Mathematics, vol. 2057. Springer, Heidelberg (2012)

    MATH  Google Scholar 

  9. Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4(4), 1168–1200 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hsieh, C.-J, Dhillon, I.S., Ravikumar, P.K., Sustik, M.A.: Sparse inverse covariance matrix estimation using quadratic approximation. In: Advances in Neural Information Processing Systems, pp. 2330–2338 (2011)

  11. Lee, C., Wright, S.J.: Inexact Successive quadratic approximation for regularized optimization. Comput. Optim. Appl. 72, 641–674 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  12. Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM J. Optim. 24(3), 1420–1443 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  13. Luo, Z.-Q., Tseng, P.: A coordinate gradient descent method for nonsmooth separable minimization. J. Optim. Theory Appl. 72(1), 20 (2002)

    Google Scholar 

  14. Mosco, U.: Convergence of convex sets and of solutions of variational inequalities. Adv. Math. 3(4), 510–585 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  15. Necoara, I., Nesterov, Y., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program. 175(1–2), 69–107 (2019)

  16. Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)

    MATH  Google Scholar 

  17. Qi, L., Chen, X.: A preconditioning proximal Newton method for nondifferentiable convex optimization. Math. Program. 76(3, Ser. B), 411–429 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  18. Salzo, S.: The variable metric forward–backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27(4), 2153–2181 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  19. Scheinberg, K., Tang, X.: Practical inexact proximal quasi-Newton method with global complexity analysis. Math. Program. 160(1–2, Ser. A), 495–529 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  20. Taylor, A.B., Hendrickx, J.M., Glineur, F.: Exact worst-case convergence rates of the proximal gradient method for composite convex minimization. J. Optim. Theory Appl. 178(2), 455–476 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  21. Tseng, P.: A modified forward–backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38(2), 431–446 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  22. Tseng, P., Yun, S.A.: Coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  23. Wei, Z., Qi, L.: Convergence analysis of a proximal Newton method. Numer. Funct. Anal. Optim. 17(3–4), 463–472 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  24. Yuan, G.-X., Ho, C.-H., Lin, C.-J.: An improved glmnet for l1-regularized logistic regression. J. Mach. Learn. Res. 13(Jun), 1999–2030 (2012)

    MathSciNet  MATH  Google Scholar 

  25. Yue, M.-C., Zhou, Z., So, A.M.C.: A family of inexact sqa methods for non-smooth convex minimization with provable convergence guarantees based on the Luo–Tseng error bound property. Math. Program. 174(1–2), 327–358 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  26. Zhang, H.: New analysis of linear convergence of gradient-type methods via unifying error bound conditions. Math. Program. 180(1), 371–416 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  27. Zolezzi, T.: On equiwellset minimum problems. Appl. Math. Optim. 4(3):209–223, (1977/78)

Download references

Acknowledgements

We are grateful for the support of the National Natural Science Foundation of China (No. 11971480 and 11501569). We are also obliged to the anonymous reviewers for their insightful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui Zhang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Peng, W., Zhang, H., Zhang, X. et al. Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions. J Glob Optim 78, 69–89 (2020). https://doi.org/10.1007/s10898-020-00892-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-020-00892-1

Keywords

Navigation