Skip to main content
Log in

On the Gradient Projection Method for Weakly Convex Functions on a Proximally Smooth Set

  • Research Articles
  • Published:
Mathematical Notes Aims and scope Submit manuscript

Abstract

Let a weakly convex function (in the general case, nonconvex and nonsmooth) satisfy the quadratic growth condition. It is proved that the gradient projection method for minimizing such a function on a set converges with linear rate on a proximally smooth (nonconvex) set of special form (for example, on a smooth manifold), provided that the constant of weak convexity of the function is less than the constant in the quadratic growth condition and the constant of proximal smoothness for the set is sufficiently large. The connection between the quadratic growth condition on the function and other conditions is discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. In a finite-dimensional space, the continuity of the mapping \(U_Q(R)\setminus Q\ni x\to P_{Q}x\) is not required. Its continuity follows from the uniqueness of the metric projection and its upper semicontinuity [10, Chap. 3, Sec. 1, Proposition 23].

  2. The set \(\partial f(x)=\{f'(x)\in\mathbb{R}^n\mid f(y)\ge f(x)+(f'(x),y-x)\ \forall\,y\in\mathbb{R}^n\}\). is called the subdifferential of a convex function \(f\) at the point \(x\)

References

  1. B. T. Polyak, “Gradient methods for minimizing functionals,” USSR Comput. Math. Math. Phys. 3 (4), 864–878 (1963).

    Article  MathSciNet  Google Scholar 

  2. E. S. Levitin and B. T. Polyak, “Constrained minimization methods,” USSR Comput. Math. Math. Phys. 6 (5), 1–50 (1966).

    Article  Google Scholar 

  3. B. T. Polyak, Introduction to Optimization (Nauka, Moscow, 1983) [in Russian].

    MATH  Google Scholar 

  4. D. Davis, D. Drusvyatskiy, K. J. MacPhee, and C. Paquette, Subgradient Methods for Sharp Weakly Convex Functions, arXiv: 1803.02461v1 (2018).

    Book  Google Scholar 

  5. X. Li, Zh. Zhu, A. Man-Cho So, and J. D. Lee, Incremental Methods for Weakly Convex Optimization, arXiv: 1907.11687v1 (2019).

    Google Scholar 

  6. H. Karimi, J. Nutini, and M. Schmidt, “Linear convergence of gradient and proximal-gradient methods under the Polyak–Lojasiewicz condition,” in Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Comput. Sci. (Springer, Cham, 2016), Vol. 9851.

    Google Scholar 

  7. D. Drusvyatskiy and A. S. Lewis, “Error bounds, quadratic growth and linear convergence of proximal methods,” Math. Oper. Res. 43 (3), 919–948 (2018).

    Article  MathSciNet  Google Scholar 

  8. J.-P. Vial, “Strong and weak convexity of sets and functions,” Math. Oper. Res. 8 (2), 231–259 (1983).

    Article  MathSciNet  Google Scholar 

  9. F. H. Clarke, R. J. Stern and P. R. Wolenski, “Proximal smoothness and lower-\(C^{2}\) property,” J. Convex Anal. 2 (1-2), 117–144 (1995).

    MathSciNet  MATH  Google Scholar 

  10. J. P. Aubin and I. Ekeland, Applied Nonlinear Analysis (John Wiley, New York, 1984).

    MATH  Google Scholar 

  11. R. A. Poliquin, R. T. Rockafellar and L. Thibault, “Local differentiability of distance functions,” Trans. Amer. Math. Soc. 352 (11), 5231–5249 (2000).

    Article  MathSciNet  Google Scholar 

  12. M. Bounkhel and L. Thibault, “On various notions of regularity of sets in nonsmooth analysis,” Nonlinear Anal. 48 (2), 223–246 (2002).

    Article  MathSciNet  Google Scholar 

  13. P.-A. Absil and J. Malick, “Projection-like retractions on matrix manifolds,” SIAM J. Optim. 22 (1), 135–158 (2012).

    Article  MathSciNet  Google Scholar 

  14. R. T. Rockafellar and R. J. B. Wets, Variational Analysis (Springer- Verlag, Berlin, 2009).

    MATH  Google Scholar 

  15. M. V. Balashov “The gradient projection method on matrix manifolds,” Comput. Math. Math. Phys. 60 (9), 1453–1461 (2020).

    Article  MathSciNet  Google Scholar 

  16. Yu. Nesterov Introductory Lectures on Convex Optimization. A Basic Course (Springer-Verlag, Boston, MA, 2004).

    Book  Google Scholar 

  17. M. V. Balashov, “About the gradient projection algorithm for a strongly convex function and a proximally smooth set,” J. Convex Anal. 24 (2), 493–500 (2017).

    MathSciNet  MATH  Google Scholar 

  18. J.-P. Aubin and A. Cellina, Differential Inclusions (Springer- Verlag, Berlin, 1984).

    Book  Google Scholar 

  19. H. Liu, W. Wu, and A. M.-Ch. So, “Quadratic optimization with orthogonality constraints: explicit Lojasiewicz exponent and linear convergence of line-search methods,” in ICML’16: Proceedings of the 33rd International Conference on International Conference on Machine Learning, Proc. Machine Learning Res. (2016), Vol. 48, pp. 1158–1167.

    Google Scholar 

  20. M. V. Balashov, “The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient,” Sb. Math. 211 (4), 481–504 (2020).

    Article  MathSciNet  Google Scholar 

  21. R. Schneider and A. Uschmajew, “Convergence results for projected line search methods on varieties of low-rank matrices via Lojasiewicz inequality,” SIAM J. Optim. 25 (1), 622–646 (2015).

    Article  MathSciNet  Google Scholar 

  22. M. V. Balashov, B. T. Polyak and A. A. Tremba, “Gradient projection and conditional gradient methods for constrained nonconvex minimization,” Numer. Funct. Anal. Optim. 41 (7), 822–849 (2019).

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work was supported by the Russian Science Foundation under grant 16-11-10015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. V. Balashov.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Balashov, M.V. On the Gradient Projection Method for Weakly Convex Functions on a Proximally Smooth Set. Math Notes 108, 643–651 (2020). https://doi.org/10.1134/S0001434620110024

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0001434620110024

Keywords

Navigation