Abstract
Let a weakly convex function (in the general case, nonconvex and nonsmooth) satisfy the quadratic growth condition. It is proved that the gradient projection method for minimizing such a function on a set converges with linear rate on a proximally smooth (nonconvex) set of special form (for example, on a smooth manifold), provided that the constant of weak convexity of the function is less than the constant in the quadratic growth condition and the constant of proximal smoothness for the set is sufficiently large. The connection between the quadratic growth condition on the function and other conditions is discussed.
Similar content being viewed by others
Notes
In a finite-dimensional space, the continuity of the mapping \(U_Q(R)\setminus Q\ni x\to P_{Q}x\) is not required. Its continuity follows from the uniqueness of the metric projection and its upper semicontinuity [10, Chap. 3, Sec. 1, Proposition 23].
The set \(\partial f(x)=\{f'(x)\in\mathbb{R}^n\mid f(y)\ge f(x)+(f'(x),y-x)\ \forall\,y\in\mathbb{R}^n\}\). is called the subdifferential of a convex function \(f\) at the point \(x\)
References
B. T. Polyak, “Gradient methods for minimizing functionals,” USSR Comput. Math. Math. Phys. 3 (4), 864–878 (1963).
E. S. Levitin and B. T. Polyak, “Constrained minimization methods,” USSR Comput. Math. Math. Phys. 6 (5), 1–50 (1966).
B. T. Polyak, Introduction to Optimization (Nauka, Moscow, 1983) [in Russian].
D. Davis, D. Drusvyatskiy, K. J. MacPhee, and C. Paquette, Subgradient Methods for Sharp Weakly Convex Functions, arXiv: 1803.02461v1 (2018).
X. Li, Zh. Zhu, A. Man-Cho So, and J. D. Lee, Incremental Methods for Weakly Convex Optimization, arXiv: 1907.11687v1 (2019).
H. Karimi, J. Nutini, and M. Schmidt, “Linear convergence of gradient and proximal-gradient methods under the Polyak–Lojasiewicz condition,” in Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Comput. Sci. (Springer, Cham, 2016), Vol. 9851.
D. Drusvyatskiy and A. S. Lewis, “Error bounds, quadratic growth and linear convergence of proximal methods,” Math. Oper. Res. 43 (3), 919–948 (2018).
J.-P. Vial, “Strong and weak convexity of sets and functions,” Math. Oper. Res. 8 (2), 231–259 (1983).
F. H. Clarke, R. J. Stern and P. R. Wolenski, “Proximal smoothness and lower-\(C^{2}\) property,” J. Convex Anal. 2 (1-2), 117–144 (1995).
J. P. Aubin and I. Ekeland, Applied Nonlinear Analysis (John Wiley, New York, 1984).
R. A. Poliquin, R. T. Rockafellar and L. Thibault, “Local differentiability of distance functions,” Trans. Amer. Math. Soc. 352 (11), 5231–5249 (2000).
M. Bounkhel and L. Thibault, “On various notions of regularity of sets in nonsmooth analysis,” Nonlinear Anal. 48 (2), 223–246 (2002).
P.-A. Absil and J. Malick, “Projection-like retractions on matrix manifolds,” SIAM J. Optim. 22 (1), 135–158 (2012).
R. T. Rockafellar and R. J. B. Wets, Variational Analysis (Springer- Verlag, Berlin, 2009).
M. V. Balashov “The gradient projection method on matrix manifolds,” Comput. Math. Math. Phys. 60 (9), 1453–1461 (2020).
Yu. Nesterov Introductory Lectures on Convex Optimization. A Basic Course (Springer-Verlag, Boston, MA, 2004).
M. V. Balashov, “About the gradient projection algorithm for a strongly convex function and a proximally smooth set,” J. Convex Anal. 24 (2), 493–500 (2017).
J.-P. Aubin and A. Cellina, Differential Inclusions (Springer- Verlag, Berlin, 1984).
H. Liu, W. Wu, and A. M.-Ch. So, “Quadratic optimization with orthogonality constraints: explicit Lojasiewicz exponent and linear convergence of line-search methods,” in ICML’16: Proceedings of the 33rd International Conference on International Conference on Machine Learning, Proc. Machine Learning Res. (2016), Vol. 48, pp. 1158–1167.
M. V. Balashov, “The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient,” Sb. Math. 211 (4), 481–504 (2020).
R. Schneider and A. Uschmajew, “Convergence results for projected line search methods on varieties of low-rank matrices via Lojasiewicz inequality,” SIAM J. Optim. 25 (1), 622–646 (2015).
M. V. Balashov, B. T. Polyak and A. A. Tremba, “Gradient projection and conditional gradient methods for constrained nonconvex minimization,” Numer. Funct. Anal. Optim. 41 (7), 822–849 (2019).
Funding
This work was supported by the Russian Science Foundation under grant 16-11-10015.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Balashov, M.V. On the Gradient Projection Method for Weakly Convex Functions on a Proximally Smooth Set. Math Notes 108, 643–651 (2020). https://doi.org/10.1134/S0001434620110024
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0001434620110024