Abstract
The Monteiro–Svaiter accelerated hybrid proximal extragradient method (2013) with one step of Newton’s method used at every iteration for the approximate solution of an auxiliary problem is considered. The Monteiro–Svaiter method is optimal (with respect to the number of gradient and Hessian evaluations for the optimized function) for sufficiently smooth convex optimization problems in the class of methods using only the gradient and Hessian of the optimized function. An optimal tensor method involving higher derivatives is proposed by replacing Newton’s step with a step of Yu.E. Nesterov’s recently proposed tensor method (2018) and by using a special generalization of the step size selection condition in the outer accelerated proximal extragradient method. This tensor method with derivatives up to the third order inclusive is found fairly practical, since the complexity of its iteration is comparable with that of Newton’s one. Thus, a constructive solution is obtained for Nesterov’s problem (2018) of closing the gap between tight lower and overstated upper bounds for the convergence rate of existing tensor methods of order \(p\; \geqslant \;3\).
Similar content being viewed by others
REFERENCES
Y. Arjevani, O. Shamir, and R. Shiff, Math. Program. https://arxiv.org/abs/1705.07260.
Yu. Nesterov, Preprint No. 2018005, Univ. Catholique de Louvain (Center for Operations Research and Econometrics (CORE), 2018).
A. V. Gasnikov and D. A. Kovalev, Komp’yut. Issled. Model. 10 (3), 305–314 (2018).
A. S. Nemirovskii and D. B. Yudin, Complexity of Problems and Efficiency of Optimization Methods (Nauka, Moscow, 1979) [in Russian].
R. Monteiro and B. Svaiter, SIAM J. Optim. 23 (2), 1092–1125 (2013).
Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course (Springer Science and Business Media, 2013).
Yu. E. Nesterov, Introduction to Convex Optimization (MTsNMO, Moscow, 2010) [in Russian].
H. Lin, J. Mairal, and Z. Harchaoui, J. Machine Learning Res. 18 (212), 1–54 (2018).
Yu. Nesterov and B. Polyak, Math. Program. 108 (1), 177–205 (2006).
A. V. Gasnikov, Modern Numerical Optimization Methods: Universal Gradient Descent Method (Mosk. Fiz.-Tekh. Inst., Moscow, 2018) [in Russian].
T. H. Cormen, C. E. Leiserson, R. L. Rivest, and C. Stein, Introduction to Algorithms, 2nd ed. (MIT Press, Cambridge, Mass., 2001).
Author information
Authors and Affiliations
Corresponding author
Additional information
Translated by I. Ruzanova
Rights and permissions
About this article
Cite this article
Gasnikov, A.V., Gorbunov, E.A., Kovalev, D.A. et al. Reachability of Optimal Convergence Rate Estimates for High-Order Numerical Convex Optimization Methods. Dokl. Math. 99, 91–94 (2019). https://doi.org/10.1134/S1064562419010289
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S1064562419010289