Advertisement

Automation and Remote Control

, Volume 80, Issue 1, pp 102–111 | Cite as

On the Properties of the Method of Minimization for Convex Functions with Relaxation on the Distance to Extremum

  • V. N. KrutikovEmail author
  • N. S. Samoilenko
  • V. V. Meshechkin
Optimization, System Analysis, and Operations Research
  • 7 Downloads

Abstract

We present a subgradient method of minimization, similar to the method of minimal iterations for solving systems of equations, which inherits from the latter convergence properties on quadratic functions. The proposed algorithm, for a certain set of parameters, coincides with the previously known method of minimizing piecewise linear functions and is an element of the family of minimization methods with relaxation of the distance to extremum, developed by B.T. Polyak, where the step length is calculated based on the predefined minimum value of the function. We link parameters of this method to the constraint on the degree of homogeneity of the function and obtain estimates on its convergence rate on convex functions. We prove that on some classes of functions it converges at the rate of a geometric progression. We also discuss the computational capabilities of this approach for solving problems with high dimension.

Keywords

subgradient convex function linear algebra minimum of a function convergence rate 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Shor, N.Z., An Application of Gradient Descent to Solve a Network Transportation Problem, Proc. Seminar on Theory and Applied Prob. Cyber. Oper. Research, Kiev: Nauch. Sovet po Kibernetike AN USSR, 1962, no. 1, pp. 9–17.Google Scholar
  2. 2.
    Polyak, B.T., One General Approach to Solving Extreme Problems, Dokl. AN USSR, 1967, vol. 174, no. 1, pp. 33–36.Google Scholar
  3. 3.
    Polyak, B.T., Vvedenie v optimizatsiyu (Introduction to Optimization), Moscow: Nauka, 1983.zbMATHGoogle Scholar
  4. 4.
    Wolfe, P., Note on a Method of Conjugate Subgradients for Minimizing Nondifferentiable Functions, Math. Program., 1974, vol. 7, no. 1, pp. 380–383.MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Lemarechal, C., An Extension of Davidon Methods to Non-Differentiable Problems, Math. Program. Study, 1975, vol. 3, pp. 95–109.MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Nemirovskii, A.S. and Yudin, D.B., Slozhnost’ zadach i effektivnost’ metodov optimizatsii (Complexity of Problems and Efficiency of Optimization Methods), Moscow: Nauka, 1979.Google Scholar
  7. 7.
    Shor, N.Z., Metody minimizatsii nedifferentsiruemykh funktsii i ikh prilozheniya (Minimization Methods for Non-Differentiable Functions and Their Applications), Kiev: Naukova Dumka, 1979.zbMATHGoogle Scholar
  8. 8.
    Krutikov, V.N. and Petrova, T.V., A Relaxation Method of Minimization with Space Dilation in the Direction of Subgradient, Ekonom. Mat. Metody, 2003, vol. 39, no. 1, pp. 106–119.zbMATHGoogle Scholar
  9. 9.
    Krutikov, V.N. and Gorskaya, T.A., A Family of Relaxation Subgradient Methods with Two-Rank Correction of the Metric Matrices, Ekonom. Mat. Metody, 2009, vol. 45, no. 4, pp. 37–80.Google Scholar
  10. 10.
    Nurminskii, E.A. and Tien, D., Method of Conjugate Subgradients with Constrained Memory, Autom. Remote Control, 2014, vol. 75, no. 4, pp. 646–656.MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Krutikov, V.N. and Vershinin, Ya.N., A Multistep Subgradient Method for Solving Nonsmooth Minimization Problems of High Dimension, Vestn. Tomsk. Gos. Univ., Mat. Mekh., 2014, no. 3, pp. 5–19.Google Scholar
  12. 12.
    Krutikov, V.N. and Vershinin, Ya.N., A Subgradient Minimization Method with Correction of Descent Vectors Based on Pairs of Training Relations, Vestn. Kemer. Gos. Univ., 2014, no. 1–1 (57), pp. 46–54.Google Scholar
  13. 13.
    Gol’shtein, E.G., Nemirovskii, A.S., and Nesterov, Yu.E., Method of Levels, its Generalization and Applications, Ekonom. Mat. Metody, 1995, vol. 31, no. 3, pp. 164–180.zbMATHGoogle Scholar
  14. 14.
    Nesterov, Yu.E., Smooth Minimization of Non-Smooth Functions, Math. Program., 2005, vol. 103, no. 1, pp. 127–152.MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Nesterov, Yu., Universal Gradient Methods for Convex Optimization Problems, Math. Program., Ser. A, 2015, vol. 152, pp. 381–404.MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Gasnikov, A.V. and Nesterov, Yu.E., A Universal Method for Stochastic Composite Optimization Problems, e-print, 2016. https://arxiv.org/ftp/arxiv/papers/1604/1604.05275.pdfGoogle Scholar
  17. 17.
    Nesterov, Yu., Subgradient Methods for Huge-Scale Optimization Problems, Math. Program., Ser. A, 2013, vol. 146, no. 1–2, pp. 275–297.Google Scholar
  18. 18.
    Polyak, B.T., Minimzation of Non-Smooth Functionals, Zh. Vychisl. Mat. Mat. Fiz., 1969, vol. 9, no. 3, pp. 507–521.Google Scholar
  19. 19.
    Samoilenko, N.S., Krutikov, V.N., and Meshechkin, V.V., A Study of One Variation of the Subgradient Method, Vestn. Kemer. Gos. Univ., 2015, no. 5 (2), pp. 55–58.Google Scholar
  20. 20.
    Fadeev, D.K. and Fadeeva, D.K., Vychislitel’nye metody lineinoi algebry (Computational Methods of Linear Algebra), Moscow: Fizmatgiz, 1963.Google Scholar
  21. 21.
    Voevodin, V.V. and Kuznetsov, Yu.A., Matritsy i vychisleniya (Matrices and Computations), Moscow: Nauka, 1984.zbMATHGoogle Scholar
  22. 22.
    Camerini, P., Fratta, L., and Maffioli, F., On Improving Relaxation Methods by Modified Gradient Techniques, Math. Program., 1975, no. 3, pp. 26–34.MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Pleiades Publishing, Ltd. 2019

Authors and Affiliations

  • V. N. Krutikov
    • 1
    Email author
  • N. S. Samoilenko
    • 1
  • V. V. Meshechkin
    • 1
  1. 1.Kemerovo State UniversityKemerovoRussia

Personalised recommendations