Journal of Optimization Theory and Applications

, Volume 91, Issue 1, pp 185–214 | Cite as

Perturbed steepest-descent technique in multiextremal problems

  • S. K. Zavriev
Contributed Papers


The steepest-descent technique dealing with the perturbed values of the objective function and its gradients and with nonexact line searches is considered. Attention is given to the case where the perturbations do not decrease on the algorithm trajectories; the aim is to investigate how perturbations at every iteration of the algorithm perturb its original attractor set.

Based on the Liapunov direct method for attraction analysis of discrete-time processes, a sharp estimation of the attractor set generated by a perturbed steepest-descent technique with respect to the perturbation magnitudes is obtained. Some global optimization properties of finite-difference analogues of the gradient method are discovered. These properties are not inherent in methods which use exact gradients.

Key Words

Gradient methods Liapunov direct methods global optimization stability under perturbations 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gill, P. E., Murray, W., andWright, M. H.,Practical Optimization, Academic Press, London, England, 1981.Google Scholar
  2. 2.
    Young, P.,Optimization in the Presence of Noise: A Guided Tour, Optimization in Action, Edited by L. C. W. Dixon, Academic Press, London, England, pp. 517–573, 1976.Google Scholar
  3. 3.
    Polyak, B. T.,Introduction to Optimization, Optimization Software, New York, New York, 1987.Google Scholar
  4. 4.
    Brown, K. M., andDennis, J. E., Jr.,Derivative-Free Analogues of the Levenberg-Marquardt and Gauss Algorithms for Nonlinear Least Squares Approximation, Numerische Mathematik, Vol. 18, pp. 289–297, 1972.Google Scholar
  5. 5.
    Dembo, R. S., Eisenstat, S. C., andSteihaug, T.,Inexact Newton Methods, SIAM Journal on Numerical Analysis, Vol. 19, pp. 400–408, 1982.Google Scholar
  6. 6.
    Dembo, R. S., andSteihaug, T.,Truncated Newton Algorithms for Large-Scale Unconstrained Optimization, Mathematical Programming, Vol. 26, pp. 190–212, 1983.Google Scholar
  7. 7.
    Ypma, T. J.,Local Convergence of Difference Newton-Like Methods, Mathematics of Computation, Vol. 41, pp. 527–536, 1983.Google Scholar
  8. 8.
    Ypma, T. J.,Local Convergence of Inexact Newton Methods, SIAM Journal on Numerical Analysis, Vol. 21, pp. 583–590, 1984.Google Scholar
  9. 9.
    Fontecilla, R.,On Inexact Quasi-Newton Methods for Constrained Optimization, Numerical Optimization, Edited by P. T. Boggs, R. H. Byrd, and R. B. Schnabel, SIAM, Philadelphia, Pennsylvania, pp. 102–118, 1985.Google Scholar
  10. 10.
    Dennis, J. E., Jr., andWalker, H. F.,Least-Change Sparse Secant Update Methods with Inaccurate Secant Conditions, SIAM Journal on Numerical Analysis, Vol. 22, pp. 760–778, 1985.Google Scholar
  11. 11.
    Nazareth, J. L.,Analogues of Dixon's and Powell's Theorems for Unconstrained Minimization with Inexact Line Searches, SIAM Journal on Numerical Analysis, Vol. 23, pp. 170–177, 1986.Google Scholar
  12. 12.
    Tassopoulos, A., andStorey, S.,Use of a Nonquadratic Model in a Conjugate-Gradient Method of Optimization with Inexact Line Searches, Journal of Optimization Theory and Applications, Vol. 43, pp. 357–370, 1984.Google Scholar
  13. 13.
    Wright, S. J.,Local Properties of Inexact Methods for Minimizing Nonsmooth Composite Functions, Mathematical Programming, Vol. 37, pp. 232–252, 1987.Google Scholar
  14. 14.
    Gill, P. E., Murray, W., Saunders M. A., andWright, M. H.,A Note on a Sufficient-Decrease Criterion for a Nonderivative Steplength Procedure, Mathematical Programming, Vol. 23, pp. 349–352, 1982.Google Scholar
  15. 15.
    De Leone, R., Gaudioso, M., andGrippo, L.,Stopping Criteria for Linesearch Methods without Derivatives, Mathematical Programming, Vol. 30, pp. 285–300, 1984.Google Scholar
  16. 16.
    Grippo, L., Lampariello, F., andLucidi, S.,Global Convergence and Stabilization of Unconstrained Minimization Methods without Derivatives Journal of Optimization Theory and Applications, Vol. 59, pp. 385–406, 1988.Google Scholar
  17. 17.
    Boggs, P. T., andDennis, J. E., Jr.,A Stability Analysis for Perturbed Nonlinear Iterative Methods, Mathematics of Computation, Vol. 30, pp. 199–215, 1976.Google Scholar
  18. 18.
    Ypma, T. J.,The Effect of Rounding Errors on Newton-Like Methods, IMA Journal of Numerical Analysis, Vol. 3, pp. 109–118, 1983.Google Scholar
  19. 19.
    Dennis, J. E., Jr., andWalker, H. F.,Inaccuracy in Quasi-Newton Methods: Local Improvement Theorems, Mathematical Programming Study, Vol. 22, pp. 70–85, 1984.Google Scholar
  20. 20.
    Tishyadhigama, S., Polak, E., Klessig, R.,A Comparative Study of Several General Convergence Conditions for Algorithms Modeled by Point-to-Set Maps, Mathematical Programming Study, Vol. 10, pp. 172–190, 1979.Google Scholar
  21. 21.
    Mikhalevitch, V. S., Gupal, A. M., andNorkin, V. I.,Methods of Nonconvex Optimization, Nauka, Moscow, Russia, 1987 (in Russian).Google Scholar
  22. 22.
    Hu, T. C., Klee, V., andLarman, D.,Optimization of Globally Convex Functions, SIAM Journal on Control and Optimization, Vol. 27, pp. 1026–1047, 1989.Google Scholar
  23. 23.
    Zavriev, S. K.,Stochastic Subgradients Methods for Minmax Problems, Izdatelstvo Moskovskogo Universiteta, Moscow, Russia, 1984 (in Russian).Google Scholar
  24. 24.
    Dorofeev, P. A.,On Some Properties of Quasigradient Methods, USSR Computational Mathematics and Mathematical Physics, Vol. 25, pp. 181–189, 1985.Google Scholar
  25. 25.
    Zavriev, S. K.,Convergence Properties of the Gradient Method under Variable Level Interference, USSR Computational Mathematics and Mathematical Physics, Vol. 30, pp. 24–32, 1990.Google Scholar
  26. 26.
    Zavriev, S. K.,On the Global Optimization Properties of Finite-Difference Local Descent Algorithms, Journal of Global Optimization, Vol. 3, pp. 69–78, 1993.Google Scholar
  27. 27.
    Zangwill, W. I.,Nonlinear Programming: A Unified Approach, Prentice-Hall, Englewood Cliffs, New Jersey, 1969.Google Scholar
  28. 28.
    Polak, E.,Computational Methods in Optimization: A Unified Approach, Academic Press, New York, New York, 1971.Google Scholar
  29. 29.
    Ortega, J. M., andRheinboldt, W. C.,Iterative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, New York, 1970.Google Scholar
  30. 30.
    Nurminskii, E. A.,Convergence Conditions for Nonlinear Programming Algorithms, Cybernetics, Vol. 9, pp. 464–468, 1973.Google Scholar
  31. 31.
    Rouche, N., Habets, P., andLaloy, M.,Stability Theory by Liapunov's Direct Method, Springer Verlag, New York, New York, 1977.Google Scholar
  32. 32.
    La Salle, J. P.,Stability Theory for Ordinary Differential Equations, Journal of Differential Equations, Vol. 4, pp. 57–65, 1968.Google Scholar
  33. 33.
    Mayne, D. Q., andPolak, E. Nondifferential Optimization via Adaptive Smoothing, Journal of Optimization Theory and Applications, Vol. 43, pp. 601–614, 1984.Google Scholar

Copyright information

© Plenum Publishing Corporation 1996

Authors and Affiliations

  • S. K. Zavriev
    • 1
  1. 1.Faculty of Computational Mathematics and CyberneticsMoscow State UniversityMoscowRussia

Personalised recommendations