Skip to main content

Basic Descent Methods

  • Chapter
  • First Online:
Linear and Nonlinear Programming

Part of the book series: International Series in Operations Research & Management Science ((ISOR,volume 228))

  • 4039 Accesses

Abstract

We turn now to a description of the basic techniques used for iteratively solving unconstrained minimization problems. These techniques are, of course, important for practical application since they often offer the simplest, most direct alternatives for obtaining solutions; but perhaps their greatest importance is that they establish certain reference plateaus with respect to difficulty of implementation and speed of convergence. Thus in later chapters as more efficient techniques and techniques capable of handling constraints are developed, reference is continually made to the basic techniques of this chapter both for guidance and as points of comparison.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See the Interlocking Eigenvalues Lemma in Sect. 10.6 for a proof that only one eigenvalue becomes large.

References

  1. H. Akaike, On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Stat. Math 11, 1–17 (1959)

    Article  Google Scholar 

  2. H.A. Antosiewicz, W.C. Rheinboldt, Numerical analysis and functional analysis, in Survey of Numerical Analysis, ed. by J. Todd, Chap. 14 (McGraw-Hill, New York, 1962)

    Google Scholar 

  3. L. Armijo, Minimization of functions having Lipschitz continuous first-partial derivatives. Pac. J. Math. 16(1), 1–3 (1966)

    Article  Google Scholar 

  4. J. Barzilai, J.M. Borwein, Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (2008)

    Article  Google Scholar 

  5. A. Beck, M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)

    Article  Google Scholar 

  6. S. Boyd, L. Vandenberghe, Convex Optimization (Cambridge University Press, Cambridge, 2004)

    Book  Google Scholar 

  7. Y, Carmon, J.C. Duchi, O. Hinder, A. Sidford, Accelerated methods for nonconvex optimization. SIAM J. Optim. 28(2), 1751–1772 (2018)

    Google Scholar 

  8. H. Curry, The method of steepest descent for nonlinear minimization problems. Q. Appl. Math. 2, 258–261 (1944)

    Article  Google Scholar 

  9. Y.-H. Dai, R. Fletcher, Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming. Numer. Math. 100, 21–47 (2005)

    Article  Google Scholar 

  10. R.S. Dembo, S.C. Eisenstat, T. Steinhaug, Inexact Newton methods. SIAM J. Numer. Anal. 19(2), 400–408 (1982)

    Article  Google Scholar 

  11. I. I. Dikin, On the convergence of an iterative process. Upravlyaemye Sistemi 12, 54–60 (1974) (in Russian)

    Google Scholar 

  12. D.K. Faddeev, V.N. Faddeeva, Computational Methods of Linear Algebra (W. H. Freeman, San Francisco, 1963)

    Google Scholar 

  13. R. Fletcher, Practical Methods of Optimization 1: Unconstrained Optimization (Wiley, Chichester, 1980)

    Google Scholar 

  14. G.E. Forsythe, W.R. Wasow, Finite-Difference Methods for Partial Differential Equations (Wiley, New York, 1960)

    Google Scholar 

  15. K. Fox, An Introduction to Numerical Linear Algebra (Clarendon Press, Oxford, 1964)

    Google Scholar 

  16. P.E. Gill, W. Murray, M.H. Wright, Practical Optimization (Academic, London, 1981)

    Google Scholar 

  17. S.M. Goldfeld, R.E. Quandt, H.F. Trotter, Maximization by quadratic hill climbing. Econometrica 34, 541–551 (1966).

    Article  Google Scholar 

  18. A.A. Goldstein, On steepest descent. SIAM J. Control 3, 147–151 (1965)

    Google Scholar 

  19. E. Hazan, Introduction to online convex optimization (2019). arXiv preprint arXiv:1909.05207

    Google Scholar 

  20. E. Isaacson, H.B. Keller, Analysis of Numerical Methods (Wiley, New York, 1966)

    Google Scholar 

  21. J. Kowalik, M.R. Osborne, Methods for Unconstrained Optimization Problems (Elsevier, New York, 1968)

    Google Scholar 

  22. C. Lanczos, Applied Analysis (Prentice-Hall, Englewood Cliffs, 1956)

    Google Scholar 

  23. Z. Lu, L. Xiao, On the complexity analysis of randomized block-coordinate descent methods. Math. Program. (2013). https://doi.org/10.1007/s10107-014-0800-2

  24. D.G. Luenberger, Optimization by Vector Space Methods (Wiley, New York, 1969)

    Google Scholar 

  25. J.J. Moré, The Levenberg-Marquardt algorithm: implementation and theory. Numerical Analysis, ed. by G.A. Watson (Springer, New York, 1977)

    Google Scholar 

  26. A. Nemirovskii, D. Yudin, Efficient methods for large-scale convex optimization problems. Ekono-mika i Matematicheskie Metody 2, 135–152 (1979)

    Google Scholar 

  27. Y. Nesterov, Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22, 341–362 (2012)

    Article  Google Scholar 

  28. Y. Nesterov, A method of solving a convex programming problem with convergence rate O((1∕k 2)). Soviet Math. Dokl. 27(2), 372–376 (1983)

    Google Scholar 

  29. Y. Nesterov, A. Nemirovskii, Interior Point Polynomial Methods in Convex Programming: Theory and Algorithms (SIAM Publications, Philadelphia, 1994)

    Google Scholar 

  30. Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course (Kluwer, Boston, 2004)

    Book  Google Scholar 

  31. B. Polyak, Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4, 1–17 (1964)

    Article  Google Scholar 

  32. J. Renegar, A Mathematical View of Interior-Point Methods in Convex Optimization (Society for Industrial and Applied Mathematics, Philadelphia, 2001)

    Google Scholar 

  33. H. Robbins, S. Monro, A stochastic approximation method. Ann. Math. Stat. 22(3), 400–407 (1951)

    Article  Google Scholar 

  34. S. Smale, Newton’s method estimates from data at one point, in The Merging of Disciplines: New Directions in Pure, Applied and Computational Mathematics, ed. by R. Ewing, K. Gross, C. Martin (Springer, New York, 1986)

    Google Scholar 

  35. A. Tamir, Line search techniques based on interpolating polynomials using function values only. Manag. Sci. 22(5), 576–586 (1976)

    Article  Google Scholar 

  36. K. Tone, Revisions of constraint approximations in the successive QP method for nonlinear programming problems. Math. Program. 26(2), 144–152 (1983)

    Article  Google Scholar 

  37. J.F. Traub, Iterative Methods for the Solution of Equations (Prentice-Hall, Englewood Cliffs, 1964)

    Google Scholar 

  38. D.J. Wilde, C.S. Beightler, Foundations of Optimization (Prentice-Hall, Englewood Cliffs, 1967)

    Google Scholar 

  39. P. Wolfe, Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  Google Scholar 

  40. Y. Ye, A new complexity result on minimization of a quadratic function with a sphere constraint, in Recent Advances in Global Optimization, ed. by C.A. Floudas, P.M. Pardalos (Princeton University Press, Princeton, 2014), pp. 19–31

    Google Scholar 

  41. X. Yuan, A review of trust region algorithms for optimization. ICIAM 99(1), 271–282 (2000)

    Google Scholar 

  42. W.I. Zangwill, Nonlinear Programming: A Unified Approach (Prentice-Hall, Englewood Cliffs, 1969)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Luenberger, D.G., Ye, Y. (2021). Basic Descent Methods. In: Linear and Nonlinear Programming. International Series in Operations Research & Management Science, vol 228. Springer, Cham. https://doi.org/10.1007/978-3-030-85450-8_8

Download citation

Publish with us

Policies and ethics