Chapter 5 Large unconstrained optimization problems

Part of the Lecture Notes in Computer Science book series (LNCS, volume 165)


Hessian Matrix Conjugate Gradient Method Trust Region Superlinear Convergence Trust Region Method 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

5.8 References

  1. Cline, A.K., Moler, C.B., Stewart, G.W., and Wilkinson, J.H. [1979]. An estimate of the condition number of a matrix, SIAM J. Numer. Anal. 16, 368–375.Google Scholar
  2. Coleman, T.F. and Moré [1982]. Estimation of sparse Hessian matrices and graph coloring problems, Technical Report 82-535, Department of Computer Science, Cornell University, Ithaca, NY.Google Scholar
  3. Dembo, R., Eisenstat, S., and Steihaug [1982]. Inexact Newton Methods, SIAM Journal on Numerical Analysis 19, 400–408.Google Scholar
  4. Dennis, J.E. and Moré [1977]. Quasi-Newton methods, motivation and theory, SIAM Review 19, 46–89.Google Scholar
  5. Dennis, J.E. and Schnabel, R.B. [1979]. Least change secant updates for quasi-Newton methods, SIAM Review 21, 443–459.Google Scholar
  6. Dennis, J.E. and Schnabel, R.B.[1983]. Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice-Hall.Google Scholar
  7. Fletcher, R. [1980]. Practical Methods of Optimization: Unconstrained Optimization, John Wiley and Sons.Google Scholar
  8. Fletcher, R. and Reeves, C.M. [1964]. Function minimization by conjugate gradients, Computer Journal 6, 163–168.Google Scholar
  9. Gay, D.M. [1981]. Computing optimal locally constrained steps, SIAM J. Sci. Stat. Comput. 2, 186–197.Google Scholar
  10. Griewank, A. and Toint Ph. L [1982]. Partitioned Variable Metric Updates for Large Structured Optimization Problems, Numerische Mathematik 39, 429–448.Google Scholar
  11. Griewank, A. and Toint, Ph. L [1983]. On the existence of convex decompositions of partially separable functions, to appear in Mathematical Programming.Google Scholar
  12. Gill, P.E., Murray, W., and Wright, M. [1981]. Practical Optimization, Academic Press, New York.Google Scholar
  13. Gill, P.E., Murray, W., Saunders, M.A., and Wright, M.H. [1983]. Computing Forward-Difference Intervals for Numerical Optimization, SIAM Journal on Scientific and Statistical Computing 4, 310–321.Google Scholar
  14. Marwil, E. [1978]. Exploiting sparsity in Newton-type methods, PhD. Dissertation, Applied Mathematics, Cornell University, Ithaca, NY.Google Scholar
  15. Moré, J.J, [1982]. Recent developments in algorithms and software for trust region methods, Technical Report ANL/MCS-TM-2, Argonne National Laboratory, Argonne, Il.Google Scholar
  16. Moré, J.J. and Sorensen, D.C. [1982]. Newton's Method, Technical Report ANL-82-8, Argonne National Laboratory, Argonne, Il.Google Scholar
  17. Moré, J.J. and Sorensen, D.C. [1981]. Computing a Trust Region Step, Technical Report ANL-81-83, Argonne National Laboratory, Argonne, Il.Google Scholar
  18. Nazareth, L. [1979]. A relationship between the BFGS and conjugate gradient algorithms and its implication for new algorithms, SIAM J. on Numerical Analysis, 16, 794–800.Google Scholar
  19. O'Leary, D.P. [1980]. A discrete Newton algorithm for minimizing a function of many variables, Technical Report 910, Computer Science Center, University of Maryland, College Park, Maryland.Google Scholar
  20. Paige, C. and Saunders, M. [1982]. LSQE: An algorithm for sparse linear equations and sparse least squares, ACM Trans. on Math. Software, 8, 43–71.Google Scholar
  21. Powell, M.J.D. [1970]. A new algorithm for unconstrained optimization, in ‘Nonlinear Programming', J.B. Rosen, O.L. Mangasarian, and K.Ritter, eds., Academic Press, 31–66.Google Scholar
  22. Powell, M.J.D. and Toint, Ph. L. [1979]. On the estimation of sparse Hessian matrices, SIAM J. on Numerical Analysis 16, 1060–1074.Google Scholar
  23. Powell, M.J.D. and Toint, Ph. L. [1981]. The Shanno-Toint procedure for updating sparse symmetric matrices, IMA Journal of Numerical Analysis 1, 403–413.Google Scholar
  24. Powell, M.J.D. [1981]. A note on quasi-Newton formulae for sparse second derivative matrices, Math. Prog. 20, 144–151.Google Scholar
  25. Shanno, D.F. [1978]. Conjugate-gradient methods with inexact searches, Math. of Oper. Res. 3, 244–256.Google Scholar
  26. Shanno, D.F. [1980]. On the variable metric methods for sparse Hessians, Math. Comp. 34, 499–514.Google Scholar
  27. Sorensen, D.C. [1981]. An example concerning quasi-Newton estimation of a sparse Hessian, SIGNUM Newsletter, 16, 8–10.Google Scholar
  28. Sorensen, D.C [1982]. Trust region methods for uncontrained optimization, SIAM J. Numer. Anal. 19, 409–426.Google Scholar
  29. Steihaug, T. [1983]. The conjugate gradient method and trust regions in large scale optimization, SIAM Journal on Numerical Analysis 20, 626–637.Google Scholar
  30. Thomas, S.W. [1975]. Sequential estimation techniques for quasi-Newton algorithms, Ph.D. Dissertation, Cornell University, Ithaca, NY.Google Scholar
  31. Toint, Ph. L [1977]. On sparse and symmetric updating subject to a linear equation, Math. Comp. 32, 839–851.Google Scholar
  32. Toint, Ph.L [1981a]. A sparse quasi-Newton update derived variationally with a non-diagonally weighted Frobenius norm, Math. Comp. 37, 425–434.Google Scholar
  33. Toint, Ph.L [1981]. A note on sparsity exploiting quasi-Newton methods, Mathematical Programming 21, 172–181.Google Scholar
  34. Thapa, M.N. [1979]. A note on sparse quasi-Newton methods, Technical Report 79-13, Dept. of Operations Research, Stanford University, Stanford, CA.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1984

Personalised recommendations