Truncatednewtono algorithms for largescale unconstrained optimization
 Ron S. Dembo,
 Trond Steihaug
 … show all 2 hide
Rent the article at a discount
Rent now* Final gross prices may vary according to local VAT.
Get AccessAbstract
We present an algorithm for largescale unconstrained optimization based onNewton's method. In largescale optimization, solving the Newton equations at each iteration can be expensive and may not be justified when far from a solution. Instead, an inaccurate solution to the Newton equations is computed using a conjugate gradient method. The resulting algorithm is shown to have strong convergence properties and has the unusual feature that the asymptotic convergence rate is a user specified parameter which can be set to anything between linear and quadratic convergence. Some numerical results on a 916 vriable test problem are given. Finally, we contrast the computational behavior of our algorithm with Newton's method and that of a nonlinear conjugate gradient algorithm.
 O. Axelsson, “Solution of linear systems of equations: Iterative methods”, in: V.A. Barker, ed.,Sparse matrix techniques. (SpringerVerlag, New York, 1976) pp. 1–51.
 R. Chandra, “Conjugate gradient methods for partial differential equations”, Ph.D. dissertation, Yale University (New Haven, CT, 1978), Also available as Department of Computer Science Research Report No. 129.
 R. Chandra, S.C. Eisenstat and M.H. Schultz, “The modified conjugate residual method for partial differential equations”, in: R. Vichnevetsky, ed.,Advance in computer methods for partial differential equations II, Proceedings of the Second International Symposium on Computer Methods for Partial Differential Equations, Lehigh University, Bethlehem, PA (International Association for Mathematics and Computers in Simulation, June 1977) pp. 13–19.
 A.R. Curtis, M.J.D. Powell and J.K. Reid, “On the estimation of sparse Jacobian matrices”,Journal of the Institute of Mathematics and its Applications 13 (1974) 117–119.
 R.S. Dembo, S.C. Eisenstat and T. Steihaug, “Inexact Newton methods”,SIAM Journal of Numerical Analysis 19 (1982) 400–408. CrossRef
 R.S. Dembo and J.G. Klincewicz, “A scaled reduced gradient algorithm for network flow problems with convex separable costs”,Mathematical Programming Studies 15 (1981) 125–147.
 R. S. Dembo and T. Steihaug, “A test problem for largescale unconstrained minimization”, School of Organization and Management, Yale University (New Haven, CT) Working Paper Series B (in preparation).
 J.E. Dennis Jr. and J.J. Moré, “QuasiNewton methods, motivation and theory”,SIAM Review 19 (1977) 46–89. CrossRef
 R. Fletcher, “Conjugate gradient methods for indefinite systems”, in: G.A. Watson, ed.,Numerical Analysis, Proceedings of Biennial Conference, Dundee, Scotland, 1975 (SpringerVerlag, New York, 1976) pp. 73–89. CrossRef
 R. Fletcher,Unconstrained optimization (John Wiley and Sons, New York, 1980).
 N.K. Garg and R.A. Tapia, “QDN: A variable storage algorithm for unconstrained optimization”, Technical Report, Department of Mathematical Sciences, Rice University (Houston, TX, 1977).
 P.E. Gill and W. Murray, “Safeguarded steplength algorithm for optimization using descent methods”, Technical Report NPL NA 37, National Physical Laboratory (1974).
 P.E. Gill, W. Murray and S.G. Nash, “A conjugategradient approach to Newtontype methods”, presented at ORSA/TIMS Joint National Meeting (Colorado Springs, November 1980).
 P.E. Gill, W. Murray and M.H. WrightPractical optimization (Academic Press, New York, 1981).
 M.R. Hestenes,Conjugate direction methods in optimization (SpringerVerlag New York, 1980).
 M.R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems”,Journal of Research of the National Bureau of Standards 49 (1952) 409–436.
 D.G. Luenberger, “Hyperbolic pairs in the method of conjugate gradients”,SIAM Journal on Applied Mathematics 17 (1969) 1263–1267. CrossRef
 B. Murtagh and M. Saunders, “Largescale linearly constrained optimization”,Mathematical Programming 14 (1978) 41–72. CrossRef
 J. Nocedal, “Updating QuasiNewton matrices with limited storage”,Mathematics of Computation 35 (1980) 773–782. CrossRef
 D.P. O'Leary, “A discrete Newton algorithm for minimizing a function of many variables”,Mathematical Programming 23 (1982) 20–33. CrossRef
 J.M. Ortega and W.C. RheinboldtIterative solution of nonlinear equations in several variables (Academic Press, New York, 1970).
 M.J.D. Powell and Ph.L. Toint, “On the estimation of sparse Hessian matrices”,SIAM Journal on Numerical Analysis 16 (1979) 1060–1074. CrossRef
 D.F. Shanno, “On the convergence of a new conjugate gradient method”,SIAM Journal on Numerical Analysis 15 (1978) 1247–1257. CrossRef
 D.F. Shanno and K.H. Phua, “Algorithm 500: Minimization of unconstrained multivariate functions”,Transactions on Mathematical Software 2 (1976) 87–94. CrossRef
 T. Steihaug, “QuasiNewton methods for large scale nonlinear problems”, Ph.D. dissertation, Yale University (New Haven, CT, 1980). Also available as Working Paper, Series B No. 49.
 Ph.L. Toint, “Some numerical results using a sparse matrix updating formula in unconstrained optimization”,Mathematics of Computation 32 (1978) 839–851. CrossRef
 Title
 Truncatednewtono algorithms for largescale unconstrained optimization
 Journal

Mathematical Programming
Volume 26, Issue 2 , pp 190212
 Cover Date
 19830601
 DOI
 10.1007/BF02592055
 Print ISSN
 00255610
 Online ISSN
 14364646
 Publisher
 SpringerVerlag
 Additional Links
 Topics
 Keywords

 Unconstrained Optimization
 Modified Newton Methods
 Conjugate Gradient Algorithms
 Industry Sectors
 Authors

 Ron S. Dembo ^{(1)}
 Trond Steihaug ^{(2)}
 Author Affiliations

 1. School of Organization and Management, Yale University, 06520, New Haven, CT, USA
 2. Department of Mathematical Sciences, Rice University, 77001, Houston, TX, USA