Abstract
An adaptive conjugate gradient algorithm is presented. The search direction is computed as the sum of the negative gradient and a vector determined by minimizing the quadratic approximation of objective function at the current point. Using a special approximation of the inverse Hessian of the objective function, which depends by a positive parameter, we get the search direction which satisfies both the sufficient descent condition and the Dai-Liao’s conjugacy condition. The parameter in the search direction is determined in an adaptive manner by clustering the eigenvalues of the matrix defining it. The global convergence of the algorithm is proved for uniformly convex functions. Using a set of 800 unconstrained optimization test problems we prove that our algorithm is significantly more efficient and more robust than CG-DESCENT algorithm. By solving five applications from the MINPACK-2 test problem collection, with 106 variables, we show that the suggested adaptive conjugate gradient algorithm is top performer versus CG-DESCENT.
This paper is dedicated to Prof. Boris T. Polyak on the occasion of his 80th birthday. Prof. Polyak’s contributions to linear and nonlinear optimization methods, linear algebra, numerical mathematics, linear and nonlinear control systems are well-known. His articles and books give careful attention to both mathematical rigor and practical relevance. In all his publications he proves to be a refined expert in understanding the nature, purpose and limitations of nonlinear optimization algorithms and applied mathematics in general. It is my great pleasure and honour to dedicate this paper to Prof. Polyak, a pioneer and a great contributor in his area of interests.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)
Andrei, N.: Another collection of large-scale unconstrained optimization test functions. ICI Technical Report, January 30 (2013)
Aris, R.: The Mathematical Theory of Diffusion and Reaction in Permeable Catalysts. Oxford University Press, New York (1975)
Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection. Mathematics and Computer Science Division, Argonne National Laboratory. Preprint MCS-P153-0692, June (1992)
Axelsson, O., Lindskog, G.: On the rate of convergence of the preconditioned conjugate gradient methods. Numer. Math. 48, 499–523, (1986)
Babaie-Kafaki, S.: A eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method. Bull. Iran. Math. Soc. 40 (1) 235–242 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belgian Math. Soc. Simon Stevin 21 (3), 465–47 (2014)
Bebernes, J., Eberly, D.: Mathematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer, New York (1989)
Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101, (2001)
Dai, Y.H., Liao, L.Z., Duan, L.: On restart procedures for the conjugate gradient method. Numer. Algorithms 35, 249–260 (2004)
Dennis, J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs (1983)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1), 21–42 (1992)
Glowinski, R.: Numerical Methods for Nonlinear Variational Problems. Springer, Berlin (1984)
Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2 (1), 35–58 (2006)
Hestenes, M.R., Steifel, E.: Metods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. Sec. B. 48, 409–436 (1952)
Kaporin, I.E.: New convergence results and preconditioning strategies for the conjugate gradient methods. Numer. Linear Algebra Appl. 1 (2), 179–210 (1994)
Kratzer, D., Parter, S.V., Steuerwalt, M.: Bolck splittings for the conjugate gradient method. Comp. Fluid 11, 255–279 (1983)
Luenberger, D.G., Ye, Y.: Linear and Nonlinear Programming. International Series in Operations Research & Management Science, 3rd edn. Springer Science+Business Media, New York (2008)
Nitsche, J.C.C.: Lectures On Minimal Surfaces, vol. 1. Cambridge University Press, Cambridge (1989)
Nocedal, J.: Conjugate gradient methods and nonlinear optimization. In: Adams, L., Nazareth, J.L. (eds.) Linear and Nonlinear Conjugate Gradient Related Methods, pp. 9–23. SIAM, Philadelphia (1996)
Polak, E., Ribière, G.: Note sur la convergence de directions conjuguée. Rev. Fr. Informat Recherche Oper. 3e Année 16, 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)
Polyak, B.T.: Introduction to Optimization. Optimization Software, Publications Division, New York (1987)
Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983). Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)
Stiefel, E.: Über einige methoden der relaxationsrechnung. Z. Angew. Math. Phys. 3, 1–33 (1952)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Science + Business Media, New York (2006)
Winther, R.: Some superlinear convergence results for the conjugate gradient method. SIAM J. Numer. Anal. 17, 14–17 (1980)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)
Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Andrei, N. (2016). A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization. In: Goldengorin, B. (eds) Optimization and Its Applications in Control and Data Sciences. Springer Optimization and Its Applications, vol 115. Springer, Cham. https://doi.org/10.1007/978-3-319-42056-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-42056-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-42054-7
Online ISBN: 978-3-319-42056-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)