Skip to main content

A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization

  • Chapter
  • First Online:
Optimization and Its Applications in Control and Data Sciences

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 115))

  • 1352 Accesses

Abstract

An adaptive conjugate gradient algorithm is presented. The search direction is computed as the sum of the negative gradient and a vector determined by minimizing the quadratic approximation of objective function at the current point. Using a special approximation of the inverse Hessian of the objective function, which depends by a positive parameter, we get the search direction which satisfies both the sufficient descent condition and the Dai-Liao’s conjugacy condition. The parameter in the search direction is determined in an adaptive manner by clustering the eigenvalues of the matrix defining it. The global convergence of the algorithm is proved for uniformly convex functions. Using a set of 800 unconstrained optimization test problems we prove that our algorithm is significantly more efficient and more robust than CG-DESCENT algorithm. By solving five applications from the MINPACK-2 test problem collection, with 106 variables, we show that the suggested adaptive conjugate gradient algorithm is top performer versus CG-DESCENT.

This paper is dedicated to Prof. Boris T. Polyak on the occasion of his 80th birthday. Prof. Polyak’s contributions to linear and nonlinear optimization methods, linear algebra, numerical mathematics, linear and nonlinear control systems are well-known. His articles and books give careful attention to both mathematical rigor and practical relevance. In all his publications he proves to be a refined expert in understanding the nature, purpose and limitations of nonlinear optimization algorithms and applied mathematics in general. It is my great pleasure and honour to dedicate this paper to Prof. Polyak, a pioneer and a great contributor in his area of interests.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: Another collection of large-scale unconstrained optimization test functions. ICI Technical Report, January 30 (2013)

    Google Scholar 

  3. Aris, R.: The Mathematical Theory of Diffusion and Reaction in Permeable Catalysts. Oxford University Press, New York (1975)

    MATH  Google Scholar 

  4. Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection. Mathematics and Computer Science Division, Argonne National Laboratory. Preprint MCS-P153-0692, June (1992)

    Google Scholar 

  5. Axelsson, O., Lindskog, G.: On the rate of convergence of the preconditioned conjugate gradient methods. Numer. Math. 48, 499–523, (1986)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S.: A eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method. Bull. Iran. Math. Soc. 40 (1) 235–242 (2014)

    MathSciNet  MATH  Google Scholar 

  7. Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belgian Math. Soc. Simon Stevin 21 (3), 465–47 (2014)

    MathSciNet  MATH  Google Scholar 

  8. Bebernes, J., Eberly, D.: Mathematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer, New York (1989)

    Google Scholar 

  9. Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101, (2001)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Liao, L.Z., Duan, L.: On restart procedures for the conjugate gradient method. Numer. Algorithms 35, 249–260 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dennis, J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs (1983)

    MATH  Google Scholar 

  14. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  16. Glowinski, R.: Numerical Methods for Nonlinear Variational Problems. Springer, Berlin (1984)

    Book  MATH  Google Scholar 

  17. Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  18. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  19. Hager, W.W., Zhang, H.: Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  20. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2 (1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  21. Hestenes, M.R., Steifel, E.: Metods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. Sec. B. 48, 409–436 (1952)

    Article  Google Scholar 

  22. Kaporin, I.E.: New convergence results and preconditioning strategies for the conjugate gradient methods. Numer. Linear Algebra Appl. 1 (2), 179–210 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  23. Kratzer, D., Parter, S.V., Steuerwalt, M.: Bolck splittings for the conjugate gradient method. Comp. Fluid 11, 255–279 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  24. Luenberger, D.G., Ye, Y.: Linear and Nonlinear Programming. International Series in Operations Research & Management Science, 3rd edn. Springer Science+Business Media, New York (2008)

    Google Scholar 

  25. Nitsche, J.C.C.: Lectures On Minimal Surfaces, vol. 1. Cambridge University Press, Cambridge (1989)

    Google Scholar 

  26. Nocedal, J.: Conjugate gradient methods and nonlinear optimization. In: Adams, L., Nazareth, J.L. (eds.) Linear and Nonlinear Conjugate Gradient Related Methods, pp. 9–23. SIAM, Philadelphia (1996)

    Google Scholar 

  27. Polak, E., Ribière, G.: Note sur la convergence de directions conjuguée. Rev. Fr. Informat Recherche Oper. 3e Année 16, 35–43 (1969)

    Google Scholar 

  28. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)

    Article  MATH  Google Scholar 

  29. Polyak, B.T.: Introduction to Optimization. Optimization Software, Publications Division, New York (1987)

    MATH  Google Scholar 

  30. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983). Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)

    Google Scholar 

  31. Stiefel, E.: Über einige methoden der relaxationsrechnung. Z. Angew. Math. Phys. 3, 1–33 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  32. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Science + Business Media, New York (2006)

    MATH  Google Scholar 

  33. Winther, R.: Some superlinear convergence results for the conjugate gradient method. SIAM J. Numer. Anal. 17, 14–17 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  34. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  35. Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Andrei, N. (2016). A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization. In: Goldengorin, B. (eds) Optimization and Its Applications in Control and Data Sciences. Springer Optimization and Its Applications, vol 115. Springer, Cham. https://doi.org/10.1007/978-3-319-42056-1_1

Download citation

Publish with us

Policies and ethics