Advertisement

A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function

  • Zahra Khoshgam
  • Ali AshrafiEmail author
Article
  • 82 Downloads

Abstract

The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The CG parameter in the mentioned hybrid CG method is a convex combination of CG parameters corresponding to the Hestenes–Stiefel and Dai–Yuan algorithms. The main feature of these hybrid methods is that the search direction is the Newton direction. The modified secant equation is derived by means of the fifth-order tensor model to improve the curvature information of the objective function. Also, to achieve convergence for general function, the revised version of the method based on the linear combination of the mentioned secant equation and Li and Fukushima’s modified secant equation is suggested. Under proper conditions, globally convergence properties of the new hybrid CG algorithm even without convexity assumption on the objective function is studied. Numerical experiments on a set of test problems of the CUTEr collection are done; they demonstrate the practical effectiveness of the proposed hybrid conjugate gradient algorithm.

Keywords

Unconstrained optimization Large-scale optimization Hybrid conjugate gradient method Secant equation Global convergence 

Mathematics Subject Classification

49Mxx 49M37 90Cxx 90C06 90C26 90C30 

Notes

Acknowledgements

This research was supported by the Research Council of Semnan University.

References

  1. Andrei N (2008) Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer Algor 47:143–156MathSciNetCrossRefGoogle Scholar
  2. Andrei N (2010) Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer Algor 54(1):23–46MathSciNetCrossRefGoogle Scholar
  3. Andrei N (2011) Open problems in conjugate gradient algorithms for unconstrained optimization. Bull Malays Math Sci Soc 34(2):319–330MathSciNetzbMATHGoogle Scholar
  4. Babaie-Kafaki S, Ghanbari R (2014) A modified scaled conjugate gradient method with global convergence for nonconvex functions. Belg Math Soc-SIM 21(3):465–477MathSciNetzbMATHGoogle Scholar
  5. Babaie-Kafaki S, Fatemi M, Mahdavi-Amiri N (2011) Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer Algor 58(3):315–331MathSciNetCrossRefGoogle Scholar
  6. Biglari F, Abu Hassan M, Leong WJ (2011) New quasi-Newton methods via higher order tensor models. J Comput Appl Math 235(8):2412–2422MathSciNetCrossRefGoogle Scholar
  7. Dai YH, Yuan YX (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10:177–182MathSciNetCrossRefGoogle Scholar
  8. Dai YH, Yuan YX (2001) An efficient hybrid conjugate gradient method for unconstrained optimization. Ann Oper Res 103:33–47MathSciNetCrossRefGoogle Scholar
  9. Dai YH, Han JY, Liu GH, Sun DF, Yin HX, Yuan YX (1999) Convergence properties of nonlinear conjugate gradient methods. SIAM J Optim 10(2):348–358MathSciNetzbMATHGoogle Scholar
  10. Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213MathSciNetCrossRefGoogle Scholar
  11. Fletcher R (2013) Practical methods of optimization. Wiley, New YorkzbMATHGoogle Scholar
  12. Fletcher R, Revees CM (1964) On-line algorithms for computing mean and variance of interval data, and their use in intelligent systems. Comput J 7:149–154MathSciNetCrossRefGoogle Scholar
  13. Gould NIM, Orban D, Toint Ph L (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394CrossRefGoogle Scholar
  14. Guo Q, Liu JG, Wang DH (2008) A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule. J Appl Math Comput 28:435–446MathSciNetCrossRefGoogle Scholar
  15. Hestenes MR, Stiefel EL (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49(5):409–436MathSciNetCrossRefGoogle Scholar
  16. Li DH, Fukushima M (2001) A modified BFGS method and its global convergence in nonconvex minimization. J Comput Appl Math 129(1–2):15–35MathSciNetCrossRefGoogle Scholar
  17. Liu Y, Storey C (1991) Efficient generalized conjugate gradient algorithms. J Optim Theory Appl 69:129–137MathSciNetCrossRefGoogle Scholar
  18. Nocedal J, Wright SJ (2006) Numerical optimization. Springer, New YorkzbMATHGoogle Scholar
  19. Polyak BT (1969) The conjugate gradient method in extreme problems. Comput Math Math Phys 9(4):94–112CrossRefGoogle Scholar
  20. Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New YorkzbMATHGoogle Scholar
  21. Wei Z, Yu G, Yuan G et al (2004) The superlinear convergence of a modified BFGS-type method for unconstrained optimization. Comput Optim Appl 29:315–332MathSciNetCrossRefGoogle Scholar
  22. Wei Z, Li G, Qi L (2006) New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput 175:1156–1188MathSciNetzbMATHGoogle Scholar
  23. Xiao LD, Wei JL, Yu BH (2017) Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition. RAIRO Oper Res 51:67–77MathSciNetCrossRefGoogle Scholar
  24. Yabe YX (2004) Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. J Numer Anal 28:203–225MathSciNetzbMATHGoogle Scholar
  25. Yuan H, Takano M (1991) A modified BFGS algorithm for unconstrained optimization. IMA J Numer Anal 11:325–332MathSciNetCrossRefGoogle Scholar
  26. Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102:147–167MathSciNetCrossRefGoogle Scholar
  27. Zhou W, Zhang L (2006) A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim Methods Softw 21(5):707–714MathSciNetCrossRefGoogle Scholar

Copyright information

© SBMAC - Sociedade Brasileira de Matemática Aplicada e Computacional 2019

Authors and Affiliations

  1. 1.Department of Mathematics, Faculty of Mathematics, Statistics and Computer ScienceSemnan UniversitySemnanIran

Personalised recommendations