Skip to main content
Log in

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

  • Research paper
  • Published:
4OR Aims and scope Submit manuscript

Abstract

In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Andrei N (2007a) A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl Math Lett 20(6):645–650

    Article  Google Scholar 

  • Andrei N (2007b) Scaled conjugate gradient algorithms for unconstrained optimization. Comput Optim Appl 38(3):401–416

    Article  Google Scholar 

  • Andrei N (2007c) Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim Methods Softw 22(4):561–571

    Article  Google Scholar 

  • Andrei N (2008) A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization 57(4):549–570

    Google Scholar 

  • Andrei N (2010) Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur J Oper Res 204(3):410–420

    Article  Google Scholar 

  • Babaie-Kafaki S (2011) A modified BFGS algorithm based on a hybrid secant equation. Sci China Math 54(9):2019–2036

    Article  Google Scholar 

  • Babaie-Kafaki S (2012a) A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput Optim Appl 52(2):409–414

    Article  Google Scholar 

  • Babaie-Kafaki S (2012b) A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. J Optim Theory Appl 154(3):916–932

    Article  Google Scholar 

  • Babaie-Kafaki S, Ghanbari R, Mahdavi-Amiri N (2010) Two new conjugate gradient methods based on modified secant equations. J Comput Appl Math 234(5):1374–1386

    Article  Google Scholar 

  • Barzilai J, Borwein JM (1988) Two-point stepsize gradient methods. IMA J Numer Anal 8(1):141–148

    Article  Google Scholar 

  • Birgin E, Martínez JM (2001) A spectral conjugate gradient method for unconstrained optimization. Appl Math Optim 43(2):117–128

    Article  Google Scholar 

  • Dai YH, Han JY, Liu GH, Sun DF, Yin HX, Yuan YX (1999) Convergence properties of nonlinear conjugate gradient methods. SIAM J Optim 10(2):348–358

    Google Scholar 

  • Dai YH, Yuan J, Yuan YX (2002) Modified two-point stepsize gradient methods for unconstrained optimization. Comput Optim Appl 22(1):103–109

    Article  Google Scholar 

  • Dai YH, Liao LZ, Li D (2004) On restart procedures for the conjugate gradient method. Numer Algorithm 35(2–4):249–260

    Article  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213

    Google Scholar 

  • Gould NIM, Orban D, Toint PL (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394

    Article  Google Scholar 

  • Hager WW, Zhang H (2006a) Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137

    Google Scholar 

  • Hager WW, Zhang H (2006b) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58

    Google Scholar 

  • Han JY, Liu GH, Sun DF, Yin HX (2001) Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications. Acta Math Appl Sin 17(2):38–46

    Google Scholar 

  • Nocedal J, Wright SJ (2006) Numerical optimization. Springer, New York

    Google Scholar 

  • Oren SS, Spedicato E (1976) Optimal conditioning of self-scaling variable metric algorithms. Math Program 10(1):70–90

    Article  Google Scholar 

  • Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comput Math Math Phys 9(4):94–112

    Article  Google Scholar 

  • Polak E, Ribière G (1969) Note sur la convergence de méthodes de directions conjuguées. Rev Française Informat Recherche Opérationnelle 3(16):35–43

    Google Scholar 

  • Shanno DF (1978) Conjugate gradient methods with inexact searches. Math Oper Res 3(3):244–256

    Article  Google Scholar 

  • Shanno DF, Phua KH (1976) Algorithm 500: minimization of unconstrained multivariate functions. ACM Trans Math Softw 2(1):87–94

    Article  Google Scholar 

  • Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New York

    Google Scholar 

  • Wolfe P (1969) Convergence conditions for ascent methods. SIAM Rev 11(2):226–235

    Article  Google Scholar 

  • Xu C, Zhang JZ (2001) A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann Oper Res 103(1–4):213–234

    Article  Google Scholar 

  • Yabe H, Ogasawara H, Yoshino M (2007) Local and superlinear convergence of quasi-Newton methods based on modified secant conditions. J Comput Appl Math 205(1):617–632

    Article  Google Scholar 

  • Yabe H, Takano M (2004) Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput Optim Appl 28(2):203–225

    Article  Google Scholar 

  • Yuan YX, Byrd RH (1995) Non-quasi-Newton updates for unconstrained optimization. J Comput Math 13(2):95–107

    Google Scholar 

  • Zhang J, Xu C (2001) Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J Comput Appl Math 137(2):269–278

    Article  Google Scholar 

  • Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102(1):147–167

    Article  Google Scholar 

  • Zhang JZ, Xue Y, Zhang K (2003) A structured secant method based on a new quasi-Newton equation for nonlinear least-squares problems. BIT 43(1):217–229

    Article  Google Scholar 

Download references

Acknowledgments

This research was in part supported by a grant from IPM (No. 91900051), and in part by the Research Council of Semnan University. The author is grateful to the anonymous reviewers for their valuable suggestions helped to improve the quality of this work. He also thanks Professor Michael Navon for providing the line search code.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Babaie-Kafaki, S. A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR-Q J Oper Res 11, 361–374 (2013). https://doi.org/10.1007/s10288-013-0233-4

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10288-013-0233-4

Keywords

Mathematics Subject Classification

Navigation