Skip to main content
Log in

A modified conjugate gradient method based on the self-scaling memoryless BFGS update

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

There are many conjugate gradient methods to solving unconstrained optimization problems. Compared with the conjugate gradient method, the accelerated conjugate gradient method has better numerical effects for the unconstrained optimization problem. In this paper, we propose a modified conjugate gradient method with Wolfe line search, which generates descent direction. Under mild conditions, we prove that the proposed method is globally convergent. Numerical results indicate that the proposed method is considerable for our test functions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13(2), 185–188 (1971)

    Article  MathSciNet  Google Scholar 

  2. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  3. Yuan, G., Wei, Z., Lu, X.: Global convergence of the BFGS method and the PRP method for general functions under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  4. Yuan, G., Wei, Z., Yang, Y.: The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J. Comput. Appl. Math. 362, 262–275 (2019)

    Article  MathSciNet  Google Scholar 

  5. Yuan, G., Sheng, Z., Wang, B., et al.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  Google Scholar 

  6. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)

    Article  MathSciNet  Google Scholar 

  7. Sun, W., Yuan, Y: Optimization theory and methods (2006)

  8. Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 26–44 (2018)

  9. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(3), 503–528 (1989)

    Article  MathSciNet  Google Scholar 

  10. Nocedal, J.: Updating quasi-newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  MathSciNet  Google Scholar 

  11. Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3(3), 244–256 (1978)

    Article  MathSciNet  Google Scholar 

  12. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  Google Scholar 

  13. Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 149–164 (2017)

  14. Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)

    Article  MathSciNet  Google Scholar 

  15. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  16. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  17. More, J.J., Garbow, B.S., Hillstrom, K.E., et al.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    Article  MathSciNet  Google Scholar 

  18. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work was supported by the National Natural Science Foundation of China (No. 11961010, 61967004), Guangxi Key Laboratory of Automatic Testing Technology and Instruments, China (No. YQ19111).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiangli Li.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X., Zhao, W. A modified conjugate gradient method based on the self-scaling memoryless BFGS update. Numer Algor 90, 1017–1042 (2022). https://doi.org/10.1007/s11075-021-01220-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-021-01220-8

Keywords

Navigation