Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization

  • 157 Accesses

Abstract

In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of \({y_{k}^{T}} s_{k}\). For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

References

  1. 1.

    Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research, Springer, Berlin (2006)

  2. 2.

    Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

  3. 3.

    Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

  4. 4.

    Polak, E., Ribiére, G.: Note sur la convergence de directions conjuguées. Francaise Infomat Recherche Operatonelle 16, 35–43 (1969)

  5. 5.

    Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

  6. 6.

    Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

  7. 7.

    Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

  8. 8.

    Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)

  9. 9.

    Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204, 410–420 (2010)

  10. 10.

    Fatemi, M.: A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim. Methods Softw. 32, 1095–1112 (2016)

  11. 11.

    Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

  12. 12.

    Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

  13. 13.

    Zheng, Y., Zhen, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175, 502–509 (2017)

  14. 14.

    Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)

  15. 15.

    Yang, Y., Chen, Y., Lu, Y.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numerical Algorithms 76, 813–828 (2017)

  16. 16.

    Alhawarat, A., Salleh, Z., Mamat, M., Rivaie, M.: An efficient modified Polak-Ribiére-Polyak conjugate gradient method with global convergence properties. Optim. Methods Softw. 32, 1299–1312 (2017)

  17. 17.

    Liu, Z., Liu, H.: Several efficient gradient methods with approximate optimal step sizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)

  18. 18.

    Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29, 583–591 (2014)

  19. 19.

    Babaie-Kafaki, S.: On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim. Lett. 10, 1789–1797 (2016)

  20. 20.

    Fatemi, M.: An optimal parameter for Dai–Liao family of conjugate gradient methods. J. Optim. Theory Appl. 169, 587–605 (2016)

  21. 21.

    Beale, E.M.L.: A derivation of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp 39–43. Academic Press, London (1972)

  22. 22.

    Nazareth, L.: A conjugate direction algorithm without line searches. J. Optim. Theory Appl. 23, 373–387 (1977)

  23. 23.

    Dixon, L.C.W., Ducksbury, P.G., Ningh, P.: A new three-term conjugate gradient method. J. Optim. Theory Appl. 47, 285–300 (1985)

  24. 24.

    Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)

  25. 25.

    Andrei, N.: A modified Polak-Ribiére-Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60, 1457–1471 (2011)

  26. 26.

    Liu, J.K., Wu, X.S.: New three-term conjugate gradient method for solving unconstrained optimization problems. Sci. Asia 40, 295–300 (2014)

  27. 27.

    Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60, 89–110 (2015)

  28. 28.

    Wu, Y.: A modified three-term PRP conjugate gradient algorithm for optimization models. Journal of Inequalities and Applications 97, 1–14 (2017)

  29. 29.

    Liu, J.K., Feng, Y.M., Zou, L.M.: Some three-term conjugate gradient methods with the inexact line search condition. Calcolo 55, 1–16 (2018)

  30. 30.

    Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)

  31. 31.

    Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)

  32. 32.

    Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219, 6316–6327 (2013)

  33. 33.

    Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)

  34. 34.

    Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)

  35. 35.

    Liu, J.K., Li, S.J.: New three-term conjugate gradient method with guaranteed global convergence. Int. J. Comput. Math. 91, 1744–1754 (2014)

  36. 36.

    Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)

  37. 37.

    Dong, X.L., Liu, H.W., He, Y.B.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)

  38. 38.

    Arzuka, I., Abu Bakar, M.R., Leong, W.J.: A scaled three-term conjugate gradient method for unconstrained optimization. Journal of Inequalities and Applications 325, 1–16 (2016)

  39. 39.

    Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

  40. 40.

    Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer Optimization and its Applications, Springer (2006)

  41. 41.

    Andrei, N.: An unconstrained optimization test functions collection. Advanced Modeling and Optimization 10, 147–161 (2008)

  42. 42.

    Burke, J.V., Engle, A.: Line search methods for convex-composite optimization. arXiv:1806.05218v1 (2018)

  43. 43.

    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

Download references

Acknowledgments

The authors would like to thank the anonymous reviewers for their useful suggestions and valuable comments which were greatly helpful to improve the quality of this paper.

Author information

Correspondence to M. R. Eslahchi.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

In this Appendix, we present all the numerical results of our first set of experiments of Section 4 in Table 7. The numbers in the first and second columns represent the number of problems (see Table 3) and their dimensions. In the reminder columns, the symbol − indicates that the method failed to solve the problem.

Table 7 The numerical results of five modified three-term CG algorithms

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bojari, S., Eslahchi, M.R. Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer Algor 83, 901–933 (2020). https://doi.org/10.1007/s11075-019-00709-7

Download citation

Keywords

  • Large-scale optimization problem
  • Nonconvex optimization
  • Conjugate gradient method
  • Weak-Wolfe-Powell line search technique
  • Global convergence

Mathematics Subject Classification (2010)

  • 90C30
  • 90C06
  • 90C26