Skip to main content
Log in

A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

In this paper, we propose a new hybrid conjugate gradient method to solve unconstrained optimization problems. This new method is defined as a convex combination of DY and DL conjugate gradient methods. The special feature is that our search direction respects Newton’s direction, but without the need to store or calculate the second derivative (the Hessian matrix), due to the use of the secant equation that allows us to remove the troublesome part required by the Newton method. Our search direction not only satisfies the descent property, but also the sufficient descent condition through the use of the strong Wolfe line search, the global convergence is proved. The numerical comparison shows the efficiency of the new algorithm, as it outperforms both the DY and DL algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data availability statement

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015). https://doi.org/10.1080/10556788.2014.966825

    Article  MathSciNet  MATH  Google Scholar 

  2. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969). https://doi.org/10.1137/1011036

    Article  MathSciNet  MATH  Google Scholar 

  3. Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13(2), 185–188 (1971). https://doi.org/10.1137/1013035

    Article  MathSciNet  MATH  Google Scholar 

  4. Narayanan, S., Kaelo, P.: A linear hybridization of Dai–Yuan and Hestenes–Stiefel conjugate gradient method for unconstrained optimization. Numer. Math. Theor. Meth. Appl. 14, 527–539 (2021). https://doi.org/10.4208/nmtma.OA-2020-0056

    Article  MathSciNet  MATH  Google Scholar 

  5. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952). https://doi.org/10.6028/jres.049.044

    Article  MathSciNet  MATH  Google Scholar 

  6. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964). https://doi.org/10.1093/comjnl/7.2.149

    Article  MathSciNet  MATH  Google Scholar 

  7. Polak, E., Ribiére, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Fr. Inf. Rech. Oper. 3(16), 35–43 (1969). https://doi.org/10.1051/m2an/196903R100351

    Article  MATH  Google Scholar 

  8. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969). https://doi.org/10.1016/0041-5553(69)90035-4

    Article  MATH  Google Scholar 

  9. Fletcher, R.: Practical methods of optimization. Unconstrained Optimization. vol. 1, John Wiley and Sons, New York (1987)

  10. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69(1), 129–137 (1991). https://doi.org/10.1007/BF00940464

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999). https://doi.org/10.1137/S1052623497318992

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001). https://doi.org/10.1007/s002450010019

    Article  MathSciNet  MATH  Google Scholar 

  13. Yao, S., Qin, B.: A hybrid of DL and WYL nonlinear conjugate gradient methods. Abstract and Applied Analysis, vol. 2014, Article ID 279891, p. 9 (2014). https://doi.org/10.1155/2014/279891

  14. Xu, X., Kong, F.Y.: New hybrid conjugate gradient methods with the generalized Wolfe line search. SpringerPlus. 5(1), 1–10 (2016). https://doi.org/10.1186/s40064-016-2522-9

    Article  Google Scholar 

  15. Djordjević, S.S.: New hybrid conjugate gradient method as a convex combination of FR and PRP methods. Filomat. 30(11), 3083–3100 (2016). https://doi.org/10.2298/FIL1611083D

    Article  MathSciNet  MATH  Google Scholar 

  16. Djordjević, S.S.: New hybrid conjugate gradient method as a convex combination of LS and CD methods. Filomat. 31(6), 1813–1825 (2017). https://doi.org/10.2298/FIL1706813D

    Article  MathSciNet  MATH  Google Scholar 

  17. Kaelo, P., Narayanan, S., Thuto, M.V.: A modified quadratic hybridization of Polak-Ribiere-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems. Int. J. Opt. Control Theor.-Appl. (IJOCTA) 7(2), 177–185 (2017). https://doi.org/10.11121/ijocta.01.2017.00339

    Article  MathSciNet  Google Scholar 

  18. Hassan, B.A., Owaid, O.A., Yasen, Z.T.: A variant of hybrid conjugate gradient methods based on the convex combination for optimization. Indones. J Electr. Eng. Comput. Sci. 20(2), 1007–1015 (2020). https://doi.org/10.11591/ijeecs.v20.i2.pp1007-1015

    Article  Google Scholar 

  19. Alhawarat, A., Salleh, D., Masmali, I.A.: A convex combination between two different search directions of conjugate gradient method and application in image restoration. Mathematical Problems in Engineering (2021). https://doi.org/10.1155/2021/9941757

  20. Mohamed, N.S., Mamat, M., Rivaie, M., Shaharudin, S.M.: A comparison on classical-hybrid conjugate gradient method under exact line search. Int. J. Adv. Intell. Inform. 5(2), 150–168 (2019). https://doi.org/10.26555/ijain.v5i2.356

    Article  Google Scholar 

  21. Akinduko, O.B.: A new conjugate gradient method with sufficient descent property. Earthline J. Math. Sci. 6(1), 163–174 (2021). https://doi.org/10.34198/ejms.6121.163174

    Article  Google Scholar 

  22. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008). https://doi.org/10.1007/s11075-007-9152-9

    Article  MathSciNet  MATH  Google Scholar 

  23. Liu, J.K., Li, S.J.: New hybrid conjugate gradient method for unconstrained optimization. Appl. Math. Comput. 245, 36–43 (2014). https://doi.org/10.1016/j.amc.2014.07.096

    Article  MathSciNet  MATH  Google Scholar 

  24. Jardow, F.N., Al-Naemi, G.M.: A new hybrid conjugate gradient algorithm as a convex combination of MMWU and RMIL nonlinear problems. J. Interdiscip. Math. 24(3), 637–655 (2021). https://doi.org/10.1080/09720502.2020.1815346

    Article  Google Scholar 

  25. Djordjević, S.S.: New hybrid conjugate gradient method as a convex combination of LS and FR methods. Acta. Math. Sci. 39(1), 214–228 (2019). https://doi.org/10.1007/s10473-019-0117-6

    Article  MathSciNet  MATH  Google Scholar 

  26. Djordjević, S.S.: New hybrid conjugate gradient method as a convex combination of HS and FR conjugate gradient methods. J. Appl. Math. Comput. 2(9), 366–378 (2018). https://doi.org/10.26855/jamc.2018.09.002

    Article  Google Scholar 

  27. Abubakar, A.B., Kumam, P., Malik, M., Chaipunya, P., Ibrahim, A.H.: A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection. AIMS Math. 6(6), 6506–6527 (2021). https://doi.org/10.3934/math.2021383

    Article  MathSciNet  MATH  Google Scholar 

  28. Li, M.: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. J. Indus. Manag. Optim. 16(1), 245–260 (2020). https://doi.org/10.3934/jimo.2018149

    Article  MATH  Google Scholar 

  29. Livieris, I.E., Tampakas, V., Pintelas, P.: A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numer. Algorithms 79, 1169–1185 (2018). https://doi.org/10.1007/s11075-018-0479-1

    Article  MathSciNet  MATH  Google Scholar 

  30. Andrei, N.: A hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization as a convex combination of Hestenes-Stiefel and Dai-Yuan algorithms. Stud. Inform. Control. 17(4), 373–392 (2008)

    Google Scholar 

  31. Zoutendijk, G.: Nonlinear programming, computational methods. Integer and nonlinear programming. 37–86 (1970)

  32. Dai, Y., Yuan, Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1), 33–47 (2001). https://doi.org/10.1023/A:1012930416777

    Article  MathSciNet  MATH  Google Scholar 

  33. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  34. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). https://doi.org/10.1007/s101070100263

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to express their sincere thanks to the editor and the anonymous reviewers for their helpful comments and suggestions to improve this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mourad Ghiat.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hamel, N., Benrabia, N., Ghiat, M. et al. A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems. J. Appl. Math. Comput. 69, 2531–2548 (2023). https://doi.org/10.1007/s12190-022-01821-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-022-01821-z

Keywords

Mathematics Subject Classification

Navigation