Optimization Letters

, Volume 9, Issue 5, pp 999–1015 | Cite as

A modified Perry conjugate gradient method and its global convergence

Original Paper

Abstract

In this work, we propose a new conjugate gradient method which consists of a modification of Perry’s method and ensures sufficient descent independent of the accuracy of the line search. An important property of our proposed method is that it achieves a high-order accuracy in approximating the second order curvature information of the objective function by utilizing a new modified secant condition. Moreover, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed method is preferable and in general superior to classical conjugate gradient methods in terms of efficiency and robustness.

Keywords

Unconstrained optimization Conjugate gradient method Sufficient descent property Hybrid secant equation Global convergence 

References

  1. 1.
    Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38, 401–416 (2007)CrossRefMATHMathSciNetGoogle Scholar
  2. 2.
    Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)CrossRefMATHMathSciNetGoogle Scholar
  3. 3.
    Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 22, 1–17 (2014)Google Scholar
  4. 4.
    Babaie-Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)CrossRefMATHMathSciNetGoogle Scholar
  5. 5.
    Babaie-Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261, 172–182 (2014)CrossRefMATHMathSciNetGoogle Scholar
  6. 6.
    Babaie-Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234, 1374–1386 (2010)CrossRefMATHMathSciNetGoogle Scholar
  8. 8.
    Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 58(3), 315–331 (2011)CrossRefMATHMathSciNetGoogle Scholar
  9. 9.
    Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization, 63(7), 1–16 (2012)Google Scholar
  10. 10.
    Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)CrossRefMATHMathSciNetGoogle Scholar
  11. 11.
    Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (1999)CrossRefGoogle Scholar
  12. 12.
    Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21(1), 123–160 (1995)CrossRefMATHGoogle Scholar
  13. 13.
    Chen, W., Liu, Q.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010)CrossRefGoogle Scholar
  14. 14.
    Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (1999)CrossRefMATHMathSciNetGoogle Scholar
  15. 15.
    Dai, Y.H., Yuan, Y.X.: Nonlinear conjugate gradient methods. Shanghai Scientific and Technical Publishers, Shanghai (2000)Google Scholar
  16. 16.
    Dai, Z., Wen, F.: A modified CG-DESCENT method for unconstrained optimization. J. Comput. Appl. Math. 235(11), 3332–3341 (2011)CrossRefMATHMathSciNetGoogle Scholar
  17. 17.
    Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5(4), 1–16 (2010)Google Scholar
  18. 18.
    Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)CrossRefMATHMathSciNetGoogle Scholar
  19. 19.
    Du, S.Q., Chen, Y.Y.: Global convergence of a modified spectral FR conjugate gradient method. Appl. Math. Comput. 202(2), 766–770 (2008)CrossRefMATHMathSciNetGoogle Scholar
  20. 20.
    Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)CrossRefMATHMathSciNetGoogle Scholar
  21. 21.
    Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)CrossRefMATHMathSciNetGoogle Scholar
  22. 22.
    Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)CrossRefMATHMathSciNetGoogle Scholar
  23. 23.
    Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2005)CrossRefMathSciNetGoogle Scholar
  24. 24.
    Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)CrossRefMATHMathSciNetGoogle Scholar
  25. 25.
    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)MATHMathSciNetGoogle Scholar
  26. 26.
    Hestenes, M.R., Stiefel, E.: Methods for conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)CrossRefMATHMathSciNetGoogle Scholar
  27. 27.
    Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J.Comput. Appl. Math. 129, 15–35 (2001)CrossRefMATHMathSciNetGoogle Scholar
  28. 28.
    Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)CrossRefMATHMathSciNetGoogle Scholar
  29. 29.
    Livieris, I.E., Pintelas, P.: Globally convergent modified Perry conjugate gradient method. Appl. Math. Comput. 218(18), 9197–9207 (2012)CrossRefMATHMathSciNetGoogle Scholar
  30. 30.
    Livieris, I.E., Pintelas, P.: A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization. J. Comput. Appl. Math. 239, 396–405 (2013)CrossRefMATHMathSciNetGoogle Scholar
  31. 31.
    Lu, A., Liu, H., Zheng, X., Cong, W.: A variant spectral-type FR conjugate gradient method and its global convergence. Appl. Math. Comput. 217(12), 5547–5552 (2011)CrossRefMATHMathSciNetGoogle Scholar
  32. 32.
    Narushima, Y., Yabe, H.: Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J. Comput. Appl. Math. 236(17), 4303–4317 (2012)CrossRefMATHMathSciNetGoogle Scholar
  33. 33.
    Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)CrossRefMATHGoogle Scholar
  34. 34.
    Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1978)CrossRefMATHMathSciNetGoogle Scholar
  35. 35.
    Polak, E., Ribière, G.: Note sur la convergence de methods de directions conjuguees. Revue Francais d’Informatique et de Recherche Operationnelle 16, 35–43 (1969)Google Scholar
  36. 36.
    Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)CrossRefMATHGoogle Scholar
  37. 37.
    Shanno, D.F., Phua, K.H.: Minimization of unconstrained multivariate functions. ACM Trans. Math. Softw. 2, 87–94 (1976)CrossRefMATHGoogle Scholar
  38. 38.
    Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)CrossRefMATHMathSciNetGoogle Scholar
  39. 39.
    Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)CrossRefMATHMathSciNetGoogle Scholar
  40. 40.
    Yu, G., Guan, L., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23(2), 275–293 (2008)CrossRefMATHMathSciNetGoogle Scholar
  41. 41.
    Yu, G.H.: Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. PhD thesis, Sun Yat-Sen University (2007)Google Scholar
  42. 42.
    Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)CrossRefMATHMathSciNetGoogle Scholar
  43. 43.
    Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137, 269–278 (2001)CrossRefMATHMathSciNetGoogle Scholar
  44. 44.
    Zhang, L.: Two modified Dai-Yuan nonlinear conjugate gradient methods. Numer. Algorithms 50, 1–16 (2009)CrossRefMathSciNetGoogle Scholar
  45. 45.
    Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216, 251–264 (2008)CrossRefMATHMathSciNetGoogle Scholar
  46. 46.
    Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)CrossRefMATHMathSciNetGoogle Scholar
  47. 47.
    Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.Department of MathematicsUniversity of PatrasPatrasGreece

Personalised recommendations