Advertisement

Calcolo

, 56:2 | Cite as

A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations

  • Farzad Rahpeymaii
  • Keyvan AminiEmail author
  • Tofigh Allahviranloo
  • Mohsen Rostamy Malkhalifeh
Article
  • 177 Downloads

Abstract

In this paper, we introduce a new three-term conjugate gradient (NTTCG) method to solve unconstrained smooth optimization problems. NTTCG is based on conjugate gradient methods proposed by Dai and Yuan (SIAM J Optim 10:177–182, 1999) and Polak and Ribière (Rev Francaise Inform Rech Oper 3(16):35–43, 1969). The descent property of the direction generated by NTTCG in each iteration is established. Under some standard assumptions, the global convergence results of the new methods are investigated. The extension of this algorithm is proposed to solve absolute value equations (AVE), called three-term conjugate subgradient (NTTCS) method. Numerical experiments are reported for unconstrained CUTEst problems and AVE.

Keywords

Conjugate gradient method Smooth optimization Conjugate subgradient method Absolute value equations Wolfe conditions 

Mathematics Subject Classification

90C30 93E24 34A34 

Notes

References

  1. 1.
    Al-Bayati, A.Y., Sharif, W.H.: A new three-term conjugate gradient method for unconstrained optimization. Can. J. Sci. Eng. Math. 1(5), 108–124 (2010)Google Scholar
  2. 2.
    Andrei, N.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60, 1457–1471 (2011)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219, 6316–6327 (2013)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Aslam Noor, M., Iqbal, J., Inayat Noor, Kh, Al-Said, E.: On an iterative method for solving absolute value equations. Optim. Lett. 6, 1027–1033 (2012)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Bioucas-Dias, J., Figueiredo, M.: A new TwIST: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE. Trans. Image Process. 16, 2992–3004 (2007)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Caccetta, L., Qu, B., Zhou, G.: A globally and quadratically convergent method for absolute value equations. Comput. Optim. Appl. 48(1), 45–58 (2011)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Conn, A.R., Gould, N.I.M., Toint, P.L.: Convergence of quasi-Newton matrices generated by the symmetric rank one update. Math. Program. 50, 177–195 (1991)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, H.X., Yuan, Y.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10, 345–358 (1999)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Dong, X.L., Li, H.W., He, Y.B.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)MathSciNetGoogle Scholar
  15. 15.
    Esmaeili, H., Kimiaei, M.: An improved adaptive trust-region method for unconstrained optimization. Math. Model. Anal. 19(4), 469–490 (2014)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Griewank, A.: The global convergence of Broyden-like methods with a suitable line search. J. Austr. Methods Soc. Ser. B 28, 75–92 (1986)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Gould, N.I.M., Orban, D., Toint, PhL: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60(3), 545–557 (2015)MathSciNetCrossRefGoogle Scholar
  19. 19.
  20. 20.
    Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)MathSciNetzbMATHGoogle Scholar
  22. 22.
    Hale, E.T., Yin, W., Zhang, Y.: Fixed-point continuation applied to compressed sensing: implementation and numerical experiment. J. Comput. Math. 28(2), 170–194 (2010)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)MathSciNetCrossRefGoogle Scholar
  24. 24.
    Iqbal, J., Iqbal, A., Arif, M.: Levenberg–Marquardt method for solving systems of absolute value equations. J. Comput. Appl. Math. 282, 134–138 (2015)MathSciNetCrossRefGoogle Scholar
  25. 25.
    Kimiaei, M., Ghaderi, S.: A new restarting adaptive trust-region method for unconstrained optimization. J. Oper. Res. Soc. China 5(4), 487–507 (2017)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Lu, Z., Chen, X.: Generalized conjugate gradient methods for \(\ell _1\) regularized convex quadratic programming with finite convergence. Math. Oper. Res. 43(1), 275–303 (2017)CrossRefGoogle Scholar
  27. 27.
    Mangasarian, O.L.: A generalized Newton method for absolute value equations. Optim. Lett. 3, 101–108 (2009)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Mangasarian, O.L., Meyer, R.R.: Absolute value equations. Linear Algebra Appl. 419, 359–367 (2006)MathSciNetCrossRefGoogle Scholar
  29. 29.
    Moosaei, H., Ketabchi, S., Jafari, H.: Minimum norm solution of the absolute value equations via simulated annealing algorithm. Afr. Mat. 26(7–8), 1221–1228 (2015)MathSciNetCrossRefGoogle Scholar
  30. 30.
    Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, NewYork (2006)zbMATHGoogle Scholar
  31. 31.
    Moosaei, H., Ketabchi, S., Noor, A., Iqbal, J., Hooshyarbakhsh, V.: Some techniques for solving absolute value equations. Appl. Math. Comput. 268, 696–705 (2015)MathSciNetGoogle Scholar
  32. 32.
    More, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. 20(3), 286–307 (1994)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, New York (1970)zbMATHGoogle Scholar
  34. 34.
    Polak, E., Ribière, G.: Note sur la convergence de directions conjugées. Rev. Francaise Inform. Rech. Oper. 3(16), 35–43 (1969)zbMATHGoogle Scholar
  35. 35.
    Powell, M.J.D.: Restart procedures of the conjugate gradient method. Math. Program. 2, 241–254 (1977)MathSciNetCrossRefGoogle Scholar
  36. 36.
    Rockafellar, R.T.: New applications of duality in convex programming. In: Proceedings Fourth Conference on Probability, Brasov, Romania (1971)Google Scholar
  37. 37.
    Rohn, J.: A theorem of the alternatives for the equation \(Ax+B|x|=b\). Linear Multilinear Algebra 52, 421–426 (2004)MathSciNetCrossRefGoogle Scholar
  38. 38.
    Sorber, L., Barel, M.V., Lathauwer, L.D.: Unconstrained optimization of real functions in complex variables. SIAM J. Optim. 22(3), 879–898 (2012)MathSciNetCrossRefGoogle Scholar
  39. 39.
    Wright, S.J., Nowak, R.D., Figueiredo, M.A.T.: Sparse reconstruction by separable approximation. IEEE. Trans. Signal Process. 57(7), 2479–2493 (2009)MathSciNetCrossRefGoogle Scholar
  40. 40.
    Yuan, G., Wei, Z., Li, G.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)MathSciNetCrossRefGoogle Scholar
  41. 41.
    Yuan, G., Sheng, Z., Liu, W.: The modified HZ conjugate gradient algorithm for large-scale nonsmooth optimization.  https://doi.org/10.1371/journal.pone.0164289
  42. 42.
    Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)MathSciNetCrossRefGoogle Scholar
  43. 43.
    Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)MathSciNetCrossRefGoogle Scholar
  44. 44.
    Zhang, J., Xiao, Y., Wei, Z.: Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization. Math. Probl. Eng. (2009).  https://doi.org/10.1155/2009/243290
  45. 45.
    Zhang, C., Wei, Q.: Global and finite convergence of a generalized Newton method for absolute value equations. J. Optim. Theory Appl. 143(2), 391–403 (2009)MathSciNetCrossRefGoogle Scholar
  46. 46.
    Zhou, Q., Zhou, F., Cao, F.: A nonmonotone trust region method based on simple conic models for unconstrained optimization. Appl. Math. Comput. 225, 295–305 (2013)MathSciNetzbMATHGoogle Scholar
  47. 47.
    Ujevic, N.: A new iterative method for solving linear systems. Appl. Math. Comput. 179, 725–730 (2006)MathSciNetzbMATHGoogle Scholar

Copyright information

© Istituto di Informatica e Telematica (IIT) 2018

Authors and Affiliations

  • Farzad Rahpeymaii
    • 1
  • Keyvan Amini
    • 2
    Email author
  • Tofigh Allahviranloo
    • 1
  • Mohsen Rostamy Malkhalifeh
    • 1
  1. 1.Department of Mathematics, Science and Research BranchIslamic Azad UniversityTehranIran
  2. 2.Department of Mathematics, Faculty of SciencesRazi UniversityKermanshahIran

Personalised recommendations