Skip to main content

A Modified Dai-Yuan Conjugate Gradient Algorithm for Large-Scale Optimization Problems

  • Conference paper
  • First Online:
Cloud Computing and Security (ICCCS 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11063))

Included in the following conference series:

  • 1667 Accesses

Abstract

It is well know that DY conjugate gradient is one of the most efficient optimization algorithms, which sufficiently utilizes the current information of the search direction and gradient function. It is regrettable that DY conjugate gradient algorithm fails to address large scale optimization model and few scholars and writers paid much attention to modifying it. Thus, to solve large scale unconstrained optimization problems, a modified DY conjugate gradient algorithm under Yuan-Wei-Lu line search was proposed. The proposed algorithm not only has a descent character but also a trust region property. At the same time, the objective algorithm meets the demand of global convergence and the corresponding numeral test proves it is more outstanding compare with similar optimization algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akkraoui, A., Trmolet, Y., Todling, R.: Preconditioning of variational data assimilation and the use of a bi-conjugate gradient method. Q. J. R. Meteorol. Soc. 139, 731–741 (2013)

    Article  Google Scholar 

  2. Jordan, A., Bycul, R.P.: The parallel algorithm of conjugate gradient method. In: Grigoras, D., Nicolau, A., Toursel, B., Folliot, B. (eds.) IWCC 2001. LNCS, vol. 2326, pp. 156–165. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-47840-X_15

    Chapter  Google Scholar 

  3. Shanno, D.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24, 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  4. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theor. Appl. 64, 379–397 (1990)

    Article  MathSciNet  Google Scholar 

  5. Touati-Ahmed, D., Storey, C.: Globally convergent hybrid conjugate gradient methods. J. Optim. Theor. Appl. 64, 379–397 (1990)

    Article  Google Scholar 

  6. Polak, E., Ribire, G.: Note sur la convergence de mthodes de directions conjugues. Rev. Franaise Informat. Recherche Oprationnelle 16, 35–43 (2009)

    Google Scholar 

  7. Yuan, G., Hu, W., Sheng, Z.: A conjugate gradient algorithm with Yuan-Wei-Lu line search. In: Sun, X., Chao, H.-C., You, X., Bertino, E. (eds.) ICCCS 2017. LNCS, vol. 10603, pp. 738–746. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68542-7_64

    Chapter  Google Scholar 

  8. Yuan, G., Sheng, Z., Wang, B.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2017)

    Article  MathSciNet  Google Scholar 

  9. Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  10. Darzentas, J.: Problem complexity and method efficiency in optimization. J. Oper. Res. Soc. 35, 455 (1984)

    Article  Google Scholar 

  11. Dong, J., Jiao, B., Chen, L.: A new hybrid HS-DY conjugate gradient method. In: International Joint Conference on Computational Sciences and Optimization, vol. 4, pp. 94–98 (2011)

    Google Scholar 

  12. Gilbert, J., Lemarchal, C.: Some numerical experiments with variable-storage quasi-Newton algorithms. Math. Program. 45, 407–435 (1989)

    Article  MathSciNet  Google Scholar 

  13. Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1990)

    Article  MathSciNet  Google Scholar 

  14. Tang, J., Dong, L., Zhang, X.: A new class of memory gradient methods with Wolfe line search. J. Shandong Univ. 44, 33–37 (2005)

    MathSciNet  MATH  Google Scholar 

  15. Dixon, L., Ducksbury, P., Singh, P.: A new three-term conjugate gradient method. J. Optim. Theor. Appl. 47, 285–300 (1985)

    Article  MathSciNet  Google Scholar 

  16. Fletcher, R.: Practical Methods of Optimization, vol. 1, pp. 71–94. Wiley (1980)

    Google Scholar 

  17. Surhone, L.M., Timpledon, M.T., Marseken, S.F.: Quasi-Newton method. Betascript Publ. 14, 115–150 (2010)

    Google Scholar 

  18. Lian, S., Wang, C.: Global convergence properties of the conjugate descent method. OR Trans. 7, 1–9 (2003)

    Google Scholar 

  19. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numerische Mathematik 104, 561–572 (2006)

    Article  MathSciNet  Google Scholar 

  20. Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  21. Islam, M., Robert, A., James, W.: Integrated economic-hydrologic modelling for groundwater basin management. Int. J. Water Resour. Dev. 13, 21–34 (1997)

    Article  Google Scholar 

  22. Qin, P., Huang, D., Yuan, Y.: Integrated gravity and gravity gradient 3D inversion using the non-linear conjugate gradient. J. Appl. Geophys. 126, 52–73 (2016)

    Article  Google Scholar 

  23. Byrd, R., Nocedal, J.: A Tool for the analysis of quasi-Newton methods with application to unconstrained minimization. Soc. Ind. Appl. Math. 26, 727–739 (1989)

    MathSciNet  MATH  Google Scholar 

  24. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  25. Dai, Y., Han, J., Liu, G.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10, 345–358 (1998)

    Article  MathSciNet  Google Scholar 

  26. Dai, Y., Yuan, Y.: Convergence properties of the conjugate descent method. Adv. Math. 26, 552–562 (1996)

    MathSciNet  MATH  Google Scholar 

  27. Dai, Y., Yuan, Y.: Convergence properties of the Fletcher-Reeves method. IMA J. Numer. Anal. 16, 155–164 (1996)

    Article  MathSciNet  Google Scholar 

  28. Dai, Z., Tian, B.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5, 615–630 (2011)

    Article  MathSciNet  Google Scholar 

  29. Pan, Z., Cai, Y., Tan, S.: Transient analysis of on-chip power distribution networks using equivalent circuit modeling. In: International Symposium on Quality Electronic Design Proceedings, pp. 63–68 (2004)

    Google Scholar 

Download references

Acknowledgements

We would like to thank reviewers and editors for their meaningful suggestions. This work is supported by the National Natural Science Foundation of China (Grant No. 11661009), the Guangxi Science Fund for Distinguished Young Scholars (No. 2015GXNSFGA139001), and the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tingting Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yuan, G., Li, T. (2018). A Modified Dai-Yuan Conjugate Gradient Algorithm for Large-Scale Optimization Problems. In: Sun, X., Pan, Z., Bertino, E. (eds) Cloud Computing and Security. ICCCS 2018. Lecture Notes in Computer Science(), vol 11063. Springer, Cham. https://doi.org/10.1007/978-3-030-00006-6_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-00006-6_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-00005-9

  • Online ISBN: 978-3-030-00006-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics