Skip to main content

Computational Approaches in Large-Scale Unconstrained Optimization

  • Chapter
  • First Online:
Big Data Optimization: Recent Developments and Challenges

Part of the book series: Studies in Big Data ((SBD,volume 18))

Abstract

As a topic of great significance in nonlinear analysis and mathematical programming, unconstrained optimization is widely and increasingly used in engineering, economics, management, industry and other areas. Unconstrained optimization also arises in reformulation of the constrained optimization problems in which the constraints are replaced by some penalty terms in the objective function. In many big data applications, solving an unconstrained optimization problem with thousands or millions of variables is indispensable. In such situations, methods with the important feature of low memory requirement are helpful tools. Here, we study two families of methods for solving large-scale unconstrained optimization problems: conjugate gradient methods and limited-memory quasi-Newton methods, both of them are structured based on the line search. Convergence properties and numerical behaviors of the methods are discussed. Also, recent advances of the methods are reviewed. Thus, new helpful computational tools are supplied for engineers and mathematicians engaged in solving large-scale unconstrained optimization problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Statist. Math. Tokyo 11(1), 1–16 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  2. Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)

    MathSciNet  Google Scholar 

  4. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  6. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  7. Andrei, N.: 40 conjugate gradient algorithms for unconstrained optimization—a survey on their definition. ICI Technical Report No. 13/08 (2008)

    Google Scholar 

  8. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  9. Andrei, N.: A hybrid conjugate gradient algorithm for unconstrained optimization as a convex combination of Hestenes-Stiefel and Dai-Yuan. Stud. Inform. Control 17(1), 55–70 (2008)

    MathSciNet  Google Scholar 

  10. Andrei, N.: A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization 57(4), 549–570 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  11. Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141(2), 249–264 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  12. Andrei, N.: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algorithms 54(1), 23–46 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  13. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  14. Andrei, N.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60(12), 1457–1471 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  15. Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  16. Babaie-Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  17. Babaie-Kafaki, S.: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput. Optim. Appl. 52(2), 409–414 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  18. Babaie-Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4):361–374 (2013)

    Google Scholar 

  19. Babaie-Kafaki, S.: A new proof for the sufficient descent condition of Andrei’s scaled conjugate gradient algorithms. Pac. J. Optim. 9(1), 23–28 (2013)

    MathSciNet  MATH  Google Scholar 

  20. Babaie-Kafaki, S.: An eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method. Bull. Iranian Math. Soc. 40(1), 235–242 (2014)

    MathSciNet  MATH  Google Scholar 

  21. S. Babaie-Kafaki. On the sufficient descent condition of the Hager-Zhang conjugate gradient methods. 4OR 12(3):285–292 (2014)

    Google Scholar 

  22. Babaie-Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  23. Babaie-Kafaki, S., Fatemi, M.: A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim. Methods Softw. 28(5), 1040–1050 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  24. Babaie-Kafaki, S., Fatemi, M., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 58(3), 315–331 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  25. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  26. Babaie-Kafaki, S., Ghanbari, R.: A descent extension of the Polak-Ribière-Polyak conjugate gradient method. Comput. Math. Appl. 68(12), 2005–2011 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  27. Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  28. Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)

    MathSciNet  MATH  Google Scholar 

  29. Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  30. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  31. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. Numer. Algorithms 68(3), 481–495 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  32. Babaie-Kafaki, S., Ghanbari, R.: Two optimal Dai-Liao conjugate gradient methods. Optimization 64(11), 2277–2287 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  33. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234(5), 1374–1386 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  34. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  35. Barzilai, J., Borwein, J.M.: Two-point stepsize gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  36. Beale, E.M.L.: A derivation of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp. 39–43. Academic Press, NewYork (1972)

    Google Scholar 

  37. Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  38. Buckley, A.G.: Extending the relationship between the conjugate gradient and BFGS algorithms. Math. Program. 15(1), 343–348 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  39. Cauchy, A.: Méthodes générales pour la résolution des systèmes déquations simultanées. C. R. Acad. Sci. Par. 25(1), 536–538 (1847)

    Google Scholar 

  40. Dai, Y.H.: Analyses of conjugate gradient methods. Ph.D. Thesis, Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences (1997)

    Google Scholar 

  41. Dai, Y.H.: New properties of a nonlinear conjugate gradient method. Numer. Math. 89(1), 83–98 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  42. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  43. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  44. Dai, Y.H., Liao, L.Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  45. Dai, Y.H., Ni, Q.: Testing different conjugate gradient methods for large-scale unconstrained optimization. J. Comput. Math. 22(3), 311–320 (2003)

    MathSciNet  MATH  Google Scholar 

  46. Dai, Y.H., Yuan, J., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization. Comput. Optim. Appl. 22(1), 103–109 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  47. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  48. Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1–4), 33–47 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  49. Fletcher, R.: Practical Methods of Optimization. Wiley, New York (1987)

    MATH  Google Scholar 

  50. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  51. Ford, J.A., Moghrabi, I.A.: Multi-step quasi-Newton methods for optimization. J. Comput. Appl. Math. 50(1–3), 305–323 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  52. Ford, J.A., Moghrabi, I.A.: Minimum curvature multistep quasi-Newton methods. Comput. Math. Appl. 31(4–5), 179–186 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  53. Ford, J.A., Moghrabi, I.A.: Using function-values in multi-step quasi-Newton methods. J. Comput. Appl. Math. 66(1–2), 201–211 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  54. Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40(2), 191–216 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  55. Forsythe, G.E.: On the asymptotic directions of the \(s\)-dimensional optimum gradient method. Numer. Math. 11(1), 57–76 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  56. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  57. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  58. Guo, Q., Liu, J.G., Wang, D.H.: A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule. J. Appl. Math. Comput. 28(1–2), 435–446 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  59. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  60. Hager, W.W., Zhang, H.: Algorithm 851: CG-Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  61. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  62. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Research Nat. Bur. Standards 49(6), 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  63. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  64. Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  65. Li, D.H., Fukushima, M.: On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11(4), 1054–1064 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  66. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  67. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large-scale optimization. Math. Program. 45(3, Ser. B), 503–528 (1989)

    Google Scholar 

  68. Liu, G.H., Han, J.Y., Yin, H.X.: Global convergence of the Fletcher-Reeves algorithm with an inexact line search. Appl. Math. J. Chin. Univ. Ser. B 10(1), 75–82 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  69. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms. I. Theory. J. Optim. Theory Appl. 69(1), 129–137 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  70. Nazareth, J.L.: A conjugate direction algorithm without line searches. J. Optim. Theory Appl. 23(3), 373–387 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  71. Nazareth, J.L.: A relationship between the BFGS and conjugate gradient algorithms and its implications for the new algorithms. SIAM J. Numer. Anal. 16(5), 794–800 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  72. Nazareth, J.L.: Conjugate gradient methods less dependent on conjugacy. SIAM Rev. 28(4), 501–511 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  73. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  74. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  75. Oren, S.S.: Self-scaling variable metric (SSVM) algorithms. II. Implementation and experiments. Manage. Sci. 20(5), 863–874 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  76. S.S. Oren and D.G. Luenberger. Self-scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Manage. Sci. 20(5), 845–862 (1973/1974)

    Google Scholar 

  77. Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  78. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  79. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Française Informat. Recherche Opérationnelle 3(16), 35–43 (1969)

    MATH  Google Scholar 

  80. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  81. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12(2), 241–254 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  82. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)

    Google Scholar 

  83. Raydan, M.: The Barzilai and Borwein gradient method for the large-scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  84. Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3(3), 244–256 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  85. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  86. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    MATH  Google Scholar 

  87. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  88. Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  89. Wei, Z., Li, G., Qi, L.: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)

    MathSciNet  MATH  Google Scholar 

  90. Wei, Z., Yu, G., Yuan, G., Lian, Z.: The superlinear convergence of a modified BFGS-type method for unconstrained optimization. Comput. Optim. Appl. 29(3), 315–332 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  91. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  92. Wolfe, P.: Convergence conditions for ascent methods. II. Some corrections. SIAM Rev. 13(2), 185–188 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  93. Xu, C., Zhang, J.Z.: A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann. Oper. Res. 103(1–4), 213–234 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  94. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28(2), 203–225 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  95. Yang, X., Luo, Z., Dai, X.: A global convergence of LS-CD hybrid conjugate gradient method. Adv. Numer. Anal. Article ID 517452 (2013)

    Google Scholar 

  96. Yu, G., Guan, L., Li, G.: Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. J. Ind. Manage. Optim. 4(3), 565–579 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  97. Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3(1), 11–21 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  98. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  99. Yuan, Y.X., Byrd, R.H.: Non-quasi-Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)

    MathSciNet  MATH  Google Scholar 

  100. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  101. Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  102. Zhang, L., Zhou, W., Li, D.H.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  103. Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  104. Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  105. Zoutendijk, G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    Google Scholar 

Download references

Acknowledgments

This research was supported by Research Council of Semnan University. The author is grateful to Professor Ali Emrouznejad for his valuable suggestions helped to improve the presentation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Babaie-Kafaki, S. (2016). Computational Approaches in Large-Scale Unconstrained Optimization. In: Emrouznejad, A. (eds) Big Data Optimization: Recent Developments and Challenges. Studies in Big Data, vol 18. Springer, Cham. https://doi.org/10.1007/978-3-319-30265-2_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30265-2_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30263-8

  • Online ISBN: 978-3-319-30265-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics