Skip to main content

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 158))

  • 990 Accesses

Abstract

As already seen, the conjugate gradient algorithms presented so far use some principles based on: hybridization or modifications of the standard schemes, the memoryless or the scaled memoryless BFGS preconditioned or the three-term concept. The corresponding conjugate gradient algorithms are defined by the descent condition, the “pure” conjugacy or the Dai–Liao conjugacy conditions or by the minimization of the quadratic approximation with one or two parameters of the objective function. There are a number of convergence results, mainly based on the Zoutendijk and on the Nocedal conditions under the Wolfe line search (Dai, 2011). These algorithms have good numerical performances, being able to solve large-scale unconstrained optimization problems and applications. However, in the frame of conjugate gradient methods, which is a very active area of research, some other computational schemes were introduced in order to improve their numerical performances. They are too numerous to be presented in this study. However, a short description of some of them is as follows.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei .

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Andrei, N. (2020). Other Conjugate Gradient Methods. In: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer Optimization and Its Applications, vol 158. Springer, Cham. https://doi.org/10.1007/978-3-030-42950-8_11

Download citation

Publish with us

Policies and ethics