Skip to main content
Log in

Extensions of CGS algorithms: Generalized least-square solutions

  • Contributed Papers
  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

The CGS (conjugate Gram-Schmidt) algorithms of Hestenes and Stiefel are formulated so as to obtain least-square solutions of a system of equationsg(x)=0 inn independent variables. Both the linear caseg(x)=Axh and the nonlinear case are discussed. In the linear case, a least-square solution is obtained in no more thann steps, and a method of obtaining the least-square solution of minimum length is given. In the nonlinear case, the CGS algorithm is combined with the Gauss-Newton process to minimize sums of squares of nonlinear functions. Results of numerical experiments with several versions of CGS on test functions indicate that the algorithms are effective.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Hestenes, M. R., andStiefel, E.,Methods of Conjugate Gradients for Solving Linear Systems, Journal of Research of the National Bureau of Standards, Section B, Vol. 49, No. 6, 1952.

  2. Dennemeyer, R. F., andMookini, E. H.,CGS Algorithms for Unconstrained Minimization of Functions, Journal of Optimization Theory and Applications, Vol. 16, Nos. 1–2, 1975.

  3. Hestenes, M. R.,Pseudoinverses and Conjugate Gradients, Communications of the ACM, Vol. 18, No. 1, 1975.

  4. Ortega, J., andRheinboldt, W.,Itrative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, New York, 1970.

    Google Scholar 

  5. Fletcher, R., andPowell, M.,A Rapidly Convergent Descent Method for Minimization, Computer Journal, Vol. 6, No. 2, 1964.

  6. Beale, E.,On an Iterative Method of Finding a Local Minimum of a Function of More Than One Variable, Princeton University, Statistical Techniques Research Group, Technical Report No. 25, 1958.

  7. Cragg, E., andLevy, A.,Study on a Supermemory Gradient Method for the Minimization of Functions, Journal of Optimization Theory and Applications, Vol. 4, pp. 191–205, 1969.

    Google Scholar 

  8. Kowlik, J., andOsborne, M.,Methods for Unconstrained Optimization Problems, American Elsevier Publishing Company, New York, New York, 1968.

    Google Scholar 

  9. Box, M.,A Comparison of Several Current Optimization Methods and the Use of Transformations in Constrained Problems, Computer Journal, Vol. 9, 1966.

  10. Bard, Y.,Comparison of Gradient Methods for the Solution of Nonlinear Parametric Estimation Problems, SIAM Journal of Numerical Analysis, Vol. 7, pp. 159–186, 1970.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Communicated by M. R. Hestenes

The author wishes to express appreciation and to acknowledge the ideas and help of Professor M. R. Hestenes which made this paper possible.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dennemeyer, R.F. Extensions of CGS algorithms: Generalized least-square solutions. J Optim Theory Appl 24, 387–419 (1978). https://doi.org/10.1007/BF00932885

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00932885

Key Words

Navigation