Abstract
A conjugate-gradient optimization method which is invariant to nonlinear scaling of a quadratic form is introduced. The technique has the property that the search directions generated are identical to those produced by the classical Fletcher-Reeves algorithm applied to the quadratic form. The approach enables certain nonquadratic functions to be minimized in a finite number of steps. Several examples which illustrate the efficacy of the method are included.
Similar content being viewed by others
References
Kowalik, J. S.,A Nonlinear Programming Approach to a Very Large Hydroelectric System Optimization, Mathematical Programming, Vol. 6, No. 1, 1974.
Spedicato, E.,Recent Developments in the Variable Metric Method for Nonlinear Unconstrained Optimization. Toward Global Optimization, Edited by L. W. Dixon and G. P. Szego, North-Holland Publishing Company, Amsterdam, Holland, 1975.
Spedicato, E.,A Variable Metric Method for Function Minimization Derived from Invariancy to Nonlinear Scaling, Journal of Optimization Theory and Applications, Vol. 20, No. 3, 1976.
Fried, I.,N-Step Conjugate Gradients Minimization Scheme for Nonquadratic Functions, AIAA Journal, Vol. 9, No. 11, 1971.
Goldfarb, D.,Variable Metric and Conjugate Direction Methods in Unconstrained Optimization, Recent Developments, ACM Proceedings, National Meeting, Boston, Massachusetts, 1972.
Author information
Authors and Affiliations
Additional information
Communicated by H. Y. Huang
Rights and permissions
About this article
Cite this article
Boland, W.R., Kamgnia, E.R. & Kowalik, J.S. A conjugate-gradient optimization method invariant to nonlinear scaling. J Optim Theory Appl 27, 221–230 (1979). https://doi.org/10.1007/BF00933228
Issue Date:
DOI: https://doi.org/10.1007/BF00933228