Abstract
This paper shows that the generalized Newton algorithm [GN(r)], developed by Kalaba and Tishler (Ref. 1), can be described as a fixed-point algorithm. In addition to specifying sufficient conditions for convergence of the GN(r), we show that, forr=1, 2, 3, its rate of convergence increases with the order of the derivatives which are used.
Similar content being viewed by others
References
Kalaba, R., andTishler, A.,A Generalized Newton Algorithm Using Higher-Order Derivatives, Journal of Optimization Theory and Applications, Vol. 39, pp. 1–17, 1983.
Isacson, E., andKeller, H.,Analysis of Numerical Methods, John Wiley and Sons, New York, New York, 1966.
Johnson, L., andRiess, R.,Numerical Analysis, Addison-Wesley, Reading, Massachusetts, 1977.
Kalaba, R., andTishler, A.,A Computer Program to Minimize a Function with Many Variables Using Computer-Evaluated Exact Higher-Order Derivatives, Applied Mathematics and Computation, Vol. 13, pp. 143–172, 1983.
Author information
Authors and Affiliations
Additional information
The authors are indebted to I. Zang for drawing their attention to an error in an earlier draft of this paper. Suggestions and comments by N. Levin and D. Trietsch are also gratefully acknowledged.
Rights and permissions
About this article
Cite this article
Kalaba, R., Tishler, A. & Wang, J.S. Rate of convergence of the generalized newton algorithm using the fixed-point approach. J Optim Theory Appl 43, 543–555 (1984). https://doi.org/10.1007/BF00935005
Issue Date:
DOI: https://doi.org/10.1007/BF00935005