Advertisement

Numerische Mathematik

, Volume 18, Issue 4, pp 289–297 | Cite as

Derivative free analogues of the Levenberg-Marquardt and Gauss algorithms for nonlinear least squares approximation

  • Kenneth M. Brown
  • J. E. DennisJr.
Article

Abstract

In this paper we give two derivative-free computational algorithms for nonlinear least squares approximation. The algorithms are finite difference analogues of the Levenberg-Marquardt and Gauss methods. Local convergence theorems for the algorithms are proven. In the special case when the residuals are zero at the minimum, we show that certain computationally simple choices of the parameters lead to quadratic convergence. Numerical examples are included.

Keywords

Finite Difference Mathematical Method Convergence Theorem Computational Algorithm Local Convergence 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bard, Y.: Comparison of gradient methods for the solution of nonlinear parameter estimation problems. SIAM J. Numer. Anal.7, 157–186 (1970).Google Scholar
  2. 2.
    Barnes, J. G. P.: An algorithm for solving nonlinear equations based on the secant method. The Computer Journal8, 66–72 (1965).Google Scholar
  3. 3.
    Boggs, P. T., Dennis, J. E., Jr.: Function minimization by descent along a difference approximation to the gradient. To appear.Google Scholar
  4. 4.
    Box, M. J.: A comparison of several current optimization methods, and the use of transformations in constrained problems. The Computer Journal9, 67–77 (1966).Google Scholar
  5. 5.
    Broyden, C. G.: A class of methods for solving nonlinear simultaneous equations. Math. Comp.19, 577–593 (1965).Google Scholar
  6. 6.
    Davidon, W. C.: Variable metric methods for minimization. A. E. C. Research and Development Report, ANL-5990. (1959) (Rev.).Google Scholar
  7. 7.
    Dennis, J. E., Jr.: On the convergence of Newtonlike methods. To appear in: Numerical methods for nonlinear algebraic equations, ed. P. Rabinowitz. London: Gordon and Breach 1970.Google Scholar
  8. 8.
    Fletcher, R., Powell, M. J. D.: A rapidly convergent descent method for minimization. The Computer Journal6, 163–168 (1963).Google Scholar
  9. 9.
    — Reeves, C. M.: Function minimization by conjugate gradients. The Computer Journal7, 149–154 (1964).Google Scholar
  10. 10.
    Greenstadt, J.: Variations of variable-metric methods. Math. Comp.24, 1–22 (1970).Google Scholar
  11. 11.
    Levenberg, K.: A method for the solution of certain nonlinear problems in least squares. Quart. Appl. Math.2, 164–168 (1944).Google Scholar
  12. 12.
    Marquardt, D. W.: An algorithm for least squares estimation of nonlinear parameters. SIAM J. Appl. Math.11, 431–441 (1963).Google Scholar
  13. 13.
    Nelder, J. A., Mead, R.: A simplex method for function minimization. The Computer Journal7, 308–313 (1965).Google Scholar
  14. 14.
    Ortega, J. M., Rheinboldt, W. C.: Iterative solution of nonlinear equations in several variables. Chapt. 8. New York: Academic Press 1970Google Scholar
  15. 15.
    Powell, M. J. D.: An efficient method of finding the minimum of a function of several variables without calculating derivatives. The Computer Journal7, 155–162 (1964).Google Scholar
  16. 16.
    — A method for minimizing a sum of squares of nonlinear functions without calculating derivatives. The Computer Journal7, 303–307 (1965).Google Scholar
  17. 17.
    Powell, M. J. D. A hybrid method for nonlinear equations. U.K.A.E.R.E. Technical Paper No. 364, January. 1969.Google Scholar
  18. 18.
    Rosenbrock, H. H.: An automatic method for finding the greatest or least value of a function. The Computer Journal3, 175–184 (1960).Google Scholar
  19. 19.
    Spendley, W., Hext, G. R., Himsworth, F. R.: Sequential applications of simplex designs in optimisation and evolutionary operation. Technometrics4, 441–461 (1962).Google Scholar
  20. 20.
    Swann, W. H.: Report on the development of a new direct searching method of optimisation. I.C.I.Ltd., Central Instrument Laboratory Research Note 64/3 (1964).Google Scholar
  21. 21.
    Traub, J. F.: Iterative methods for the solution of equations. Englewood Cliffs: Prentice-Hall 1964.Google Scholar
  22. 22.
    Wilkinson, J. H.: The algebraic eigenvalue problem, p. 54. Clarendon: Oxford 1965.Google Scholar

Copyright information

© Springer-Verlag 1971

Authors and Affiliations

  • Kenneth M. Brown
    • 1
    • 2
  • J. E. DennisJr.
    • 3
  1. 1.IBM Philadelphia Scientific CenterPhiladelphiaUSA
  2. 2.Institute of TechnologyUniversity of MinnesotaMinneapolisUSA
  3. 3.Cornell UniversityIthacaUSA

Personalised recommendations