Advertisement

BIT Numerical Mathematics

, Volume 17, Issue 1, pp 72–90 | Cite as

A comparison of some algorithms for the nonlinear least squares problem

  • Håkan Ramsin
  • Per-Åke Wedin
Article

Abstract

The problem of minimizing a sum of squares of nonlinear functions is studied. To solve this problem one usually takes advantage of the fact that the objective function is of this special form. Doing this gives the Gauss-Newton method or modifications thereof. To study how these specialized methods compare with general purpose nonlinear optimization routines, test problems were generated where parameters determining the local behaviour of the algorithms could be controlled. The order of 1000 test problems were generated for testing three algorithms: the Gauss-Newton method, the Levenberg-Marquardt method and a quasi-Newton method.

Keywords

Objective Function Computational Mathematic Special Form Test Problem Nonlinear Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Y. Bard,Comparison of gradient methods for the solution of nonlinear parameter estimation problems, SIAM J. Numer. Anal. 7 (1970) 157–185.Google Scholar
  2. 2.
    P. E. Gill and W. Murray,Safeguarded steplength algorithms for optimization using descent methods, NPL Rep. NAC 37, Teddington, Middlesex (1974).Google Scholar
  3. 3.
    P. E. Gill and W. Murray, ed.,Numerical methods for constrained optimization, Academic Press, London and New York (1974).Google Scholar
  4. 4.
    R. J. Hanson,Certifying linear equation solvers, Signum newsletter vol. 4, No. 3 (1969) 21–29.Google Scholar
  5. 5.
    K. Levenberg,A method for the solution of certain nonlinear problems in least squares, Quart. Appl. Math. 2 (1944) 164–168.Google Scholar
  6. 6.
    D. W. Marquardt,An algorithm for least-squares estimation of nonlinear parameters, J. SIAM 11 (1963) 431–441.Google Scholar
  7. 7.
    J. J. McKeown,Specialised versus general-purpose algorithms for minimising functions that are sums of squared terms, Tech. Rep. No. 50, The Hatfield Polytechnic (1973).Google Scholar
  8. 8.
    W. Murray, ed.,Numerical methods for unconstrained optimization, Academic Press, London and New York (1972).Google Scholar
  9. 9.
    J. M. Ortega and W. C. Rheinboldt,Iterative solution of nonlinear equations in several variables, Academic Press, New York (1970).Google Scholar
  10. 10.
    Ostrowski, A. M.Solution of equations in Euclidean and Banach spaces, Academic Press, New York (1973).Google Scholar
  11. 11.
    M. J. D. Powell,A view of unconstrained optimization, Rep. C.S.S. 14, Harwell (1975).Google Scholar
  12. 12.
    M. J. D. Powell,Some global convergence properties of a variable metric algorithm for minimization without exact line searches, Rep. C.S.S. 15, Harwell (1975).Google Scholar
  13. 13.
    P.-Å. Wedin,The non-linear least squares problem from a numerical point of view, I Geometrical properties, Tech. Rep. Dept. of Comp. Sc., Lund University (1972).Google Scholar
  14. 14.
    P.-Å. Wedin,On the Gauss-Newton method for the non-linear least squares problem, Rep. 24, The Swedish Institute of Applied mathematics, Box 5073, Stockholm (1974).Google Scholar

Copyright information

© BIT Foundations 1977

Authors and Affiliations

  • Håkan Ramsin
    • 1
    • 2
  • Per-Åke Wedin
    • 1
    • 2
  1. 1.CernGenève 23Switzerland
  2. 2.Dept. of Information ProcessingUmeå UniversityUmeåSweden

Personalised recommendations