Skip to main content
Log in

Convergence of Newton's method for convex best interpolation

  • Original article
  • Published:
Numerische Mathematik Aims and scope Submit manuscript

Summary. In this paper, we consider the problem of finding a convex function which interpolates given points and has a minimal \(L^2\) norm of the second derivative. This problem reduces to a system of equations involving semismooth functions. We study a Newton-type method utilizing Clarke's generalized Jacobian and prove that its local convergence is superlinear. For a special choice of a matrix in the generalized Jacobian, we obtain the Newton method proposed by Irvine et al. [17] and settle the question of its convergence. By using a line search strategy, we present a global extension of the Newton method considered. The efficiency of the proposed global strategy is confirmed with numerical experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received October 26, 1998 / Revised version received October 20, 1999 / Published online August 2, 2000

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dontchev, A., Qi, H. & Qi, L. Convergence of Newton's method for convex best interpolation. Numer. Math. 87, 435–456 (2001). https://doi.org/10.1007/PL00005419

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00005419

Navigation