Summary. In this paper, we consider the problem of finding a convex function which interpolates given points and has a minimal \(L^2\) norm of the second derivative. This problem reduces to a system of equations involving semismooth functions. We study a Newton-type method utilizing Clarke's generalized Jacobian and prove that its local convergence is superlinear. For a special choice of a matrix in the generalized Jacobian, we obtain the Newton method proposed by Irvine et al. [17] and settle the question of its convergence. By using a line search strategy, we present a global extension of the Newton method considered. The efficiency of the proposed global strategy is confirmed with numerical experiments.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
Received October 26, 1998 / Revised version received October 20, 1999 / Published online August 2, 2000
Rights and permissions
About this article
Cite this article
Dontchev, A., Qi, H. & Qi, L. Convergence of Newton's method for convex best interpolation. Numer. Math. 87, 435–456 (2001). https://doi.org/10.1007/PL00005419
Issue Date:
DOI: https://doi.org/10.1007/PL00005419