Abstract
Applied to nonlinear modeling problem, the maximum-likelihood estimation principle leads to nonconvex optimization problems and yields inconsistent estimators in the errors-in-variables setting. This chapter presents a computationally cheap and statistically consistent estimation method based on a bias correction procedure, called adjusted least squares estimation. The adjusted least squares method is applied to curve fitting (static modeling) and system identification. Section 7.1 presents a general nonlinear data modeling framework. The model class consists of affine varieties with bounded complexity (dimension and degree) and the fitting criteria are algebraic and geometric. Section 7.2 shows that the underlying computational problem is polynomially structured low-rank approximation. In the algebraic fitting method, the approximating matrix is unstructured and the corresponding problem can be solved globally and efficiently. The geometric fitting method aims to solve the polynomially structured low-rank approximation problem, which is nonconvex and has no analytic solution. The equivalence of nonlinear data modeling and low-rank approximation unifies existing curve fitting methods, showing that algebraic fitting is a relaxation of geometric fitting, obtained by removing the structural constraint. Motivated by the fact that the algebraic fitting method is efficient but statistically inconsistent, Sect. 7.3.3 proposes a bias correction procedure. The resulting adjusted least squares method yields a consistent estimator. Simulation results show that it is effective also for small sample sizes. Section 7.4 considers the class, called polynomial time-invariant, of discrete-time, single-input, single-output, nonlinear dynamical systems described by a polynomial difference equation. The identification problem is: (1) find the monomials appearing in the difference equation representation of the system (structure selection), and (2) estimate the coefficients of the equation (parameter estimation). Since the model representation is linear in the parameters, the parameter estimation by minimization of the 2-norm of the equation error leads to unstructured low-rank approximation. However, knowledge of the model structure is required and even with the correct model structure, the method is statistically inconsistent. For the structure selection, we propose to use 1-norm regularization and for the bias correction, we use the adjusted least squares method.
With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.
J. von Neumann
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Billings S (2013) Nonlinear system identification: NARMAX methods in the time, frequency, and spatio-temporal domains. Wiley, New York
Bookstein FL (1979) Fitting conic sections to scattered data. Comput Graph Image Process 9:59–71
Boyd S, Chua L, Desoer C (1984) Analytical foundations of Volterra series. IMA J Math Control Inf 1(3):243–282
Cox D, Little J, O’Shea D (2004) Ideals, varieties, and algorithms. Springer, Berlin
Fitzgibbon A, Pilu M, Fisher R (1999) Direct least-squares fitting of ellipses. IEEE Trans Pattern Anal Mach Intell 21(5):476–480
Gander W, Golub G, Strebel R (1994) Fitting of circles and ellipses: least squares solution. BIT 34:558–578
Giri F, Bai EW (2010) Block-oriented nonlinear system identification, vol 1. Springer, Berlin
Hastie T, Stuetzle W (1989) Principal curves. J Am Stat Assoc 84:502–516
Kanatani K (1994) Statistical bias of conic fitting and renormalization. IEEE Trans Pattern Anal Mach Intell 16(3):320–326
Markovsky I, Kukush A, Van Huffel S (2004) Consistent least squares fitting of ellipsoids. Numer Math 98(1):177–194
Paduart J, Lauwers L, Swevers J, Smolders K, Schoukens J, Pintelon R (2010) Identification of nonlinear systems using polynomial nonlinear state space models. Automatica 46(4):647–656
Schölkopf B, Smola A, Müller K (1999) Kernel principal component analysis. MIT Press, Cambridge, pp 327–352
Shklyar S, Kukush A, Markovsky I, Van Huffel S (2007) On the conic section fitting problem. J Multivar Anal 98:588–624
Suykens J, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Usevich K, Markovsky I (2016) Adjusted least squares fitting of algebraic hypersurfaces. Linear Algebra Appl 502:243–274
Van Huffel S (ed) (1997) Recent advances in total least squares techniques and errors-in-variables modeling. SIAM, Philadelphia
Vidal R, Ma Y, Sastry S (2005) Generalized principal component analysis (GPCA). IEEE Trans Pattern Anal Mach Intell 27(12):1945–1959
Zhang Z, Zha H (2005) Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J Sci Comput 26:313–338
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2019 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Markovsky, I. (2019). Nonlinear Modeling Problems. In: Low-Rank Approximation. Communications and Control Engineering. Springer, Cham. https://doi.org/10.1007/978-3-319-89620-5_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-89620-5_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-89619-9
Online ISBN: 978-3-319-89620-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)