Abstract
Polynomially structured low-rank approximation problems occur in
-
algebraic curve fitting, e.g., conic section fitting,
-
subspace clustering (generalized principal component analysis), and
-
nonlinear and parameter-varying system identification.
The maximum likelihood estimation principle applied to these nonlinear models leads to nonconvex optimization problems and yields inconsistent estimators in the errors-in-variables (measurement errors) setting. We propose a computationally cheap and statistically consistent estimator based on a bias correction procedure, called Adjusted Least-Squares Estimation. The method is successfully used for conic section fitting and was recently generalized to algebraic curve fitting. The contribution of this book’s chapter is the application of the polynomially structured low-rank approximation problem and, in particular, the adjusted least-squares method to subspace clustering, nonlinear and parameter-varying system identification. The classical in system identification input-output notion of a dynamical model is replaced by the behavioral definition of a model as a set, represented by implicit nonlinear difference equations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We use the notation \(d\) for data in problems involving static models and \(w\) for data in problems involving dynamical models.
- 2.
The choice of the monomials is related to the model class selection in system identification.
References
C. Bishop, Pattern Recognition and Machine Learning (Springer, 2006)
F.L. Bookstein, Fitting conic sections to scattered data. Comput. Graphics Image Proc. 9, 59–71 (1979)
C. Cheng, H. Schneeweiss, Polynomial regression with errors in the variables. J. R. Stat. Soc. B 60(1), 189–199 (1998)
W. Chojnacki, M. Brooks, A.V.D. Hengel, D. Gawley, From FNS to HEIV: a link between two vision parameter estimation methods. IEEE Trans. Pattern Anal. Mach. Intell. 26, 264–268 (2004)
D. Cox, J. Little, D. O’Shea, IDeals, Varieties, and Algorithms (Springer, 2004)
G. Eckart, G. Young, The approximation of one matrix by another of lower rank. Psychometrika 1, 211–218 (1936)
A. Fitzgibbon, M. Pilu, R. Fisher, Direct least-squares fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999)
W. Gander, G. Golub, R. Strebel, Fitting of circles and ellipses: Least squares solution. BIT 34, 558–578 (1994)
G. Golub, V. Pereyra, Separable nonlinear least squares: the variable projection method and its applications. Inst. Phys. Inverse Prob. 19, 1–26 (2003)
T. Hastie, W. Stuetzle, Principal curves. J. Am. Stat. Assoc. 84, 502–516 (1989)
J. Jackson, A User’s Guide to Principal Components (Wiley, 2003)
I. Jolliffe, Principal Component Analysis (Springer, 2002)
K. Kanatani, Statistical bias of conic fitting and renormalization. IEEE Trans. Pattern Anal. Mach. Intell. 16(3), 320–326 (1994)
A. Kukush, I. Markovsky, S. Van Huffel, Consistent estimation in an implicit quadratic measurement error model. Comput. Statist. Data Anal. 47(1), 123–147 (2004)
A. Kukush, S. Zwanzig, On inconsistency of the least squares estimator in nonlinear functional error-in-variables models. Preprint N96–12 (1996), Institut für Mathematische Stochastik, Universität Hamburg
Y. Ma, Y. Fu, Manifold Learning Theory and Applications, (CRC Press, 2011)
I. Markovsky, Structured low-rank approximation and its applications. Automatica 44(4), 891–909 (2008)
I. Markovsky, Low Rank Approximation: Algorithms, Implementation (Springer, Applications. Communications and Control Engineering, 2012)
I. Markovsky, Recent progress on variable projection methods for structured low-rank approximation. Signal Process. 96PB, 406–419 (2014)
I. Markovsky, A. Kukush, S. Van Huffel, Consistent least squares fitting of ellipsoids. Numer. Math. 98(1), 177–194 (2004)
B. Matei, P. Meer, Estimation of nonlinear errors-in-variables models for computer vision applications. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1537–1552 (2006)
J. Polderman, J.C. Willems, Introduction to mathematical systems theory (Springer-Verlag, New York, 1998)
B. Schölkopf, A. Smola, K. Müller, in Kernel Principal Component Analysis. (Cambridge, MA, 1999 MIT Press), pp. 327–352
S. Shklyar, A. Kukush, I. Markovsky, S. Van Huffel, On the conic section fitting problem. J. Multivar. Anal. 98, 588–624 (2007)
E.D. Sontag, Mathematical Control Theory: Deterministic Finite Dimensional Systems (Springer, 1990)
I. Vajk, J. Hetthéssy, Identification of nonlinear errors-in-variables models. Automatica 39(12), 2099–2107 (2003)
G. Vandersteen, Identification of linear and nonlinear systems in an errors-in-variables least square. Ph.D. thesis (Vrije Universiteit Brussel 1997), http://wwwtw.vub.ac.be/elec/Papersonweb/Papers/GerdVandersteen/Phd.pdf
R. Vidal, Y. Ma, S. Sastry, Generalized principal component analysis (GPCA). IEEE Trans. Pattern Anal. Mach. Intell. 27(12), 1945–1959 (2005)
J.C. Willems, From time series to linear system–Part I. Finite dimensional linear time invariant systems, Part II. Exact modelling, Part III. Approximate modelling. Automatica 22, 23, 561–580, 675–694, 87–115 (1986, 1987)
Z. Zhang, Parameter estimation techniques: A tutorial with application to conic fitting. Image Vision Comp. J. 15(1), 59–76 (1997)
Z. Zhang, H. Zha, Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Sci. Comput. 26, 313–338 (2005)
Acknowledgments
Funding from the European Research Council under the European Union’s Seventh Framework Programme (FP7/2007–2013)/ERC Grant agreement number 258581 “Structured low-rank approximation: Theory, algorithms, and applications” is gratefully acknowledged.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Markovsky, I., Usevich, K. (2014). Nonlinearly Structured Low-Rank Approximation. In: Fu, Y. (eds) Low-Rank and Sparse Modeling for Visual Analysis. Springer, Cham. https://doi.org/10.1007/978-3-319-12000-3_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-12000-3_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11999-1
Online ISBN: 978-3-319-12000-3
eBook Packages: Computer ScienceComputer Science (R0)