Nonlinearly Structured Low-Rank Approximation

  • Ivan Markovsky
  • Konstantin Usevich


Polynomially structured low-rank approximation problems occur in
  • algebraic curve fitting, e.g., conic section fitting,

  • subspace clustering (generalized principal component analysis), and

  • nonlinear and parameter-varying system identification.

The maximum likelihood estimation principle applied to these nonlinear models leads to nonconvex optimization problems and yields inconsistent estimators in the errors-in-variables (measurement errors) setting. We propose a computationally cheap and statistically consistent estimator based on a bias correction procedure, called Adjusted Least-Squares Estimation. The method is successfully used for conic section fitting and was recently generalized to algebraic curve fitting. The contribution of this book’s chapter is the application of the polynomially structured low-rank approximation problem and, in particular, the adjusted least-squares method to subspace clustering, nonlinear and parameter-varying system identification. The classical in system identification input-output notion of a dynamical model is replaced by the behavioral definition of a model as a set, represented by implicit nonlinear difference equations.


Structured low-rank approximation Conic section fitting Subspace clustering Nonlinear system identification 



Funding from the European Research Council under the European Union’s Seventh Framework Programme (FP7/2007–2013)/ERC Grant agreement number 258581 “Structured low-rank approximation: Theory, algorithms, and applications” is gratefully acknowledged.


  1. 1.
    C. Bishop, Pattern Recognition and Machine Learning (Springer, 2006)Google Scholar
  2. 2.
    F.L. Bookstein, Fitting conic sections to scattered data. Comput. Graphics Image Proc. 9, 59–71 (1979)CrossRefGoogle Scholar
  3. 3.
    C. Cheng, H. Schneeweiss, Polynomial regression with errors in the variables. J. R. Stat. Soc. B 60(1), 189–199 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    W. Chojnacki, M. Brooks, A.V.D. Hengel, D. Gawley, From FNS to HEIV: a link between two vision parameter estimation methods. IEEE Trans. Pattern Anal. Mach. Intell. 26, 264–268 (2004)CrossRefGoogle Scholar
  5. 5.
    D. Cox, J. Little, D. O’Shea, IDeals, Varieties, and Algorithms (Springer, 2004)Google Scholar
  6. 6.
    G. Eckart, G. Young, The approximation of one matrix by another of lower rank. Psychometrika 1, 211–218 (1936)CrossRefzbMATHGoogle Scholar
  7. 7.
    A. Fitzgibbon, M. Pilu, R. Fisher, Direct least-squares fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999)CrossRefGoogle Scholar
  8. 8.
    W. Gander, G. Golub, R. Strebel, Fitting of circles and ellipses: Least squares solution. BIT 34, 558–578 (1994)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    G. Golub, V. Pereyra, Separable nonlinear least squares: the variable projection method and its applications. Inst. Phys. Inverse Prob. 19, 1–26 (2003)MathSciNetCrossRefGoogle Scholar
  10. 10.
    T. Hastie, W. Stuetzle, Principal curves. J. Am. Stat. Assoc. 84, 502–516 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    J. Jackson, A User’s Guide to Principal Components (Wiley, 2003)Google Scholar
  12. 12.
    I. Jolliffe, Principal Component Analysis (Springer, 2002)Google Scholar
  13. 13.
    K. Kanatani, Statistical bias of conic fitting and renormalization. IEEE Trans. Pattern Anal. Mach. Intell. 16(3), 320–326 (1994)CrossRefzbMATHGoogle Scholar
  14. 14.
    A. Kukush, I. Markovsky, S. Van Huffel, Consistent estimation in an implicit quadratic measurement error model. Comput. Statist. Data Anal. 47(1), 123–147 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    A. Kukush, S. Zwanzig, On inconsistency of the least squares estimator in nonlinear functional error-in-variables models. Preprint N96–12 (1996), Institut für Mathematische Stochastik, Universität HamburgGoogle Scholar
  16. 16.
    Y. Ma, Y. Fu, Manifold Learning Theory and Applications, (CRC Press, 2011)Google Scholar
  17. 17.
    I. Markovsky, Structured low-rank approximation and its applications. Automatica 44(4), 891–909 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    I. Markovsky, Low Rank Approximation: Algorithms, Implementation (Springer, Applications. Communications and Control Engineering, 2012)CrossRefzbMATHGoogle Scholar
  19. 19.
    I. Markovsky, Recent progress on variable projection methods for structured low-rank approximation. Signal Process. 96PB, 406–419 (2014)Google Scholar
  20. 20.
    I. Markovsky, A. Kukush, S. Van Huffel, Consistent least squares fitting of ellipsoids. Numer. Math. 98(1), 177–194 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    B. Matei, P. Meer, Estimation of nonlinear errors-in-variables models for computer vision applications. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1537–1552 (2006)CrossRefGoogle Scholar
  22. 22.
    J. Polderman, J.C. Willems, Introduction to mathematical systems theory (Springer-Verlag, New York, 1998)CrossRefGoogle Scholar
  23. 23.
    B. Schölkopf, A. Smola, K. Müller, in Kernel Principal Component Analysis. (Cambridge, MA, 1999 MIT Press), pp. 327–352Google Scholar
  24. 24.
    S. Shklyar, A. Kukush, I. Markovsky, S. Van Huffel, On the conic section fitting problem. J. Multivar. Anal. 98, 588–624 (2007)CrossRefzbMATHGoogle Scholar
  25. 25.
    E.D. Sontag, Mathematical Control Theory: Deterministic Finite Dimensional Systems (Springer, 1990)Google Scholar
  26. 26.
    I. Vajk, J. Hetthéssy, Identification of nonlinear errors-in-variables models. Automatica 39(12), 2099–2107 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    G. Vandersteen, Identification of linear and nonlinear systems in an errors-in-variables least square. Ph.D. thesis (Vrije Universiteit Brussel 1997),
  28. 28.
    R. Vidal, Y. Ma, S. Sastry, Generalized principal component analysis (GPCA). IEEE Trans. Pattern Anal. Mach. Intell. 27(12), 1945–1959 (2005)CrossRefGoogle Scholar
  29. 29.
    J.C. Willems, From time series to linear system–Part I. Finite dimensional linear time invariant systems, Part II. Exact modelling, Part III. Approximate modelling. Automatica 22, 23, 561–580, 675–694, 87–115 (1986, 1987)Google Scholar
  30. 30.
    Z. Zhang, Parameter estimation techniques: A tutorial with application to conic fitting. Image Vision Comp. J. 15(1), 59–76 (1997)CrossRefGoogle Scholar
  31. 31.
    Z. Zhang, H. Zha, Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Sci. Comput. 26, 313–338 (2005)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Department ELECVrije Universiteit BrusselBrusselsBelgium

Personalised recommendations