Skip to main content
Log in

When are Simple LS Estimators Enough? An Empirical Study of LS, TLS, and GTLS

  • Short Paper
  • Published:
International Journal of Computer Vision Aims and scope Submit manuscript

Abstract

A variety of least-squares estimators of significantly different complexity and generality are available to solve over-constrained linear systems. The most theoretically general may not necessarily be the best choice in practice; problem conditions may be such that simpler and faster algorithms, if theoretically inferior, would yield acceptable errors. We investigate when this may happen using homography estimation as the reference problem. We study the errors of LS, TLS, equilibrated TLS and GTLS algorithms with different noise types and varying intensity and correlation levels. To allow direct comparisons with algorithms from the applied mathematics and computer vision communities, we consider both inhomogeneous and homogeneous systems. We add noise to image co-ordinates and system matrix entries in separate experiments, to take into account the effect on noise properties (heteroscedasticity) of pre-processing data transformations. We find that the theoretically most general algorithms may not always be worth their higher complexity; comparable results are obtained with moderate levels of noise intensity and correlation. We identify such levels quantitatively for the reference problem, thus suggesting when simpler algorithms can be applied with limited errors in spite of their restrictive assumptions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  • Chandrasekaran, S., Golub, G.H., Gu, M. and Sayed, A.H. : 1998 Parameter estimation in the presence of bounded data uncertainties. SIMAX19(1):235–252.

    MathSciNet  Google Scholar 

  • Chaudhuri, S. and Chatterjee, S. 1991 Performance analysis of total least squares methods in three-dimensional motion estimation. IEEE Transactions on Robotics and Automation, 7(5):707–714.

    Article  Google Scholar 

  • Chojnacki, W., Brooks, M.J. van den Hengel, A. and Gawley, D. 2000 On the fitting of surfaces to data with covariances. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):1294–1303.

    Article  Google Scholar 

  • Chojnacki, W., Brooks, M.J. van den Hengel, A. and Gawley, D. 2004 From FNS to HEIV: a link between two vision parameter estimation methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(2):264–268.

    Article  Google Scholar 

  • de Groen, P. 1996 An introduction to total least squares. Nieuw Archief voor Wiskunde, 4th Series, 14: 237–253.

    MATH  Google Scholar 

  • Gallo, P.P. 1982 Consistency of regression estimates when some variables are subject to error. Communications in Statistics – Theory and Methods, 11: 973–983.

    MATH  MathSciNet  Google Scholar 

  • Ghaoui, L.E. and Lebret, H. 1997 Robust solution to least-squares problems with uncertain data. SIAM Journal on Matrix Analysis and Applications.

  • Golub, G.H. and van Loan, C.F. 1996 Matrix Computations. The Johns Hopkins University Press.

  • Hartley, R. and Zisserman, A. 2002 Multiple View Geometry in Computer Vision. Cambridge University Press.

  • Hartley, R.I. 1995 In defense of the eight-point algorithm. In Proceedings of Fifth International Conference on Computer Vision. pp. 1064–1070.

  • Hartley, R.I. 1997 In defense of the eight-point algorithm. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(6):580–593.

    Article  Google Scholar 

  • Howell, D.C. 1989 Fundamental Statistics for the Behavioral Sciences. PWS-Kent Publishing Company, Boston.

    Google Scholar 

  • Howell, D.C. 2003 Correlation/Regression. In http://www.uvm.edu/~dhowell/StatPages/More_Stuff/CorrReg.html.

  • Kanatani, K. 1996 Statistical Optimization for Geometric Computation: Theory and Practice. Elsevier.

  • Kanatani, K. 2005 Further improving geometric fitting. In Proc. 5th Int. Conf. on 3-D Digital Imaging and Modelling (3DIM'05). Ottawa, Canada, pp. 2–13.

  • Kanatani, K., Ohta, N. and Kanazawa, Y. 2000 Optimal homography computation with a reliability measure. IEICE Trans. on Inf. and Syst., E83-D(87):1369–1374.

    Google Scholar 

  • Kukush, A., Markovsky, I. and van Huffel, S. 2002 Consistent fundamental matrix estimation in a quadratic measurement error model arising in motion analysis. Computational Statistics and Data Analysis, Special Issue of Matrix Computations and Statistics 41(1):3–18.

    Google Scholar 

  • Kukush, A., Markovsky, I. and van Huffel, S. 2004 Consistent estimation in an implicit quadratic measurement error model. Computational Statistics and Data Analysis, Special Issue of Matrix Computations and Statistics, 47(1):123–147.

    Google Scholar 

  • Leedan, Y. and Meer, P. 2000 Heteroscedastic regression in computer vision: Problems with bilinear constraints. International Journal of Computer Vision, 37(2):127–150.

    Article  Google Scholar 

  • Matei, B. and Meer, P. 2000 A general method for error-in-variable problems in computer vision. In International Conference on Computer Vision and Pattern Recognition, Vol. 2, pp. 18–25.

    Google Scholar 

  • Meer, P., Mintz, D. Rosenfeld, A. and Kim, D. Y. 1991 Robust regression methods for computer vision: A review. International Journal of Computer Vision, 6(1):59–70.

    Article  Google Scholar 

  • Mühlich, M. and Mester, R. 1998 The role of total least squares in motion analysis. In Proceedings of European Conference on Computer Vision.

  • Mühlich, M. and Mester, R. 1999 Subspace methods and equilibration in computer vision. Technical Report XP-TR-C-21, Johann Wolfgang Goethe-Universität, Applied Physics.

  • Mühlich, M. and Mester, R. 2001 A considerable improvement in non-iterative homography estimation using TLS and equilibration. Pattern Recognition Letters, 22:1181–1189.

    Article  Google Scholar 

  • Paige, C.C. and Strakos, Z. 2002 Scaled total least squares fundamentals. Numerische Mathematik, 91:117–146.

    Article  MathSciNet  Google Scholar 

  • Sampson, P.D. 1982 Fitting conic sections to very scattered data: An iterative refinement of the Bookstein algorithm. Computer graphics and image processing, 18:97–108.

    Article  Google Scholar 

  • Springer, M. D. 1979 The Algebra of Random Variables. John Wiley & Sons.

  • Strang, G. 1988 Linear Algebra and its Applications. Harcourt Brace Jovanovich.

  • van Huffel, S. and Vandewalle, J. 1989 Analysis and properties of the generalized total least squares problem A x = bwhen some or all columns in A are subject to error. SIAM Journal on Matrix Analysis and Application, 10(3):294–315.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Arvind Nayak.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nayak, A., Trucco, E. & Thacker, N.A. When are Simple LS Estimators Enough? An Empirical Study of LS, TLS, and GTLS. Int J Comput Vision 68, 203–216 (2006). https://doi.org/10.1007/s11263-006-6486-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11263-006-6486-z

Keywords

Navigation