# Direct Estimation of Homogeneous Vectors: An Ill-Solved Problem in Computer Vision

• Matthew Harker
• Paul O’Leary
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4338)

## Abstract

Computer Vision theory is firmly rooted in Projective Geometry, whereby geometric objects can be effectively modeled by homogeneous vectors. We begin from Gauss’s 200 year old theorem of least squares to derive a generic algorithm for the direct estimation of homogeneous vectors. We uncover the common link of previous methods, showing that direct estimation is not an ill-conditioned problem as is the popular belief, but has merely been an ill-solved problem. Results show improvements in goodness-of-fit and numerical stability, and demonstrate that “data normalization” is unnecessary for a well-founded algorithm.

## Keywords

Cost Function Direct Estimation Geometric Object Residual Vector Generalize Eigenvector
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## References

1. 1.
Hartley, R.: In defense of the eight-point algorithm. IEEE Trans. Pattern Analysis and Machine Intelligence 19, 580–593 (1997)
2. 2.
Bookstein, F.: Fitting conic sections to scattered data. Computer Graphics and Image Processing 9, 56–71 (1987)
3. 3.
Torr, P., Fitzgibbon, A.: Invariant fitting of two view geometry. IEEE Trans. Pattern Analysis and Machine Intelligence 26, 648–650 (2004)
4. 4.
Harker, M., O’Leary, P., Zsombor-Murray, P.: Direct type-specific conic fitting and eigenvalue bias correction. In: Image and Vision Computing: 2004 British Machine Vision Conference Special Issue  17 (submitted, 2005)Google Scholar
5. 5.
Gander, W.: Least squares with a quadratic constraint. Numer. Math. 36, 291–307 (1981)
6. 6.
Sampson, P.: Fitting conic sections to “very scattered” data: An iterative refinement of the Bookstein algorithm. Computer Graphics and Image Processing 18, 97–108 (1982)
7. 7.
Taubin, G.: Estimation of planar curves, surfaces and nonplanar space curves defined by implicit equations with applications to edge and range image segmentation. IEEE Trans. on Pattern Analysis and Machine Intelligence 13 (1991)Google Scholar
8. 8.
Gauss, C.F.: Méthode des moindres carrés. Mémoires sur la combinaison des observations. Mallet-Bachelier, Paris (1855)Google Scholar
9. 9.
Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, Cambridge (2003)Google Scholar
10. 10.
Förstner, W.: Uncertainty and projective geometry. In: Bayro Corrochano, E. (ed.) Handbook of Geometric Computing, pp. 493–535. Springer, Heidelberg (2005)
11. 11.
Golub, G., Van Loan, C.: Matrix Computations. John Hopkins University Press, Baltimore (1996)
12. 12.
Golub, G.H., Hoffman, A., Stewart, G.W.: A generalization of the Eckart-Young-Mirsky matrix approximation theorem. Lin. Alg. and its Applic. 88/89, 317–327 (1987)
13. 13.
Golub, G., Pereyra, V.: The differentiation of pseudoinverses and nonlinear least squares problems whose variables separate. SIAM J. Num. Anal. 10(2), 413–432 (1973)
14. 14.
Nievergelt, Y.: Hyperspheres and hyperplanes fitted seamlessly by algebraic constrained total least-squares. Linear Algebra and its Applications 331, 43–59 (2001)
15. 15.
Taubin, G.: An improved algorithm for algebraic curve and surface fitting. In: International Conference on Computer Vision, Berlin, Germany, pp. 658–665 (1993)Google Scholar