Abstract
The problem of approximating a matrix by another matrix of lower rank, when a modest portion of its elements are missing, is considered. The solution is obtained using Newton’s algorithm to find a zero of a vector field on a product manifold. As a preliminary the algorithm is formulated for the well-known case with no missing elements where also a rederivation of the correction equation in a block Jacobi-Davidson method is included. Numerical examples show that the Newton algorithm grows more efficient than an alternating least squares procedure as the amount of missing values increases.
Similar content being viewed by others
References
Abadir, K., Magnus, J.: Matrix Algebra. Cambridge University Press, Cambridge (2005)
Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Absil, P.-A., Ishteva, M., De Lathauwer, L., Van Huffel, S.: A geometric Newton method for Oja’s vector field. Neural Comput. 21(5), 1415–1433 (2009)
Bai, Z., Demmel, J., Dongarra, J., Ruhe, A., van der Vorst, H. (eds.): Templates for the Solution of Algebraic Eigenvalue Problems: A Practical Guide. SIAM, Philadelphia (2000)
Deuflhard, P.: Newton Methods for Nonlinear Problems: Affine Invariance and Adaptive Algorithms. Springer, Berlin (2004)
do Carmo, M.P.: Riemannian Geometry. Birkhäuser, Boston (1992)
Eckart, G., Young, G.: The approximation of one matrix by another of lower rank. Psychometrica 1, 211–218 (1936)
Edelman, A., Arias, T., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1999)
Geus, R.: The Jacobi-Davidson algorithm for solving large sparse symmetric eigenvalue problems with applications to the design of accelerator cavities. PhD Thesis, ETH Zurich (2002)
Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The Johns Hopkins University Press, Baltimore (1996)
Grung, B., Manne, R.: Missing values in principal component analysis. Chemom. Intell. Lab. Syst. 42, 125–139 (1998)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, Berlin (2001)
Helmke, U., Moore, J.B.: Optimization and Dynamical Systems. Springer, London (1994)
Hochstenbach, M.: A Jacobi-Davidson type SVD method. SIAM J. Sci. Comput. 23, 606–628 (2001)
Horn, R., Johnson, C.: Topics in Matrix Analysis. Cambridge University Press, Cambridge (1991)
Lundström, E., Eldén, L.: Adaptive eigenvalue computations using Newton’s method on the Grassmann manifold. SIAM J. Matrix Anal. Appl. 23, 819–839 (2002)
Manton, J.H.: Optimization algorithms exploiting unitary constraints. IEEE Trans. Signal Process. 50, 635–650 (2002)
Manton, J.H., Mahony, R., Hua, Y.: The geometry of weighted low-rank approximations. IEEE Trans. Signal Process. 51, 500–514 (2003)
Ruhe, A.: Numerical computation of principal components when several observations are missing, Tech. report, UMINF-48, Umeå (1974)
Simonsson, L.: Subspace computations via matrix decompositions and geometric optimization. PhD Thesis, Department of Mathematics, Linköping University (2006)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Haesun Park.
Rights and permissions
About this article
Cite this article
Simonsson, L., Eldén, L. Grassmann algorithms for low rank approximation of matrices with missing values. Bit Numer Math 50, 173–191 (2010). https://doi.org/10.1007/s10543-010-0253-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10543-010-0253-9
Keywords
- Grassmann manifold
- Matrix
- Low rank approximation
- Newton’s method
- Singular value decomposition
- Least squares
- Missing values