Noisy Matrix Completion Using Alternating Minimization

  • Suriya Gunasekar
  • Ayan Acharya
  • Neeraj Gaur
  • Joydeep Ghosh
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8189)


The task of matrix completion involves estimating the entries of a matrix, M ∈ ℝ m×n , when a subset, Ω ⊂ {(i,j):1 ≤ i ≤ m,1 ≤ j ≤ n} of the entries are observed. A particular set of low rank models for this task approximate the matrix as a product of two low rank matrices, \(\widehat{M}=UV^T\), where U ∈ ℝ m×k and V ∈ ℝ n×k and k ≪ min {m,n}. A popular algorithm of choice in practice for recovering M from the partially observed matrix using the low rank assumption is alternating least square (ALS) minimization, which involves optimizing over U and V in an alternating manner to minimize the squared error over observed entries while keeping the other factor fixed. Despite being widely experimented in practice, only recently were theoretical guarantees established bounding the error of the matrix estimated from ALS to that of the original matrix M. In this work we extend the results for a noiseless setting and provide the first guarantees for recovery under noise for alternating minimization. We specifically show that for well conditioned matrices corrupted by random noise of bounded Frobenius norm, if the number of observed entries is \(\mathcal{O}\left(k^7n\log n\right)\), then the ALS algorithm recovers the original matrix within an error bound that depends on the norm of the noise matrix. The sample complexity is the same as derived in [7] for the noise–free matrix completion using ALS.


Singular Value Decomposition Recommender System Matrix Completion Alternate Little Square Image Inpainting 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Bertalmio, M., Vese, L., Sapiro, G., Osher, S.: Simultaneous structure and texture image inpainting. IEEE Transactions on Image Processing (2003)Google Scholar
  2. 2.
    Candès, E.J., Plan, Y.: Matrix completion with noise. CoRR (2009)Google Scholar
  3. 3.
    Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Foundations of Computational Mathematics (2009)Google Scholar
  4. 4.
    Candès, E.J., Tao, T.: The power of convex relaxation: near-optimal matrix completion. IEEE Transactions on Information Theory (2010)Google Scholar
  5. 5.
    Golub, G.H., van Van Loan, C.F.: Matrix Computations (Johns Hopkins Studies in Mathematical Sciences), 3rd edn. The Johns Hopkins University Press (1996)Google Scholar
  6. 6.
    Jain, P., Netrapalli, P., Sanghavi, S.: Low-rank matrix completion using alternating minimization. ArXiv e-prints (December 2012)Google Scholar
  7. 7.
    Jain, P., Netrapalli, P., Sanghavi, S.: Low-rank matrix completion using alternating minimization. In: STOC (2013)Google Scholar
  8. 8.
    Keshavan, R.H., Montanari, A., Oh, S.: Matrix completion from a few entries. IEEE Transactions on Information Theory (2010)Google Scholar
  9. 9.
    Keshavan, R.H., Montanari, A., Oh, S.: Matrix completion from noisy entries. JMLR (2010)Google Scholar
  10. 10.
    Koren, Y., Bell, R., Volinsky, C.: Matrix factorization techniques for recommender systems. IEEE Computer (2009)Google Scholar
  11. 11.
    Mitra, K., Sheorey, S., Chellappa, R.: Large-scale matrix factorization with missing data under additional constraints. In: NIPS (2010)Google Scholar
  12. 12.
    So, A.M.C., Ye, Y.: Theory of semidefinite programming for sensor network localization. In: ACM-SIAM Symposium on Discrete Algorithms (2005)Google Scholar
  13. 13.
    Takács, G., Pilászy, I., Németh, B., Tikk, D.: Investigation of various matrix factorization methods for large recommender systems. In: KDD Workshop on Large-Scale Recommender Systems and the Netflix Prize Competition (2008)Google Scholar
  14. 14.
    Wang, Y., Xu, H.: Stability of matrix factorization for collaborative filtering. In: ICML (2012)Google Scholar
  15. 15.
    Yu, K., Tresp, V.: Learning to learn and collaborative filtering. In: NIPS Workshop on Inductive Transfer: 10 Years Later (2005)Google Scholar
  16. 16.
    Zhou, Y., Wilkinson, D., Schreiber, R., Pan, R.: Large-scale parallel collaborative filtering for the netflix prize. In: Fleischer, R., Xu, J. (eds.) AAIM 2008. LNCS, vol. 5034, pp. 337–348. Springer, Heidelberg (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Suriya Gunasekar
    • 1
  • Ayan Acharya
    • 1
  • Neeraj Gaur
    • 1
  • Joydeep Ghosh
    • 1
  1. 1.Department of ECEUniversity of Texas at AustinUSA

Personalised recommendations