, Volume 72, Issue 1, pp 215–236

Provable ICA with Unknown Gaussian Noise, and Implications for Gaussian Mixtures and Autoencoders

  • Sanjeev Arora
  • Rong Ge
  • Ankur Moitra
  • Sushant Sachdeva

DOI: 10.1007/s00453-015-9972-2

Cite this article as:
Arora, S., Ge, R., Moitra, A. et al. Algorithmica (2015) 72: 215. doi:10.1007/s00453-015-9972-2


We present a new algorithm for independent component analysis which has provable performance guarantees. In particular, suppose we are given samples of the form \(y = Ax + \eta \) where \(A\) is an unknown but non-singular \(n \times n\) matrix, \(x\) is a random variable whose coordinates are independent and have a fourth order moment strictly less than that of a standard Gaussian random variable and \(\eta \) is an \(n\)-dimensional Gaussian random variable with unknown covariance \(\varSigma \): We give an algorithm that provably recovers \(A\) and \(\varSigma \) up to an additive \(\epsilon \) and whose running time and sample complexity are polynomial in \(n\) and \(1 / \epsilon \). To accomplish this, we introduce a novel “quasi-whitening” step that may be useful in other applications where there is additive Gaussian noise whose covariance is unknown. We also give a general framework for finding all local optima of a function (given an oracle for approximately finding just one) and this is a crucial step in our algorithm, one that has been overlooked in previous attempts, and allows us to control the accumulation of error when we find the columns of \(A\) one by one via local search.


Independent component analysis Mixture models Method of moments Cumulants 

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Sanjeev Arora
    • 1
  • Rong Ge
    • 2
  • Ankur Moitra
    • 3
  • Sushant Sachdeva
    • 4
  1. 1.Princeton UniversityPrincetonUSA
  2. 2.Microsoft ResearchCambridgeUSA
  3. 3.Massachusetts Institute of TechnologyCambridgeUSA
  4. 4.Yale UniversityNew HavenUSA

Personalised recommendations