ICA 2007: Independent Component Analysis and Signal Separation pp 227-235 | Cite as
Robust Independent Component Analysis Using Quadratic Negentropy
Abstract
We present a robust algorithm for independent component analysis that uses the sum of marginal quadratic negentropies as a dependence measure. It can handle arbitrary source density functions by using kernel density estimation, but is robust for a small number of samples by avoiding empirical expectation and directly calculating the integration of quadratic densities. In addition, our algorithm is scalable because the gradient of our contrast function can be calculated in O(LN) using the fast Gauss transform, where L is the number of sources and N is the number of samples. In our experiments, we evaluated the performance of our algorithm for various source distributions and compared it with other, well-known algorithms. The results show that the proposed algorithm consistently outperforms the others. Moreover, it is extremely robust to outliers and is particularly more effective when the number of observed samples is small and the number of mixed sources is large.
Keywords
Mutual Information Independent Component Analysis Kernel Density Estimation Independent Component Analysis Source DistributionPreview
Unable to display preview. Download preview PDF.
References
- 1.Bach, F.R., Jordan, M.I: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2002)CrossRefMathSciNetGoogle Scholar
- 2.Araki, S., Makino, S., Nishikawa, T., Saruwatari, H.: Fundamental limitation of frequency domain blind source separation for convolutive mixture of speech. In: Proc. ICASSP. vol. 5, pp. 2737–2740 (2001)Google Scholar
- 3.Learned-Miller, E.G.: Ica using spacings estimates of entropy. Journal of Machine Learning Research 4, 1271–1295 (2003)CrossRefMathSciNetGoogle Scholar
- 4.Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)MATHCrossRefMathSciNetGoogle Scholar
- 5.Hild II, K.E., Erdogmus, D., Principe, J.C.: Blind source separation using renyi’s mutual information. IEEE Signal Processing Letters 8(6), 174–176 (2001)CrossRefGoogle Scholar
- 6.Hild II, K.E., Erdogmus, D., Principe, J.C.: An analysis of entropy estimators for blind source separation. Signal Processing 86, 182–194 (2006)CrossRefGoogle Scholar
- 7.Comon, P.: Independent component analysis, a new concept? Signal Processing 36, 287–314 (1994)MATHCrossRefGoogle Scholar
- 8.Principe, J.C., Fisher III, J.W., Xu, D.: Information theoretic learning. In: Haykin, S. (ed.) Unsupervised Adaptive Filtering, Wiley, New York (2000)Google Scholar
- 9.Greengard, L., Strain, J.: The fast gauss transform. SIAM Journal on Scientific and Statistical Computing 12(1), 79–94 (1991)MATHCrossRefMathSciNetGoogle Scholar
- 10.Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis and Applications 20(2), 303–353 (1998)MATHCrossRefMathSciNetGoogle Scholar
- 11.Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, Sydney (1986)MATHGoogle Scholar
- 12.Hyvarinen, A., Oja, E.: A fast fixed-point algorithm for independent component analysis. Neural Computation 9, 1483–1492 (1997)CrossRefGoogle Scholar
- 13.Lee, T.-W., Girolami, M., Sejnowski, T.J.: Independent component analysis using an extended infomax algorithm for mixed sub-gaussian and super-gaussian sources. Neural Computation 11, 417–441 (1999)CrossRefGoogle Scholar
- 14.Boscolo, R., Pan, H., Roychowdhury, V.P.: Independent component analysis based on nonparametric density estimation. IEEE Transactions on Neural Networks 15(1), 55–65 (2004)CrossRefGoogle Scholar
- 15.Amari, S., Cichocki, A., Yang, H.: A new learning algorithm for blind source separation. In: Advances in Neural Information Processing 8 (Proc. NIPS 1995), pp. 757–763 (1996)Google Scholar