Advertisement

Reliability of ICA Estimates with Mutual Information

  • Harald Stögbauer
  • Ralph G. Andrzejak
  • Alexander Kraskov
  • Peter Grassberger
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3195)

Abstract

Obtaining the most independent components from a mixture (under a chosen model) is only the first part of an ICA analysis. After that, it is necessary to measure the actual dependency between the components and the reliability of the decomposition. We have to identify one- and multidimensional components (i.e., clusters of mutually dependent components) or channels which are too close to Gaussians to be reliably separated. For the determination of the dependencies we use a new highly accurate mutual information (MI) estimator. The variability of the MI under remixing provides us a measure for the stability. A rapid growth of the MI under mixing identifies stable components. On the other hand a low variability identifies unreliable components. The method is illustrated on artificial datasets. The usefulness in real-world data is shown on biomedical data.

Keywords

Mutual Information Independent Component Analysis Independent Component Analysis Dependency Matrix Gaussian Signal 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. Wiley, New York (2001)CrossRefGoogle Scholar
  2. 2.
    Meinecke, F., Ziehe, A., Kawanabe, M., Müller, K.-R.: A resampling approach to estimate the stability of one-dimensional or multidimensional independent components. IEEE Trans. Biomed. Eng. 49, 1514–1525 (2002)CrossRefGoogle Scholar
  3. 3.
    Harmeling, S., Meinecke, F., Müller, K.-R.: Analysing ICA component by injection noise. In: Proc. Int. Workshop on Independent Component Analysis (2003)Google Scholar
  4. 4.
    Himberg, J., Hyvärinen, A.: Icasso: software for investigating the reliability of ICA estimates by clustering and visualization. In: Processing of theWorkshop on Neural Networks and Signal Processing. Toulouse, France (2003)Google Scholar
  5. 5.
    Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Phys. Rev. E (in press)Google Scholar
  6. 6.
    Cardoso, J.-F.: Multidimensional independent component analysis. In: Processing of ICASSP, Seattle (1998)Google Scholar
  7. 7.
    Kozachenko, L.F., Leonenko, N.N.: Sample estimate of the entropy of a random vector. Probl. Inf. Transm. 23, 9–16 (1987)MathSciNetGoogle Scholar
  8. 8.
    Stögbauer, H., Kraskov, A., Astakhov, S.A., Grassberger, P.: Least Dependent Component Analysis Based on Mutual Information (submitted), http://arXiv.org/abs/physics/0405044
  9. 9.
    Lathauwer, L.D., Moor, B.D., Vandewalle, J.: Fetal electrocardiogram extraction by source subspace separation. In: Processing of HOS, Aiguabla, Spain (1995)Google Scholar
  10. 10.
    De Moor, B.L.R. (ed.): Daisy: Database for the identification of systems (1997), www.esat.kuleuven.ac.be/sista/daisy

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Harald Stögbauer
    • 1
  • Ralph G. Andrzejak
    • 1
  • Alexander Kraskov
    • 1
  • Peter Grassberger
    • 1
  1. 1.John-von-Neumann Institute for ComputingForschungszentrum JülichJülichGermany

Personalised recommendations