Pattern Analysis and Applications

, Volume 13, Issue 1, pp 1–13 | Cite as

Auto-regressive independent process analysis without combinatorial efforts

  • Zoltán Szabó
  • Barnabás Póczos
  • András Lőrincz
Theoretical Advances

Abstract

We treat the problem of searching for hidden multi-dimensional independent auto-regressive processes (auto-regressive independent process analysis, AR-IPA). Independent subspace analysis (ISA) can be used to solve the AR-IPA task. The so-called separation theorem simplifies the ISA task considerably: the theorem enables one to reduce the task to one-dimensional blind source separation task followed by the grouping of the coordinates. However, the grouping of the coordinates still involves two types of combinatorial problems: (a) the number of the independent subspaces and their dimensions, and then (b) the permutation of the estimated coordinates are to be determined. Here, we generalize the separation theorem. We also show a non-combinatorial procedure, which—under certain conditions—can treat these two combinatorial problems. Numerical simulations have been conducted. We investigate problems that fulfill sufficient conditions of the theory and also others that do not. The success of the numerical simulations indicates that further generalizations of the separation theorem may be feasible.

Keywords

Independent component analysis Independent process analysis Auto-regressive processes 

Notes

Acknowledgments

This research has been supported by the EC NEST ‘Perceptual Consciousness: Explication and Testing’ grant under contract 043261. Opinions and errors in this manuscript are the author’s responsibility, they do not necessarily reflect those of the EC or other project members.

References

  1. 1.
    Hyvärinen A, Karhunen J, Oja E (2001) Independent component analysis. Wiley, New YorkGoogle Scholar
  2. 2.
    Cichocki A, Amari S (2002) Adaptive blind signal and image processing. Wiley, New YorkGoogle Scholar
  3. 3.
    Cardoso J (1998) Multidimensional independent component analysis. In: International conference on acoustics, speech, and signal processing (ICASSP ’98), vol 4. pp 1941–1944Google Scholar
  4. 4.
    Akaho S, Kiuchi Y, Umeyama S (1999) MICA: multimodal independent component analysis. In: International joint conference on neural networks (IJCNN ’99), vol 2. pp 927–932Google Scholar
  5. 5.
    Vollgraf R, Obermayer K (2001) Multi-dimensional ICA to separate correlated sources. In: Neural information processing systems (NIPS 2001), vol 14. MIT Press, Cambridge, pp 993–1000Google Scholar
  6. 6.
    Bach FR, Jordan MI (2003) Beyond independent components: trees and clusters. J Mach Learn Res 4:1205–1233CrossRefMathSciNetGoogle Scholar
  7. 7.
    Póczos B, Lőrincz A (2005) Independent subspace analysis using k-nearest neighborhood distances. Artif Neural Netw Formal Models Appl 3697:163–168Google Scholar
  8. 8.
    Póczos B, Lőrincz A (2005) Independent subspace analysis using geodesic spanning trees. In: International conference on machine learning (ICML 2005), vol 119. ACM Press, New York, pp 673–680Google Scholar
  9. 9.
    Theis FJ (2005) Blind signal separation into groups of dependent signals using joint block diagonalization. In: International Society for Computer Aided Surgery (ISCAS 2005), vol 6. pp 5878–5881Google Scholar
  10. 10.
    Van Hulle MM (2005) Edgeworth approximation of multivariate differential entropy. Neural Comp 17:1903–1910MATHCrossRefGoogle Scholar
  11. 11.
    Póczos B, Takács B, Lőrincz A (2005) Independent subspace analysis on innovations. In: European conference on machine learning (ECML 2005), vol 3720 LNAI. Springer, Berlin, pp 698–706Google Scholar
  12. 12.
    Hyvärinen A (1998) Independent component analysis for time-dependent stochastic processes. In: International conference on artificial neural networks (ICANN ’98). Springer, Berlin, pp 541–546Google Scholar
  13. 13.
    Szabó Z, Póczos B, Lőrincz A (2006) Cross-entropy optimization for independent process analysis. In: Independent component analysis and blind signal separation (ICA 2006), vol 3889, LNCS. Springer, Berlin, pp 909–916Google Scholar
  14. 14.
    Cheung Y, Xu L (2003) Dual multivariate auto-regressive modeling in state space for temporal signal separation. IEEE Trans Syst Man Cybern B 33:386–398CrossRefGoogle Scholar
  15. 15.
    Theis FJ (2004) Uniqueness of complex and multidimensional independent component analysis. Signal Process 84:951–956MATHCrossRefGoogle Scholar
  16. 16.
    Rubinstein RY, Kroese DP (2004) The cross-entropy method. Springer, BerlinGoogle Scholar
  17. 17.
    Hardy GH, Ramanujan SI (1918) Asymptotic formulae in combinatory analysis. Proc Lond Math Soc 17:75–115CrossRefGoogle Scholar
  18. 18.
    Uspensky JV (1920) Asymptotic formulae for numerical functions which occur in the theory of partitions. Bull Russian Acad Sci 14:199–218Google Scholar
  19. 19.
    Costa JA, Hero AO (2004) Manifold learning using k-nearest neighbor graphs. In: International conference on acoustic speech and signal processing (ICASSP 2004), vol 4. pp 988–991Google Scholar
  20. 20.
    Gray AG, Moore AW (2000) ‘N-Body’ problems in statistical learning. In: Proceedings of NIPS, pp 521–527Google Scholar
  21. 21.
    Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New YorkMATHCrossRefGoogle Scholar
  22. 22.
    Takano S (1995) The inequalities of Fisher information and entropy power for dependent variables. In: Proceedings of the 7th Japan–Russia symposium on probability theory and mathematical statisticsGoogle Scholar
  23. 23.
    Fang KT, Kotz S, Ng KW (1990) Symmetric multivariate and related distributions. Chapman and Hall, LondonGoogle Scholar
  24. 24.
    Frahm G (2004) Generalized elliptical distributions: theory and applications. PhD thesis, University of KölnGoogle Scholar
  25. 25.
    Gupta AK, Song D (1997) L p-norm spherical distribution. J Stat Plan Inference 60Google Scholar
  26. 26.
    Lorenz EN (1963) Deterministic nonperiodic flow. J Atmos Sci 20:130–141CrossRefGoogle Scholar
  27. 27.
    Amari S, Cichocki A, Yang HH (1996) A new learning algorithm for blind signal separation. Adv Neural Inf Process Syst 8:757–763Google Scholar
  28. 28.
    Bach FR, Jordan MI (2002) Kernel independent component analysis. J Mach Learn Res 3:1–48CrossRefMathSciNetGoogle Scholar
  29. 29.
    Theis FJ (2005) Multidimensional independent component analysis using characteristic functions. In: European signal processing conference (EUSIPCO 2005)Google Scholar
  30. 30.
    Neumaier A, Schneider T (2001) Estimation of parameters and eigenmodes of multivariate autoregressive models. ACM Trans Math Softw 27:27–57MATHCrossRefGoogle Scholar
  31. 31.
    Schneider T, Neumaier A (2001) Algorithm 808: ARfit - a matlab package for the estimation of parameters and eigenmodes of multivariate autoregressive models. ACM Trans Math Softw 27:58–65MATHCrossRefGoogle Scholar
  32. 32.
    Hyvärinen A, Oja E (1997) A fast fixed-point algorithm for independent component analysis. Neural Comput 9:1483–1492CrossRefGoogle Scholar
  33. 33.
    Hyvärinen A, Hoyer PO (2000) Emergence of phase and shift invariant features by decomposition of natural images into independent feature subspaces. Neural Comput 12:1705–1720CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2009

Authors and Affiliations

  • Zoltán Szabó
    • 1
  • Barnabás Póczos
    • 1
    • 2
  • András Lőrincz
    • 1
  1. 1.Department of Information SystemsEötvös Loránd UniversityBudapestHungary
  2. 2.Department of Computing ScienceUniversity of AlbertaEdmontonCanada

Personalised recommendations