Independent Subspace Analysis Using k-Nearest Neighborhood Distances

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3697)


A novel algorithm called independent subspace analysis (ISA) is introduced to estimate independent subspaces. The algorithm solves the ISA problem by estimating multi-dimensional differential entropies. Two variants are examined, both of them utilize distances between the k-nearest neighbors of the sample points. Numerical simulations demonstrate the usefulness of the algorithms.


Information System Artificial Intelligence Pattern Recognition Sample Point System Application 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Jutten, C., Herault, J.: Blind separation of sources, part 1: an adaptive algorithm based on neuromimetic architecture. Signal Processing 24, 1–10 (1991)CrossRefzbMATHGoogle Scholar
  2. 2.
    Comon, P.: Independent Component Analysis, a new concept? Signal Processing 36, 287–314 (1994); Special issue on Higher-Order StatisticsCrossRefzbMATHGoogle Scholar
  3. 3.
    Cardoso, J.: Multidimensional independent component analysis. In: Proc. of Int. Conf. on Acoust. Speech and Signal Processing, Seattle, WA (1998)Google Scholar
  4. 4.
    Akaho, S., Kiuchi, Y., Umeyama, S.: Mica: Multimodal independent component analysis. In: Proc. of IJCNN 1999 (1999)Google Scholar
  5. 5.
    Hyvärinen, A., Hoyer, P.O.: Emergence of topography and complex cell properties from natural images using extensions of ica. In: Proc. of NIPS 1999, pp. 827–833 (2000)Google Scholar
  6. 6.
    Hyvärinen, A., Hoyer, P.: Emergence of phase and shift invariant features by decomposition of natural images into independent feature subspaces. Neural Computation 7, 1705–1720 (2000)CrossRefGoogle Scholar
  7. 7.
    Vollgraf, R., Obermayer, K.: Multi dimensional ica to separate correlated sources. In: Proc. of NIPS 2000, pp. 993–1000 (2001)Google Scholar
  8. 8.
    Bach, F.R., Jordan, M.I.: Finding clusters in independent component analysis. In: Proc. of Fourth International Symposium on Independent Component Analysis and Blind Signal Separation (2003)Google Scholar
  9. 9.
    Yukich, J.E.: Probability Theory of Classical Euclidean Optimization Problems. Lecture Notes in Mathematics, vol. 1675. Springer, Berlin (1998)zbMATHGoogle Scholar
  10. 10.
    Costa, J.A., Hero, A.O.: Manifold learning using k-nearest neighbor graphs. In: Proc. of Int. Conf. on Acoust. Speech and Signal Processing, Montreal, Canada (2004)Google Scholar
  11. 11.
    Kozachenko, L.F., Leonenko, N.N.: Sample estimate of entropy of a random vector. Problems of Information Transmission 23, 95–101 (1987)zbMATHMathSciNetGoogle Scholar
  12. 12.
    Beirlant, J., Dudewicz, E.J., Györfi, L., van der Meulen, E.C.: Nonparametric entropy estimation: An overview. International Journal of Mathematical and Statistical Sciences 6, 17–39 (1997)zbMATHMathSciNetGoogle Scholar
  13. 13.
    Amari, S., Cichocki, A., Yang, H.: A new learning algorithm for blind source separation. In: Proc of. NIPS 1995, pp. 757–763 (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  1. 1.Department of Information Systems, Eötvös Loránd University, Research Group on Intelligent Information SystemsHungarian Academy of SciencesBudapestHungary

Personalised recommendations