An Information Geometrical View of Stationary Subspace Analysis

  • Motoaki Kawanabe
  • Wojciech Samek
  • Paul von Bünau
  • Frank C. Meinecke
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6792)

Abstract

Stationary Subspace Analysis (SSA) [3] is an unsupervised learning method that finds subspaces in which data distributions stay invariant over time. It has been shown to be very useful for studying non-stationarities in various applications [5,10,4,9]. In this paper, we present the first SSA algorithm based on a full generative model of the data. This new derivation relates SSA to previous work on finding interesting subspaces from high-dimensional data in a similar way as the three easy routes to independent component analysis [6], and provides an information geometric view.

Keywords

stationary subspace analysis generative model maximum likelihood estimation Kullback-Leibler divergence information geometry 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Amari, S.: Differential-geometrical Methods in Statistics. Lecture Notes in Statistics. Springer, Berlin (1985)CrossRefMATHGoogle Scholar
  2. 2.
    Blanchard, G., Sugiyama, M., Kawanabe, M., Spokoiny, V., Müller, K.: In search of non-Gaussian components of a high-dimensional distribution. Journal of Machine Learning Research 7, 247–282 (2006)MATHMathSciNetGoogle Scholar
  3. 3.
    Bünau, P.v., Meinecke, F.C., Király, F., Müller, K.-R.: Finding stationary subspaces in multivariate time series. Physical Review Letters 103, 214101 (2009)CrossRefGoogle Scholar
  4. 4.
    Bünau, P.v., Meinecke, F.C., Müller, J.S., Lemm, S., Müller, K.-R.: Boosting High-Dimensional Change Point Detection with Stationary Subspace Analysis. In: Workshop on Temporal Segmentation at NIPS (2009)Google Scholar
  5. 5.
    von Bünau, P., Meinecke, F.C., Scholler, S., Müller, K.R.: Finding stationary brain sources in EEG data. In: Proceedings of the 32nd Annual Conference of the IEEE EMBS, pp. 2810–2813 (2010)Google Scholar
  6. 6.
    Cardoso, J.F.: The three easy routes to independent component analysis; contrasts and geometry. In: Proc. ICA 2001, pp. 1–6 (2001)Google Scholar
  7. 7.
    Diederichs, E., Juditsky, A., Spokoiny, V., Schtte, C.: Sparse non-gaussian component analysis. IEEE Trans. Inform. Theory 56, 3033–3047 (2010)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Friedman, J.H., Tukey, J.W.: A projection pursuit algorithm for exploratory data analysis. IEEE Trans. Computers 23, 881–890 (1974)CrossRefMATHGoogle Scholar
  9. 9.
    Hara, S., Kawahara, Y., Washio, T., von Bünau, P.: Stationary subspace analysis as a generalized eigenvalue problem. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds.) ICONIP 2010, Part I. LNCS, vol. 6443, pp. 422–429. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  10. 10.
    Meinecke, F., von Bünau, P., Kawanabe, M., Müller, K.R.: Learning invariances with stationary subspace analysis. In: IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops), 2009 . pp. 87 –92 (2009)Google Scholar
  11. 11.
    Pham, D.T., Cardoso, J.F.: Blind separation of instantaneous mixtures of non stationary sources. In: Proc. ICA 2000, Helsinki, Finland, pp. 187–192 (2000)Google Scholar
  12. 12.
    Plumbley, M.D.: Geometrical methods for non-negative ica: Manifolds, lie groups and toral subalgebras. Neurocomputing 67(161-197) (2005)Google Scholar
  13. 13.
    Theis, F.: Colored subspace analysis: Dimension reduction based on a signal’s autocorrelation structure. IEEE Trans. Circuits & Systems I 57(7), 1463–1474 (2010)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Motoaki Kawanabe
    • 1
    • 2
  • Wojciech Samek
    • 1
    • 2
  • Paul von Bünau
    • 1
  • Frank C. Meinecke
    • 1
  1. 1.Fraunhofer Institute FIRSTBerlinGermany
  2. 2.Berlin Institute of TechnologyBerlinGermany

Personalised recommendations