Advertisement

A FastICA Algorithm for Non-negative Independent Component Analysis

  • Zhijian Yuan
  • Erkki Oja
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3195)

Abstract

The non-negative ICA problem is here defined by the constraint that the sources are non-negative with probability one. This case occurs in many practical applications like spectral or image analysis. It has then been shown by [10] that there is a straightforward way to find the sources: if one whitens the non-zero-mean observations and makes a rotation to positive factors, then these must be the original sources. A fast algorithm, resembling the FastICA method, is suggested here, rigorously analyzed, and experimented with in a simple image separation example.

Keywords

Permutation Matrix Positive Matrix Factorization Source Vector Generic Cost Function FastICA Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Amari, S.-I., Cichocki, A.: Adaptive Blind Signal and Image Processing. John Wiley & sons, Inc., Chichester (2002)Google Scholar
  2. 2.
    Henry, R.C.: Multivariate receptor models-current practice and future trends. Chemometrics and Intelligent Laboratory Systems 60(1-2), 43–48 (2002)CrossRefGoogle Scholar
  3. 3.
    Hyvarinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. John Wiley & Sons, Chichester (2001)CrossRefGoogle Scholar
  4. 4.
    Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401, 788–791 (1999)CrossRefGoogle Scholar
  5. 5.
    Lee, J.S., Lee, D.D., Choi, S., Lee, D.S.: Application of non-negative matrix factorization to dynamic positron emission tomography. In: Lee, T.-W., Jung, T.-P., Makeig, S., Sejnowski, T.J. (eds.) Proceedings of the International Conference on Independent Component Analysis and Signal Separation (ICA2001), San Diego, California, December 9-13, pp. 629–632 (2001)Google Scholar
  6. 6.
    Oja, E.: The nonlinear PCA learning rule in independent component analysis. Neurocomputing 17(l), 25–46 (1997)CrossRefGoogle Scholar
  7. 7.
    Oja, E., Plumbley, M.D.: Blind separation of positive sources by globally convergent gradient search. Neural Computation 16 (2004)Google Scholar
  8. 8.
    Paatero, P., Tapper, U.: Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values. Environmetrics 5, 111–126 (1994)CrossRefGoogle Scholar
  9. 9.
    Parra, L., Spence, C., Sajda, P., Ziehe, A., Miiller, K.-R.: Unmixing hyperspectral data. In: Advances in Neural Information Processing Systems 12 (Proc. NIPS 1999), pp. 942–948. MIT Press, Cambridge (2000)Google Scholar
  10. 10.
    Plumbley, M.D.: Conditions for nonnegative independent component analysis. IEEE Signal Processing Letters 9(6), 177–180 (2002)CrossRefGoogle Scholar
  11. 11.
    Plumbley, M.D.: Algorithms for non-negative independent component analysis. IEEE Transaction on Neural Networks 14(3), 534–543 (2003)CrossRefGoogle Scholar
  12. 12.
    Plumbley, M.D., Oja, E.: A ’non-negative PCA’ algorithm for independent component analysis. IEEE Transactions on Neural Networks 15(1), 66–76 (2004)CrossRefGoogle Scholar
  13. 13.
    Tsuge, S., Shishibori, M., Kuroiwa, S., Kita, K.: Dimensionality reduction using non-negative matrix factorization for information retrieval. In: IEEE International Conference on Systems, Man, and Cybernetics, Tucson, AZ, USA, October 7-10, vol. 2, pp. 960–965 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Zhijian Yuan
    • 1
  • Erkki Oja
    • 1
  1. 1.Neural Networks Research CentreHelsinki University of TechnologyFinland

Personalised recommendations