Monotonic Convergence of a Nonnegative ICA Algorithm on Stiefel Manifold

  • Mao Ye
  • Xuqian Fan
  • Qihe Liu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4232)


When the independent sources are known to be nonnegative and well-grounded, which means that they have a non-zero pdf in the region of zero, a few nonnegative independent component analysis (ICA) algorithms have been proposed to separate these positive sources. In this paper, by using the property of skew-symmetry matrix, rigorous convergence proof of a nonnegative ICA algorithm on Stiefel manifold is given. And sufficient convergence conditions are presented. Simulations are employed to confirm our convergence theory. Our techniques may be useful to analyze general ICA algorithms on Stiefel manifold.


Independent Component Analysis Learning Rate Convergence Condition Independent Component Analysis Blind Signal 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Cichocki, A., Amar, S.-i.: Adaptive Blind Signal and Image Processing. John Wiley and Sons, Chichester (2002)CrossRefGoogle Scholar
  2. 2.
    Edelman, A., Arias, T.A., Smith, S.T.: The Geometry of Algorithms with Orthogonality Constrains. SIAM J. Matrix Anal. Applicat. 20(2), 303–353 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Fiori, S.: A Theory for Leaning by Weight Flow on Stiefel-Grassman Manifold. Neural Comput. 13, 1625–1647 (2001)zbMATHCrossRefGoogle Scholar
  4. 4.
    Golub, G.H., Van Loan, C.F.: Matrix Computations. The Johns Hopkins University Press, Baltimore (1996)zbMATHGoogle Scholar
  5. 5.
    Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analsysis. John Wiley and Sons, Chichester (2001)CrossRefGoogle Scholar
  6. 6.
    Nishimori, Y.: Learning Algorithm for ICA for by Geodesic Flows on Orthogonal Group. In: Proc. Int. Joint. Conf. Neural Networks, Washington, DC, vol. 2, pp. 933–938 (1999)Google Scholar
  7. 7.
    Oja, E., Plumbley, M.D.: Blind Separation of Positive Sources by Globally Convergent Gradient Search. Neural Computation 16(9), 1811–1825 (2004)zbMATHCrossRefGoogle Scholar
  8. 8.
    Oja, E., Plumbley, M.D.: Blind Separation of Positive Sources using Nonnegative PCA. In: Proc. Int. Conf. on Independent Component Analysis and Blind Signal Separation (ICA 2003), pp. 11–16 (2003)Google Scholar
  9. 9.
    Plumbley, M.D.: Conditions for Nonnegative Independent Component Analysis. IEEE Signal Processing Letters 9(6), 177–180 (2002)CrossRefGoogle Scholar
  10. 10.
    Plumbley, M.D.: Algorithms for Nonnegative Independent Component Analysis. IEEE Trans. Neural Networks 14(3), 534–543 (2003)CrossRefGoogle Scholar
  11. 11.
    Plumbley, M.D.: Lie Group Methods for Optimization with Orthogonality Constraints. In: Puntonet, C.G., Prieto, A.G. (eds.) ICA 2004. LNCS, vol. 3195, pp. 1245–1252. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  12. 12.
    Vidyasagar, M.: Nonlinear Systems Analysis. Prentice-Hall, Englewood Cliffs (1993)zbMATHGoogle Scholar
  13. 13.
    Ye, M.: Global Convergence Analysis of a Discrete Time Nonnegative ICA Algorithm. IEEE Trans. Neural Networks 17(1), 253–256 (2006)CrossRefGoogle Scholar
  14. 14.
    Zuffiria, P.J.: On the Discrete Time Dynamics of the Basic Hebbian Neural Network Node. IEEE Trans. Neural Networks 13(6), 1342–1352 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Mao Ye
    • 1
  • Xuqian Fan
    • 2
  • Qihe Liu
    • 1
  1. 1.Computational Intelligence Lab, School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduP.R. China
  2. 2.Department of MathematicsJinan UniversityGuangzhouP.R. China

Personalised recommendations