Lie Group Methods for Optimization with Orthogonality Constraints

  • Mark D. Plumbley
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3195)


Optimization of a cost function J(W) under an orthogonality constraint WW T =I is a common requirement for ICA methods. In this paper, we will review the use of Lie group methods to perform this constrained optimization. Instead of searching in the space of n× n matrices W, we will introduce the concept of the Lie group SO(n) of orthogonal matrices, and the corresponding Lie algebraso(n). Using so(n) for our coordinates, we can multiplicatively update W by a rotation matrix R so that W′=RW always remains orthogonal. Steepest descent and conjugate gradient algorithms can be used in this framework.


Line Search Steep Descent Conjugate Gradient Method Independent Component Analysis Conjugate Gradient Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Comon, P.: Independent component analysis - a new concept? Signal Processing 36, 287–314 (1994)MATHCrossRefGoogle Scholar
  2. 2.
    Plumbley, M.D.: Algorithms for nonnegative independent component analysis. IEEE Transactions on Neural Networks 14, 534–543 (2003)CrossRefGoogle Scholar
  3. 3.
    Douglas, S.C.: Self-stabilized gradient algorithms for blind source separation with orthogonality constraints. IEEE Transactions on Neural Networks 11, 1490–1497 (2000)CrossRefGoogle Scholar
  4. 4.
    Iserles, A.: Brief introduction to Lie-group methods. In: Estep, D., Tavener, S. (eds.) Collected Lectures on the Preservation of Stability Under Discretization (Proceedings in Applied Mathematics Series). SIAM (2002)Google Scholar
  5. 5.
    Fiori, S.: A theory for learning by weight flow on Stiefel-Grassman manifold. Neural Computation 13, 1625–1647 (2001)MATHCrossRefGoogle Scholar
  6. 6.
    Schutz, B.: Geometrical Methods of Mathematical Physics. Cambridge University Press, Cambridge (1980)MATHGoogle Scholar
  7. 7.
    Nishimori, Y.: Learning algorithm for ICA by geodesic flows on orthogonal group. In: Proceedings of the International Joint Conference on Neural Networks (IJCNN 1999), Washington, DC, vol. 2, pp. 933–938 (1999)Google Scholar
  8. 8.
    Plumbley, M.D.: Geometrical methods for non-negative ICA: Manifolds, Lie groups and toral subalgebras (2004) (submitted to Neurocomputing)Google Scholar
  9. 9.
    Cardoso, J.F., Laheld, B.H.: Equivariant adaptive source separation. IEEE Transactions on Signal Processing 44, 3017–3030 (1996)CrossRefGoogle Scholar
  10. 10.
    Plumbley, M.D.: Optimzation using Fourier expansion over a geodesic for nonnegative ICA. In: Puntonet, C.G., Prieto, A.G. (eds.) ICA 2004. LNCS, vol. 3195, pp. 49–56. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Martin-Clemente, R., Puntonet, C.G., Acha, J.I.: Blind signal separation based on the derivatives of the output cumulants and a conjugate gradient algorithm. In: Lee, T.W., Jung, T.P., Makeig, S., Sejnowski, T.J. (eds.) Proceedings of the International Conference on Independent Component Analysis and Signal Separation (ICA 2001), San Diego, California, pp. 390–393 (2001)Google Scholar
  13. 13.
    Yamada, I., Ezaki, T.: An orthogonal matrix optimzation by dual Cayley parametrization technique. In: Proc. 4th Intl. Symp. On Independent Component Analysis and Blind Signal Separation (ICA 2003), Nara, Japan, pp. 35–40 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Mark D. Plumbley
    • 1
  1. 1.Department of Electronic EngineeringQueen Mary University of LondonLondonUK

Personalised recommendations