Abstract
Recently, a mathematical proof is obtained in (Liu, Chiu, Xu, 2004) on the so called one-bit-matching conjecture that all the sources can be separated as long as there is an one-to-one same-sign-correspondence between the kurtosis signs of all source probability density functions (pdf’s) and the kurtosis signs of all model pdf’s (Xu, Cheung, Amari, 1998a), which is widely believed and implicitly supported by many empirical studies. However, this proof is made only in a weak sense that the conjecture is true when the global optimal solution of an ICA criterion is reached. Thus, it can not support the successes of many existing iterative algorithms that usually converge at one of local optimal solutions. In this paper, a new mathematical proof is obtained in a strong sense that the conjecture is also true when anyone of local optimal solutions is reached, in help of investigating convex-concave programming on a polyhedral-set. Theorems have also been proved not only on partial separation of sources when there is a partial matching between the kurtosis signs, but also on an interesting duality of maximization and minimization on source separation. Moreover, corollaries are obtained from the theorems to state that seeking a one-to-one same-sign-correspondence can be replaced by a use of the duality, i.e., super-gaussian sources can be separated via maximization and sub-gaussian sources can be separated via minimization. Also, a corollary is obtained to confirm the symmetric orthogonalization implementation of the kurtosis extreme approach for separating multiple sources in parallel, which works empirically but in a lack of mathematical proof. Furthermore, a linkage has been set up to combinatorial optimization from a Stiefel manifold perspective, with algorithms that guarantee convergence and satisfaction of constraints.
The work described in this paper was fully supported by a grant from the Research Grant Council of the Hong Kong SAR (Project No: CUHK4225/04E).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Amari, S.I., Cichocki, A., Yang, H.: A New Learning Algorithm for Blind Separation of Sources. In: Advances in Neural Information Processing, vol. 8, pp. 757–763. MIT Press, Cambridge (1996)
Amari, S., Chen, T.-P., Cichocki, A.: Stability analysis of adaptive blind source separation. Neural Networks 10, 1345–1351 (1997)
Bazaraa, M.S., Sherall, H.D., Shetty, C.M.: Nonlinear Programming: Theory and Algorithms. John Wileys & Sons, Inc., New York (1993)
Bell, A., Sejnowski, T.: An Information-maximization Approach to Blind Separation and Blind Deconvolution. Neural Computation 7, 1129–1159 (1995)
Cardoso, J.-F.: Blind signal separation: Statistical Principles. Proc. of IEEE 86, 2009–2025 (1998)
Cheung, C.C., Xu, L.: Some Global and Local Convergence Analysis on the Information-theoretic Independent Component Analysis Approach. Neurocomputing 30, 79–102 (2000)
Comon, P.: Independent component analysis - a new concept? Signal Processing 36, 287–314 (1994)
Delfosse, N., Loubation, P.: Adaptive Blind Separation of Independent Sources: A Deflation Approach. Signal Processing 45, 59–83 (1995)
Edelman, A., Arias, T.A., Smith, S.T.: The Geometry of Algorithms with Orthogonality Constraints. SIAM J. Matrix Anal. APPL. 20, 303–353 (1998)
Everson, R., Roberts, S.: Independent Component Analysis: A Flexible Nonlinearity and Decorrelating Manifold Approach. Neural Computation 11, 1957–1983 (1999)
Girolami, M.: An Alternative Perspective on Adaptive Independent Component Analysis Algorithms. Neural Computation 10, 2103–2114 (1998)
Hopfield, J.J., Tank, D.W.: Neural computation of decisions in optimization problems. Biological Cybernetics 52, 141–152 (1985)
Hyvarinen, A., Karhunen, J., Oja, A.: Independent Component Analysis. John Wileys, Sons, Inc, New York (2001)
Lee, T.W., Girolami, M., Sejnowski, T.J.: Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources. Neural Computation 11, 417–441 (1999)
Liu, Z.Y., Chiu, K.C., Xu, L.: One-Bit-Matching Conjecture for Independent Component Analysis. Neural Computation 16, 383–399 (2004)
Moreau, E., Macchi, O.: High Order Constrasts for Self-adaptive Source Separation. International Journal of Adaptive Control and Signal Processing 10, 19?6 (1996)
Pearlmutter, B.A., Parra, L.C.: A Context-sensitive Genaralization of ICA. In: Proc. of Int. Conf. on Neural Information Processing, Hong Kong, Springer, Hong Kong (1996)
Tong, L., Inouye, Y., Liu, R.: Waveform-preserving Blind Estimation of Multiple Independent Sources. Signal Processing 41, 2461–2470 (1993)
Welling, M., Weber, M.: A Constrained EM Algorithm for Independent Component Analysis. Neural Computation 13, 677–689 (2001)
Xu, L.: Independent Component Analysis and Extensions with Noise and Time: A Bayesian Ying-Yang Learning Perspective. Neural Information Processing Letters and Reviews 1, 1–52 (2003)
Xu, L.: Distribution Approximation, Combinatorial Optimization, and Lagrange- Barrier. In: Proc. of International Joint Conference on Neural Networks 2003 (IJCNN 2003), Jantzen Beach, Portland, Oregon, July 20-24, pp. 2354–2359 (2003)
Xu, L., Cheung, C.C., Amari, S.I.: Further Results on Nonlinearity and Separtion Capability of a Liner Mixture ICA Method and Learned LPM. In: Fyfe, C. (ed.) Proceedings of the I&ANN?, vol. 8, pp. 39–45 (1998a)
Xu, L., Cheung, C.C., Amari, S.I.: Learned Parametric Mixture Based ICA Algorithm. Neurocomputing 22, 69–80 (1998b)
Xu, L., Cheung, C.C., Yang, H.H., Amari, S.I.: Independent component analysis by the information-theoretic approach with mixture of density. In: Proc. of 1997 IEEE Intl. Conf on Neural Networks, Houston, TX, vol. 3, pp. 1821–1826 (1997)
Xu, L.: Bayesian Ying-Yang Learning Based ICA Models. In: Proc. 1997 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing VI, Florida, pp. 476–485 (1997)
Xu, L., Yang, H.H., Amari, S.I.: Signal Source Separation by Mixtures: Accumulative Distribution Functions or Mixture of Bell-shape Density Distribution Functions. Rresentation at FRONTIER FORUM. Japan: Institute of Physical and Chemical Research (April 1996)
Xu, L.: Combinatorial optimization neural nets based on a hybrid of Lagrange and transformation approaches. In: Proc. of World Congress on Neural Networks, San Diego, pp. 399–404 (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xu, L. (2005). One-Bit-Matching ICA Theorem, Convex-Concave Programming, and Combinatorial Optimization. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_2
Download citation
DOI: https://doi.org/10.1007/11427391_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)