Neural Processing Letters

, Volume 22, Issue 2, pp 183–204 | Cite as

On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models

Abstract

We show that the choice of posterior approximation affects the solution found in Bayesian variational learning of linear independent component analysis models. Assuming the sources to be independent a posteriori favours a solution which has orthogonal mixing vectors. Linear mixing models with either temporally correlated sources or non-Gaussian source models are considered but the analysis extends to nonlinear mixtures as well.

Keywords

independent component analysis variational Bayesian learning 

Abbreviations

ICA

Independent component analysis

MoG

Mixture of Gaussians

PCA

Principal component analysis

References

  1. 1.
    Attias, H. 1999Independent factor analysisNeural Computation11803851CrossRefPubMedGoogle Scholar
  2. 2.
    Lappalainen, H.: Ensemble learning for independent component analysis, In: Proceedings of Int. Workshop on Independent Component Analysis and Signal Separation (ICA99), pp. 7–12, Aussois: France, 1999.Google Scholar
  3. 3.
    Miskin, J. and MacKay, D. J. C.: Ensemble learning for blind source separation, In: S. Roberts and R. Everson (eds.),Independent Component Analysis: Principles and Practice, pp. 209–233, Cambridge University Press, 2001.Google Scholar
  4. 4.
    Choudrey, R., Penny, W. and Roberts, S.: An ensemble learning approach to independent component analysis, In: Proceeding of the IEEE Workshop on Neural Networks for Signal Processing, pp. 435–444, IEEE Press: Sydney, Australia, 2000.Google Scholar
  5. 5.
    Valpola, H.: Nonlinear independent component analysis using ensemble learning: theory, In: Proceeding of International Workshop on Independent Component Analysis and Blind Signal Separation (ICA2000), pp. 251–256, Helsinki, Finland, 2000.Google Scholar
  6. 6.
    Chan, K., Lee, T., Sejnowski, T. J. 2002Variational learning of clusters of undercomplete nonsymmetric independent componentsJournal of Machine Learning Research399114CrossRefMathSciNetGoogle Scholar
  7. 7.
    Chan, K., Lee, T., Sejnowski, T. J. 2003Variational bayesian learning of ICA with missing dataNeural Computation1519912011CrossRefGoogle Scholar
  8. 8.
    Valpola, H., Karhunen, J. 2002An unsupervised ensemble learning method for nonlinear dynamic state-space modelsNeural computation1426472692CrossRefGoogle Scholar
  9. 9.
    Ilin, A. and Valpola, H.: On the effect of the form of the posterior approximation in variational learning of ICA models, In: Proceeding of the 4th International Conference on Independent Component Analysis and Blind Signal Separation (ICA2003), pp. 915–920, Nara: Japan, 2003.Google Scholar
  10. 10.
    Barber, D. and Bishop, C. Ensemble learning for multi-layer networks, In: M. Jordan, M. Kearns and S. Solla (eds.), Advances in Neural Information Processing Systems 10, pp. 395–401, The MIT Press: Cambridge, MA, USA.Google Scholar
  11. 11.
    Ghahramani, Z., Hinton, G. E. 2000Variational learning for switching state-space modelsNeural computation12963996CrossRefGoogle Scholar
  12. 12.
    Hyvärinen, A., Karhunen, J. and Oja, E.: Independent Component Analysis, Wiley, 2001.Google Scholar
  13. 13.
    Belouchrani, A., Meraim, K. A., Cardoso, J.-F., Moulines, E. 1997A blind source separation technique based on second order statisticsIEEE Trans. on Signal Processing45434444CrossRefGoogle Scholar
  14. 14.
    Ziehe, A., Müller, K.-R., Nolte, G., Mackert, B.-M. and Cuiro, G.: Artifact reduction in manetoneurography based on time-delayed second order correlations, Technical Report 31, GMD–Forschungszentrum Informationstechnik GmbH, 1998.Google Scholar
  15. 15.
    Tong, L., Soo, V., Liu, R. and Huang, Y.: Amuse: a new blind identification algorithm, In: Proceedings ISCAS, New Orleans, USA, 1990.Google Scholar
  16. 16.
    Ghahramani, Z., Beal, M. 2001

    Propagation algorithms for variational Bayesian learning

    Leen, T.Dietterich, T.Tresp, V. eds. Advances in Neural Information Processing Systems 13The MIT PressCambridge, MA, USA507513
    Google Scholar
  17. 17.
    Valpola, H., Honkela, A., Harva, M., Ilin, A., Raiko, T. and Östman, T.: Bayes blocks software library, http://www.cis.hut.fi/projects/bayes/sofware/, 2003.Google Scholar
  18. 18.
    Valpola, H., Raiko, T. and Karhunen, J.: Building blocks for hierarchical latent variable models, In: Proceedings of the 3rd International Conference on Independent Component Analysis and Signal Separation (ICA2001), pp. 710–715, San Diego, USA, 2001.Google Scholar
  19. 19.
    Ilin A.: Matlab code for variational bayesian learning of linear ICA models, http://www.cis.hut.fi/projects/bayes/sofware/np12004/, 2004.Google Scholar
  20. 20.
    Valpola, H., Harva, M., Karhunen, J. 2004Hierarchical models of variance sourcesSignal Processing84267282CrossRefGoogle Scholar
  21. 21.
    Harva, M. 2004Hierarchical variance models of image sequencesHelsinki University of TechnologyEspooMaster’s thesisGoogle Scholar
  22. 22.
    Højen-Sørensen, P. A. d. F. R., Winter, O., Hansen, L. K. 2002Mean-field approaches to independent component analysisNeural Computation14889918CrossRefPubMedGoogle Scholar

Copyright information

© Springer 2005

Authors and Affiliations

  1. 1.Neural Networks Research CentreHelsinki University of TechnologyFinland
  2. 2.Laboratory of Computational EngineeringHelsinki University of TechnologyFinland

Personalised recommendations