Independent Component Analysis and Blind Signal Separation

Volume 3889 of the series Lecture Notes in Computer Science pp 32-39

Csiszár’s Divergences for Non-negative Matrix Factorization: Family of New Algorithms

  • Andrzej CichockiAffiliated withLaboratory for Advanced Brain Signal Processing
  • , Rafal ZdunekAffiliated withLaboratory for Advanced Brain Signal Processing
  • , Shun-ichi AmariAffiliated withLaboratory for Mathematical Neuroscience, RIKEN BSI

* Final gross prices may vary according to local VAT.

Get Access


In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the NMF algorithms to increase convergence speed and impose some desired constraints such as sparsity and smoothness of components. Moreover, the effects of various regularization terms and constraints are clearly shown. The scope of these results is vast since the proposed generalized divergence functions include quite large number of useful loss functions such as the squared Euclidean distance,Kulback-Leibler divergence, Itakura-Saito, Hellinger, Pearson’s chi-square, and Neyman’s chi-square distances, etc. We have applied successfully the developed algorithms to blind (or semi blind) source separation (BSS) where sources can be generally statistically dependent, however they satisfy some other conditions or additional constraints such as nonnegativity, sparsity and/or smoothness.