Skip to main content

Information Approach to Blind Source Separation and Deconvolution

  • Chapter
Information Theory and Statistical Learning
  • 4234 Accesses

Blind separation of sources aims to recover the sources from a their mixtures without relying on any specific knowledge of the sources and/or on the mixing mechanism [1]. (That is why the separation is called blind). Instead, it relies on the basic assumption that the sources are mutually independent.1 A popular measure of dependence is the mutual information. This chapter attempts to provide a systematic approach to blind source separation based on the mutual information.

We shall focus on noiseless mixtures. There are few methods which deal explicitly with noises. Often a preprocessing step is done to reduce noises, or a method designed for a noiseless model is found to be rather insensitive to noise and can thus be applied when some noises are present. A general noiseless mixture model can be written as x(·) = A{s(·)}, where s(n) et x(n) represent the observation and the source vector at time n et A is some transformation, which can be instantaneous, i.e. operating on each s(n) to produce x(n), or global (i.e. operating on the whole sequence s(·) of the source vectors. The transformation A is not completely arbitrary, one often assumes it belongs to a certain class A, the most popular ones are the class of linear (or affine) instantaneous transformation and the class of linear convolutions. More complex non linear transformations have been considered, but for simplicity we shall limit ourselves to the above two linear classes. Separation may be realized by applying an inverse transformation A-1 to x(·). However, A is unknown, so is its inverse. The natural idea is to apply a transformation B є A -1, the set of all transformations which are inverses of a transformation in A, and is chosen to minimize some criterion. We consider here the independence criterion based on the mutual information measure.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cardoso, J.F.: Blind signal separation: Statistical principles. Proc IEEE, special issue on Blind Estimation and Identification 86(10), 2009–2025 (1998)

    Google Scholar 

  2. Cardoso, J.F., Laheld, B.: Equivariant adaptive sousrce separation. IEEE Trans Signal Process 44(12), 3017–3030 (1996)

    Article  Google Scholar 

  3. Comon, P.: Independence components analysis, a new concept. Signal Process 36(3), 287–314 (1994)

    Article  MATH  Google Scholar 

  4. Cover, T., Thomas, J.: Elements of Information Theory. New-York: Wiley (1991)

    MATH  Google Scholar 

  5. Dai, Y.H.: Convergence properties of the BFGS algorithm. SIAM J Optim 31(3), 693–701 (2002)

    Article  Google Scholar 

  6. Huber, P.J.: Projection pursuit. Ann Statist 13(2), 435–475 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  7. Jones, M.C.: Discretized and interpolated kernel density estimates. J Am Statist Assoc 84, 733–741 (1989)

    Article  Google Scholar 

  8. Pham, D.T.: Contrast functions for blind seperation and deconvolution of sources. In: Proceeding of ICA 2001 Conference, pp. 37–42. San Diego, USA (2001)

    Google Scholar 

  9. Pham, D.T.: Mutual information approach to blind separation of stationary sourcs. IEEE Trans Inform Theory 48, 1935–1946 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  10. Pham, D.T.: Fast algorithms for mutual information based independent component analysis. IEEE Trans Signal Process 52(10), 2690–2700 (2004)

    Article  Google Scholar 

  11. Pham, D.T.: Generalized mutual information approach to multichannel blind deconvolution. Signal Process 87(9), 2045–2060 (2007)

    Article  Google Scholar 

  12. Pham, D.T., Garat, P.: Blind separation of mixtures of independent sources through a quasi maximum likelihood approach. IEEE Trans Signal Process 45(7), 1712–1725 (1997)

    Article  MATH  Google Scholar 

  13. Press, W.H., Flannery, B.P., Teukolsky, S.A., Vetterling, W.T.: Numerical Recipes in C. The Art of Scientific Computing, Second Edition. Cambridge: Cambridge University Press (1993)

    Google Scholar 

  14. Scott, D.W., Sheather, S.J.: Kernel density estimation with binned data. Commun Statist – Theory Meth 14, 1353–1359 (1985)

    Article  Google Scholar 

  15. Shalvi, O., Weinstein, E.: Super-exponential methods for blind deconvolution. IEEE Trans Inform Theory 39(2), 504–519 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  16. Silverman, B.W.: Density Estimation for Statistics and Data Analysis. London: Chapman and Hall (1982)

    Google Scholar 

  17. Tugnait, J.K.: Identification and deconvolution of multichannel linear non-gaussian processes using higher order statistics and inverse filter criteria. IEEE Trans Signal Process 45(3), 658–672 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pham Dinh-Tuan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Dinh-Tuan, P. (2009). Information Approach to Blind Source Separation and Deconvolution. In: Emmert-Streib, F., Dehmer, M. (eds) Information Theory and Statistical Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-84816-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-0-387-84816-7_7

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-0-387-84815-0

  • Online ISBN: 978-0-387-84816-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics