Advertisement

Exploiting Redundancy to Construct Listening Systems

  • Paris Smaragdis

Conclusions

This report highlighted some of the intricate relationships between the statistics of natural sounds and auditory processing. We have argued that many of the common steps that we often take to perform computational audition can be seen as processes driven by the nature of sound, and not so as steps inspired by human physiology or engineering. We have shown how different aspects of hearing can be explained using a simple and common rule exploiting the statistical structure of sound. Although the methods we employed are very simple, the results are just as promising as using any other more complex approach. We hope that the simplicity and the elegance of this approach will inspire further work along these lines, and give rise to more investigations in the field of computationally evolving audition.

Keywords

Principal Component Analysis Mutual Information Discrete Cosine Transform Independent Component Analysis Independent Component Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S-I., Cichocki, A., and Yang, H.H., 1996, A new learning algorithm for blind signal separation, Neural Information Processing Systems, 1996.Google Scholar
  2. Ahmed, N., Natarajan, T., and Rao, K.R., 1974, Discrete Cosine Transform, IEEE Transactions on Computers.Google Scholar
  3. Barlow, H.B., 1961, Possible principles underlying the transformation of sensory messages. In Sensory Communication, MIT Press.Google Scholar
  4. Barlow, H.B., 1989, Unsupervised learning, in Neural Computation 1, MIT Press, Cambridge MA.Google Scholar
  5. Cardoso, J.-F., 1990, Eigen-structure of the fourth-order cumulant tensor with application to the blind source separation problem. Proc. ICASSP’90, pages 2655–2658Google Scholar
  6. Cichocki, A. and Georgiev, P., 2003, Blind source separation algorithms with matrix constraints, IEICE Trans. Fundamentals, Vol. E86-AGoogle Scholar
  7. Deco, G. and Obradovic, D., 1996, An information theoretic approach to neural computing. Springer-Verlag, New York, New York.Google Scholar
  8. Haykin, S.S., 1994, Neural Networks: A Comprehensive Foundation, Macmillan.Google Scholar
  9. Hyvärinen, A., 1999, Survey on Independent Component Analysis, Neural Computer Surveys, Vol. 2, pp. 94–128.Google Scholar
  10. Jolliffe, I.T., 1986, Principal Component Analysis. Springer-Verlag, New York, New York.Google Scholar
  11. Olshausen, B.A., and Field, D.J., 1996, Emergence of simple-cell receptive field properties by learning a sparse code for natural images, Nature, 381.Google Scholar
  12. Plumbley, M.D., 2002, Conditions for non-negative independent component analysis, IEEE Signal Processing Letters, 9(6), pp177–180.CrossRefGoogle Scholar
  13. Rao, K. and Yip, P., 1990, Discrete Cosine Transform, Algorithms, Advantages, Applications. Academic Press.Google Scholar
  14. Roweis, S., 1998, EM Algorithms for PCA and SPCA, Neural Information Processing Systems 10.Google Scholar
  15. Smaragdis, P., 2001, Redundancy Reduction for Computational Audition: A Unifying Approach. MIT Doctoral dissertation.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2005

Authors and Affiliations

  • Paris Smaragdis
    • 1
  1. 1.Mitsubishi Electric Research LaboratoriesJapan

Personalised recommendations