Skip to main content
Log in

Speech separation algorithm for auditory scene analysis

  • Letters
  • Published:
Journal of Electronics (China)

Abstract

A simple and efficient algorithm is presented to separate concurrent speeches. The parameters of mixed speeches are estimated by searching in the neighbor area of given pitches to minimize the error between the original and the synthetic spectrums. The effectiveness of the proposed algorithm to separate close frequencies is demonstrated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. A. J. W. Van der Kouwe, D. L. Wang, G. J. Brown, A comparison of auditory and blind separation techniques for speech segregation, IEEE Trans. on Speech and audio processing, 9(2001)3, 189–195.

    Article  Google Scholar 

  2. G. J. Brown, M. Cooke, Computational auditory scene analysis, Computer Speech and Language, 8(1994), 297–336.

    Article  Google Scholar 

  3. R. Meddis, L. O’Mard, Psychophysically Faithful Methods for Extracting Pitch, in Computational Auditory Scene Analysis (eds: D. Rosenthal & H. Okuno), Lawrence Erlbaum. 1998, 43–58.

  4. D. W. Griffith, J. S. Lim, Multiband excitation vocoder, IEEE Trans. on ASSP, 36(1988)8, 1223–1235.

    Google Scholar 

  5. R. J. McAulay, T. F. Quatieri, Speech analysis/synthesis based on a sinusoidal representation, IEEE Trans. on ASSP, 34(1986)4, 744–754.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Supported by the National Natural Science Foundation of China (No.60172048)

About this article

Cite this article

Huang, X., Wei, G. Speech separation algorithm for auditory scene analysis. J. of Electron.(China) 21, 261–264 (2004). https://doi.org/10.1007/BF02687881

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02687881

Key words

Navigation