Advertisement

Audio Coding pp 145-160 | Cite as

Entropy and Coding

  • Yuli You
Chapter

Abstract

Let us consider a 2-bit quantizer that represents quantized values using the following set of quantization indexes: f0; 1; 2; 3g: Each quantization index given above is called a source symbol, or simply a symbol, and the set is called a symbol set. When applied to quantize a sequence of input samples, the quantizer produces a sequence of quantization indexes, such as the following: 1; 2; 1; 0; 1; 2; 1; 2; 1; 0; 1; 2; 2; 1; 2; 1; 2; 3; 2; 1; 2; 1; 1; 2; 1; 0; 1; 2; 1; 2: Called a source sequence, it needs to be represented by or converted to a sequence of codewords or codes that are suitable for transmission over a variety of channels. The primary concern is that the average codeword length is minimized so that the transmission of the source sequence demands a lower bit rate.

Keywords

Entropy Acoustics 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 43.
    MacKay, D.J.C.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)MATHGoogle Scholar
  2. 83.
    Sayood, K.: Introduction to Data Compression, second edn. Morgan Kaufmann, Burlington (2000)Google Scholar
  3. 85.
    Shannon, C.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423 (1948)MATHMathSciNetGoogle Scholar
  4. 86.
    Shannon, C.: A mathematical theory of communication. Bell System Technical Journal 27, 623–656 (1948)MathSciNetGoogle Scholar

Copyright information

© Springer US 2010

Authors and Affiliations

  • Yuli You
    • 1
  1. 1.University of Minnesota in Twin CitiesMinneapolisUSA

Personalised recommendations