Skip to main content

A minimax lower bound for empirical quantizer design

  • Conference paper
  • First Online:
Book cover Computational Learning Theory (EuroCOLT 1997)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1208))

Included in the following conference series:

Abstract

We obtain a minimax lower bound for the expected distortion of empirically designed vector quantizers. We show that the mean squared distortion of any empirically designed vector quantizer is at least Ω (n −1/2) away from the optimal distortion for some distribution on a bounded subset of R d, where n is the number of i.i.d. data points that are used to train the empirical quantizer.

The work of the last two authors was supported by OTKA Grant F 014174.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. K. Alexander. Probability inequalities for empirical processes and a law of the iterated logarithm. Annals of Probability, 4:1041–1067, 1984.

    Google Scholar 

  2. P. A. Chou. The distortion of vector quantizers trained on n vectors decreases to the optimum as Op(1/n). Proceedings of IEEE Int. Symp. Inform. Theory, Trondheim, Norway, 1994.

    Google Scholar 

  3. Y.S. Chow and H. Teicher. Probability Theory, Independence, Interchangeability, Martingales. Springer-Verlag, New York, 1978.

    Google Scholar 

  4. L. Devroye, L. Györfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Springer-Verlag, New York, 1996.

    Google Scholar 

  5. A.N. Kolmogorov and V.M. Tikhomirov. ε-entropy and ε-capacity of sets in function spaces. Translations of the American Mathematical Society, 17:277–364, 1961.

    Google Scholar 

  6. T. Linder, G. Lugosi, and K. Zeger. Rates of convergence in the source coding theorem, empirical quantizer design, and universal lossy source coding. IEEE Transactions on Information Theory, 40:1728–1740, 1994.

    Google Scholar 

  7. T. Linder, G. Lugosi, and K. Zeger. Empirical quantizer design in the presence of source noise or channel noise. IEEE Transactions on Information Theory, 1996. to appear.

    Google Scholar 

  8. C.L. Mallows. An inequality involving multinomial probabilities. Biometrika, 55:422–424, 1968.

    Google Scholar 

  9. N. Merhav and J. Ziv. On the amount of side information required for lossy data compression. submitted to IEEE Transactions on Information Theory, 1995.

    Google Scholar 

  10. D. Pollard. A central limit theorem for k-means clustering. Annals of Probability, 10:919–926, 1982.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Shai Ben-David

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bartlett, P., Linder, T., Lugosi, G. (1997). A minimax lower bound for empirical quantizer design. In: Ben-David, S. (eds) Computational Learning Theory. EuroCOLT 1997. Lecture Notes in Computer Science, vol 1208. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62685-9_18

Download citation

  • DOI: https://doi.org/10.1007/3-540-62685-9_18

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-62685-5

  • Online ISBN: 978-3-540-68431-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics