Skip to main content

Determine the Optimal Parameter for Information Bottleneck Method

  • Conference paper
PRICAI 2006: Trends in Artificial Intelligence (PRICAI 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4099))

Included in the following conference series:

  • 2151 Accesses

Abstract

A natural question in Information Bottleneck method is how many “groups” are appropriate. The dependency on prior knowledge restricts the applications of many Information Bottleneck algorithms. In this paper we aim to remove this dependency by formulating the parameter choosing as a model selection problem, and solve it using the minimum message length principle. Empirical results in the documentation clustering scenario indicates that the proposed method works well for the determination of the optimal parameter value for information bottleneck method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 239.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tishby, N., Pereira, F., Bialek, W.: The information bottleneck method. In: Proc. 37th Allerton Conference on Communication and Computation (1999)

    Google Scholar 

  2. Slonim, N.: The Information Bottleneck: Theory and Applications. PhD thesis, the Senate of the Hebrew University (2002)

    Google Scholar 

  3. Slonim, N., Tishby, N.: Agglomerative information bottleneck. Advances in Neural Information Processing Systems (NIPS) 12, 617–623 (1999)

    Google Scholar 

  4. Wallace, C., Freeman, P.R.: Estimation and inference by compact coding. Journal of the Royal Statistical Society 49, 223–265 (1987)

    MathSciNet  Google Scholar 

  5. Rissanen, J.: Universal Prior for Integers and Estimation by Minimum Description Length. Annals of Statistics 11, 416–431 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  6. Weisstein, E.W.: Stirling Number of the Second Kind (From MathWorld – A Wolfram Web Resource), http://mathworld.wolfram.com/StirlingNumberoftheSecondKind.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, G., Liu, D., Ye, Y., Rong, J. (2006). Determine the Optimal Parameter for Information Bottleneck Method. In: Yang, Q., Webb, G. (eds) PRICAI 2006: Trends in Artificial Intelligence. PRICAI 2006. Lecture Notes in Computer Science(), vol 4099. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36668-3_123

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-36668-3_123

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36667-6

  • Online ISBN: 978-3-540-36668-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics