Skip to main content

Mutual Information Estimation with Random Forests

  • Conference paper
Neural Information Processing (ICONIP 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8835))

Included in the following conference series:

  • 2601 Accesses

Abstract

We present a new method for estimating mutual information based on the random forests classifiers. This method uses random permutation of one of the two variables to create data where the two variables are independent. We show that mutual information can be estimated by the class probabilities of a probabilistic classifier trained on the independent against the dependent data. This method has the robustness and flexibility that random forests offers as well as the possibility to use mixtures of continuous and discrete data, unlike most other approaches for estimating mutual information. We tested our method on a variety of data and found it to be accurate with medium or large datasets yet inaccurate with smaller datasets. On the positive side, our method is capable to estimate the mutual information between sets of both continuous and discrete variables and appears to be relatively insensitive to the addition of noise variables.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fleuret, F.: Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research 5, 1531–1555 (2004)

    MathSciNet  MATH  Google Scholar 

  2. Sethi, I.K., Sarvarayudu, G.P.R.: Hierarchical classifier design using mutual information. IEEE Transactions on Pattern Analysis and Machine Intelligence 4(4), 441–445 (1982)

    Article  Google Scholar 

  3. Cheng, J., Bell, D.A., Liu, W.: Learning belief networks from data: An information theory based approach. In: Proceedings of the Sixth International Conference on Information and Knowledge Management, pp. 325–331. ACM (1997)

    Google Scholar 

  4. Kwak, N., Choi, C.-H.: Input feature selection by mutual information based on parzen window. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(12), 1667–1671 (2002)

    Article  Google Scholar 

  5. Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Physical Review E 69(6), 066138 (2004)

    Google Scholar 

  6. Moon, Y.-I., Rajagopalan, B., Lall, U.: Estimation of mutual information using kernel density estimators. Physical Review E 52(3), 2318 (1995)

    Article  Google Scholar 

  7. Parzen, E., et al.: On estimation of a probability density function and mode. Annals of Mathematical Statistics 33(3), 1065–1076 (1962)

    Article  MathSciNet  MATH  Google Scholar 

  8. Bishop, C.M.: Pattern Recognition and Machine Learning, vol. 1. Springer, New York (2006)

    MATH  Google Scholar 

  9. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  10. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  11. Criminisi, A., Shotton, J., Konukoglu, E.: Decision forests for classification, regression, density estimation, manifold learning and semi-supervised learning. Microsoft Research Cambridge, Tech. Rep. MSRTR-2011-114 5(6),  12 (2011)

    Google Scholar 

  12. Biau, G., Devroye, L., Lugosi, G.: Consistency of random forests and other averaging classifiers. Journal of Machine Learning Research 9, 2015–2033 (2008)

    MathSciNet  MATH  Google Scholar 

  13. Bostrom, H.: Calibrating random forests. In: Seventh International Conference on Machine Learning and Applications, ICMLA 2008, pp. 121–126. IEEE (2008)

    Google Scholar 

  14. Hwang, J.N., Lay, S.R., Lippman, A.: Nonparametric multivariate density estimation: a comparative study. IEEE Transactions on Signal Processing 42(10), 2795–2810 (1994)

    Article  Google Scholar 

  15. Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2(1), 1–32 (1994)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Koeman, M., Heskes, T. (2014). Mutual Information Estimation with Random Forests. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_63

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12640-1_63

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12639-5

  • Online ISBN: 978-3-319-12640-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics