Abstract
We present a new method for estimating mutual information based on the random forests classifiers. This method uses random permutation of one of the two variables to create data where the two variables are independent. We show that mutual information can be estimated by the class probabilities of a probabilistic classifier trained on the independent against the dependent data. This method has the robustness and flexibility that random forests offers as well as the possibility to use mixtures of continuous and discrete data, unlike most other approaches for estimating mutual information. We tested our method on a variety of data and found it to be accurate with medium or large datasets yet inaccurate with smaller datasets. On the positive side, our method is capable to estimate the mutual information between sets of both continuous and discrete variables and appears to be relatively insensitive to the addition of noise variables.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Fleuret, F.: Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research 5, 1531–1555 (2004)
Sethi, I.K., Sarvarayudu, G.P.R.: Hierarchical classifier design using mutual information. IEEE Transactions on Pattern Analysis and Machine Intelligence 4(4), 441–445 (1982)
Cheng, J., Bell, D.A., Liu, W.: Learning belief networks from data: An information theory based approach. In: Proceedings of the Sixth International Conference on Information and Knowledge Management, pp. 325–331. ACM (1997)
Kwak, N., Choi, C.-H.: Input feature selection by mutual information based on parzen window. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(12), 1667–1671 (2002)
Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Physical Review E 69(6), 066138 (2004)
Moon, Y.-I., Rajagopalan, B., Lall, U.: Estimation of mutual information using kernel density estimators. Physical Review EÂ 52(3), 2318 (1995)
Parzen, E., et al.: On estimation of a probability density function and mode. Annals of Mathematical Statistics 33(3), 1065–1076 (1962)
Bishop, C.M.: Pattern Recognition and Machine Learning, vol. 1. Springer, New York (2006)
Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Criminisi, A., Shotton, J., Konukoglu, E.: Decision forests for classification, regression, density estimation, manifold learning and semi-supervised learning. Microsoft Research Cambridge, Tech. Rep. MSRTR-2011-114 5(6), Â 12 (2011)
Biau, G., Devroye, L., Lugosi, G.: Consistency of random forests and other averaging classifiers. Journal of Machine Learning Research 9, 2015–2033 (2008)
Bostrom, H.: Calibrating random forests. In: Seventh International Conference on Machine Learning and Applications, ICMLA 2008, pp. 121–126. IEEE (2008)
Hwang, J.N., Lay, S.R., Lippman, A.: Nonparametric multivariate density estimation: a comparative study. IEEE Transactions on Signal Processing 42(10), 2795–2810 (1994)
Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2(1), 1–32 (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Koeman, M., Heskes, T. (2014). Mutual Information Estimation with Random Forests. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_63
Download citation
DOI: https://doi.org/10.1007/978-3-319-12640-1_63
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12639-5
Online ISBN: 978-3-319-12640-1
eBook Packages: Computer ScienceComputer Science (R0)