Measuring Distances Between Variables by Mutual Information

Purchase on Springer.com

$29.95 / €24.95 / £19.95*

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Information theoretic concepts, such as the mutual information, provide a general framework to detect and evaluate dependencies between variables. In this work, we describe and review several aspects of the mutual information as a measure of ‘distance’ between variables. Giving a brief overview over the mathematical background, including its recent generalization in the sense of Tsallis, our emphasis will be the numerical estimation of these quantities from finite datasets. The described concepts will be exemplified using large-scale gene expression data and compared to the results obtained from other measures, such as the Pearson Correlation.