Measuring Distances Between Variables by Mutual Information
- Cite this paper as:
- Steuer R., Daub C.O., Selbig J., Kurths J. (2005) Measuring Distances Between Variables by Mutual Information. In: Baier D., Wernecke KD. (eds) Innovations in Classification, Data Science, and Information Systems. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg
Information theoretic concepts, such as the mutual information, provide a general framework to detect and evaluate dependencies between variables. In this work, we describe and review several aspects of the mutual information as a measure of ‘distance’ between variables. Giving a brief overview over the mathematical background, including its recent generalization in the sense of Tsallis, our emphasis will be the numerical estimation of these quantities from finite datasets. The described concepts will be exemplified using large-scale gene expression data and compared to the results obtained from other measures, such as the Pearson Correlation.
Unable to display preview. Download preview PDF.