Measuring Distances Between Variables by Mutual Information

  • Ralf Steuer
  • Carsten O. Daub
  • Joachim Selbig
  • Jürgen Kurths
Conference paper

DOI: 10.1007/3-540-26981-9_11

Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)
Cite this paper as:
Steuer R., Daub C.O., Selbig J., Kurths J. (2005) Measuring Distances Between Variables by Mutual Information. In: Baier D., Wernecke KD. (eds) Innovations in Classification, Data Science, and Information Systems. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg

Abstract

Information theoretic concepts, such as the mutual information, provide a general framework to detect and evaluate dependencies between variables. In this work, we describe and review several aspects of the mutual information as a measure of ‘distance’ between variables. Giving a brief overview over the mathematical background, including its recent generalization in the sense of Tsallis, our emphasis will be the numerical estimation of these quantities from finite datasets. The described concepts will be exemplified using large-scale gene expression data and compared to the results obtained from other measures, such as the Pearson Correlation.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin · Heidelberg 2005

Authors and Affiliations

  • Ralf Steuer
    • 1
  • Carsten O. Daub
    • 2
  • Joachim Selbig
    • 2
  • Jürgen Kurths
    • 1
  1. 1.Nonlinear Dynamics GroupUniversity of PotsdamPotsdamGermany
  2. 2.Max-Planck Institute for Molecular Plant PhysiologyGolmGermany

Personalised recommendations