Skip to main content

Clustering Gene Expression Data by Mutual Information with Gene Function

  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 2001 (ICANN 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2130))

Included in the following conference series:

Abstract

We introduce a simple on-line algorithm for clustering paired samples of continuous and discrete data. The clusters are defined in the continuous data space and become local there, while within-cluster differences between the associated, implicitly estimated conditional distributions of the discrete variable are minimized. The discrete variable can be seen as an indicator of relevance or importance guiding the clustering. Minimization of the Kullback-Leibler divergence-based distortion criterion is equivalent to maximization of the mutual information between the generated clusters and the discrete variable. We apply the method to a time series data set, i.e. yeast gene expressions measured with DNA chips, with biological knowledge about the functions of the genes encoded into the discrete variable.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 189.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S. Becker. Mutual information maximization: models of cortical self-organization. Network: Computation in Neural Systems, 7:7–31, 1996.

    Article  MATH  Google Scholar 

  2. S. Becker and G. E. Hinton. Self-organizing neural network that discovers surfaces in random-dot stereograms. Nature, 355:161–163, 1992.

    Article  Google Scholar 

  3. M. P. Brown, W. N. Grundy, D. Lin, N. Cristianini, C. W. Sugnet, T. S. Furey, M. Ares, Jr., and D. Haussler. Knowledge-based analysis of microarray gene expression data by using support vector machines. Proceedings of the National Academy of Sciences, USA, 97:262–267, 2000.

    Article  Google Scholar 

  4. A. P. Dempster, N. M. Laird, and D. B. Rubin. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39:1–38, 1977.

    MATH  MathSciNet  Google Scholar 

  5. M. B. Eisen, P. T. Spellman, P. O. Brown, and D. Botstein. Cluster analysis and display of genome-wide expression patterns. Proceedings of the National Academy of Sciences, USA, 95:14863–14868, 1998.

    Article  Google Scholar 

  6. T. Hastie, R. Tibshirani, and A. Buja. Flexible discriminant and mixture models. In J. Kay and D. Titterington, editors, Neural Networks and Statistics. Oxford University Press, 1995.

    Google Scholar 

  7. S. Kaski. Convergence of a stochastic semisupervised clustering algorithm. Technical Report A62, Helsinki University of Technology, Publications in Computer and Information Science, Espoo, Finland, 2000.

    Google Scholar 

  8. S. Kaski, J. Sinkkonen and J. Peltonen. Bankruptcy Analysis with Self-Organizing Maps in Learning Metrics IEEE Transactions on Neural Networks, 2001, in press.

    Google Scholar 

  9. K. V. Mardia. Statistics of directional data. Journal of the Royal Statistical Society. Series B, 37:349–393, 1975.

    MATH  MathSciNet  Google Scholar 

  10. F. Pereira, N. Tishby, and L. Lee. Distributional clustering of English words. In Proceedings of the 30th Annual Meeting of the Association for Computational Linguistics, pages 183–190. 1993.

    Google Scholar 

  11. J. Sinkkonen and S. Kaski. Clustering based on conditional distributions in an auxiliary space. Neural Computation, 2001, in press.

    Google Scholar 

  12. T. Hofmann, J. Puzicha, and M. I. Jordan. Learning from dyadic data. In M. S. Kearns, S. A. Solla, and D. A. Cohn, editors, Advances in Neural Information Processing Systems 11, pages 466–472. Morgan Kauffmann Publishers, San Mateo, CA, 1998.

    Google Scholar 

  13. N. Tishby, F. C. Pereira, and W. Bialek. The information bottleneck method. In 37th Annual Allerton Conference on Communication, Control, and Computing, Urbana, Illinois, 1999.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kaski, S., Sinkkonen, J., Nikkilä, J. (2001). Clustering Gene Expression Data by Mutual Information with Gene Function. In: Dorffner, G., Bischof, H., Hornik, K. (eds) Artificial Neural Networks — ICANN 2001. ICANN 2001. Lecture Notes in Computer Science, vol 2130. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44668-0_12

Download citation

  • DOI: https://doi.org/10.1007/3-540-44668-0_12

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42486-4

  • Online ISBN: 978-3-540-44668-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics