Abstract
Clustering by maximizing the dependency between twopaired, continuous-valued multivariate data sets is studied. The new method, associative clustering (AC), maximizes a Bayes factor between two clustering models differing only in one respect: whether the clusterings of the two data sets are dependent or independent. The model both extends Information Bottleneck (IB)-type dependency modeling to continuous-valued data and offers it a well-founded and asymptotically well-behaving criterion for small data sets: With suitable prior assumptions the Bayes factor becomes equivalent to the hypergeometric probability of a contingency table, while for large data sets it becomes the standard mutual information. An optimization algorithm is introduced, with empirical comparisons to a combination of IB and K-means, and to plain K-means. Two case studies cluster genes 1) to find dependencies between gene expression and transcription factor binding, and 2) to find dependencies between expression in different organisms.
Chapter PDF
Similar content being viewed by others
Keywords
- Mutual Information
- Contingency Table
- Voronoi Region
- Probabilistic Latent Semantic Analysis
- Information Bottleneck
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Bazaraa, M.S., Sherali, H.D., Shetty, C.M.: Nonlinear Programming: Theory and Algorithms. Wiley, New York (1993)
Blei, D., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Machine Learning Res. 3, 993–1022 (2003)
Buntine, W.: Variational extensions to EM and multinomial PCA. In: Elomaa, T., Mannila, H., Toivonen, H. (eds.) ECML 2002. LNCS (LNAI), vol. 2430, pp. 23–34. Springer, Heidelberg (2002)
Dhillon, I.S., Mallela, S., Kumar, R.: A divisive information-theoretic feature clustering algorithm for text classification. J. Machine Learning Res. 3, 1265–1287 (2003)
Friedman, N., Mosenzon, O., Slonim, N., Tishby, N.: Multivariate information bottleneck. In: Proc. of UAI 2001, The 17th Conference on Uncertainty in Artificial Intelligence, pp. 152–161. Morgan Kaufmann Publishers, San Francisco (2001)
Good, I.J.: On the application of symmetric Dirichlet distributions and their mixtures to contingency tables. Annals of Statistics 4, 1159–1189 (1976)
Hastie, T., Tibshirani, R.: Discriminant analysis by Gaussian mixtures. J. of the R. Stat. Soc. B 58, 155–176 (1996)
Hofmann, T.: Unsupervised learning by probabilistic latent semantic analysis. Machine Learning 42, 177–196 (2001)
Hughes, T.R., Marton, M.J., Jones, A.R., Roberts, C.J., Stoughton, R., Armour, C.D., Bennett, H.A., Coffrey, E., Dai, H., He, Y.D., Kidd, M.J., King, A.M., Meyer, M.R., Slade, D., Lum, P.Y., Stepaniants, S.B., Shoemaker, D.D., Gachotte, D., Chakraburtty, K., Simon, J., Bard, M., Friend, S.H.: Functional discovery via a compendium of expression profiles. Cell 102, 109–126 (2000)
Kaski, S., Sinkkonen, J., Klami, A.: Regularized discriminative clustering. In: Molina, C., Adali, T., Larsen, J., van Hulle, M., Douglas, S., Rouat, J. (eds.) Neural Networks for Signal Processing XIII, pp. 289–298. IEEE, New York (2003)
Kay, J.: Feature discovery under contextual supervision using mutual information. In: Proc. of IJCNN 1992, International Joint Conference on Neural Networks, pp. 79–84. IEEE, Piscataway (1992)
Lee, T.I., Rinaldi, N.J., Robert, F., Odom, D.T., Bar-Joseph, Z., Gerber, G.K., Hannett, N.M., Harbison, C.T., Tomphson, C.M., Simon, I., Zeitlinger, J., Jennings, E.G., Murray, H.L., Gordon, D.B., Ren, B., Wyrick, J.J., Tagne, J.-B., Volkert, T.L., Fraenkel, E., Gifford, D.K., Young, R.A.: Transcriptional regulatory networks in saccharomyces cerevisiae. Science 298, 799–804 (2002)
Miller, D.J., Uyar, H.S.: A mixture of experts classifier with learning based on both labelled and unlabelled data. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, pp. 571–577. MIT Press, Cambridge (1997)
Peltonen, J., Sinkkonen, J., Kaski, S.: Sequential information bottleneck for finite data. In: Proc. of the International Conference on Machine Learning (to appear)
Pruitt, K.D., Maglott, D.R.: RefSeq and LocusLink: NCBI gene-centered resources. Nucleic Acids Research 29, 137–141 (2001)
Sinkkonen, J., Kaski, S.: Clustering based on conditional distributions in an auxiliary space. Neural Computation 14, 217–239 (2002)
Sinkkonen, J., Kaski, S., Nikkilä, J.: Discriminative clustering: Optimal contingency tables by learning metrics. In: Elomaa, T., Mannila, H., Toivonen, H. (eds.) ECML 2002. LNCS (LNAI), vol. 2430, pp. 418–430. Springer, Heidelberg (2002)
Slonim, N.: The information bottleneck: theory and applications, PhD thesis, Hebrew University, Jerusalem (2002)
Su, A.I., Cooke, M.P., Ching, K.A., Hakak, Y., Walker, J.R., Wiltshire, T., Orth, A.P., Vega, R.G., Sapinoso, L.M., Moqrich, A., Patapoutian, A., Hampton, G.M., Schultz, P.G., Hogenesch, J.B.: Large-scale analysis of the human and mouse transcriptomes. PNAS 99, 4465–4470 (2002)
Szummer, M., Jaakkola, T.: Kernel expansions with unlabeled examples. In: Leen, T., Dietterich, T., Tresp, V. (eds.) Advances in Neural Information Processing Systems, vol. 13, pp. 626–632. MIT Press, Cambridge (2001)
Tishby, N., Pereira, F.C., Bialek, W.: The information bottleneck method. In: Hajek, B., Sreenivas, R.S. (eds.) Proc. of The 37th Annual Allerton Conference on Communication, Control, and Computing, pp. 368–377. University of Illinois, Urbana (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sinkkonen, J., Nikkilä, J., Lahti, L., Kaski, S. (2004). Associative Clustering. In: Boulicaut, JF., Esposito, F., Giannotti, F., Pedreschi, D. (eds) Machine Learning: ECML 2004. ECML 2004. Lecture Notes in Computer Science(), vol 3201. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30115-8_37
Download citation
DOI: https://doi.org/10.1007/978-3-540-30115-8_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23105-9
Online ISBN: 978-3-540-30115-8
eBook Packages: Springer Book Archive