Abstract
In this chapter, we present a procedure for clustering (unsupervised learning) data from a model based on mixtures of independent component analyzers. Clustering techniques have been extensively studied in many different fields for a long time. They can be organized in different ways according to several theoretical criteria. However, a rough widely accepted classification of these techniques is: hierarchical and partitional clustering; see for instance. Both clustering categories provide a division of the data objects. The hierarchical approach also yields a hierarchical structure from a sequence of partitions performed from singleton clusters to a cluster including all data objects (agglomerative or bottom-up strategy) or vice versa (divisive or top-down strategy). This structure consists of a binary tree (dendrogram) whose leaves are the data objects and whose internal nodes represent nested clusters of various sizes. The whole node of the dendrogram represents the whole data set. The internal nodes describe the extent that the objects are proximal to each other; and the height of the dendrogram usually represents the distance between each pair of objects or clusters, or an object and a cluster.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
B. Everitt, S. Landau, M. Leese, Cluster Analysis, 4th edn. (Arnold, London, 2001)
R. Xu, D. Wunsch, Survey of clustering algorithms. IEEE Trans. Neural Netw. 16(3), 645–678 (2005)
D.T. Pham, A.A. Afify, Clustering techniques and their applications in engineering. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 221(11), 1445–1459 (2007)
G. Lance, W. Williams, A general theory of classification sorting strategies. 1. Hierarchical systems. Comput. J. 9(4), 373–380 (1967)
A.K. Jain, M.N. Murty, P.J. Flynn, Data clustering: a review, ACM Comput. Surv. 31(3) (1999)
C. Williams, A MCMC approach to hierarchical mixture modelling. Int. Conf. Neural Inf. Process. Sys. NIPS 13, 680–686 (1999)
R.M. Neal, Density modeling and clustering using Dirichlet diffusion trees. Bayesian Stat. 7, 619–629 (2003)
C. Kemp, T.L. Griffiths, S. Stromsten, J.B. Tenenbaum, Semi-supervised learning with trees, Int. Conf. Neural Inf. Process. Sys. NIPS 17 (2003)
N. Vasconcelos, A. Lippman, Learning mixture hierarchies. Int. Conf. Neural Inf. Process. Sys. NIPS 12, 606–612 (1998)
A. Stolcke, S. Omohundro, Hidden Markov model induction by Bayesian model merging. Int. Conf. Neural Inf. Process. Sys. 6, 11–18 (1992)
J.D. Banfield, A.E. Raftery, Model-based Gaussian and non-gaussian clustering. Biometrics 43, 803–821 (1993)
S. Vaithyanathan, B. Dom, Model-based hierarchical clustering. Uncertain Artif. Intell. 16, 599–608 (2000)
E. Segal, D. Koller, D. Ormoneit, Probabilistic abstractions hierarchies. Int. Conf. Neural Inf. Process. Sys. NIPS 15, 913–920 (2001)
M.F. Ramoni, P. Sebastiani, I.S. Kohane, Cluster analysis of gene expression dynamics. Natl Acad Sci 99, 9121–9126 (2003)
N. Friedman, Pcluster: probabilistic agglomerative clustering of gene expression profiles. Technical report, vol 80 (Herbew University 2003)
K.A. Heller, Z. Ghahramani, Bayesian hierarchical clustering, ACM International Conference Proceeding Series. Proceedings of the 22nd international conference on Machine learning, vol 119 (Bonn, Germany, 2005), pp 297–304
C.M. Bishop, M.E. Tipping, A hierarchical latent variable model for data visualization. IEEE Trans. Pattern Anal. Mach. Intell. 20(3), 281–293 (1998)
M.E. Tipping, C.M. Bishop, Probabilistic principal component analysis. J. R. Stat. Soc. Series B 61(3), 611–622 (1999)
M.E. Tipping, C.M. Bishop, Mixtures of probabilistic principal component analyzers, Neural Comput. 11(2), 443–482 (1999)
H.J. Park, T.W. Lee, Capturing nonlinear dependencies in natural images using ICA and mixture of Laplacian distribution. Neurocomputing 69, 1513–1528 (2006)
D.J. Mackay, Information Theory, Inference, and Learning Algorithms (Cambridge University Press, Cambridge, 2004)
F.R. Bach, M.I. Jordan, Beyond independent components: trees and clusters. J. Mach. Learn. Res. 3, 1205–1233 (2003)
A. Hyvärinen, P.O. Hoyer, M. Inki, Topographic independent component analysis. Neural Comput. 13(7), 1527–1558 (2001)
R.S. Raghavan, A method for estimating parameters of K-distributed clutter, IEEE Trans. Aerosp. Electron. Sys. 27(2), 268–275 (1991)
J.C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms (Plenum Press, New York, 1981)
A.J. Bell, T.J. Sejnowski, The “independent components” of natural scenes are edge filters. Vis. Res. 37(23), 3327–3338 (1997)
J.H. Van Hateren, A. van der Shaaf, Independent component filters of natural images compared with simple cells in primary visual cortex. Proc R Soc Lond B 265, 359–366 (1998)
Y. Matsuda, K. Yamaguchi, Linear multilayer ICA generating hierarchical edge detectors. Neural Comput. 19(1), 218–230 (2007)
T.W. Lee, M.S. Lewicki, T.J. Sejnowski, ICA mixture models for unsupervised classification of non-gaussian classes and automatic context switching in blind signal separation. IEEE Trans. Pattern Anal. Mach. Intell. 22(10), 1078–1089 (2000)
T.S. Lee, D. Mumford, Hierarchical Bayesian inference in the visual cortex. J. Opt. Soc. Am. A 20(7), 1434–1448 (2003)
S.A. Nene, S.K. Nayar, H. Murase, Columbia Object Image Library (COIL-100), Technical report CUCS-006-96, February (1996)
A. Hyvärinen, J. Karhunen, E. Oja, Independent Component Analysis (Wiley, New York, 2001)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Salazar, A. (2013). Hierarchical Clustering from ICA Mixtures. In: On Statistical Pattern Recognition in Independent Component Analysis Mixture Modelling. Springer Theses, vol 4. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30752-2_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-30752-2_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30751-5
Online ISBN: 978-3-642-30752-2
eBook Packages: EngineeringEngineering (R0)