Abstract
In this paper, we present a kernel trick embedded Gaussian Mixture Model (GMM), called kernel GMM. The basic idea is to embed kernel trick into EM algorithm and deduce a parameter estimation algorithm for GMM in feature space. Kernel GMM could be viewed as a Bayesian Kernel Method. Compared with most classical kernel methods, the proposed method can solve problems in probabilistic framework. Moreover, it can tackle nonlinear problems better than the traditional GMM. To avoid great computational cost problem existing in most kernel methods upon large scale data set, we also employ a Monte Carlo sampling technique to speed up kernel GMM so that it is more practical and efficient. Experimental results on synthetic and real-world data set demonstrate that the proposed approach has satisfing performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Achlioptas, D., McSherry, F., Schölkopf, B.: Sampling techniques for kernel methods. In: Advances in Neural Information Processing System (NIPS), vol. 14, MIT Press, Cambridge (2002)
Bilmes, J.A.: A Gentle Tutorial on the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models, Technical Report, UC Berkeley, ICSI-TR-97-021 (1997)
Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)
Dahmen, J., Keysers, D., Ney, H., Güld, M.O.: Statistical Image Object Recognition using Mixture Densities. Journal of Mathematical Imaging and Vision 14(3), 285–296 (2001)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons Press, New York (2001)
Everitt, B.S.: An Introduction to Latent Variable Models. Chapman and Hall, London (1984)
Francis, R.B., Michael, I.J.: Kernel Independent Component Analysis. Journal of Machine Learning Research 3, 1–48 (2002)
Gestel, T.V., et al.: Bayesian framework for least squares support vector machine classifiers, gaussian processs and kernel fisher discriminant analysis. Neural Computation 15(5), 1115–1148 (2002)
Herbrich, R., Graepel, T., Campbell, C.: Bayes Point Machines: Estimating the Bayes Point in Kernel Space. In: Proceedings of International Joint Conference on Artificial Intelligence Work-shop on Support Vector Machines, pp. 23–27 (1999)
Kwok, J.T.: The Evidence Framework Applied to Support Vector Machines. IEEE Trans. on NN 11, 1162–1173 (2000)
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.R.: Fisher discriminant analysis with kernels. In: IEEE Workshop on Neural Networks for Signal Processing IX, pp. 41–48 (1999)
Mjolsness, E., Decoste, D.: Machine Learning for Science: State of the Art and Future Pros-pects, Science, vol. 293 (2001)
Roberts, S.J.: Parametric and Non-Parametric Unsupervised Cluster Analysis. Pattern Recogni-tion 30(2), 261–272 (1997)
Schölkopf, B., Smola, A.J., Müller, K.R.: Nonlinear Component Analysis as a Kernel Eigen-value Problem. Neural Computation 10(5), 1299–1319 (1998)
Schölkopf, B., Mika, S., Burges, C.J.C., Knirsch, P., Müller, K.R., Raetsch, G., Smola, A.: Input Space vs. Feature Space in Kernel-Based Methods. IEEE Trans. on NN 10(5), 1000–1017 (1999)
Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization and Beyond. MIT Press, Cambridge (2002)
Tipping, M.E.: Sparse Bayesian Learning and the Relevance Vector Machine. Journal of Machine Learning Research (2001)
Vapnik, V.: The Nature of Statistical Learning Theory, 2nd edn. Springer, New York (1997)
Williams, C., Seeger, M.: Using the Nyström Method to Speed Up Kernel Machines. In: Leen, T.K., Diettrich, T.G., Tresp, V. (eds.) Advances in Neural Information Processing Systems (NIPS)13, MIT Press, Cambridge (2001)
Taylor, J.S., Williams, C., Cristianini, N., Kandola, J.: On the Eigenspectrum of the Gram Matrix and Its Relationship to the Operator Eigenspectrum. In: Cesa-Bianchi, N., Numao, M., Reischuk, R., et al. (eds.) ALT 2002. LNCS (LNAI), vol. 2533, pp. 23–40. Springer, Heidelberg (2002)
Ng, A.Y., Jordan, M.I., Weiss, Y.: On Spectral Clustering: Analysis and an algorithm, Advance in Neural Information Processing Systems (NIPS) 14. MIT Press, Cambridge (2002)
Moghaddam, B., Pentland, A.: Probabilistic visual learning for object representation. IEEE Trans. on PAMI 19(7), 696–710 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, J., Lee, J., Zhang, C. (2003). Kernel Trick Embedded Gaussian Mixture Model. In: Gavaldá, R., Jantke, K.P., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2003. Lecture Notes in Computer Science(), vol 2842. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39624-6_14
Download citation
DOI: https://doi.org/10.1007/978-3-540-39624-6_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-20291-2
Online ISBN: 978-3-540-39624-6
eBook Packages: Springer Book Archive