Gaussian Mixture Modeling with Gaussian Process Latent Variable Models

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6376)


Density modeling is notoriously difficult for high dimensional data. One approach to the problem is to search for a lower dimensional manifold which captures the main characteristics of the data. Recently, the Gaussian Process Latent Variable Model (GPLVM) has successfully been used to find low dimensional manifolds in a variety of complex data. The GPLVM consists of a set of points in a low dimensional latent space, and a stochastic map to the observed space. We show how it can be interpreted as a density model in the observed space. However, the GPLVM is not trained as a density model and therefore yields bad density estimates. We propose a new training strategy and obtain improved generalisation performance and better density estimates in comparative evaluations on several benchmark data sets.


Gaussian Mixture Modeling Mixture Component Kernel Density Estimation Latent Variable Model Latent Centre 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Izenman, A.J.: Recent developments in nonparametric density estimation. Journal of the American Statistical Association 86, 205–224 (1991)zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Ghahramani, Z., Beal, M.J.: Variational inference for Bayesian mixtures of factor analysers. In: NIPS, vol. 12 (2000)Google Scholar
  3. 3.
    Rosenblatt, M.: Remarks on some nonparametric estimates of a density function. Annals of Mathematical Statistics 27(3), 832–837 (1956)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Parzen, E.: On estimation of a probability density function and mode. Annals of Mathematical Statistics 33(3), 1065–1076 (1962)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Rudemo, M.: Empirical choice of histograms and kernel density estimators. Scandinavian Journal of Statistics 9, 65–78 (1982)zbMATHMathSciNetGoogle Scholar
  6. 6.
    Vincent, P., Bengio, Y.: Manifold parzen windows. In: NIPS, vol. 15 (2003)Google Scholar
  7. 7.
    Bishop, C.M., Svensén, M., Williams, C.K.I.: The generative topographic mapping. Neural Computation 1, 215–234 (1998)CrossRefGoogle Scholar
  8. 8.
    Roweis, S., Saul, L.K., Hinton, G.E.: Global coordination of local linear models. In: NIPS, vol. 14 (2002)Google Scholar
  9. 9.
    Lawrence, N.: Probabilistic non-linear principal component analysis with Gaussian process latent variable models. JMLR 6, 1783–1816 (2005)MathSciNetGoogle Scholar
  10. 10.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  11. 11.
    Rasmussen, C.E.: The infinite Gaussian mixture model. In: NIPS, vol. 12 (2000)Google Scholar
  12. 12.
    Quiñonero-Candela, J., Girard, A., Rasmussen, C.E.: Prediction at an uncertain input for GPs and RVMs. Technical Report IMM-2003-18, TU Denmark (2003)Google Scholar
  13. 13.
    Wasserman, L.: All of Nonparametric Statistics. Springer, Heidelberg (2006)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  1. 1.MPI for Biological CyberneticsTübingenGermany
  2. 2.Department of EngineeringUniversity of CambridgeUK

Personalised recommendations