Generative Topographic Mapping (GTM) is a non-linear latent variable model that provides simultaneous visualization and clustering of high-dimensional data. It was originally formulated as a constrained mixture of distributions, for which the adaptive parameters were determined by Maximum Likelihood (ML), using the Expectation-Maximization (EM) algorithm. In this paper, we define an alternative variational formulation of GTM that provides a full Bayesian treatment to a Gaussian Process (GP)-based variation of GTM. The performance of the proposed Variational GTM is assessed in several experiments with artificial datasets. These experiments highlight the capability of Variational GTM to avoid data overfitting through active regularization.


Latent Space Marginal Likelihood Adaptive Parameter Manifold Learning Generative Topographic Mapping 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bishop, C.M., Svensen, M., Williams, C.R.I.: GTM: The Generative Topographic Mapping. Neural Comput. 10(1), 215–234 (1998)CrossRefGoogle Scholar
  2. 2.
    Kohonen, T.: Self-Organizing Maps, 3rd edn. Springer, Berlin (2001)zbMATHGoogle Scholar
  3. 3.
    Bishop, C.M., Svensen, M., Williams, C.R.I.: Developments of the Generative Topographic Mapping. Neurocomputing 21(1–3), 203–224 (1998)zbMATHCrossRefGoogle Scholar
  4. 4.
    Vellido, A., El-Deredy, W., Lisboa, P.J.G.: Selective smoothing of the Generative Topographic Mapping. IEEE T. Neural Networ. 14(4), 847–852 (2003)CrossRefGoogle Scholar
  5. 5.
    Beal, M.: Variational algorithms for approximate Bayesian inference. PhD thesis, The Gatsby Computational Neuroscience Unit, Univ. College London (2003)Google Scholar
  6. 6.
    Jakkola, T., Jordan, M.I.: Bayesian parameter estimation via variational methods. Stat. Comput. 10, 25–33 (2000)CrossRefGoogle Scholar
  7. 7.
    Abrahamsen, P.: A review of Gaussian random fields and correlation functions. Technical Report 917, Norwegian Computing Center, Oslo, Norway (1997)Google Scholar
  8. 8.
    Utsugi, A.: Bayesian sampling and ensemble learning in Generative Topographic Mapping. Neural Process. Lett. 12, 277–290 (2000)zbMATHCrossRefGoogle Scholar
  9. 9.
    Bishop, C.M.: Variational principal components. In: Proceedings Ninth Intern. Conf. on Artificial Neural Networks, vol. 1, pp. 509–514 (1999)Google Scholar
  10. 10.
    MacKay, D.J.C.: A practical Bayesian framework for back-propagation networks. Neural Comput. 4(3), 448–472 (1992)CrossRefGoogle Scholar
  11. 11.
    Andrieu, C., de Freitas, N., Doucet, A., Jordan, M.I.: An introduction to MCMC for machine learning. Mach. Learn. 50, 5–43 (2003)zbMATHCrossRefGoogle Scholar
  12. 12.
    Olier, I., Vellido, A.: A variational Bayesian formulation for GTM: Theoretical foundations. Technical report, Technical University of Catalonia (UPC) (2007)Google Scholar
  13. 13.
    Gibbs, M., MacKay, D.J.C.: Variational Gaussian process classifiers. IEEE T. Neural Networ. 11(6), 1458–1464 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Iván Olier
    • 1
  • Alfredo Vellido
    • 1
  1. 1.Department of Computing Languages and Systems (LSI), Technical University of Catalonia (UPC), C/. Jordi Girona 1-3, Edifici Omega, Despatx S106, 08034 - BarcelonaSpain

Personalised recommendations