Advertisement

Automatic Model Selection for Probabilistic PCA

  • Ezequiel López-Rubio
  • Juan Miguel Ortiz-de-Lazcano-Lobato
  • Domingo López-Rodríguez
  • María del Carmen Vargas-González
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4507)

Abstract

The Mixture of Probabilistic Principal Components Analyzers (MPPCA) is a multivariate analysis technique which defines a Gaussian probabilistic model at each unit. The number of units and principal directions in each unit is not learned in the original approach. Variational Bayesian approaches have been proposed for this purpose, which rely on assumptions on the input distribution and/or approximations of certain statistics. Here we present a different way to solve this problem, where cross-validation is used to guide the search for an optimal model selection. This allows to learn the model architecture without the need of any assumptions other than those of the basic PPCA framework. Experimental results are presented, which show the probability density estimation capabilities of the proposal with high dimensional data.

Keywords

Probabilistic Principal Components Analysis (PPCA) dimensionality reduction cross-validation handwritten digit recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beal, M.J.: Software in Matlab. Available at: http://www.cse.buffalo.edu/faculty/mbeal/software.html
  2. 2.
    Besse, P.: PCA stability and choice of dimensionality. Statistics and Probability Letters 13(5), 405–410 (1992)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Burden, R.L., Faires, D.: Numerical Analysis. Brooks/Cole Publishing, Pacific Grove (2004)Google Scholar
  4. 4.
    Georghiades, A.S., Belhumeur, P.N., Kriegman, D.J.: From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose. IEEE Trans. Pattern Anal. Mach. Intelligence 23(6), 643–660 (2001)CrossRefGoogle Scholar
  5. 5.
    Ghahramani, Z., Beal, M.J.: Variational Inference for Bayesian Mixtures of Factor Analysers. Advances in Neural Information Processing Systems 12, 449–455 (1999)Google Scholar
  6. 6.
    Kwon, O.-W., Chan, K., Lee, T.-W.: Speech Feature Analysis Using Variational Bayesian PCA. IEEE Signal Processing Letters 10(5), 137–140 (2003)CrossRefGoogle Scholar
  7. 7.
    LeCun, Y., Cortes, C.: The MNIST Database of Handwritten Digits. In: Internet (November 2006), http://yann.lecun.com/exdb/mnist/
  8. 8.
    Oba, S., Sato, M., Ishii, S.: Prior Hyperparameters in Bayesian PCA. In: Kaynak, O., Alpaydın, E., Oja, E., Xu, L. (eds.) ICANN 2003 and ICONIP 2003. LNCS, vol. 2714, pp. 271–279. Springer, Heidelberg (2003)Google Scholar
  9. 9.
    Tipping, M.E., Bishop, C.M.: Mixtures of Probabilistic Principal Components Analyzers. Neural Computation 11, 443–482 (1999)CrossRefGoogle Scholar
  10. 10.
    VizieR service (March 29, 2004), Available at: http://vizier.cfa.harvard.edu/viz-bin/VizieR

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Ezequiel López-Rubio
    • 1
  • Juan Miguel Ortiz-de-Lazcano-Lobato
    • 1
  • Domingo López-Rodríguez
    • 1
  • María del Carmen Vargas-González
    • 1
  1. 1.School of Computer Engineering, University of Málaga, Campus de Teatinos, s/n. 29071 MálagaSpain

Personalised recommendations