Abstract
Considering numerous types of data, this paper discusses application of PCA to exponential family distributions. Reviewing the probabilistic basis of PCA, we propose a model using Laplace approximation, which was widely used in classification context, Laplace exponential family PCA (LePCA). The proposed approach provides a more probabilistic solution compared with numerous models before. Standard EM algorithm can be applied to this model, while only a degraded form of EM is applicable on previous exponential PCA models. LePCA absorbs probabilistic PCA, as well as the traditional PCA as its specialization by taking the Gaussian assumption for granted.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemom. Intell. Lab. Syst. 2(1–3), 37–52 (1987)
Hotelling, H.: Analysis of a complex of statistical variables into principal components. Br. J. Educ. Psychol. 24(6), 417–520 (1933)
Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. J. Roy. Stat. Soc. 61(3), 611–622 (1999)
Guo, Y.: Supervised exponential family principal component analysis via convex optimization. In: International Conference on Neural Information Processing Systems Curran Associates Inc., pp. 569–576 (2008)
Collins, M., Dasgupta, S., Schapire, R.E.: A generalization of principal component analysis to the exponential family. In: International Conference on Neural Information Processing Systems: Natural and Synthetic, pp. 617–624. MIT Press (2001)
Mackay, D.J.C.: Probable networks and plausible predictions—a review of practical Bayesian methods for supervised neural networks. Netw. Comput. Neural Syst. 6(3), 469–505 (1995)
Robert, C.: Machine Learning, A Probabilistic Perspective. MIT Press, Cambridge (2012)
Akaike, H.: Factor analysis and AIC. Psychometrika 52(3), 317–332 (1987)
Bishop, C.M.: Bayesian PCA. In: Advances in Neural Information Processing Systems DBLP, pp. 382–388 (1999)
Mohamed, S., Heller, K.A., Ghahramani, Z.: Bayesian exponential family PCA. In: Conference on Neural Information Processing Systems, Vancouver, British Columbia, Canada, December, pp. 1089–1096. DBLP (2008)
Li, J., Tao, D.: Simple exponential family PCA. IEEE Trans. Neural Netw. Learn. Syst. 24(3), 485–497 (2013)
Nakajima, S., Sugiyama, M., Babacan, D.: On Bayesian PCA: automatic dimensionality selection and analytic solution. In: International Conference on Machine Learning Omnipress, pp. 497–504 (2011)
Miller, F.P., et al.: Bregman Divergence. Alphascript Publishing, Beau Bassin (2010)
Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics. Springer-Verlag New York Inc., Secaucus (2006)
Acknowledgement
This research work is funded by the National Key Research and Development Project of China (2016YFB0801003).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Li, F., Ren, X. (2018). Laplace Exponential Family PCA. In: Huang, DS., Bevilacqua, V., Premaratne, P., Gupta, P. (eds) Intelligent Computing Theories and Application. ICIC 2018. Lecture Notes in Computer Science(), vol 10954. Springer, Cham. https://doi.org/10.1007/978-3-319-95930-6_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-95930-6_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95929-0
Online ISBN: 978-3-319-95930-6
eBook Packages: Computer ScienceComputer Science (R0)