Bayesian Block-Diagonal Predictive Classifier for Gaussian Data

  • Jukka CoranderEmail author
  • Timo Koski
  • Tatjana Pavlenko
  • Annika Tillander
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 190)


The paper presents a method for constructing Bayesian predictive classifier in a high-dimensional setting. Given that classes are represented by Gaussian distributions with block-structured covariance matrix, a closed form expression for the posterior predictive distribution of the data is established. Due to factorization of this distribution, the resulting Bayesian predictive and marginal classifier provides an efficient solution to the high-dimensional problem by splitting it into smaller tractable problems. In a simulation study we show that the suggested classifier outperforms several alternative algorithms such as linear discriminant analysis based on block-wise inverse covariance estimators and the shrunken centroids regularized discriminant analysis.


Covariance estimators discriminant analysis high-dimensional data hyperparameters 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bernardo, J.M., Smith, A.F.: Bayesian Theory. John Wiley & Sons, New York (1995)Google Scholar
  2. 2.
    Bühlmann, P., van de Geer, S.: Statistics for High-Dimensional Data. Springer, New York (2011)zbMATHCrossRefGoogle Scholar
  3. 3.
    Dawid, A.P., Lauritzen, S.L.: Hyper Markov laws in the statistical analysis of decomposable graphical models. Ann. Stat. 21, 1272–1317 (1993)MathSciNetzbMATHCrossRefGoogle Scholar
  4. 4.
    Dawid, A.P.: Some matrix-variate distribution theory: notational considerations and a Bayesian application. Biometrika 68, 265–274 (1981)MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Geisser, S.: Predictive Discrimination. In: Krishnaiah, P.R. (ed.) Multivariate Analysis. Academic Press, New York (1966)Google Scholar
  6. 6.
    Geisser, S.: Predictive Inference: An Introduction. Chapman and Hall, London (1993)zbMATHGoogle Scholar
  7. 7.
    Hastie, T., Tibshirani, R., Friedman, J.: Elements of statistical learning. Springer, New York (2009)zbMATHCrossRefGoogle Scholar
  8. 8.
    Kollo, T., von Rosen, D.: Advanced multivariate statistics with matrices. Springer, New York (2005)zbMATHGoogle Scholar
  9. 9.
    Kuss, M., Rasmussen, C.E.: Assessing approximate inference for binary gaussian process classification. J. Mach. Learn. Res. 6, 1679–1704 (2005)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Corander, J., Cui, Y., Koski, T., Sirén, J.: Predictive Gaussian Classifiers (under revision)Google Scholar
  11. 11.
    Pavlenko, T., Björkström, A., Tillander, A.: Covariance Structure Approximation via gLasso in High-Dimensional Supervissed Classification. J. Appl. Stat. (in press, 2012)Google Scholar
  12. 12.
    Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Jukka Corander
    • 2
    Email author
  • Timo Koski
    • 1
  • Tatjana Pavlenko
    • 1
  • Annika Tillander
    • 3
  1. 1.KTH Royal Institute of TechnologyStockholmSweden
  2. 2.University of HelsinkiHelsinkiFinland
  3. 3.Karolinska InstitutetStockholmSweden

Personalised recommendations