Parsimonious Classification Via Generalized Linear Mixed Models
We devise a classification algorithm based on generalized linear mixed model (GLMM) technology. The algorithm incorporates spline smoothing, additive model-type structures and model selection. For reasons of speed we employ the Laplace approximation, rather than Monte Carlo methods. Tests on real and simulated data show the algorithm to have good classification performance. Moreover, the resulting classifiers are generally interpretable and parsimonious.
KeywordsAkaike Information Criterion Feature selection Generalized additive models Penalized splines Supervised learning Model selection Rao statistics Variance components
Unable to display preview. Download preview PDF.
- HASTIE, T. (2006), “Gam 0.97, R Package”, http://cran.r-project.org.
- HASTIE, T.J., and TIBSHIRANI,R.J. (1990), Generalized AdditiveModels, London: Chapman and Hall.Google Scholar
- ORMEROD, J.T. (2008), “On Semiparametric Regression and Data Mining”, PhD Thesis, School of Mathematics and Statistics, The University of New South Wales, Sydney, Australia.Google Scholar
- WAKEFIELD, J.C., BEST, N.G., and WALLER, L. (2000), “Bayesian Approaches to Disease Mapping”, in Spatial Epidemiology, eds. P. Elliott, J.C. Wakefield, N.G. Best, and D.J. Briggs, Oxford: Oxford University Press, pp. 104–127. Google Scholar
- WOOD, S.N. (2006), “Mgcv 1.3, R Package”, http://cran.r-project.org.