Laplace Approximation in High-Dimensional Bayesian Regression
We consider Bayesian variable selection in sparse high-dimensional regression, where the number of covariates p may be large relative to the sample size n, but at most a moderate number q of covariates are active. Specifically, we treat generalized linear models. For a single fixed sparse model with well-behaved prior distribution, classical theory proves that the Laplace approximation to the marginal likelihood of the model is accurate for sufficiently large sample size n. We extend this theory by giving results on uniform accuracy of the Laplace approximation across all models in a high-dimensional scenario in which p and q, and thus also the number of considered models, may increase with n. Moreover, we show how this connection between marginal likelihood and Laplace approximation can be used to obtain consistency results for Bayesian approaches to variable selection in high-dimensional regression.
- 1.Anderson, T.W.: An Introduction to Multivariate Statistical Analysis. Wiley Series in Probability and Statistics, 3rd edn. Wiley-Interscience, Hoboken, NJ (2003)Google Scholar
- 13.McCullagh, P., Nelder, J.A.: Generalized Linear Models. Monographs on Statistics and Applied Probability, 2nd edn. Chapman & Hall, London (1989)Google Scholar
- 18.Zhang, F. (ed.): The Schur Complement and Its Applications. Numerical Methods and Algorithms, vol. 4. Springer, New York (2005)Google Scholar