TEST

, Volume 23, Issue 3, pp 556–584

Bayesian model robustness via disparities

Original Paper

DOI: 10.1007/s11749-014-0360-z

Cite this article as:
Hooker, G. & Vidyashankar, A.N. TEST (2014) 23: 556. doi:10.1007/s11749-014-0360-z

Abstract

This paper develops a methodology for robust Bayesian inference through the use of disparities. Metrics such as Hellinger distance and negative exponential disparity have a long history in robust estimation in frequentist inference. We demonstrate that an equivalent robustification may be made in Bayesian inference by substituting an appropriately scaled disparity for the log likelihood to which standard Monte Carlo Markov Chain methods may be applied. A particularly appealing property of minimum-disparity methods is that while they yield robustness with a breakdown point of 1/2, the resulting parameter estimates are also efficient when the posited probabilistic model is correct. We demonstrate that a similar property holds for disparity-based Bayesian inference. We further show that in the Bayesian setting, it is also possible to extend these methods to robustify regression models, random effects distributions and other hierarchical models. These models require integrating out a random effect; this is achieved via MCMC but would otherwise be numerically challenging. The methods are demonstrated on real-world data.

Keywords

Deviance test Kernel density Hellinger distance  Negative exponential disparity MCMC Bayesian inference Posterior Outliers 

Mathematics Subject Classification

62F35 

Supplementary material

11749_2014_360_MOESM1_ESM.pdf (293 kb)
Supplementary material 1 (pdf 292 KB)

Copyright information

© Sociedad de Estadística e Investigación Operativa 2014

Authors and Affiliations

  1. 1.Department of Biological Statistics and Computational BiologyCornell UniversityIthacaUSA
  2. 2.Department of StatisticsGeorge Mason UniversityFairfaxUSA

Personalised recommendations