A simulated annealing approach to approximate Bayes computations
Approximate Bayes computations (ABC) are used for parameter inference when the likelihood function of the model is expensive to evaluate but relatively cheap to sample from. In particle ABC, an ensemble of particles in the product space of model outputs and parameters is propagated in such a way that its output marginal approaches a delta function at the data and its parameter marginal approaches the posterior distribution. Inspired by Simulated Annealing, we present a new class of particle algorithms for ABC, based on a sequence of Metropolis kernels, associated with a decreasing sequence of tolerances w.r.t. the data. Unlike other algorithms, our class of algorithms is not based on importance sampling. Hence, it does not suffer from a loss of effective sample size due to re-sampling. We prove convergence under a condition on the speed at which the tolerance is decreased. Furthermore, we present a scheme that adapts the tolerance and the jump distribution in parameter space according to some mean-fields of the ensemble, which preserves the statistical independence of the particles, in the limit of infinite sample size. This adaptive scheme aims at converging as close as possible to the correct result with as few system updates as possible via minimizing the entropy production of the process. The performance of this new class of algorithms is compared against two other recent algorithms on two toy examples as well as on a real-world example from genetics.
KeywordsApproximate Bayes computations Simulated annealing Non-equilibrium thermodynamics Entropy
The first author is indebted to Bjarne Andresen for valuable comments on the adaptive algorithm.
- Beskos, A., Crisan, D., Jasra, A.: On the Stability of Sequential Monte Carlo Methods in High Dimensions. arXiv: 1103.3965v2, (2012)
- Burkholder, D., Pardoux, E., Sznitman, A.: Topics in propagation of chaos. In Ecole d’Ete de Probabilites de Saint-Flour XIX—1989, volume 1464 of Lecture Notes in Mathematics, pp. 165–251. Springer, Berlin/Heidelberg, (1991). doi: 10.1007/BFb0085169
- Del Moral, P., Doucet, A., Jasra, A.: An adaptive sequential Monte Carlo method for approximate Bayesian computation. Stat. Comput. 22(5), 1009–1020 (2012) Google Scholar
- Douc, R., Moulines, E., Rosenthal, J.S.: Quantitative bounds on convergence of time-inhomogeneous Markov chains. Ann. Appl. Probab. 14(4), 1643–1665 (2004)Google Scholar
- Föllmer, H.: Random fields and diffusion processes. Ecole d’Ete de Probabilites de Saint-Flour XV–XVII. 1985–87, volume 1362 of Lecture Notes in Mathematics, pp. 101–203. Springer, Berlin/Heidelberg (1988)Google Scholar
- Jabot, F., Faure, T., Dumoullin, N.: EasyABC: EasyABC: performing efficient approximate Bayesian computation sampling schemes (2013). R package version 1.2.2Google Scholar
- Lee, A.: On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers. In Proceedings of the 2012 Winter Simulation Conference (WSC 2012), page 12 pp. IEEE Syst., Man, Cybernetics Soc., 2012 2012. 2012 Winter Simulation Conference (WSC 2012), 9–12 Dec (2012), BerlinGoogle Scholar
- Lenormand, M., Jabot, F.: Adaptive approximate Bayesian computation for complex models. Stat. Comput. 28(6), 2777–2796 (2013)Google Scholar
- Ruppeiner, G., Pedersen, J.M., Salamon, P.: Ensemble approach to simulated annealing. J. Phys. I 1, 455–470 (1991)Google Scholar
- Sedki, M., Pudlo, P., Marin J.M., Robert, C.P., Cornuet, J.M.: Efficient learning in ABC algorithms. arXiv: 1210.1388v2 [stat.CO] (2013)
- Tavaré, S., Balding, D.J., Griffiths, R.C., Donnelly, P.: Inferring coalescence times from DNA sequence data. Genetics 145, 505–518 (1997)Google Scholar
- Weiss, G., Haeseler, A.: Inference of population history using a likelihood approach. Genetics 149, 1539–1546 (1998)Google Scholar