- First Online:
- Cite this article as:
- Prangle, D. Stat Comput (2016) 26: 171. doi:10.1007/s11222-014-9544-3
- 312 Downloads
Approximate Bayesian computation (ABC) performs statistical inference for otherwise intractable probability models by accepting parameter proposals when corresponding simulated datasets are sufficiently close to the observations. Producing the large quantity of simulations needed requires considerable computing time. However, it is often clear before a simulation ends that it is unpromising: it is likely to produce a poor match or require excessive time. This paper proposes lazy ABC, an ABC importance sampling algorithm which saves time by sometimes abandoning such simulations. This makes ABC more scalable to applications where simulation is expensive. By using a random stopping rule and appropriate reweighting step, the target distribution is unchanged from that of standard ABC. Theory and practical methods to tune lazy ABC are presented and illustrated on a simple epidemic model example. They are also demonstrated on the computationally demanding spatial extremes application of Erhardt and Smith (Comput Stat Data Anal 56:1468–1481, 2012), producing efficiency gains, in terms of effective sample size per unit CPU time, of roughly 3 times for a 20 location dataset, and 8 times for 35 locations.