Skip to main content

Modularized Bayesian analyses and cutting feedback in likelihood-free inference


There has been much recent interest in modifying Bayesian inference for misspecified models so that it is useful for specific purposes. One popular modified Bayesian inference method is “cutting feedback” which can be used when the model consists of a number of coupled modules, with only some of the modules being misspecified. Cutting feedback methods represent the full posterior distribution in terms of conditional and sequential components, and then modify some terms in such a representation based on the modular structure for specification or computation of a modified posterior distribution. The main goal of this is to avoid contamination of inferences for parameters of interest by misspecified modules. Computation for cut posterior distributions is challenging, and here we consider cutting feedback for likelihood-free inference based on Gaussian mixture approximations to the joint distribution of parameters and data summary statistics. We exploit the fact that marginal and conditional distributions of a Gaussian mixture are Gaussian mixtures to give explicit approximations to marginal or conditional posterior distributions so that we can easily approximate cut posterior analyses. The mixture approach allows repeated approximation of posterior distributions for different data based on a single mixture fit. This is important for model checks which aid in the decision of whether to “cut”. A semi-modular approach to likelihood-free inference where feedback is partially cut is also developed. The benefits of the method are illustrated on two challenging examples, a collective cell spreading model and a continuous time model for asset returns with jumps.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9


  • Aït-Sahalia, Y., Cacho-Diaz, J., Laeven, R.J.: Modeling financial contagion using mutually exciting jump processes. J. Financ. Econom. 117(3), 585–606 (2015)

    Article  Google Scholar 

  • Bennett, J., Wakefield, J.: Errors-in-variables in joint population pharmacokinetic/pharmacodynamic modeling. Biometrics 57(3), 803–812 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Bissiri, P.G., Holmes, C.C., Walker, S.G.: A general framework for updating belief distributions. J. Royal Stat. Soc.: Series B (Stat. Methodol.) 78(5), 1103–1130 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  • Bonassi, F.V., West, M.: Sequential Monte Carlo with adaptive weights for approximate Bayesian computation. Bayesian Anal. 10(1), 171–187 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  • Bonassi, F.V., You, L., West, M.: Bayesian learning from marginal data in bionetwork models. Stat. Appl. Genet. Mol. Biol. 10(1), (2011)

  • Bowman, A.W., Azzalini, A.: Applied Smoothing Techniques for Data Analysis. Oxford University Press, New York (1997)

    MATH  Google Scholar 

  • Browning, A.P., McCue, S.W., Binny, R.N., et al.: Inferring parameters for a lattice-free model of cell migration and proliferation using experimental data. J. Theor. Biol. 437, 251–260 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  • Carmona, C., Nicholls, G.: Semi-modular inference: enhanced learning in multi-modular models by tempering the influence of components. In: Chiappa S, Calandra R (eds) Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research, vol 108. PMLR, pp 4226–4235, (2020)

  • Carmona, C., Nicholls, G.: Scalable semi-modular inference with variational meta-posteriors, (2022). arXiv:2204.00296

  • Chakraborty, A., Nott, D.J., Evans, M.: Weakly informative priors and prior-data conflict checking for likelihood-free inference. Stat. Interface, To appear, (2023)

  • Clarté, G., Robert, C.P., Ryder, R.J., et al.: Componentwise approximate Bayesian computation via Gibbs-like steps. Biometrika 108(3), 591–607 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  • Creel, M., Kristensen, D.: ABC of SV: limited information likelihood inference in stochastic volatility jump-diffusion models. J. Empir. Financ. 31, 85–108 (2015)

    Article  Google Scholar 

  • Fan, Y., Nott, D.J., Sisson, S.A.: Approximate Bayesian computation via regression density estimation. Stat 2(1), 34–48 (2013)

    Article  MathSciNet  Google Scholar 

  • Forbes, F., Nguyen, H.D., Nguyen, T.T. et al.: Approximate Bayesian computation with surrogate posteriors. Inria technical report, hal-03139256, (2021),

  • Frazier, D.T., Drovandi, C.: Robust approximate Bayesian inference with synthetic likelihood. J. Comput. Graph. Stat. 30(4), 958–976 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  • Frazier, D.T., Nott, D.J.: Cutting feedback and modularized analyses in generalized bayesian inference, (2022). arXiv:2202.09968

  • Frazier, D.T., Renault, E.: Indirect inference with (out) constraints. Quant. Econ. 11(1), 113–159 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  • Frazier, D.T., Maneesoonthorn, W., Martin, G.M., et al.: Approximate Bayesian forecasting. Int. J. Forecast. 35(2), 521–539 (2019)

    Article  Google Scholar 

  • Frazier, D.T., Robert, C.P., Rousseau, J.: Model misspecification in approximate Bayesian computation: consequences and diagnostics. J. Royal Stat. Soc.: Series B (Stat. Methodol.) 82(2), 421–444 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  • Frazier, D.T., Drovandi, C., Nott, D.J.: Synthetic likelihood in misspecified models: Consequences and corrections, (2021). arXiv preprint arXiv:2104.03436

  • Gordon, N., Salmond, D., Smith, A.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. F (Radar Signal Proc.) 140(2), 107–113 (1993)

    Article  Google Scholar 

  • Greenberg, D.S., Nonnenmacher, M., Macke, J.H.: Automatic posterior transformation for likelihood-free inference. In: Chaudhuri K, Salakhutdinov R (eds) Proceedings of the 36th International Conference on Machine Learning, ICML 2019, 9-15 June 2019, Long Beach, California, USA, Proceedings of Machine Learning Research, vol 97. PMLR, pp 2404–2414, (2019)

  • Grünwald, P., van Ommen, T.: Inconsistency of Bayesian inference for misspecified linear models, and a proposal for repairing it. Bayesian Anal. 12(4), 1069–1103 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  • Gutmann, M.U., Corander, J.: Bayesian optimization for likelihood-free inference of simulator-based statistical models. J. Mach. Learn. Res. 17(125), 1–47 (2016)

    MathSciNet  MATH  Google Scholar 

  • He, Z., Huo, S., Yang, T.: An adaptive mixture-population Monte Carlo method for likelihood-free inference, (2021). arXiv:2112.00420

  • Hermans, J., Begy, V., Louppe, G.: Likelihood-free MCMC with amortized approximate ratio estimators. In: Proceedings of the 37th International Conference on Machine Learning, ICML 2020, 13-18 July 2020, Virtual Event, Proceedings of Machine Learning Research, vol 119. PMLR, pp 4239–4248, (2020)

  • Hershey, J.R., Olsen, P.A.: Approximating the kullback leibler divergence between gaussian mixture models. In: 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP ’07, pp IV–317–IV–320, (2007)

  • Jacob, P.E., Murray, L.M., Holmes, C.C., et al.: Better together? Statistical learning in models made of modules, (2017). arXiv:1708.08719

  • Jacob, P.E., O’Leary, J., Atchadé, Y.F.: Unbiased Markov chain Monte Carlo methods with couplings (with discussion). J. Royal Stat. Soc.: Series B (Stat. Methodol.) 82(3), 543–600 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  • Lewis, J.R., MacEachern, S.N., Lee, Y.: Bayesian restricted likelihood methods: conditioning on insufficient statistics in Bayesian regression. Bayesian Anal. 16(4), 1393–1462 (2021)

    Article  MathSciNet  Google Scholar 

  • Li, J., Nott, D.J., Fan, Y., et al.: Extending approximate Bayesian computation methods to high dimensions via Gaussian copula. Comput. Stat. Data Anal. 106, 77–89 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  • Liu, F., Bayarri, M.J., Berger, J.O.: Modularization in Bayesian analysis, with emphasis on analysis of computer models. Bayesian Anal. 4(1), 119–150 (2009)

    MathSciNet  MATH  Google Scholar 

  • Liu, Y., Goudie, R.J.B.: Stochastic approximation cut algorithm for inference in modularized Bayesian models, (2020). arXiv:2006.01584

  • Liu, Y., Goudie, R.J.B.: A general framework for cutting feedback within modularized Bayesian inference, (2022). arXiv:2211.03274

  • Lueckmann, J.M., Goncalves, P.J., Bassetto, G., et al.: Flexible statistical inference for mechanistic models of neural dynamics. In: Guyon I, Luxburg UV, Bengio S, et al (eds) Advances in Neural Information Processing Systems, vol 30. Curran Associates, Inc., (2017),

  • Lunn, D., Best, N., Spiegelhalter, D., et al.: Combining MCMC with ‘sequential’ PKPD modelling. J. Pharmacokinet Pharmacodyn. 36, 19–38 (2009)

    Article  Google Scholar 

  • Maneesoonthorn, W., Forbes, C.S., Martin, G.M.: Inference on self-exciting jumps in prices and volatility using high-frequency measures. J. Appl. Economet. 32(3), 504–532 (2017)

    Article  MathSciNet  Google Scholar 

  • Miller, J.W., Dunson, D.B.: Robust Bayesian inference via coarsening. J. Am. Stat. Assoc. 114(527), 1113–1125 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  • Nicholls, G.K., Lee, J.E., Wu, C.H., et al.: Valid belief updates for prequentially additive loss functions arising in semi-modular inference, (2022). arXiv preprint arXiv:2201.09706 )

  • Nott, D.J., Wang, X., Evans, M., et al.: Checking for prior-data conflict using prior-to-posterior divergences. Stat. Sci. 35(2), 234–253 (2020

  • Pacchiardi, L., Dutta, R.: Generalized Bayesian likelihood-free inference using scoring rules estimators, (2021). arXiv:2104.03889

  • Pacchiardi, L., Dutta, R.: Score matched neural exponential families for likelihood-free inference. J. Mach. Learn. Res. 23(38), 1–71 (2022)

    MathSciNet  MATH  Google Scholar 

  • Papamakarios, G., Murray, I.: Fast \(\epsilon \)-free inference of simulation models with Bayesian conditional density estimation. In: Lee D, Sugiyama M, Luxburg U, et al (eds) Advances in Neural Information Processing Systems, vol 29. Curran Associates, Inc., (2016),

  • Plummer, M.: Cuts in Bayesian graphical models. Stat. Comput. 25, 37–43 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  • Pompe, E., Jacob, P.E.: Asymptotics of cut distributions and robust modular inference using posterior bootstrap, (2021). arXiv:2110.11149

  • Price, L.F., Drovandi, C.C., Lee, A.C., et al.: Bayesian synthetic likelihood. J. Comput. Graph. Stat. 27(1), 1–11 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  • Raynal, L., Marin, J.M., Pudlo, P., et al.: ABC random forests for Bayesian parameter inference. Bioinformatics 35(10), 1720–1728 (2018)

    Article  Google Scholar 

  • Rodrigues, G., Nott, D., Sisson, S.: Likelihood-free approximate Gibbs sampling. Stat. Comput. 30, 1057–1073 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  • Scrucca, L., Fop, M., Murphy, T.B., et al.: mclust 5: clustering, classification and density estimation using Gaussian finite mixture models. The R Journal 8(1), 289–317 (2016)

    Article  Google Scholar 

  • Sisson, S., Fan, Y., Beaumont, M.: Overview of Approximate Bayesian Computation. In: Sisson S, Fan Y, Beaumont M (eds) Handbook of Approximate Bayesian Computation. Chapman & Hall/CRC Handbooks of Modern Statistical Methods, CRC Press, Taylor & Francis Group, Boca Raton, Florida, chap 1, (2018a)

  • Sisson, S.A., Fan, Y., Beaumont, M.A.: (eds) Handbook of Approximate Bayesian Computation. Chapman & Hall/CRC, (2018b)

  • Stone, M.: The Opinion Pool. Ann. Math. Stat. 32(4), 1339–1342 (1961)

    Article  MathSciNet  MATH  Google Scholar 

  • Thomas, O., Dutta, R., Corander, J., et al.: Likelihood-free inference by ratio estimation. Bayesian Anal. 17(1), 1–31 (2022)

    Article  MathSciNet  Google Scholar 

  • Wilkinson, R.D.: Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Stat. Appl. Genet. Mol. Biol. 12(2), 129–141 (2013)

    Article  MathSciNet  Google Scholar 

  • Wood, S.N.: Statistical inference for noisy nonlinear ecological dynamic systems. Nature 466(7310), 1102–1104 (2010)

    Article  Google Scholar 

  • Woodard, D.B., Crainiceanu, C., Ruppert, D.: Hierarchical adaptive regression kernels for regression with functional predictors. J. Comput. Graph. Stat. 22(4), 777–800 (2013)

    Article  MathSciNet  Google Scholar 

  • Yu, X., Nott, D.J., Smith, M.S.: Variational inference for cutting feedback in misspecified models, (2021). arXiv:2108.11066

Download references


DTF is supported by the Australian Research Council. CD is supported by the Australian Research Council through the Future Fellowship scheme (FT210100260). SAS is supported by the Australian Research Council through the Discovery Project scheme (FT170106079) and the ARC Centre of Excellence in Mathematical and Statistical Frontiers (ACMS; CE140100049). We thank an anonymous referee and the editorial team for comments which greatly improved the manuscript.

Author information

Authors and Affiliations


Corresponding author

Correspondence to David J. Nott.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Chakraborty, A., Nott, D.J., Drovandi, C.C. et al. Modularized Bayesian analyses and cutting feedback in likelihood-free inference. Stat Comput 33, 33 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Approximate Bayesian computation
  • Model misspecification
  • Modularization
  • Semi-modular inference
  • Synthetic likelihood