Kernel Sequential Monte Carlo

  • Ingmar Schuster
  • Heiko Strathmann
  • Brooks Paige
  • Dino Sejdinovic
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10534)


We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities. KSMC is a family of sequential Monte Carlo algorithms that are based on building emulator models of the current particle system in a reproducing kernel Hilbert space. We here focus on modelling nonlinear covariance structure and gradients of the target. The emulator’s geometry is adaptively updated and subsequently used to inform local proposals. Unlike in adaptive Markov chain Monte Carlo, continuous adaptation does not compromise convergence of the sampler. KSMC combines the strengths of sequental Monte Carlo and kernel methods: superior performance for multimodal targets and the ability to estimate model evidence as compared to Markov chain Monte Carlo, and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity. As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive. We describe necessary tuning details and demonstrate the benefits of the the proposed methodology on a series of challenging synthetic and real-world examples.



I.S. was supported by a PSL postdoc grant and DFG through grant CRC 1114 “Scaling Cascades in Complex Systems”, Project B03 “Multilevel coarse graining of multiscale problems”. H.S. was supported by the Gatsby Chaitable foundation. B.P. was supported by The Alan Turing Institute under the EPSRC grant EP/N510129/1.

Supplementary material


  1. 1.
    Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18, 343–373 (2008). ISSN 09603174MathSciNetCrossRefGoogle Scholar
  2. 2.
    Cappé, O., Guillin, A., Marin, J.M., Robert, C.P.: Population Monte Carlo. J. Comput. Graph. Stat. 13(4), 907–929 (2004). ISSN 1061–8600MathSciNetCrossRefGoogle Scholar
  3. 3.
    Cappé, O., Douc, R., Guillin, A., Marin, J.M., Robert, C.P.: Adaptive importance sampling in general mixture classes. Stat. Comput. 18(4), 447–459 (2008). ISSN 0960–3174MathSciNetCrossRefGoogle Scholar
  4. 4.
    Chopin, N.: A sequential particle filter method for static models. Biometrika 89(3), 539–552 (2002). ISSN 0006–3444MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Chopin, N., Jacob, P.E., Papaspiliopoulos, O.: SMC2: an efficient algorithm for sequential analysis of state space models. J. Royal Stat. Soc. Ser. B (Stat. Methodol.) 75(3), 397–426 (2013)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. Royal Stat. Soc. Ser. B (Stat. Methodol.) 68(3), 411–436 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Douc, R., Cappé, O.: Comparison of resampling schemes for particle filtering. In: Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, pp. 64–69 (2005). ISBN 953-184-089-XGoogle Scholar
  8. 8.
    Doucet, A., Johansen, A.: A tutorial on particle filtering and smoothing: fifteen years later. Handb. Nonlinear Filtering, 4–6 (2009)Google Scholar
  9. 9.
    Doucet, A., de Freitas, N., Gordon, N.: An introduction to sequential Monte Carlo methods. In: Doucet, A., de Freitas, N., Gordon, N. (eds.) Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science, pp. 3–14. Springer, Heidelberg (2011). Google Scholar
  10. 10.
    Elvira, V., Martino, L., Luengo, D., Bugallo, M.F.: Generalized multiple importance sampling. Technical report (2015)Google Scholar
  11. 11.
    Fearnhead, P., Taylor, B.M.: An adaptive sequential Monte Carlo sampler. Bayesian Anal. 2, 411–438 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Haario, H., Saksman, E., Tamminen, J.: An adaptive metropolis algorithm. Bernoulli 7(2), 223–242 (2001). ISSN 13507265MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Ihler, A.T., Fisher, J.W., Moses, R.L., Willsky, A.S.: Nonparametric belief propagation for self-localization of sensor networks. IEEE J. Sel. Areas Commun. 23(4), 809–819 (2005)CrossRefGoogle Scholar
  15. 15.
    Lan, S., Streets, J., Shahbaba, B.: Wormhole Hamiltonian Monte Carlo. In: Twenty-Eighth AAAI Conference on Artificial Intelligence (2014)Google Scholar
  16. 16.
    Neal, R.M.: MCMC using Hamiltonian dynamics. Handb. Markov Chain Monte Carlo. Chapman and Hall/CRC (2011)Google Scholar
  17. 17.
    Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: Advances in Neural Information Processing Systems, pp. 1177–1184 (2007)Google Scholar
  18. 18.
    Rosenthal, J.S.: Optimal proposal distributions and adaptive MCMC. Handb. Markov Chain Monte Carlo 4, 93–112 (2011). Chapman and HallMathSciNetzbMATHGoogle Scholar
  19. 19.
    Schuster, I.: Gradient importance sampling. arXiv preprint arXiv:1507.05781 (2015)
  20. 20.
    Sejdinovic, D., Strathmann, H., Garcia, M.L., Andrieu, C., Gretton, A.: Kernel adaptive metropolis-hastings. In: International Conference on Machine Learning (ICML), pp. 1665–1673 (2014)Google Scholar
  21. 21.
    Sriperumbudur, B., Fukumizu, K., Kumar, R., Gretton, A., Hyvärinen, A.: Density estimation in infinite dimensional exponential families. arXiv preprint arXiv:1312.3516 (2014)
  22. 22.
    Strathmann, H., Sejdinovic, D., Livingstone, S., Szabo, Z., Gretton, A.: Gradient-free Hamiltonian Monte Carlo with efficient kernel exponential families. In: NIPS (2015)Google Scholar
  23. 23.
    Tran, M.-N., Scharth, M., Pitt, M.K., Kohn, R.: Importance sampling squared for Bayesian inference in latent variable models. arXiv preprint arXiv:1309.3339 (2013)

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Ingmar Schuster
    • 1
  • Heiko Strathmann
    • 2
  • Brooks Paige
    • 3
  • Dino Sejdinovic
    • 4
  1. 1.FU BerlinBerlinGermany
  2. 2.Gatsby UnitUniversity College LondonLondonUK
  3. 3.Alan Turing Institute and University of CambridgeCambridgeUK
  4. 4.University of Oxford and Alan Turing InstituteOxfordUK

Personalised recommendations