Statistics and Computing

, Volume 26, Issue 5, pp 1121–1136 | Cite as

A pseudo-marginal sequential Monte Carlo algorithm for random effects models in Bayesian sequential design

  • J. M. McGreeEmail author
  • C. C. Drovandi
  • G. White
  • A. N. Pettitt


Motivated by the need to sequentially design experiments for the collection of data in batches or blocks, a new pseudo-marginal sequential Monte Carlo algorithm is proposed for random effects models where the likelihood is not analytic, and has to be approximated. This new algorithm is an extension of the idealised sequential Monte Carlo algorithm where we propose to unbiasedly approximate the likelihood to yield an efficient exact-approximate algorithm to perform inference and make decisions within Bayesian sequential design. We propose four approaches to unbiasedly approximate the likelihood: standard Monte Carlo integration; randomised quasi-Monte Carlo integration, Laplace importance sampling and a combination of Laplace importance sampling and randomised quasi-Monte Carlo. These four methods are compared in terms of the estimates of likelihood weights and in the selection of the optimal sequential designs in an important pharmacological study related to the treatment of critically ill patients. As the approaches considered to approximate the likelihood can be computationally expensive, we exploit parallel computational architectures to ensure designs are derived in a timely manner.


Graphics processing unit Importance Sampling Intractable likelihood Laplace approximation Nonlinear regression Optimal design Parallel computing Particle filter Randomised quasi Monte Carlo 



This work was supported by the Australian Research Council Centre of Excellence for Mathematical & Statistical Frontiers. The work of A.N. Pettitt was supported by an ARC Discovery Project (DP110100159), and the work of J.M. McGree was supported by an ARC Discovery Project (DP120100269). We would also like to thank the two referees who offered helpful comments to improve the article.


  1. Amzal, B., Bois, F.Y., Parent, E., Robert, C.P.: Bayesian-optimal design via interacting particle systems. J. Am. Stat. Assoc. 101, 773–785 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  2. Andrieu, C., Roberts, G.O.: The pseudo-marginal approach for efficient Monte Carlo computations. Ann. Stat. 37, 697–725 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  3. Atkinson, A.C., Donev, A.N., Tobias, R.D.: Optimum experimental designs, with SAS. Oxford University Press Inc., New York (2007)zbMATHGoogle Scholar
  4. Azadi, N.A., Fearnhead, P., Ridall, G., Blok, J.H.: Bayesian sequential experimental design for binary response data with application to electromyographic experiments. Bayesian Anal. 9, 287–306 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  5. Beal, M.J.: Variational algorithms for approximate inference. Ph.D. thesis. University of London (2003)Google Scholar
  6. Bernardo, J.M., Smith, A.: Bayesian Theory. Wiley, Chichester (2000)zbMATHGoogle Scholar
  7. Chopin, N.: A sequential particle filter method for static models. Biometrika 89, 539–551 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  8. Chopin, N., Jacob, P., Papaspiliopoulos, O.: SMC\(^{\wedge }\)2: an efficient algorithm for sequential analysis of state space models. J. R. Stat. Soc. 75, 397–426 (2013)MathSciNetCrossRefGoogle Scholar
  9. Corana, A., Marchesi, M., Martini, C., Ridella, S.: Minimizing multimodal functions of continuous variables with the ‘simulated annealing’ algorithm. ACM Trans. Math. Softw. 13, 262–280 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  10. Cranley, R., Patterson, T.: Randomisation of number theoretic methods for multiple integration. SIAM J. Numer. Anal. 13, 904–914 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  11. Davies, A., Jones, D., Bailey, M., Beca, J., Bellomo, R., Blackwell, N., Forrest, P., Gattas, D., Granger, E., Herkes, R., Jackson, A., McGuinness, S., Nair, P., Pellegrino, V., Pettilä, V., Plunkett, B., Pye, R., Torzillo, P., Webb, S., Wilson, M., Ziegenfuss, M.: Extracorporeal membrane oxygenation for 2009 influenza A(H1N1) acute respiratory distress syndrome. J. Am. Med. Assoc. 302, 1888–1895 (2009)CrossRefGoogle Scholar
  12. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. 68, 411–436 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Dick, J., Sloan, I., Wang, X., Wozniakowski, H.: Liberating the weights. J. Complex. 5, 593–623 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  14. Drovandi, C.C., McGree, J.M., Pettitt, A.N.: Sequential Monte Carlo for Bayesian sequentially designed experiments for discrete data. Comput. Stat. Data Anal. 57, 320–335 (2013)MathSciNetCrossRefGoogle Scholar
  15. Drovandi, C.C., McGree, J.M., Pettitt, A.N.: A sequential Monte Carlo algorithm to incorporate model uncertainty in Bayesian sequential design. J. Comput. Gr. Stat. 23, 3–24 (2014)MathSciNetCrossRefGoogle Scholar
  16. Durham, G., Geweke, J.: Massively parallel sequential Monte Carlo for Bayesian inference. Manuscript, URL (2011)
  17. Faure, H.: Discrkpance de suites associkes b un systkme de numeration (en dimension s). Acta Arith. XLI, 337–351 (1982)MathSciNetGoogle Scholar
  18. Gerber, M., Chopin, N.: Sequential-quasi Monte Carlo. Eprint arXiv:1402.4039 (2014)
  19. Gilks, W.R., Berzuini, C.: Following a moving target—Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. 63, 127–146 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  20. Gordon, N.J., Salmond, D.J., Smith, A.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. In: IEE Proceedings F Radar and Signal Processing, vol. 140, pp. 107–113 (1994)Google Scholar
  21. Gramacy, R.B., Polson, N.G.: Particle learning of Gaussian process models for sequential design and optimization. J. Comput. Gr. Stat. 20, 102–118 (2011)MathSciNetCrossRefGoogle Scholar
  22. Halton, J.H.: On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Numer. Math. 2, 84–90 (1960)MathSciNetCrossRefzbMATHGoogle Scholar
  23. Hammersley, J.M., Handscomb, D.C.: Monte Carlo Methods. Methuen & Co Ltd, London (1964)CrossRefzbMATHGoogle Scholar
  24. Han, C., Chaloner, K.: Bayesian experimental design for nonlinear mixed-effects models with application to HIV dynamics. Biometrics 60, 25–33 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  25. Hastings, W.K.: Monte carlo sampling methods using markov chains and their applications. Biometrika 57, 97–109 (1970)MathSciNetCrossRefzbMATHGoogle Scholar
  26. Kiefer, J.: Optimum experimental designs (with discussion). J. R. Stat. Soc. 21, 272–319 (1959)MathSciNetzbMATHGoogle Scholar
  27. Kitagawa, G.: Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. J. Comput. Gr. Stat. 5, 1–25 (1996)MathSciNetGoogle Scholar
  28. Kuk, A.: Laplace importance sampling for generalized linear mixed models. J. Stat. Comput. Simul. 63, 143–158 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  29. L’Ecuyer, P., Lemieux, C.: Variance reduction via lattice rules. Manag. Sci. 46, 1214–1235 (2000)CrossRefzbMATHGoogle Scholar
  30. Liu, J., West, M.: Combined parameter and state estimation in simulation based filtering. In: Doucet, A., de Freitas, J.F.G., Gordon, N.J. (eds.) Sequential Monte Carlo in Practice, pp. 197–223. Springer, New York (2001)CrossRefGoogle Scholar
  31. Liu, J.S., Chen, R.: Blind deconvolution via sequential imputations. J. Am. Stat. Assoc. 90, 567–576 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  32. Liu, J.S., Chen, R.: Sequential Monte Carlo for dynamic systems. J. Am. Stat. Assoc. 93, 1032–1044 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  33. MATLAB.: version R2013b. The Mathworks Inc, Natick, MA (2013)Google Scholar
  34. McGree, J.M., Drovandi, C.C., Thompson, M.H., Eccleston, J.A., Duffull, S.B., Mengersen, K., Pettitt, A.N., Goggin, T.: Adaptive Bayesian compound designs for dose finding studies. J. Stat. Plan. Inference 142, 1480–1492 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  35. Mentré, F., Mallet, A., Baccar, D.: Optimal design in random-effects regression models. Biometrika 84, 429–442 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  36. Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1092 (1953)CrossRefGoogle Scholar
  37. Morokoff, W.J., Caflisch, R.E.: Quasi-Monte Carlo integration. J. Comput. Phys. 122, 218–230 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  38. Müller, P.: Simulation-based optimal design. Bayesian Stat. 6, 459–474 (1999)MathSciNetzbMATHGoogle Scholar
  39. Munger, D., L’Ecuyer, P., Bastin, F., Cirillo, C., Tuffin, B.: Estimation of the mixed logit likelihood function by randomized quasi-Monte Carlo. Transp. Res. Part B 46, 305–320 (2012)CrossRefGoogle Scholar
  40. Niederreiter, H.: Quasi-Monte Carlo methods and pseudo-random numbers. Bull. Am. Math. Soc. 84, 957–1041 (1978)MathSciNetCrossRefzbMATHGoogle Scholar
  41. NVIDIA.: NVIDIA CUDA C Programming Guide 4.1. NVIDIA (2012)Google Scholar
  42. Overstall, A., Woods, D.: The approximate coordinate exchange algorithm for Bayesian optimal design of experiments arXiv:1501.00264v1 [stat.ME] (2015)
  43. Owen, A.B.: Monte Carlo variance of scrambled net quadrature. SIAM J. Numer. Anal. 34, 1884–1910 (1997a)MathSciNetCrossRefzbMATHGoogle Scholar
  44. Owen, A.B.: Scramble net variance for integrals of smooth functions. Ann. Stat. 25, 1541–1562 (1997b)CrossRefzbMATHGoogle Scholar
  45. Owen, A.B.: Latin supercube sampling for very high-dimensional simulations. ACM Trans. Model. Comput. Simul. 8, 71–102 (1998a)CrossRefzbMATHGoogle Scholar
  46. Owen, A.B.: Scrambling Sobol’ and Niederreiter-Xing points. J. Complex. 14, 466–489 (1998b)MathSciNetCrossRefzbMATHGoogle Scholar
  47. Patterson, H.D., Hunter, E.A.: The efficiency of incomplete block designs in National List and Recommended List cereal variety trials. J. Agric. Sci. 101, 427–433 (1983)CrossRefGoogle Scholar
  48. Rue, H., Martino, S., Chopin, N.: Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion). J. R. Stat. Soc. 71, 319–392 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  49. Ryan, E., Drovandi, C., McGree, J., Pettitt, A.: A review of modern computational algorithms for Bayesian optimal design. Int. Stat. Rev. Accepted for publication (2015)Google Scholar
  50. Ryan, E., Drovandi, C., Pettitt, A.: Fully Bayesian experimental design for Pharmacokinetic studies. Entropy 17(3), 1063–1089 (2014)CrossRefGoogle Scholar
  51. Shekar, K., Roberts, J., Smith, M., Fung, Y., Fraser, J.: The ECMO PK project: an incremental research approach to advance understanding of the pharmacokinetic alterations and improve patient outcomes during extracorporeal membrane oxygenation. BMC Anesthesiol. 13, 7 (2013)CrossRefGoogle Scholar
  52. Sobol’, I.M.: On the distribution of points in a cube and the approximate evaluation of integrals. USSR Comput. Math. Math. Phys. 7, 86–112 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  53. Stroud, J.R., Müller, P., Rosner, G.L.: Optimal sampling times in population pharmacokinetic studies. J. R. Stat. Soc. Ser. C 50, 345–359 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  54. Tran, M.N., Strickland, C., Pitt, M.K., Kohn, R.: Annealed important sampling for models with latent variables. arXiv:1402.6035 [stat.ME] (2014)
  55. Vergé, C., Dubarry, C., Del Moral, P., Moulines, E.: On parallel implementation of sequential Monte Carlo methods: the island particle model. Statistics and Computing. To appear (2015)Google Scholar
  56. Weir, C.J., Spiegelhalter, D.J., Grieve, A.P.: Flexible design and efficient implementation of adaptive dose-finding studies. J. Biopharm. Stat. 17, 1033–1050 (2007)MathSciNetCrossRefGoogle Scholar
  57. Woods, D.C., van de ven, P.: Blocked designs for experiments with correlated non-normal response. Technometrics 53, 173–182 (2011)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • J. M. McGree
    • 1
    Email author
  • C. C. Drovandi
    • 1
  • G. White
    • 1
  • A. N. Pettitt
    • 2
  1. 1.School of Mathematical Sciences FacultyQueensland University of TechnologyBrisbaneAustralia
  2. 2.Australian Research Council Centre of Excellence for Mathematical & Statistical Frontiers (ACEMS)ParkvilleAustralia

Personalised recommendations