Statistics and Computing

, Volume 21, Issue 1, pp 69–81

A Monte Carlo Markov chain algorithm for a class of mixture time series models


DOI: 10.1007/s11222-009-9147-6

Cite this article as:
Lau, J.W. & So, M.K.P. Stat Comput (2011) 21: 69. doi:10.1007/s11222-009-9147-6


This article generalizes the Monte Carlo Markov Chain (MCMC) algorithm, based on the Gibbs weighted Chinese restaurant (gWCR) process algorithm, for a class of kernel mixture of time series models over the Dirichlet process. This class of models is an extension of Lo’s (Ann. Stat. 12:351–357, 1984) kernel mixture model for independent observations. The kernel represents a known distribution of time series conditional on past time series and both present and past latent variables. The latent variables are independent samples from a Dirichlet process, which is a random discrete (almost surely) distribution. This class of models includes an infinite mixture of autoregressive processes and an infinite mixture of generalized autoregressive conditional heteroskedasticity (GARCH) processes.


Bayesian nonparametric Dirichlet process prior Poisson-Dirichlet process prior Bayesian Poisson calculus GARCH Volatility estimation 

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  1. 1.School of Mathematics and StatisticsUniversity of Western AustraliaPerthAustralia
  2. 2.Department of Information Systems, Business Statistics and Operations ManagementHong Kong University of Science and TechnologyHong KongHong Kong