A Monte Carlo Markov chain algorithm for a class of mixture time series models
- First Online:
- 249 Downloads
This article generalizes the Monte Carlo Markov Chain (MCMC) algorithm, based on the Gibbs weighted Chinese restaurant (gWCR) process algorithm, for a class of kernel mixture of time series models over the Dirichlet process. This class of models is an extension of Lo’s (Ann. Stat. 12:351–357, 1984) kernel mixture model for independent observations. The kernel represents a known distribution of time series conditional on past time series and both present and past latent variables. The latent variables are independent samples from a Dirichlet process, which is a random discrete (almost surely) distribution. This class of models includes an infinite mixture of autoregressive processes and an infinite mixture of generalized autoregressive conditional heteroskedasticity (GARCH) processes.
KeywordsBayesian nonparametric Dirichlet process prior Poisson-Dirichlet process prior Bayesian Poisson calculus GARCH Volatility estimation
Unable to display preview. Download preview PDF.