Statistics and Computing

, Volume 21, Issue 1, pp 69–81 | Cite as

A Monte Carlo Markov chain algorithm for a class of mixture time series models

Article

Abstract

This article generalizes the Monte Carlo Markov Chain (MCMC) algorithm, based on the Gibbs weighted Chinese restaurant (gWCR) process algorithm, for a class of kernel mixture of time series models over the Dirichlet process. This class of models is an extension of Lo’s (Ann. Stat. 12:351–357, 1984) kernel mixture model for independent observations. The kernel represents a known distribution of time series conditional on past time series and both present and past latent variables. The latent variables are independent samples from a Dirichlet process, which is a random discrete (almost surely) distribution. This class of models includes an infinite mixture of autoregressive processes and an infinite mixture of generalized autoregressive conditional heteroskedasticity (GARCH) processes.

Keywords

Bayesian nonparametric Dirichlet process prior Poisson-Dirichlet process prior Bayesian Poisson calculus GARCH Volatility estimation 

References

  1. Aldous, D.J.: Exchangeability and related topics. In: École d’Été de Probabilités de Saint-Flour XIII-1983. Lecture Notes in Mathematics, vol. 1117, pp. 1–198. Springer, Berlin (1985) CrossRefGoogle Scholar
  2. Basu, S., Chib, S.: Marginal likelihood and Bayes factors for Dirichlet process mixture models. J. Am. Stat. Assoc. 98, 224–235 (2003) MATHCrossRefMathSciNetGoogle Scholar
  3. Beal, M.J., Ghahramani, Z., Rasmussen, C.E.: The infinite hidden Markov model. Neural Inf. Process. Syst. 14, 577–585 (2002) Google Scholar
  4. Blackwell, D., MacQueen, J.B.: Ferguson distribution via Pólya Urn Scheme. Ann. Stat. 1, 353–355 (1973) MATHCrossRefMathSciNetGoogle Scholar
  5. Bollerslev, T.: Generalized autoregressive conditional heteroskedasticity. J. Econom. 31, 307–327 (1986) MATHCrossRefMathSciNetGoogle Scholar
  6. Brunner, L.J.: Using the Gibbs Sampler to simulate from the Bayes estimate of a decreasing density. Commun. Stat.—Theory Method 24, 215–226 (1995) MATHCrossRefMathSciNetGoogle Scholar
  7. Chen, C.W.S., So, M.K.P.: On a threshold heteroscedastic model. Int. J. Forecast. 22, 73–89 (2006) MATHCrossRefGoogle Scholar
  8. Chib, S.: Marginal likelihood from the Gibbs output. J. Am. Stat. Assoc. 90, 1313–1321 (1995) MATHCrossRefMathSciNetGoogle Scholar
  9. Dahl, D.B.: Modal clustering in a class of product partition models. Technical report (2006a). Available at http://www.stat.tamu.edu/~dahl/papers/modal/tr1085.pdf
  10. Dahl, D.B.: Model-based clustering for expression data via a Dirichlet process mixture model. In: Do, K.A., Müller, P., Vannucci, M. (eds.) Bayesian Inference for Gene Expression and Proteomics, pp. 97–115. Cambridge University Press, Cambridge (2006b) Google Scholar
  11. Engle, R.: Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50, 987–1008 (1982) MATHCrossRefMathSciNetGoogle Scholar
  12. Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. J. Am. Stat. Assoc. 90, 577–588 (1995) MATHCrossRefMathSciNetGoogle Scholar
  13. Escobar, M.D., West, M.: Computing nonparametric hierarchical models. Pract. Nonparamet. Semiparametr. Bayesian Stat. 1–22 (1998) Google Scholar
  14. Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209–230 (1973) MATHCrossRefMathSciNetGoogle Scholar
  15. Green, P.J., Richardson, S.: Modelling heterogeneity with and without the Dirichlet process. Scand. J. Stat. 28, 355–375 (2001) MATHCrossRefMathSciNetGoogle Scholar
  16. Ghosh, R.V., Ramamoorthi, R.V.: Bayesian Nonparametrics. Springer, Berlin (2003) MATHGoogle Scholar
  17. Griffin, J.E., Steel, M.F.J.: Order-based dependent Dirichlet processes. J. Am. Stat. Assoc. 101, 179–194 (2006) MATHCrossRefMathSciNetGoogle Scholar
  18. Hass, M., Mittnik, S., Paolella, M.S.: Asymmetric multivariate normal mixture GARCH. Comput. Stat. Data Anal. 53, 2129–2154 (2009) CrossRefGoogle Scholar
  19. Ho, M.W.: A Bayes method for a monotone hazard rate via S-paths. Ann. Stat. 34, 820–836 (2006a) MATHCrossRefGoogle Scholar
  20. Ho, M.W.: Bayes estimation of a symmetric unimodal density via S-paths. J. Comput. Graph. Stat. 15, 848–860 (2006b) CrossRefGoogle Scholar
  21. Ishwaran, H., James, L.F.: Gibbs sampling methods for stick-breaking priors. J. Am. Stat. Assoc. 96, 161–173 (2001) MATHCrossRefMathSciNetGoogle Scholar
  22. Ishwaran, H., James, L.F.: Generalized weighted Chinese restaurant processes for species sampling mixture models. Stat. Sin. 13, 1211–1235 (2003a) MATHMathSciNetGoogle Scholar
  23. Ishwaran, H., James, L.F.: Some further developments for stick-breaking priors: finite and infinite clustering and classification. Sankhya Ser. A 65, 577–592 (2003b) MATHMathSciNetGoogle Scholar
  24. Ishwaran, H., James, L.F.: Computational methods for multiplicative intensity models using weighted gamma processes: proportional hazards, marked point processes and panel count data. J. Am. Stat. Assoc. 99, 175–190 (2004) MATHCrossRefMathSciNetGoogle Scholar
  25. James, L.F.: Poisson process partition calculus with applications to exchangeable models and Bayesian Nonparametrics (2002). Available at http://arXiv.org/abs/math/0205093
  26. James, L.F.: Bayesian Poisson process partition calculus with an application to Bayesian Lévy moving averages. Ann. Stat. 33, 1771–1799 (2005) MATHCrossRefGoogle Scholar
  27. James, L.F., Lijoi, A., Prüenster, I.: Bayesian inference for classes of normalized random measures. Ann. Appl. Probab. 18, 521–511 (2008) MATHCrossRefMathSciNetGoogle Scholar
  28. Kingman, J.F.C.: Random discrete distributions. J. R. Stat. Soc., Ser. B 37, 1–22 (1975) MATHMathSciNetGoogle Scholar
  29. Kingman, J.F.C.: Poisson Processes. Oxford University Press, London (1993) MATHGoogle Scholar
  30. Lau, J.W., Green, P.J.: Bayesian model based clustering procedures. J. Comput. Graph. Stat. 16, 526–558 (2007) CrossRefMathSciNetGoogle Scholar
  31. Lau, J.W., Lo, A.Y.: Model based clustering and weighted Chinese restaurant processes. In: Nair, V. (ed.) Advances in Statistical Modeling and Inference: Essays in Honor of Kjell A. Doksum, pp. 405–424. World Scientific, Singapore (2007) CrossRefGoogle Scholar
  32. Lau, J.W., Siu, T.K.: Modelling long-term investment returns via Bayesian infinite mixture time series models. Scand. Actuar. J. 108, 243–282 (2008) CrossRefMathSciNetGoogle Scholar
  33. Lau, J.W., So, M.K.P.: Bayesian mixture of autoregressive models. Comput. Stat. Data Anal. 53, 38–60 (2008) MATHCrossRefMathSciNetGoogle Scholar
  34. Le, N.D., Martin, R.D., Raftery, A.E.: Modeling flat stretches, bursts, and outliers in time series using mixture transition distribution models. J. Am. Stat. Assoc. 91, 1504–1514 (1996) MATHCrossRefMathSciNetGoogle Scholar
  35. Lijoi, A., Mena, R.H., Prünster, I.: Hierarchical mixture modeling with normalized inverse-Gaussian priors. J. Am. Stat. Assoc. 100, 1278–1291 (2005) MATHCrossRefGoogle Scholar
  36. Lijoi, A., Mena, R.H., Prünster, I.: Controlling the reinforcement in Bayesian nonparametric mixture models. J. R. Stat. Soc. Ser. B 69, 715–740 (2007) CrossRefGoogle Scholar
  37. Lo, A.Y.: On a class of Bayesian nonparametric estimates. I. Density estimates. Ann. Stat. 12, 351–357 (1984) MATHCrossRefGoogle Scholar
  38. Lo, A.Y.: Weighted Chinese restaurant processes. COSMOS 1, 59–63 (2005) CrossRefGoogle Scholar
  39. Lo, A.Y., Brunner, L.J., Chan, A.T.: Weighted Chinese restaurant processes and Bayesian mixture models. Research Report, Hong Kong University of Science and Technology (1996). Available at http://www.erin.utoronto.ca/~jbrunner/papers/wcr96.pdf
  40. MacEachern, S.N.: Estimating normal means with a conjugate style Dirichlet process prior. Commun. Stat.—Simul. Comput. 23, 727–741 (1994) MATHCrossRefMathSciNetGoogle Scholar
  41. MacEachern, S.N., Müller, P.: Estimating mixture of Dirichlet process models. J. Comput. Graph. Stat. 7, 223–238 (1998) CrossRefGoogle Scholar
  42. MacEachern, S.N., Müller, P.: Efficient MCMC schemes for robust model extensions using encompassing Dirichlet process mixture models. In: Ruggeri, F., Rios-Insua, D. (eds.) Robust Bayesian Analysis, pp. 295–316. Springer, New York (2000) Google Scholar
  43. Müller, P., West, M., MacEachern, S.N.: Bayesian models for non-linear auto-regressions. J. Time Ser. Anal. 18, 593–614 (1997) MATHCrossRefMathSciNetGoogle Scholar
  44. Neal, R.M.: Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Stat. 9, 249–265 (2000) CrossRefMathSciNetGoogle Scholar
  45. Nelson, D.B.: Conditional heteroskedasticity in asset pricing: a new approach. Econometrica 59, 347–370 (1991) MATHCrossRefMathSciNetGoogle Scholar
  46. Pitman, J.: Exchangeable and partially exchangeable random partitions. Probab. Theory Relat. Fields 102, 145–158 (1995) MATHCrossRefMathSciNetGoogle Scholar
  47. Pitman, J.: Some developments of the Blackwell-MacQueen urn scheme. In: Ferguson, T.S., Shapley, L.S., MacQueen, J.B. (eds.) Statistics, Probability and Game Theory. Papers in Honor of David Blackwell. IMS Lecture Notes, Monograph Series, vol. 30, pp. 245–267 (1996) Google Scholar
  48. Pitman, J.: Poisson Kingman partitions. In: D.R. Goldstein (ed) Science and Statistics: A Festschrift for Terry Speed. IMS Lecture Notes, Monograph Series, vol. 40, pp. 1–34 (2003) Google Scholar
  49. Pitman, J., Yor, M.: The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator. Ann. Probab. 25, 855–900 (1997) MATHCrossRefMathSciNetGoogle Scholar
  50. Quintana, F.: A predictive view of Bayesian clustering. J. Stat. Plan. Inference 136, 2407–2429 (2006) MATHCrossRefMathSciNetGoogle Scholar
  51. Richardson, S., Green, P.J.: On Bayesian analysis of mixtures with an unknown number of components. J. R. Stat. Soc., Ser. B 59, 731–792 (1997) MATHCrossRefMathSciNetGoogle Scholar
  52. West, M., Müller, P., Escobar, M.D.: Hierarchical priors and mixture models, with applications in regression and density estimation. In: Freeman, P.R., Smith, A.F.M. (eds.) A Tribute to D.V. Lindley, pp. 363–386. Wiley, New York (1994) Google Scholar
  53. Wong, C.S., Li, W.K.: On a mixture autoregressive model. J. R. Stat. Soc., Ser. B 62, 95–115 (2000) MATHCrossRefMathSciNetGoogle Scholar
  54. Wong, C.S., Li, W.K.: On a logistic mixture autoregressive model. Biometrika 88, 833–846 (2001a) MATHCrossRefMathSciNetGoogle Scholar
  55. Wong, C.S., Li, W.K.: On a mixture autoregressive conditional heteroscedastic model. J. Am. Stat. Assoc. 96, 982–995 (2001b) MATHCrossRefMathSciNetGoogle Scholar
  56. Zhang, Z., Li, W.K., Yuen, K.C.: On a mixture GARCH time-series model. J. Time Ser. Anal. 27, 577–597 (2006) MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  1. 1.School of Mathematics and StatisticsUniversity of Western AustraliaPerthAustralia
  2. 2.Department of Information Systems, Business Statistics and Operations ManagementHong Kong University of Science and TechnologyHong KongHong Kong

Personalised recommendations