Methodology and Computing in Applied Probability

, Volume 19, Issue 3, pp 727–749 | Cite as

Biased Online Parameter Inference for State-Space Models

Article

Abstract

We consider Bayesian online static parameter estimation for state-space models. This is a very important problem, but is very computationally challenging as the state-of-the art methods that are exact, often have a computational cost that grows with the time parameter; perhaps the most successful algorithm is that of SM C2 (Chopin et al., J R Stat Soc B 75: 397–426 2013). We present a version of the SM C2 algorithm which has computational cost that does not grow with the time parameter. In addition, under assumptions, the algorithm is shown to provide consistent estimates of expectations w.r.t. the posterior. However, the cost to achieve this consistency can be exponential in the dimension of the parameter space; if this exponential cost is avoided, typically the algorithm is biased. The bias is investigated from a theoretical perspective and, under assumptions, we find that the bias does not accumulate as the time parameter grows. The algorithm is implemented on several Bayesian statistical models.

Keywords

State-space models Bayesian inference Sequential Monte Carlo 

Mathematics Subject Classifications (2010)

Primary 82C80 60K35; Secondary 60F99 62F15 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Andrieu C, Doucet A, Holenstein R (2010) Particle Markov chain Monte Carlo methods (with discussion). J R Statist Soc Ser B 72:269–342CrossRefGoogle Scholar
  2. Andrieu C, Doucet A, Tadić V (2009) On-line parameter estimation in general state-space models using pseudo-likelihood. Unpublished Technical ReportGoogle Scholar
  3. Beskos A, Jasra A, Kantas N, Thiery A (2016) On the convergence of adaptive sequential Monte Carlo methods. Ann Appl Probab 26:1111–1146MathSciNetCrossRefMATHGoogle Scholar
  4. Borwanker J, Kallianpur G, Prakasa Rao BLS (1971) The Bernstein-Von Mises theorem for Markov processes. Ann Math Stat 42:1241–1253MathSciNetCrossRefMATHGoogle Scholar
  5. Cappé O, Ryden T, Moulines É (2005) Inference in hidden Markov models. Springer, New YorkMATHGoogle Scholar
  6. Centanni S, Minozzo M (2006) A Monte Carlo approach to filtering for a class of marked doubly stochastic Poisson processes. J Amer Stat Assoc 101:1582–1597MathSciNetCrossRefMATHGoogle Scholar
  7. Cérou F, Del Moral P, Guyader A (2011) A non-asymptotic variance theorem for un-normalized Feynman-Kac particle models. Ann Inst Henri Poincare 47:629–649CrossRefMATHGoogle Scholar
  8. Chan H P, Heng C W, Jasra A (2016) Theory of parallel particle filters for hidden Markov models. Adv Appl Probab 48:69–87CrossRefMATHGoogle Scholar
  9. Chopin N, Jacob P, Papaspiliopoulos O (2013) SM C2: a sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates. J R Stat Soc B 75:397–426CrossRefGoogle Scholar
  10. Crisan D, Miguez J (2014) Particle-kernel estimation of the filter density in state-space models. Bernoulli 20:1879–1929MathSciNetCrossRefMATHGoogle Scholar
  11. Crisan D, Miguez J (2014) Nested particle filters for online parameter estimation in discrete-time state-space Markov models. arXiv preprintGoogle Scholar
  12. Deligiannidis G, Doucet A, Pitt M K (2015) The correlated pseudo-marginal method. arXiv preprintGoogle Scholar
  13. Del Moral P (2004) Feynman-Kac formulae: genealogical and interacting particle systems with applications. Springer, New YorkCrossRefMATHGoogle Scholar
  14. Del Moral P (2013) Mean field simulation for Monte Carlo integration. Chapman & Hall, LondonMATHGoogle Scholar
  15. Del Moral P, Doucet A, Jasra A (2006) Sequential Monte Carlo samplers. J R Stat Soc B 68:411–436MathSciNetCrossRefMATHGoogle Scholar
  16. Douc R, Moulines E, Olsson J, Van Handel R (2011) Consistency of the maximum likelihood estimator for general hidden Markov models. Ann Stat 39:474–513MathSciNetCrossRefMATHGoogle Scholar
  17. Doucet A, Johansen A (2011) A tutorial on particle filtering and smoothing: fifteen years later. In: Crisan et B, Rozovsky D (eds) Handbook of nonlinear filtering. Oxford University Press, OxfordGoogle Scholar
  18. Fearnhead P (2002) MCMC, sufficient statistics and particle filters. J Comp Graph Stat 11:848–862CrossRefGoogle Scholar
  19. Gilks W R, Berzuini C (2001) Following a moving target - Monte Carlo inference for dynamic Bayesian models. J R Stat Soc B 63:127–146MathSciNetCrossRefMATHGoogle Scholar
  20. Jacob P, Murray L, Rubenthaler S (2015) Path storage in the particle filter. Stat Comp 25:487–496MathSciNetCrossRefMATHGoogle Scholar
  21. Kantas N, Doucet A, Singh S S, Maciejowski J M, Chopin N (2015) On particle methods for parameter estimation in state-space models. Stat Sci 30:328–351MathSciNetCrossRefMATHGoogle Scholar
  22. Polson NG, Stroud JR, Müller P (2008) Practical filtering with sequential parameter learning. J R Stat Soc B 70:413–428MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Center INRIA Bordeaux Sud-Ouest & Institut de Mathematiques de BordeauxUniversite de Bordeaux IBordeaux CedexFrance
  2. 2.Department of Statistics & Applied ProbabilityNational University of SingaporeSingaporeSingapore

Personalised recommendations