Advertisement

How Does CONDENSATION Behave with a Finite Number of Samples?

  • O. King
  • D. A. Forsyth
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1842)

Abstract

Condensationis a popular algorithm for sequential inference that resamples a sampled representation of the posterior. The algorithm is known to be asymptotically correct as the number of samples tends to infinity. However, the resampling phase involves a loss of information. The sequence of representations produced by the algorithm is a Markov chain, which is usually inhomogeneous. We show simple discrete examples where this chain is homogeneous and has absorbing states. In these examples, the representation moves to one of these states in time apparently linear in the number of samples and remains there. This phenomenon appears in the continuous case as well, where the algorithm tends to produce “clumpy” representations. In practice, this means that different runs of a tracker on the same data can give very different answers, while a particular run of the tracker will look stable. Furthermore, the state of the tracker can collapse to a single peak — which has non-zero probability of being the wrong peak — within time linear in the number of samples, and the tracker can appear to be following tight peaks in the posterior even in the absence of any meaningful measurement. This means that, if theoretical lower bounds on the number of samples are not available, experiments must be very carefully designed to avoid these effects.

Keywords

Markov Chain State Transition Matrix Prior Density Stochastic Simulation Algorithm Homogeneous Markov Chain 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    A. Blake and M. Isard. Condensation-conditional density propagation for visual tracking. Int. J. Computer Vision, 29(1):5–28, 1998.CrossRefGoogle Scholar
  2. 2.
    B.P. Carlin and T.A. Louis. Bayes and empirical Bayes methods for data analysis. Chapman and Hall, 1996.Google Scholar
  3. 3.
    J. Carpenter, P. Clifford, and P. Fearnhead. Improved particle filter for non-linear problems. IEEE Proc. Radar, Sonar and Navigation, 146(1):2–7, 1999.CrossRefGoogle Scholar
  4. 4.
    A. Doucet. On sequential simulation-based methods for bayesian filtering. Technical report, Cambridge University, 1998. CUED/F-INFENG/TR310.Google Scholar
  5. 5.
    W.J. Ewens. Population Genetics. Methuen, 1969.Google Scholar
  6. 6.
    W.J. Ewens. Mathematical Population Genetics. Springer, 1979.Google Scholar
  7. 7.
    A. Gelman, J.B. Carlin, H.S. Stern, and D.B. Rubin. Bayesian Data Analysis. Chapman and Hall, 1995.Google Scholar
  8. 8.
    K. Kanazawa, D. Koller, and S. Russell. Stochastic simulation algorithms for dynamic probabilistic networks. In Proc Uncertainty in AI, 1995.Google Scholar
  9. 9.
    J.S. Liu and R. Chen. Sequential monte-carlo methods for dynamic systems. Technical report, Stanford University, 1999. preprint.Google Scholar
  10. 10.
    J.R. Norris. Markov Chains. Cambridge University Press, 1997.Google Scholar
  11. 11.
    D.T. Suzuki, A.J.F. Griffiths, J.H. Miller, and R.C. Lewontin. An Introduction to Genetic Analysis. W.H. Freeman, 1989.Google Scholar
  12. 12.
    L. Tierney. Introduction to general state-space markov chain theory. In W.R. Gilks, S. Richardson, and D.J. Spiegelhalter, editors, Markov chain Monte Carlo in practice. Chapman and Hall, 1996.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2000

Authors and Affiliations

  • O. King
    • 1
  • D. A. Forsyth
    • 1
  1. 1.Computer Science DivisionU.C. BerkeleyBerkeleyUSA

Personalised recommendations