Machine Learning

, Volume 37, Issue 1, pp 75–87

Mixed Memory Markov Models: Decomposing Complex Stochastic Processes as Mixtures of Simpler Ones

  • Lawrence K. Saul
  • Michael I. Jordan
Article

DOI: 10.1023/A:1007649326333

Cite this article as:
Saul, L.K. & Jordan, M.I. Machine Learning (1999) 37: 75. doi:10.1023/A:1007649326333

Abstract

We study Markov models whose state spaces arise from the Cartesian product of two or more discrete random variables. We show how to parameterize the transition matrices of these models as a convex combination—or mixture—of simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We derive a set of generalized Baum-Welch updates for factorial hidden Markov models that make use of this parameterization. We also describe a simple iterative procedure for approximately computing the statistics of the hidden states. Throughout, we give examples where mixed memory models provide a useful representation of complex stochastic processes.

Markov modelsmixture modelsdiscrete time series
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 1999

Authors and Affiliations

  • Lawrence K. Saul
    • 1
  • Michael I. Jordan
    • 2
  1. 1.AT&T LabsFlorham Park
  2. 2.University of CaliforniaBerkeley