Predictive models and generative complexity
- 140 Downloads
The causal states of computational mechanics define the minimal sufficient memory for a given discrete stationary stochastic process. Their entropy is an important complexity measure called statistical complexity (or true measure complexity). They induce the ɛ-machine, which is a hidden Markov model (HMM) generating the process. But it is not the minimal one, although generative HMMs also have a natural predictive interpretation. This paper gives a mathematical proof of the idea that the ɛ-machine is the minimal HMM with an additional (partial) determinism condition. Minimal internal state entropy of a generative HMM is in analogy to statistical complexity called generative complexity. This paper also shows that generative complexity depends on the process in a nice way. It is, as a function of the process, lower semi-continuous (w.r.t. weak-* topology), concave, and behaves nice under ergodic decomposition of the process.
Key wordsCausal states complexity ɛ-machine generative complexity HMM partially deterministic HMM predictive model statistical
Unable to display preview. Download preview PDF.
- W. Löhr and N. Ay, Non-Sufficient Memories that are Sufficient for Prediction, in Proceedings of Complex’2009, Shanghai, volume 4 part I of LNICST, pages 265–276. Springer, 2009.Google Scholar
- Wolfgang Löhr, Models of Discrete-Time Stochastic Processes and Associated Complexity Measures, University of Leipzig, 2010, http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-38267.
- G. Keller, Equilibrium States in Ergodic Theory, London Mathematical Society, 1998.Google Scholar
- L. Dębowski, Ergodic decomposition of excess entropy and conditional mutual information, IPI PAN Reports, nr 993, 2006.Google Scholar
- S. Still, J. P. Crutchfield, and C. J. Ellison, Optimal Causal Inference, Informal publication, http://arxiv.org/abs/0708.1580, 2007.