Waiting times and stopping probabilities for patterns in Markov chains
Suppose that C is a finite collection of patterns. Observe a Markov chain until one of the patterns in C occurs as a run. This time is denoted by τ. In this paper, we aim to give an easy way to calculate the mean waiting time E(τ ) and the stopping probabilities P(τ = τ A ) with A ∈ C, where τ A is the waiting time until the pattern A appears as a run.
Keywordspattern Markov chain stopping probability waiting time
MR Subject Classification60J10 60J22
Unable to display preview. Download preview PDF.
The authors wish to express their thanks to the referee for his/her constructive suggestions and many helpful comments.
- J Brofos. A Markov chain analysis of a pattern matching coin game, arXiv preprint, arXiv: 1406.2212, 2014.Google Scholar