Markov† chains are the most substantial application of conditional probability which is easily accessible, and, at the same time, they provide an excellent introduction to the more general subject of stochastic processes. A stochastic process is a random variable with a time index (say, Xn, n= 0, 1, 2,...) for discrete time, or a family of random variables (say, X(t), 0<t<∞) for continuous time.
KeywordsMarkov Chain Random Walk Transition Matrix Probability Generate Function Stationary Probability Vector
Unable to display preview. Download preview PDF.