In this chapter, we will study certain sequences of random variables, known as ‘random walks’. These are defined in terms of sums of independent identically distributed random variables. Important in the study of random walks (and of more general random sequences) are ‘filtrations’ and ‘stopping times’. A filtration is a sequence of σ-fields representing the information available at various stages of an experiment. A stopping time is a ℤ̄+ -valued random variable whose value may be regarded as the time at which an experiment is to be terminated. In applications, such as gambling theory, important stopping times are the time at which a random walk reaches a certain goal and the time at which it returns to its original position. These will be treated in the latter part of the chapter for several special random walks.
Unable to display preview. Download preview PDF.
- Dynkin, E. B., Markov Processes, Vol. I, Academic Press, New York, 1965.Google Scholar
- Dynkin, E. B., Markov Processes, Vol. II, Academic Press, New York, 1965.Google Scholar
- Preedman, David, Approximating Countable Markov Chains, Holden-Day, San Francisco, 1971.Google Scholar
- Preedman, David, Markov Chains, Holden-Day, San Francisco, 1971.Google Scholar
- Harris, Theodore E., The Theory of Branching Processes, Dover, New York, 1989.Google Scholar
- Maisonneuve, Bernard, Systèmes Régénératifs (Astérique, Vol. 15), Société Mathématique de France, Paris, 1974.Google Scholar
- Tackás, Lajos, Combinatorial Methods in the Theory of Stochastic Processes, John Wiley & Sons, New York, 1967.Google Scholar