Markov Processes
Abstract
In this chapter, we provide a background material that is needed to define and study Markov processes in discrete and continuous time. We start by giving basic examples of transition probabilities and the corresponding operators on the spaces of functions and measures. An emphasis is put on stochastic operators on the spaces of integrable functions. The importance of transition probabilities is that the distribution of a stochastic process with Markov property is completely determined by transition probabilities and initial distributions. The Markov property simply states that the past and the future are independent given the present. We refer the reader to Appendix A for the required theory on measure, integration, and basic concepts of probability theory.