Abstract
Markov processes are introduced as memoryless stochastic processes and classified in four classes based on whether the time parameter is continuous or discrete and whether the sample space is continuous or discrete. Two of them are treated in more detail: discrete-time (“classical”) Markov chains and continuous-time, continuous-state Markov processes. Long-time behavior of the chains is discussed, establishing the conditions for the formation of equilibrium distributions. In the continuous case, the Markov propagator is defined along with a discussion of moment functions, characterizing functions, and time evolution of the moments. Two particular Markov processes, the Wiener and the Ornstein–Uhlenbeck process, are given special attention due to their relevance for the study of diffusion.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
If the chain is irreducible and the states are reproducible (in finite \(\Omega \) they always are), the equilibrium distribution does not exist. If the chain is irreducible and its states are periodic, the limit (12.5) may not exist or it may depend on i: an example is the matrix \(\mathcal{P} = ((0,1),(1,0))\) with the equilibrium distribution \({{\varvec{\pi }}} = (1/2, 1/2)\), as \({{\varvec{\pi }}} = {{\varvec{\pi }}}\mathcal{P}\), but \(\lim _{t\rightarrow \infty } \mathcal{P}^t\) does not exist.
- 2.
We are referring to a simple lemma: is g(z) is a smooth function of z satisfying \(g(z) = n g(z/n)\) for any positive integer n, it holds that \(g(z) = Cz\), where C does not depend on z.
- 3.
The expected value is \(E[\Xi ({\Delta t};x,t)] = M_1(x,t){\Delta t} + {\mathcal{O}}({\Delta t})\), whence \(M_1(x,t) = A(x,t)\). The variance is given by \(\mathrm {var}[\Xi ({\Delta t};x,t)] = E[\Xi ^2({\Delta t};x,t)] - (E[\Xi ({\Delta t};x,t)])^2 = M_2(x,t){\Delta t} + {\mathcal{O}}({\Delta t}) - (M_1(x,t){\Delta t} + {\mathcal{O}}({\Delta t}))^2 = M_2(x,t){\Delta t} + {\mathcal{O}}({\Delta t})\), therefore \(M_2(x,t) = D(x,t)\).
References
E. Parzer, Stochastic Processes (Holden-Day, San Francisco, 1962)
A.A. Markov, An example of statistical investigation of the text “Eugene Onegin” concerning the connection of samples in chains. Sci. Context 19, 591 (2006). English translation from original Russian
S. Meyn, R.L. Tweedie, Markov Chains and Stochastic Stability (Springer, Berlin, 1995)
D.W. Stroock, An Introduction to Markov Processes, 2nd edn. (Springer, Berlin, 2014)
A. Papoulis, Probability, Random Variables and Stochastic Processes, 3rd edn. (McGraw-Hill, New York, 1991)
C. Meyer, Matrix Analysis and Applied Linear Algebra (SIAM, Philadelphia, 2000)
B. Hayes, First links in the Markov chain. Am. Sci. 101, 92 (2013)
D.T. Gillespie, Markov Processes. An Introduction for Physical Scientists (Academic Press Inc, Boston, 1992)
A. Einstein, On the movement of small particles suspended in stationary liquids required by the molecular-kinetic theory of heat. Ann. Phys. 17, 549 (1905)
P. Langevin, Sur la théorie du mouvement brownien, C. R. Acad. Sci. (Paris) 146, 530 (1908). See also the English translation D.S. Lemons, A. Gythiel, Paul Langevin’s 1908 paper “On the theory of Brownian motion”. Am. J. Phys. 65, 1079 (1997)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Širca, S. (2016). Markov Processes \(\star \) . In: Probability for Physicists. Graduate Texts in Physics. Springer, Cham. https://doi.org/10.1007/978-3-319-31611-6_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-31611-6_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-31609-3
Online ISBN: 978-3-319-31611-6
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)