Skip to main content

Markov Processes \(\star \)

  • Chapter
  • First Online:
Probability for Physicists

Part of the book series: Graduate Texts in Physics ((GTP))

  • 3743 Accesses

Abstract

Markov processes are introduced as memoryless stochastic processes and classified in four classes based on whether the time parameter is continuous or discrete and whether the sample space is continuous or discrete. Two of them are treated in more detail: discrete-time (“classical”) Markov chains and continuous-time, continuous-state Markov processes. Long-time behavior of the chains is discussed, establishing the conditions for the formation of equilibrium distributions. In the continuous case, the Markov propagator is defined along with a discussion of moment functions, characterizing functions, and time evolution of the moments. Two particular Markov processes, the Wiener and the Ornstein–Uhlenbeck process, are given special attention due to their relevance for the study of diffusion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 84.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    If the chain is irreducible and the states are reproducible (in finite \(\Omega \) they always are), the equilibrium distribution does not exist. If the chain is irreducible and its states are periodic, the limit (12.5) may not exist or it may depend on i: an example is the matrix \(\mathcal{P} = ((0,1),(1,0))\) with the equilibrium distribution \({{\varvec{\pi }}} = (1/2, 1/2)\), as \({{\varvec{\pi }}} = {{\varvec{\pi }}}\mathcal{P}\), but \(\lim _{t\rightarrow \infty } \mathcal{P}^t\) does not exist.

  2. 2.

    We are referring to a simple lemma: is g(z) is a smooth function of z satisfying \(g(z) = n g(z/n)\) for any positive integer n, it holds that \(g(z) = Cz\), where C does not depend on z.

  3. 3.

    The expected value is \(E[\Xi ({\Delta t};x,t)] = M_1(x,t){\Delta t} + {\mathcal{O}}({\Delta t})\), whence \(M_1(x,t) = A(x,t)\). The variance is given by \(\mathrm {var}[\Xi ({\Delta t};x,t)] = E[\Xi ^2({\Delta t};x,t)] - (E[\Xi ({\Delta t};x,t)])^2 = M_2(x,t){\Delta t} + {\mathcal{O}}({\Delta t}) - (M_1(x,t){\Delta t} + {\mathcal{O}}({\Delta t}))^2 = M_2(x,t){\Delta t} + {\mathcal{O}}({\Delta t})\), therefore \(M_2(x,t) = D(x,t)\).

References

  1. E. Parzer, Stochastic Processes (Holden-Day, San Francisco, 1962)

    Google Scholar 

  2. A.A. Markov, An example of statistical investigation of the text “Eugene Onegin” concerning the connection of samples in chains. Sci. Context 19, 591 (2006). English translation from original Russian

    Article  MATH  Google Scholar 

  3. S. Meyn, R.L. Tweedie, Markov Chains and Stochastic Stability (Springer, Berlin, 1995)

    MATH  Google Scholar 

  4. D.W. Stroock, An Introduction to Markov Processes, 2nd edn. (Springer, Berlin, 2014)

    Book  MATH  Google Scholar 

  5. A. Papoulis, Probability, Random Variables and Stochastic Processes, 3rd edn. (McGraw-Hill, New York, 1991)

    MATH  Google Scholar 

  6. C. Meyer, Matrix Analysis and Applied Linear Algebra (SIAM, Philadelphia, 2000)

    Book  Google Scholar 

  7. B. Hayes, First links in the Markov chain. Am. Sci. 101, 92 (2013)

    Google Scholar 

  8. D.T. Gillespie, Markov Processes. An Introduction for Physical Scientists (Academic Press Inc, Boston, 1992)

    MATH  Google Scholar 

  9. A. Einstein, On the movement of small particles suspended in stationary liquids required by the molecular-kinetic theory of heat. Ann. Phys. 17, 549 (1905)

    Article  Google Scholar 

  10. P. Langevin, Sur la théorie du mouvement brownien, C. R. Acad. Sci. (Paris) 146, 530 (1908). See also the English translation D.S. Lemons, A. Gythiel, Paul Langevin’s 1908 paper “On the theory of Brownian motion”. Am. J. Phys. 65, 1079 (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simon Širca .

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Širca, S. (2016). Markov Processes \(\star \) . In: Probability for Physicists. Graduate Texts in Physics. Springer, Cham. https://doi.org/10.1007/978-3-319-31611-6_12

Download citation

Publish with us

Policies and ethics