Queues pp 37-50 | Cite as

Introduction to Markov Chains

  • Moshe Haviv
Part of the International Series in Operations Research & Management Science book series (ISOR, volume 191)


The topic of Markov processes is huge. A number of volumes can be, and in fact were, written on this topic. We have no intentions to be complete in this area. What is given in this chapter is the minimum required in order to follow what is presented afterwards. In particular, we will refer at times to this chapter when results we present here are called for. For more comprehensive coverage of the topic of Markov chains and stochastic matrices, see [9, 19, 41] or [42].


Markov Chain Renewal Process Prove Theorem Probability Vector Limit Probability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 9.
    Billingsley, P. (1995). Probability and measure (3rd ed.). New York: Wiley.Google Scholar
  2. 16.
    Denardo, E. V. (1982). Dynamic programming: models and applications. Englewood Cliffs: Prentice-Hall.Google Scholar
  3. 19.
    Feller, W. (1968). An introduction to probability theory and its applications (3rd ed.). New York: Wiley.Google Scholar
  4. 31.
    Kemeny, J. K., & Snell, J. L. (1961). Finite Markov chains. New York: D. Van Nostrand.Google Scholar
  5. 41.
    Ross, S. M. (1996). Stochastic processes (2nd ed.). New York: Wiley.Google Scholar
  6. 42.
    Seneta, E. (2006). Non-negative matrices and Markov Chains: revised prinitng. New York: Springer.Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Moshe Haviv
    • 1
  1. 1.Department of StatisticsThe Hebrew UniversityJerusalemIsrael

Personalised recommendations