Advertisement

An Introduction to Markov Chains

  • Esra BasEmail author
Chapter

Abstract

This chapter was a compact introduction to both discrete-time and continuous-time Markov chains. The important concepts including the definition of discrete-time and continuous-time Markov chains, Chapman-Kolmogorov equations, reachability, communication, communication classes, recurrent and transient states, period of a state for a discrete-time Markov chain, the limiting probability of a state for a discrete-time and continuous-time Markov chain, and ergodic Markov chain were explained. Several examples and problems have been solved for the discrete-time Markov chains, and where relevant, state transition diagrams and tables have been used to facilitate the comprehension of the solutions.

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Industrial EngineeringIstanbul Technical UniversityIstanbulTurkey

Personalised recommendations