Understanding Markov Chains

Examples and Applications

Authors:

ISBN: 978-981-4451-50-5 (Print) 978-981-4451-51-2 (Online)

Table of contents (13 chapters)

  1. Front Matter

    Pages I-IX

  2. No Access

    Chapter

    Pages 1-6

    Introduction

  3. No Access

    Chapter

    Pages 7-36

    Probability Background

  4. No Access

    Chapter

    Pages 37-60

    Gambling Problems

  5. No Access

    Chapter

    Pages 61-75

    Random Walks

  6. No Access

    Chapter

    Pages 77-94

    Discrete-Time Markov Chains

  7. No Access

    Chapter

    Pages 95-116

    First Step Analysis

  8. No Access

    Chapter

    Pages 117-128

    Classification of States

  9. No Access

    Chapter

    Pages 129-148

    Long-Run Behavior of Markov Chains

  10. No Access

    Chapter

    Pages 149-166

    Branching Processes

  11. No Access

    Chapter

    Pages 167-209

    Continuous-Time Markov Chains

  12. No Access

    Chapter

    Pages 211-223

    Discrete-Time Martingales

  13. No Access

    Chapter

    Pages 225-239

    Spatial Poisson Processes

  14. No Access

    Chapter

    Pages 241-245

    Reliability Theory

  15. Back Matter

    Pages 247-354