The matrix of (single-step) transition probabilities of a Markov chain {X n}, Pv = (p ij), where p ij = Pr{X n+1 = j|X n = i} is the conditional probability that the chain moves to state j from state i in one step. Markov chains; Markov processes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Kluwer Academic Publishers
About this entry
Cite this entry
Gass, S.I., Harris, C.M. (2001). Transition matrix . In: Gass, S.I., Harris, C.M. (eds) Encyclopedia of Operations Research and Management Science. Springer, New York, NY. https://doi.org/10.1007/1-4020-0611-X_1061
Download citation
DOI: https://doi.org/10.1007/1-4020-0611-X_1061
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-7923-7827-3
Online ISBN: 978-1-4020-0611-1
eBook Packages: Springer Book Archive