Encyclopedia of Operations Research and Management Science

2013 Edition
| Editors: Saul I. Gass, Michael C. Fu

Partially Observed Markov Decision Processes

Reference work entry
DOI: https://doi.org/10.1007/978-1-4419-1153-7_200580

A Markov decision process (MDP) in which the state of the system cannot be fully or precisely observed, e.g., only part of the state is known and/or the state observation has some error. In principle, such a model can be converted to a fully observed MDP by introducing an “information” or “belief” state that may be infinite dimensional, corresponding to a probability distribution over the original state.

See

Copyright information

© Springer Science+Business Media New York 2013