Optimal Control of Jump-Markov Processes and Viscosity Solutions
- Cite this paper as:
- Soner H.M. (1988) Optimal Control of Jump-Markov Processes and Viscosity Solutions. In: Fleming W., Lions PL. (eds) Stochastic Differential Systems, Stochastic Control Theory and Applications. The IMA Volumes in Mathematics and Its Applications, vol 10. Springer, New York, NY
We investigate the Bellman equation that arises in the optimal control of Markov processes. This is a fully nonlinear integro-differential equation. The notion of viscosity solutions is introduced and then existence and uniqueness results are obtained. Also, the connection between the optimal control problem and the Bellman equation is developed.
Unable to display preview. Download preview PDF.