Summary
This paper treats the problem of existence of optimal controls in partially observable systems whose dynamics are described by a nonlinear stochastic differential equation. The technique applied is based on weak convergence of probability measures and on the construction of stochastically equivalent processes.
Article PDF
Similar content being viewed by others
References
BeneŚ, V.E.: Existence of Optimal Stochastic Control Laws. SIAM J. Control 9, 446–472 (1971)
Billingsley, P.: Convergence of Probability Measures. New York: Wiley 1968
Christopeit, N.: Optimal Control of Partially Observable Stochastic Systems. Technical Report. Bonn, 1979
Duncan, T., Varaya, P.: On the Solutions of a Stochastic Control System. SIAM J. Control 9, 354–371 (1971)
Gikhman, I.I., Skorokhod, A.V.: Stochastische Differentialgleichungen. Berlin: Akademie Verlag 1971
Kushner, H.J.: Existence Results for Optimal Stochastic Controls. J. Optimization Theory Appl. 15, 347–359 (1975)
Liptser, R.S., Shiryayev, A.N.: Statistics of Random Processes I, New York: Springer 1977
Natanson, I.P.: Theorie der Funktionen einer reellen VerÄnderlichen. Berlin: Akademie Verlag, 1969
Skorokhod, A.V.: Studies in the Theory of Random Processes. Reading Mass.: Addison-Wesley, 1965
Tulaikov, A.: Zur Kompaktheit im Raum L p für p=1. Nachr. Akad. Wiss. Göttingen, Math. Phys. Kl. II (1933), 167–170
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Christopeit, N. Existence of optimal stochastic controls under partial observation. Z. Wahrscheinlichkeitstheorie verw Gebiete 51, 201–213 (1980). https://doi.org/10.1007/BF00536189
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF00536189