Abstract
As mentioned in the preface, the stability theory of SDEs was developed mainly to meet the needs of stabilization of moving systems subjected to random perturbations. In this chapter we shall consider some problems concerning the stabilization of controlled stochastic systems. The results achieved to date in this field are rather sparse, despite the fact that the basic formulations of the problems and the fundamental equations have been known for some time. The only results of any significance are those pertaining to linear systems and employing quadratic control criteria. We devote to them the exposition which now follows, based on the material of Chaps. 5 through 7.
This chapter was written jointly with M.B. Nevelson.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In the literature this type of control is known as Markov control, or control employing the feedback principle.
References
Bellman, R.E.: Dynamic Programming. Princeton University Press, Princeton (1957). Russian transl.: IL, Moscow (1960)
Fleming, W.H.: Duality and a priori estimates in Markovian optimization problems. J. Math. Anal. Appl. 16, 254–279 (1966)
Kalman, R.E.: Control of randomly varying linear dynamical systems. In: Proc. Symp. Appl. Math., vol. 13, pp. 287–298. Am. Math. Soc., Providence (1962)
Krasovskii, A.A.: Sufficient conditions for statistical stability of motion. Izv. Akad. Nauk USSR Tekhn. Kibernet. 2, 107–113 (1966) (Russian)
Krasovskii, N.N.: The stabilization of systems in which the noise depends on the magnitude of the action of the controller. Izv. Akad. Nauk USSR Tekhn. Kibernet. 2, 102–109 (1965) (Russian)
Krasovskii, N.N., Lidskii, E.A.: Analytic design of controllers in systems with random attributes, I. Avtomat. i Telemekh. 22, 1145–1150 (1961). Automat. Remote Control 22, 1021–1025 (1961)
Krasovskii, N.N., Lidskii, E.A.: Analytic design of controllers in systems with random attributes, II. Avtomat. i Telemekh. 22, 1273–1278 (1961). Automat. Remote Control 22, 1145–1146 (1961)
Krasovskii, N.N., Lidskii, E.A.: Analytic design of controllers in systems with random attributes, III. Avtomat. i Telemekh. 22, 1425–1431 (1961). Automat. Remote Control 22, 1289–1294 (1961)
Nevelson, M.B.: Criterion of existence of an optimal control for a class of linear stochastic systems. Prikl. Mat. Meh. 33, 573–577 (1969). English transl.: J. Appl. Math. Mech. 33, 561–565 ({1969})
Nevelson, M.B., Khasminskii, R.Z.: Stability and stabilization of stochastic differential equations. In: Proc. Sixth Math. Summer School on Probability Theory and Math. Statist., Kaciveli, 1968, pp. 59–122. Izdat. Akad. Nauk Ukrain, SSR., Kiev (1969) (Russian)
Wonham, W.M.: Lecture Notes on Stochastic Control. Part 1. Brown University, Providence (1967)
Wonham, W.M.: Optimal stationary control of a linear system with state-dependent noise. SIAM J. Control 5, 486–500 (1967)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Khasminskii, R. (2012). Stabilization of Controlled Stochastic Systems. In: Stochastic Stability of Differential Equations. Stochastic Modelling and Applied Probability, vol 66. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23280-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-23280-0_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-23279-4
Online ISBN: 978-3-642-23280-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)