An existence theorem for optimal control systems with state variable inC, and stochastic control problems
- 42 Downloads
We consider an existence theorem for control systems whose state variables for everyt are inC, the set of continuous functions varying over a given setI. The dependence of the state variables upona ε I is induced by their dependence upon the initial state and the state equation governing the system. In contrast, the controlu=u(t) is taken as a measurable function oft alone. The usual space constraints and boundary conditions are also allowed to vary overaεI, and the cost functional is now taken to be a continuous functional over a suitable class of continuous functions. We also discuss an application of these results to control systems with stochastic boundary conditions.
KeywordsBoundary Condition Control System Continuous Function Control Problem Measurable Function
Unable to display preview. Download preview PDF.
- 1.Kelley, J. L.,General Topology, D. Van Nostrand Company, Princeton, New Jersey, 1955.Google Scholar
- 2.Cesari, L.,Existence Theorems for Weak and Usual Optimal Solutions in Lagrange Problems with Unilateral Constraints, I, Transactions of the American Mathematical Society, Vol. 124, No. 3, 1966.Google Scholar
- 3.Baum, R.,Optimal Control Systems with Stochastic Boundary Conditions, University of Michigan, Ph.D. Thesis, 1969.Google Scholar