Elements of Optimal Control
Generally speaking, the concept of control could be defined as the process of influencing the behavior of a dynamical system in order to achieve a desired goal. When the goal consists in optimizing some payoff function (or minimizing a cost function) depending on the control inputs, then we are dealing with an optimal control problem.
KeywordsOptimal Control Problem Credit Risk Deterministic Setting Dynamic Programming Principle Dynamic Programming Equation
Unable to display preview. Download preview PDF.