Abstract
A theorem published earlier by the author is strengthened and a more informative necessary condition for an extremum in optimal control problems is obtained. The result is illustrated by a simple optimal control problem.
Similar content being viewed by others
References
L. S. Pontryagin, V. G. Boltyanskii, R. V. Gamkrelidze, and E. F. Mishchenko,The Mathematical Theory of Optimal Processes [in Russian], Nauka, Moscow (1976).
M. F. Sukhinin, “On an analog of Bellman’s equation,”Mat. Zametki, [Math. Notes],38, No. 2, 267–271 (1985).
M. F. Sukhinin,Selected Topics in Nonlinear Analysis [in Russian], Izd. Russian Peoples’ Frindship Univ., Moscow (1992).
V. G. Boltyanskii, “Sufficient conditions for optimality and the justification of the dynamical programming approach,”Izv. Akad. Nauk SSSR Ser. Mat. [Math. USSR-Izv],28, No. 3, 481–514 (1964).
M. G. Crandall and P.-L. Lions, “Viscosity solutions of Hamilton-Jacobi equations,”Trans. Amer. Math. Soc.,277, 1–42 (1983).
A. I. Subbotin,Minimax Inequalities and Hamilton-Jacobi Equations [in Russian], Nauka, Moscow (1991).
A. D. Ioffe and V. M. Tikhomirov,The Theory of Extremum Problems [in Russian], Nauka, Moscow (1974).
E. B. Lee and L. Marcus,Foundations of Optimal Control Theory. J. Wiley, New York-London (1969).
Author information
Authors and Affiliations
Additional information
Translated fromMatematicheskie Zametki, Vol. 66, No. 5, pp. 770–776, November, 1999.
Rights and permissions
About this article
Cite this article
Sukhinin, M.F. On Bellman’s approach to optimal control theory. Math Notes 66, 636–641 (1999). https://doi.org/10.1007/BF02674205
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF02674205