This chapter presents the fundamentals of optimal control theory. In the first section we give a short historical survey, introducing the reader to the main ideas and notions. Subsequently we introduce the standard problem of optimal control theory.
We state Pontryagin's Maximum Principle, distinguishing between the cases without and with mixed path or pure state inequality constraints. The Hamilton–Jacobi–Bellman equation is used to give an informal proof of the Maximum Principle. Then the Maximum Principle is extended to the case of an infinite planning horizon. This is followed by the presentation of a onedimensional optimal control model, and we give an economic interpretation of the Maximum Principle.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
(2008). Tour d'Horizon: Optimal Control. In: Optimal Control of Nonlinear Processes. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77647-5_3
Download citation
DOI: https://doi.org/10.1007/978-3-540-77647-5_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-77646-8
Online ISBN: 978-3-540-77647-5
eBook Packages: Business and EconomicsEconomics and Finance (R0)