The importance of continuous dynamical systems for optimal control theory is twofold. First, dynamical systems already occur in the problem formulation, in which the evolution of the states to be controlled is formulated as a differential equation. Second, and more important, the techniques for calculating and analyzing the solutions of optimal control problems, in the form in which we introduce them, profoundly rely on results provided by the theory of continuous dynamical systems. Therefore in this chapter we elaborate the theory in some detail.
To help the reader, not acquainted with dynamical systems, we first provide a historical introduction, and then present the simple case of a onedimensional dynamical system, introducing important concepts in an informal manner. Subsequently we restate these concepts and the required theory in a rigorous way.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
(2008). Continuous-Time Dynamical Systems. In: Optimal Control of Nonlinear Processes. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77647-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-540-77647-5_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-77646-8
Online ISBN: 978-3-540-77647-5
eBook Packages: Business and EconomicsEconomics and Finance (R0)