Abstract
In this chapter we study the topic of stability for dynamical systems. There are number of different concepts and definitions of stability and these apply to various types of integral curves: fixed points, periodic solutions, etc., for dynamical systems (cf. [Ha 82], [Rob 95], [RM 80], [AM 78], [Co 65], [Bel 53], [Mer 97]). This chapter provides an introduction to the subject, giving first a few results about stability of fixed points and then a brief discussion of stability of periodic solutions (also called cycles or closed integral curves).
The question of whether a given motion of a dynamical system is stable or not is a natural one, and we have already used the terminology–stable/unstable fixed point–throughout the text in numerous examples. The definitions of stability are given precisely below, but the basic idea in these definitions is whether the integral curves starting near a given fixed point (or more generally near a given integral curve) will stay near it (stability), and perhaps tend toward it asymptotically in time (asymptotic stability).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Betounes, D. (2010). Stability Theory. In: Differential Equations: Theory and Applications. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1163-6_6
Download citation
DOI: https://doi.org/10.1007/978-1-4419-1163-6_6
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1162-9
Online ISBN: 978-1-4419-1163-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)