In previous chapters we alluded to the concept of stability without a rigorous mathematical definition, being content to accept the concept of boundedness to guarantee stability. In the linear case this is sufficient; in the nonlinear case we need to be more specific. The concepts of local, global, absolute, uniform, and asymptotic stability will be defined in Section 6.2.2, and examples given of each. Each depends upon regions of validity of the analysis about singular points (See Chapter 5). In the case of linear equations, the issue is moot, since the linear system exhibits its behavior characteristics throughout the real space in which it is defined—typically the entire x-y plane or n-dimensional space. In nonlinear systems, we must restrict our attention to neighborhoods around the operating point.
KeywordsEquilibrium Point Singular Point Asymptotic Stability Nonlinear System Dynamic Negative Real Part
Unable to display preview. Download preview PDF.
References and Related Literature
- 1.Vidyasagar, M., Nonlinear Systems Analysis, Prentice Hall, Englewood Cliffs, New Jersey (1978).Google Scholar
- 2.Lefshetz, Solomon, Stability of Nonlinear Control Systems, Academic Press, New York (1965).Google Scholar
- 3.Langill, A. W., Jr., Automatic Control System Engineering, Vol. II, Prentice-Hall, Englewood Cliffs, New Jersey (1965).Google Scholar
- 5.DeRusso, Paul M., Roy, Rob J., and Close, Charles M., State Variables for Engineers, John Wiley & Sons, New York (1965).Google Scholar
- 6.Gibson, J. E., Nonlinear Automatic Control, McGraw-Hill, New York (1963).Google Scholar