Abstract
In the majority of cases automatic control systems involve complicated devices, consisting of objects to be controlled (plants or processes) and controllers. The task of a controller is to support continuously either the stationary operating conditions or those conditions of the plant that change according to a given law. All deviations from the desired conditions that may arise in the control system must be reduced to zero with time. In other words, the control system must be asymptotically stable.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer-Verlag New York, Inc.
About this chapter
Cite this chapter
Merkin, D.R. (1997). Application of the Direct Method of Liapunov to the Investigation of Automatic Control Systems. In: Introduction to the Theory of Stability. Texts in Applied Mathematics, vol 24. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-4046-4_9
Download citation
DOI: https://doi.org/10.1007/978-1-4612-4046-4_9
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4612-8477-2
Online ISBN: 978-1-4612-4046-4
eBook Packages: Springer Book Archive