Abstract
Optimal control problems for bilinear systems are studied and solved with a view to approximating analogous problems for general nonlinear systems. For a given bilinear optimal control problem, a sequence of linear problems is constructed, and their solutions are shown to converge to the desired solution. Also, the direct solution to the Hamilton-Jacobi equation is analyzed. A power-series approach is presented which requires offline calculations as in the linear case (Riccati equation). The methods are compared and illustrated. Relations to classical linear systems theory are discussed.
Similar content being viewed by others
References
Sussmann, H. J.,Semigroup Representations, Bilinear Approximations of Input-Output Maps, and Generalized Inputs, Springer-Verlag, Berlin, Germany, 1975.
Costanza, V.,Universal Approximation of Control Systems over Finite Time, Princeton University, PhD Thesis, 1980.
Mohler, R. R.,Bilinear Control Processes, Academic Press, New York, New York, 1973.
Gutman, P. O.,Stabilizing Controllers for Bilinear Systems, IEEE Transactions on Automatic Control, Vol. AC-26, No. 4, 1981.
Jacobson, D. H.,Extensions of Linear-Quadratic Control Systems, Springer-Verlag, Berlin, Germany, 1980.
Bruni, C., Di Pillo, G., andKoch, G.,On the Mathematical Models of Bilinear Systems, Ricerche di Automatica, Vol. 2, No. 1, 1971.
Rodman, L.,On Extremal Solutions of the Algebraic Riccati Equation, Lectures in Applied Mathematics, American Mathematical Society, Vol. 18, pp. 311–327, 1980.
Brockett, R. W.,Volterra Series and Geometric Control Theory, Automatica, Vol. 12, 167–176, 1976.
Author information
Authors and Affiliations
Additional information
Communicated by G. Leitmann
Rights and permissions
About this article
Cite this article
Cebuhar, W.A., Costanza, V. Approximation procedures for the optimal control of bilinear and nonlinear systems. J Optim Theory Appl 43, 615–627 (1984). https://doi.org/10.1007/BF00935009
Issue Date:
DOI: https://doi.org/10.1007/BF00935009