Abstract
An important problem that arises in many applications is the following adaptive problem: given a sequence of n × 1 input column vectors {h i }, and a corresponding sequence of desired scalar responses {d i }, find an estimate of an n × 1 column vector of weights w such that the sum of squared errors, \(\sum\nolimits_{i = 0}^N {{{\left| {{d_i} - h_i^Tw} \right|}^2}}\), is minimized. The {h i ,d i } are most often presented sequentially, and one is therefore required to find an adaptive scheme that recursively updates the estimate of w. The least-mean-squares (LMS) algorithm was originally conceived as an approximate solution to the above adaptive problem. It recursively updates the estimates of the weight vector along the direction of the instantaneous gradient of the sum squared error [1]. The introduction of the LMS adaptive filter in 1960 came as a significant development for a broad range of engineering applications since the LMS adaptive linear-estimation procedure requires essentially no advance knowledge of the signal statistics. The LMS, however, has been long thought to be an approximate minimizing solution to the above squared error criterion, and a rigorous minimization criterion has been missing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
B. Widrow and M. E. Hoff, Jr. Adaptive switching circuits. IRE WESCON Conv. Rec., pages 96–104, 1960. Pt. 4.
S. Haykin. Adaptive Filter Theory. Prentice Hall, Englewood Cliffs, NJ, second edition, 1991.
P. E. Werbos. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD dissertation, Harvard University, Cambridge, MA, Aug. 1974.
A. Parker. Learning-Logic, Invention Report S81-64, File 1, Office of Technology Licensing, Stanford University, Stanford, CA, Oct. 1982.
D. E. Rumelhart, J. L. McClelland and the PDP Research Group. Parallel distributed processing: explorations in the micro structure of cognition Cambridge, Mass.: MIT Press, 1986.
G. Zames. Feedback optimal sensitivity: model preference transformation, multiplicative seminorms and approximate inverses. IEEE Trans. on Automatic Control, AC-26:301–320, 1981.
P.P. Khargonekar and K.M. Nagpal. Filtering and smoothing in an H∞-setting. IEEE Trans. on Automatic Control, AC-36:151–166, 1991.
B. Hassibi, A. H. Sayed, and T. Kailath. LMS is H∞ Optimal. To appear in IEEE CDC, San Antonio, Texas, 1993.
B. Widrow and S. D. Stearns. Adaptive Signal Processing. Prentice-Hall, Inc., Englewood Cliffs,NJ, 1985.
T. Kailath Linear Systems. Prentice Hall, Englewood Cliffs NJ, 1980.
P. Whittle. Risk Sensitive Optimal Control. John Wiley and Sons, New York, 1990.
K. Glover and D. Mustafa. Derivation of the maximum entropy H∞ controller and a state space formula for its entropy. Int. J. Control, 50:899–916, 1989.
B. Hassibi, A. H. Sayed, and T. Kailath. Recursive linear estimation in Krein spaces-Part II: Applications. To appear in Proc. IEEE Conference on Decision and Control, San Antonio, TX, Dec. 1993.
J. A. Ball and J. W. Helton. Nonlinear H∞ control theory for stable plants. Math. Control Signals Systems, 5:233–261, 1992.
M. Vidyasagar. Nonlinear System Analysis, 2nd ed., Englewood Cliffs, NJ, Prentice-Hall, 1993.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1994 Springer Science+Business Media New York
About this chapter
Cite this chapter
Hassibi, B., Sayed, A.H., Kailath, T. (1994). LMS and Backpropagation are Minimax Filters. In: Roychowdhury, V., Siu, KY., Orlitsky, A. (eds) Theoretical Advances in Neural Computation and Learning. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-2696-4_12
Download citation
DOI: https://doi.org/10.1007/978-1-4615-2696-4_12
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-6160-2
Online ISBN: 978-1-4615-2696-4
eBook Packages: Springer Book Archive