Skip to main content

Abstract

An important problem that arises in many applications is the following adaptive problem: given a sequence of n × 1 input column vectors {h i }, and a corresponding sequence of desired scalar responses {d i }, find an estimate of an n × 1 column vector of weights w such that the sum of squared errors, \(\sum\nolimits_{i = 0}^N {{{\left| {{d_i} - h_i^Tw} \right|}^2}}\), is minimized. The {h i ,d i } are most often presented sequentially, and one is therefore required to find an adaptive scheme that recursively updates the estimate of w. The least-mean-squares (LMS) algorithm was originally conceived as an approximate solution to the above adaptive problem. It recursively updates the estimates of the weight vector along the direction of the instantaneous gradient of the sum squared error [1]. The introduction of the LMS adaptive filter in 1960 came as a significant development for a broad range of engineering applications since the LMS adaptive linear-estimation procedure requires essentially no advance knowledge of the signal statistics. The LMS, however, has been long thought to be an approximate minimizing solution to the above squared error criterion, and a rigorous minimization criterion has been missing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. B. Widrow and M. E. Hoff, Jr. Adaptive switching circuits. IRE WESCON Conv. Rec., pages 96–104, 1960. Pt. 4.

    Google Scholar 

  2. S. Haykin. Adaptive Filter Theory. Prentice Hall, Englewood Cliffs, NJ, second edition, 1991.

    MATH  Google Scholar 

  3. P. E. Werbos. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD dissertation, Harvard University, Cambridge, MA, Aug. 1974.

    Google Scholar 

  4. A. Parker. Learning-Logic, Invention Report S81-64, File 1, Office of Technology Licensing, Stanford University, Stanford, CA, Oct. 1982.

    Google Scholar 

  5. D. E. Rumelhart, J. L. McClelland and the PDP Research Group. Parallel distributed processing: explorations in the micro structure of cognition Cambridge, Mass.: MIT Press, 1986.

    Google Scholar 

  6. G. Zames. Feedback optimal sensitivity: model preference transformation, multiplicative seminorms and approximate inverses. IEEE Trans. on Automatic Control, AC-26:301–320, 1981.

    Article  MathSciNet  Google Scholar 

  7. P.P. Khargonekar and K.M. Nagpal. Filtering and smoothing in an H∞-setting. IEEE Trans. on Automatic Control, AC-36:151–166, 1991.

    MathSciNet  Google Scholar 

  8. B. Hassibi, A. H. Sayed, and T. Kailath. LMS is H∞ Optimal. To appear in IEEE CDC, San Antonio, Texas, 1993.

    Google Scholar 

  9. B. Widrow and S. D. Stearns. Adaptive Signal Processing. Prentice-Hall, Inc., Englewood Cliffs,NJ, 1985.

    MATH  Google Scholar 

  10. T. Kailath Linear Systems. Prentice Hall, Englewood Cliffs NJ, 1980.

    MATH  Google Scholar 

  11. P. Whittle. Risk Sensitive Optimal Control. John Wiley and Sons, New York, 1990.

    MATH  Google Scholar 

  12. K. Glover and D. Mustafa. Derivation of the maximum entropy H∞ controller and a state space formula for its entropy. Int. J. Control, 50:899–916, 1989.

    Article  MathSciNet  MATH  Google Scholar 

  13. B. Hassibi, A. H. Sayed, and T. Kailath. Recursive linear estimation in Krein spaces-Part II: Applications. To appear in Proc. IEEE Conference on Decision and Control, San Antonio, TX, Dec. 1993.

    Google Scholar 

  14. J. A. Ball and J. W. Helton. Nonlinear H∞ control theory for stable plants. Math. Control Signals Systems, 5:233–261, 1992.

    Article  MathSciNet  MATH  Google Scholar 

  15. M. Vidyasagar. Nonlinear System Analysis, 2nd ed., Englewood Cliffs, NJ, Prentice-Hall, 1993.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer Science+Business Media New York

About this chapter

Cite this chapter

Hassibi, B., Sayed, A.H., Kailath, T. (1994). LMS and Backpropagation are Minimax Filters. In: Roychowdhury, V., Siu, KY., Orlitsky, A. (eds) Theoretical Advances in Neural Computation and Learning. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-2696-4_12

Download citation

  • DOI: https://doi.org/10.1007/978-1-4615-2696-4_12

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-6160-2

  • Online ISBN: 978-1-4615-2696-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics