Skip to main content
Log in

A general approach of least squares estimation and optimal filtering

  • Published:
Optimization and Engineering Aims and scope Submit manuscript

Abstract

The least squares method allows fitting parameters of a mathematical model from experimental data. This article proposes a general approach of this method. After introducing the method and giving a formal definition, the transitivity of the method as well as numerical considerations are discussed. Then two particular cases are considered: the usual least squares method and the Generalized Least Squares method. In both cases, the estimator and its variance are characterized in the time domain and in the Fourier domain. Finally, the equivalence of the Generalized Least Squares method and the optimal filtering technique using a matched filter is established.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. As stated by the Wiener-Khintchine theorem, S is the Fourier transform of R (Lampard 1954).

References

  • Aitken AC (1935) On least squares and linear combinations of observations. Proc R Soc Edinb 55:42–48

    Google Scholar 

  • Birkes D, Dodge Y (1993) Alternative methods of regression. Wiley series in probability and statistics. Wiley, New York

    Book  MATH  Google Scholar 

  • Björck Å (1967) Solving linear least squares problems by Gram-Schmidt orthogonalization. BIT Numer Math 7(1):1–21

    Article  MATH  Google Scholar 

  • Cornillon PA, Matzner-Løber E (2007) Régression: théorie et applications. Springer, Paris

    Google Scholar 

  • El-Khaiary MI (2008) Least-squares regression of adsorption equilibrium data: comparing the options. J Hazard Mater 158(1):73–87

    Article  Google Scholar 

  • Geromel JC (1999) Optimal linear filtering under parameter uncertainty. IEEE Trans Signal Process 47(1):168–175

    Article  MathSciNet  MATH  Google Scholar 

  • Hambaba ML (1992) The robust generalized least-squares estimator. Signal Process 26(3):359–368

    Article  MATH  Google Scholar 

  • Hoerl AE, Kennard RW (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1):55–67

    Article  MathSciNet  MATH  Google Scholar 

  • Hoerl AE, Kennard RW (1976) Ridge regression iterative estimation of the biasing parameter. Commun Stat, Theory Methods 5(1):77–88

    Article  Google Scholar 

  • Hoerl AE, Kannard RW, Baldwin KF (1975) Ridge regression: some simulations. Commun Stat 4(2):105–123

    Article  MATH  Google Scholar 

  • Holland PW, Welsch RE (1977) Robust regression using iteratively reweighted least-squares. Commun Stat, Theory Methods 6(9):813–827

    Article  Google Scholar 

  • Lampard DG (1954) Generalization of the Wiener-Khintchine theorem to nonstationary processes. J Appl Phys 25(6):802–803

    Article  MathSciNet  MATH  Google Scholar 

  • Legendre AM (1820) Nouvelles méthodes pour la détermination des orbites des comètes. Chez Firmin Didot, 116 rue, de Thionville, Paris

  • Ling F, Manolakis D, Proakis J (1986) A recursive modified Gram-Schmidt algorithm for least-squares estimation. IEEE Trans Acoust Speech Signal Process 34(4):829–836

    Article  Google Scholar 

  • Luati A, Proietti T (2011) On the equivalence of the weighted least squares and the generalised least squares estimators, with applications to kernel smoothing. Ann Inst Stat Math 63(4):851–871

    Article  MathSciNet  MATH  Google Scholar 

  • Osborne MR, Presnell B, Turlach BA (2000) On the lasso and its dual. J Comput Graph Stat 9(2):319–337

    MathSciNet  Google Scholar 

  • Papoulis A (1977) Signal analysis. McGraw-Hill, New York

    MATH  Google Scholar 

  • Pinsker MS (1980) Optimal filtering of square-integrable signals in Gaussian noise. Probl Pereda Inf 16(2):52–68

    MathSciNet  Google Scholar 

  • Poch J, Villaescusa I (2012) Orthogonal distance regression: a good alternative to least squares for modeling sorption data. J Chem Eng Data 57(2):490–499

    Article  Google Scholar 

  • Scott AJ, Holt D (1982) The effect of two-stage sampling on ordinary least squares methods. J Am Stat Assoc 77(380):848–854

    Article  MATH  Google Scholar 

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc B 58(1):267–288

    MathSciNet  MATH  Google Scholar 

  • Yang WY (2009) Discrete-time Fourier analysis. In: Signals and systems with MATLAB. Springer, Berlin, pp 129–205

    Chapter  Google Scholar 

Download references

Acknowledgements

The author is grateful to CNES (Centre National d’Études Spatiales) for its financial support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Lenoir.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lenoir, B. A general approach of least squares estimation and optimal filtering. Optim Eng 15, 609–617 (2014). https://doi.org/10.1007/s11081-013-9217-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11081-013-9217-7

Keywords

Navigation