Advertisement

Circuits, Systems, and Signal Processing

, Volume 38, Issue 5, pp 2114–2137 | Cite as

Incorporating Nonparametric Knowledge to the Least Mean Square Adaptive Filter

  • Soheila Ashkezari-Toussi
  • Hadi Sadoghi-YazdiEmail author
Article
  • 73 Downloads

Abstract

In the framework of the maximum a posteriori estimation, the present study proposes the nonparametric probabilistic least mean square (NPLMS) adaptive filter for the estimation of an unknown parameter vector from noisy data. The NPLMS combines parameter space and signal space by combining the prior knowledge of the probability distribution of the process with the evidence existing in the signal. Taking advantage of kernel density estimation to estimate the prior distribution, the NPLMS is robust against the Gaussian and non-Gaussian noises. To achieve this, some of the intermediate estimations are buffered and then used to estimate the prior distribution. Despite the bias-compensated algorithms, there is no need to estimate the input noise variance. Theoretical analysis of the NPLMS is derived. In addition, a variable step-size version of NPLMS is provided to reduce the steady-state error. Simulation results in the system identification and prediction show the acceptable performance of the NPLMS in the noisy stationary and non-stationary environments against the bias-compensated and normalized LMS algorithms.

Keywords

Least mean square Adaptive filter Maximum a posteriori estimation Kernel density estimation Probabilistic modeling 

References

  1. 1.
    J. Arenas-Garcia, A.R. Figueiras-Vidal, A.H. Sayed, Mean-square performance of a convex combination of two adaptive filters. IEEE Trans. Signal Process. 54(3), 1078–1090 (2006).  https://doi.org/10.1109/TSP.2005.863126 CrossRefzbMATHGoogle Scholar
  2. 2.
    B.C. Arnold, p-norm bounds on the expectation of the maximum of a possibly dependent sample. J. Multivar. Anal. 17(3), 316–332 (1985)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    B. Babadi, N. Kalouptsidis, V. Tarokh, Sparls: the sparse rls algorithm. IEEE Trans. Signal Process. 58(8), 4013–4025 (2010).  https://doi.org/10.1109/TSP.2010.2048103 MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    J. Benesty, H. Rey, L.R. Vega, S. Tressens, A nonparametric vss nlms algorithm. IEEE Signal Process. Lett. 13(10), 581–584 (2006)CrossRefGoogle Scholar
  5. 5.
    N.J. Bershad, J.C.M. Bermudez, J.Y. Tourneret, Stochastic analysis of the lms algorithm for system identification with subspace inputs. IEEE Trans. Signal Process. 56(3), 1018–1027 (2008).  https://doi.org/10.1109/TSP.2007.908967 MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    J.V. Candy, Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods, vol. 54 (Wiley, Hoboken, 2016)CrossRefGoogle Scholar
  7. 7.
    K. Crammer, O. Dekel, J. Keshet, S. Shalev-Shwartz, Y. Singer, Online passive-aggressive algorithms. J. Mach. Learn. Res. 7(Mar), 551–585 (2006)MathSciNetzbMATHGoogle Scholar
  8. 8.
    R.C. De Lamare, R. Sampaio-Neto, Adaptive reduced-rank equalization algorithms based on alternating optimization design techniques for mimo systems. IEEE Trans. Veh. Technol. 60(6), 2482–2494 (2011)CrossRefzbMATHGoogle Scholar
  9. 9.
    J. Fernandez-Bes, V. Elvira, S. Van Vaerenbergh, A probabilistic least-mean-squares filter. In: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 2199–2203. IEEE (2015)Google Scholar
  10. 10.
    J. Gao, H. Sultan, J. Hu, W.W. Tung, Denoising nonlinear time series by adaptive filtering and wavelet shrinkage: a comparison. IEEE Signal Process. Lett. 17(3), 237–240 (2010)CrossRefGoogle Scholar
  11. 11.
    A. Gilloire, M. Vetterli, Adaptive filtering in subbands with critical sampling: analysis, experiments, and application to acoustic echo cancellation. IEEE Trans. Signal Process. 40(8), 1862–1875 (1992)CrossRefzbMATHGoogle Scholar
  12. 12.
    S. Haykin, A.H. Sayed, J.R. Zeidler, P. Yee, P.C. Wei, Adaptive tracking of linear time-variant systems by extended rls algorithms. IEEE Trans. Signal Process. 45(5), 1118–1128 (1997)CrossRefGoogle Scholar
  13. 13.
    S.S. Haykin, Adaptive Filter Theory (Pearson Education India, London, 2008)zbMATHGoogle Scholar
  14. 14.
    C. Huemmer, R. Maas, W. Kellermann, The nlms algorithm with time-variant optimum stepsize derived from a bayesian network perspective. IEEE Signal Process. Lett. 22(11), 1874–1878 (2015)CrossRefGoogle Scholar
  15. 15.
    A. Ilin, T. Raiko, Practical approaches to principal component analysis in the presence of missing values. J. Mach. Learn. Res. 11(Jul), 1957–2000 (2010)MathSciNetzbMATHGoogle Scholar
  16. 16.
    S.M. Jung, P. Park, Normalised least-mean-square algorithm for adaptive filtering of impulsive measurement noises and noisy inputs. Electron. Lett. 49(20), 1270–1272 (2013)CrossRefGoogle Scholar
  17. 17.
    S.M. Jung, P. Park, Stabilization of a bias-compensated normalized least-mean-square algorithm for noisy inputs. IEEE Trans. Signal Process. 65(11), 2949–2961 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    J. Liu, X. Yu, H. Li, A nonparametric variable step-size nlms algorithm for transversal filters. Appl. Math. Comput. 217(17), 7365–7371 (2011)MathSciNetzbMATHGoogle Scholar
  19. 19.
    W. Liu, J.C. Principe, S. Haykin, Kernel Adaptive Filtering: A Comprehensive Introduction, vol. 57 (Wiley, Hoboken, 2011)Google Scholar
  20. 20.
    I.M. Park, S. Seth, S. Van Vaerenbergh, Probabilistic kernel least mean squares algorithms. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 8272–8276. IEEE (2014)Google Scholar
  21. 21.
    A.H. Sayed, Adaptive Filters (Wiley, Hoboken, 2008)CrossRefGoogle Scholar
  22. 22.
    A.H. Sayed, T. Kailath, A state-space approach to adaptive rls filtering. IEEE Signal Process. Mag. 11(3), 18–60 (1994)CrossRefGoogle Scholar
  23. 23.
    S. Van Vaerenbergh, J. Fernandez-Bes, V. Elvira, On the relationship between online Gaussian process regression and kernel least mean squares algorithms. In: 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP), pp. 1–6. IEEE (2016)Google Scholar
  24. 24.
    S. Van Vaerenbergh, M. Lzaro-Gredilla, I. Santamara, Kernel recursive least-squares tracker for time-varying regression. IEEE Trans. Neural Netw. Learn. Syst. 23(8), 1313–1326 (2012)CrossRefGoogle Scholar
  25. 25.
    S.V. Vaseghi, Advanced Digital Signal Processing and Noise Reduction (Wiley, Hoboken, 2008)CrossRefGoogle Scholar
  26. 26.
    P. Wen, J. Zhang, A novel variable step-size normalized subband adaptive filter based on mixed error cost function. Signal Process. 138, 48–52 (2017)CrossRefGoogle Scholar
  27. 27.
    H. Zhao, Z. Zheng, Bias-compensated affine-projection-like algorithms with noisy input. Electron. Lett. 52(9), 712–714 (2016)CrossRefGoogle Scholar
  28. 28.
    J. Zhao, X. Liao, S. Wang, K.T. Chi, Kernel least mean square with single feedback. IEEE Signal Process. Lett. 22(7), 953–957 (2015)CrossRefGoogle Scholar
  29. 29.
    Z. Zheng, H. Zhao, Bias-compensated normalized subband adaptive filter algorithm. IEEE Signal Process. Lett. 23(6), 809–813 (2016)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Soheila Ashkezari-Toussi
    • 1
    • 2
  • Hadi Sadoghi-Yazdi
    • 1
    • 2
    Email author
  1. 1.Department of Computer EngineeringFerdowsi University of MashhadMashhadIran
  2. 2.Center of Excellence on Soft Computing and Intelligent Information ProcessingFerdowsi University of MashhadMashhadIran

Personalised recommendations