Skip to main content
Log in

A Weighted Gaussian Kernel Least Mean Square Algorithm

  • Published:
Circuits, Systems, and Signal Processing Aims and scope Submit manuscript

Abstract

In this work, a novel weighted kernel least mean square (WKLMS) algorithm is proposed by introducing a weighted Gaussian kernel. The learning behavior of the WKLMS algorithm is studied. Mean square error (MSE) analysis shows that the WKLMS algorithm outperforms both the least mean square (LMS) and KLMS algorithms in terms of transient state as well as steady-state responses. We study the effect of the weighted Gaussian kernel on the associated kernel matrix, its eigenvalue spread and distribution, and show how these parameters affect the convergence behavior of the algorithm. Both of the transient and steady-state mean-square-error (MSE) behaviors of the WKLMS algorithm are studied, and a stability bound is derived. For a non-stationary environment, tracking analysis for a correlated random walk channel is presented. We also prove that the steady-state excess MSE (EMSE) of the WKLMS is Schur convex function of the weight elements in its kernel weight matrix and hence it follows the majorization of the kernel weight elements. This helps to decide which kernel weight matrix can provide better MSE performance. Simulations results are provided to contrast the performance of the proposed WKLMS with those of its counterparts KLMS and LMS algorithms. The derived analytical results of the proposed WKLMS algorithm are also validated via simulations for various step-size values.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data Availability

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.

Notes

  1. \(\langle f \Vert \kappa (\textbf{u}) \rangle _{\mathscr {H}_{\kappa }}=f(\textbf{u})\)

  2. For any vector \(\varvec{x}\) and matrix \(\varvec{A}\), the notation \(||\varvec{x}||_{\varvec{A}}^2\) represents the weighted norm, i.e., \(||\varvec{x}||_{\varvec{A}}^2=\varvec{x}^T\varvec{A}\varvec{x}\)

  3. However, the analysis can be applied for any dictionary size.

References

  1. K.A. Al-Hujaili, T.Y. Al-Naffouri, M. Moinuddin, The steady-state of the (normalized) lms is schur convex. In: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4900–4904 (2016). doi: https://doi.org/10.1109/ICASSP.2016.7472609

  2. S. An, W. Liu, S. Venkatesh, Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognit. 40(8), 2154–2162 (2007). https://doi.org/10.1016/j.patcog.2006.12.015

    Article  MATH  Google Scholar 

  3. N. Aronszajn, Theory of reproducing kernels. Trans. Am. Math. Soc. 68(3), 337–404 (1950)

    Article  MathSciNet  MATH  Google Scholar 

  4. C. Brunsdon, Estimating probability surfaces for geographical point data: an adaptive kernel algorithm. Comput. Geosci. 21(7), 877–894 (1995). https://doi.org/10.1016/0098-3004(95)00020-9

    Article  Google Scholar 

  5. G.C. Cawley, N.L. Talbot, Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers. Pattern Recognit. 36(11), 2585–2592 (2003). https://doi.org/10.1016/S0031-3203(03)00136-5

    Article  MATH  Google Scholar 

  6. B. Chen, J. Liang, N. Zheng, J.C. Príncipe, Kernel least mean square with adaptive kernel size. Neurocomputing 191, 95–106 (2016). https://doi.org/10.1016/j.neucom.2016.01.004

    Article  Google Scholar 

  7. A. Dak, R. R, Non-iterative cauchy kernel-based maximum correntropy cubature kalman filter for non-gaussian systems. Control Theory Technol. 20, 465–474 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  8. K. Dehnad, Density estimation for statistics and data analysis (1987)

  9. Y. Engel, S. Mannor, R. Meir, The kernel recursive least-squares algorithm. IEEE Trans. Signal Process. 52(8), 2275–2285 (2004). https://doi.org/10.1109/TSP.2004.830985

    Article  MathSciNet  MATH  Google Scholar 

  10. H. Fan, Q. Song, S.B. Shrestha, Kernel online learning with adaptive kernel width. Neurocomputing 175, 233–242 (2016)

    Article  Google Scholar 

  11. T.T. Frieß, R.F. Harrison, A kernel-based adaline for function approximation. Intell. Data Anal. 3(4), 307–313 (1999)

    Article  MATH  Google Scholar 

  12. D.G.W.S.W. Fu, Q. Zhang, Kernel least logarithmic absolute difference algorithm. Math. Probl. Eng. (2022). https://doi.org/10.1155/2022/9092663

    Article  Google Scholar 

  13. M. Gönen, E. Alpaydın, Multiple kernel learning algorithms. J. Mach. Learn. Res. 12, 2211–2268 (2011)

    MathSciNet  MATH  Google Scholar 

  14. W. Härdle, Applied nonparametric regression. 19. Cambridge university press (1990)

  15. S. Haykin, Adaptive filter theory. Prentice-Hall information and system sciences series. Prentice Hall (2002). https://books.google.cz/books?id=eMcZAQAAIAAJ

  16. E. Herrmann, Local bandwidth choice in kernel regression estimation. J. Comput. Gr. Stat. 6(1), 35–54 (1997)

    MathSciNet  Google Scholar 

  17. W. Huang, C. Chen, A novel quaternion kernel lms algorithm with variable kernel width. IEEE Trans. Circuits Syst. II: Express Br. 68(7), 2715–2719 (2021). https://doi.org/10.1109/TCSII.2021.3056452

    Article  Google Scholar 

  18. S. Jain, S. Majhi, Zero-attracting kernel maximum versoria criterion algorithm for nonlinear sparse system identification. IEEE Signal Process. Lett. 29, 1546–1550 (2022)

    Article  Google Scholar 

  19. R. Jin, S.C. Hoi, T. Yang, Online multiple kernel learning: Algorithms and mistake bounds. In: International Conference on Algorithmic Learning Theory, pp. 390–404. Springer (2010)

  20. V. Katkovnik, I. Shmulevich, Kernel density estimation with adaptive varying window size. Pattern Recognit. Lett. 23(14), 1641–1648 (2002)

    Article  MATH  Google Scholar 

  21. K.I. Kim, M. Franz, B. Scholkopf, Iterative kernel principal component analysis for image modeling. IEEE Trans. Pattern Anal. Mach. Intell. 27(9), 1351–1366 (2005). https://doi.org/10.1109/TPAMI.2005.181

    Article  Google Scholar 

  22. J. Kivinen, A. Smola, R.C. Williamson, Online learning with kernels. IEEE Trans. Signal Process. 52(8), 2165–2176 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  23. Y. Li, J. Lou, X. Tan, Y. Xu, J. Zhang, Z. Jing, Adaptive kernel learning kalman filtering with application to model-free maneuvering target tracking. IEEE Acccess 10, 78088–78101 (2022)

    Article  Google Scholar 

  24. W. Liu, P.P. Pokharel, J.C. Principe, The kernel least-mean-square algorithm. IEEE Trans. Signal Process. 56(2), 543–554 (2008). https://doi.org/10.1109/TSP.2007.907881

    Article  MathSciNet  MATH  Google Scholar 

  25. A.W. Marshall, I. Olkin, B.C. Arnold, Inequalities: theory of majorization and its applications, vol. 143 (Springer, 1979)

  26. F. Orabona, L. Jie, B. Caputo, Multi kernel learning with online-batch optimization. J. Mach. Learn. Res. 13(2), 227–253 (2012)

    MathSciNet  MATH  Google Scholar 

  27. W.D. Parreira, J.C.M. Bermudez, C. Richard, J.Y. Tourneret, Stochastic behavior analysis of the gaussian kernel least-mean-square algorithm. IEEE Trans. Signal Process. 60(5), 2208–2222 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  28. P. Pokharel, W. Liu, J. Principe, Kernel lms. In Proceedings of International Conference Acoustics, Speech, Signal Processing (ICASSP)03(3), III–1421 (2007)

  29. J.C. Príncipe, W. Liu, S. Haykin, Kernel adaptive filtering: a comprehensive introduction. John Wiley & Sons (2011)

  30. J. Racine, An efficient cross-validation algorithm for window width selection for nonparametric kernel regression. Commun. Stat. - Simul. Comput. 22(4), 1107–1114 (1993). https://doi.org/10.1080/03610919308813144

    Article  Google Scholar 

  31. A.H. Sayed, Adaptive filters. Wiley-IEEE Press (2008)

  32. B. Scholkopf, A.J. Smola, Learning with kernels, support vector (MIT Press, Cambridge, MA, USA, 2001)

    MATH  Google Scholar 

  33. J. Shawe-Taylor, N. Cristianini et al., Kernel methods for pattern analysis (Cambridge University Press, 2004)

  34. L. Shi, H. Zhao, Y. Zakharov, An improved variable kernel width for maximum correntropy criterion algorithm. IEEE Trans. Circuits Syst. II: Express Br. 67(7), 1339–1343 (2018)

    Google Scholar 

  35. Y. Tang, Y.H.T.C.J.Y.R.: A fast kernel least mean square algorithm. In: 2022 IET International Conference on Engineering Technologies and Applications (IET-ICETA) pp. 1–12 (2022)

  36. B. Widrow, E. Walach, Adaptive signal processing for adaptive control. IFAC Proc. Vol. 16(9), 7–12 (1983)

    Article  Google Scholar 

  37. J. Zhao, H. Zhang, J.A. Zhang, Gaussian kernel adaptive filters with adaptive kernel bandwidth. Signal Process. 166, 107270 (2020)

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge the support provided by the Deanship of Research Oversight and Coordination (DROC) at King Fahd University of Petroleum & Minerals (KFUPM) for funding this work under the Interdisciplinary Research Center for Communication Systems and Sensing through project No. INCS2101.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Moinuddin.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Moinuddin, M., Zerguine, A. & Arif, M. A Weighted Gaussian Kernel Least Mean Square Algorithm. Circuits Syst Signal Process 42, 5267–5288 (2023). https://doi.org/10.1007/s00034-023-02337-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00034-023-02337-y

Keywords

Navigation