Advertisement

Nonparametric Estimation of Edge Values of Regression Functions

  • Tomasz Galkowski
  • Miroslaw  Pawlak
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9693)

Abstract

In this article we investigate the problem of regression functions estimation in the edges points of their domain. We refer to the model \(y_i = R\left( {x_i } \right) + \epsilon _i ,\,i = 1,2, \ldots n\), where \(x_i\) is assumed to be the set of deterministic inputs, \(x_i \in D\), \(y_i\) is the set of probabilistic outputs, and \(\epsilon _i\) is a measurement noise with zero mean and bounded variance. R(.) is a completely unknown function. The possible solution of finding unknown function is to apply the algorithms based on the Parzen kernel [13, 31]. The commonly known drawback of these algorithms is that the error of estimation dramatically increases if the point of estimation x is drifting to the left or right bound of interval D. This fact makes it impossible to estimate functions exactly in edge values of domain.

The main goal of this paper is an application of NMS algorithm (introduced in [11]), basing on integral version of the Parzen method of function estimation by combining the linear approximation idea. The results of numerical experiments are presented.

Keywords

Nonparametric estimation Parzen kernel Boundary problem Regression 

References

  1. 1.
    Aghdam, M.H., Heidari, S.: Feature selection using particle swarm optimization in text categorization. J. Artif. Intell. Soft Comput. Res. 5(4), 231–238 (2015)CrossRefGoogle Scholar
  2. 2.
    Bas, E.: The Training of multiplicative neuron model artificial neural networks with differential evolution algorithm for forecasting. J. Artif. Intell. Soft Comput. Res. 6(1), 5–11 (2016)CrossRefGoogle Scholar
  3. 3.
    Bertini Jr., J.R., Carmo, N.M.: Enhancing constructive neural network performance using functionally expanded input data. J. Artif. Intell. Soft Comput. Res. 6(2), 119–131 (2016)Google Scholar
  4. 4.
    Chen, S.X.: Beta kernel estimators for density functions. J. Stat. Plann. Infer. 139, 2269–2283 (2009)CrossRefGoogle Scholar
  5. 5.
    Chu, J.L., Krzyzak, A.: The recognition of partially occluded objects with support vector machines, convolutional neural networks and deep belief networks. J. Artif. Intell. Soft Comput. Res. 4(1), 5–19 (2014)CrossRefGoogle Scholar
  6. 6.
    Cierniak, R., Rutkowski, L.: On image compression by competitive neural networks and optimal linear predictors. Sig. Process.-Image Commun. 15(6), 559–565 (2000)CrossRefGoogle Scholar
  7. 7.
    Duch, W., Korbicz, J., Rutkowski, L., Tadeusiewicz, R. (eds.): Biocybernetics and Biomedical Engineering 2000. Neural Networks, vol. 6. Akademicka Oficyna Wydawnicza, EXIT, Warsaw (2000) (in Polish)Google Scholar
  8. 8.
    Galkowski, T., Rutkowski, L.: Nonparametric recovery of multivariate functions with applications to system identification. In: Proceedings of the IEEE, vol. 73, pp. 942–943, New York (1985)Google Scholar
  9. 9.
    Galkowski, T., Rutkowski, L.: Nonparametric fitting of multivariable functions. IEEE Trans. Autom. Control AC–31, 785–787 (1986)CrossRefzbMATHGoogle Scholar
  10. 10.
    Galkowski, T.: Nonparametric estimation of boundary values of functions. Arch. Control Sci. 3(1–2), 85–93 (1994)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Gałkowski, T.: Kernel estimation of regression functions in the boundary regions. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2013, Part II. LNCS, vol. 7895, pp. 158–166. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  12. 12.
    Galkowski, T., Pawlak, M.: Nonparametric extension of regression functions outside domain. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014, Part I. LNCS, vol. 8467, pp. 518–530. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  13. 13.
    Gasser, T., Muller, H.G.: Kernel estimation of regression functions. Lecture Notes in Mathematics, vol. 757. Springer, Heidelberg (1979)zbMATHGoogle Scholar
  14. 14.
    Greblicki, W., Rutkowski, L.: Density-free bayes risk consistency of nonparametric pattern recognition procedures. Proc. IEEE 69(4), 482–483 (1981)CrossRefGoogle Scholar
  15. 15.
    Greblicki, W., Rutkowska, D., Rutkowski, L.: An orthogonal series estimate of time-varying regression. Ann. Inst. Stat. Math. 35(1), 215–228 (1983)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Hazelton, M.L., Marshall, J.C.: Linear boundary kernels for bivariate density estimation. Stat. Prob. Lett. 79, 999–1003 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Karunamuni, R.J., Alberts, T.: On boundary correction in kernel density estimation. Stat. Methodol. 2, 191–212 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Karunamuni, R.J., Alberts, T.: A locally adaptive transformation method of boundary correction in kernel density estimation. J. Stat. Plann. Infer. 136, 2936–2960 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Kitajima, R., Kamimura, R.: Accumulative information enhancement in the self-organizing maps and its application to the analysis of mission statements. J. Artif. Intell. Soft Comput. Res. 5(3), 161–176 (2015)CrossRefGoogle Scholar
  20. 20.
    Knop, M., Kapuscinski, T., Mleczko, W.K.: Video key frame detection based on the restricted Boltzmann machine. J. Appl. Math. Comput. Mech. 14(3), 49–58 (2015)CrossRefGoogle Scholar
  21. 21.
    Korytkowski, M., Nowicki, R., Scherer, R.: Neuro-fuzzy rough classifier ensemble. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009, Part I. LNCS, vol. 5768, pp. 817–823. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  22. 22.
    Korytkowski, M., Rutkowski, L., Scherer, R.: Fast image classification by boosting fuzzy classifiers. Inf. Sci. 327, 175–182 (2016)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Koshiyama, A.S., Vellasco, M., Tanscheit, R.: GPFIS-control: a genetic fuzzy system for control tasks. J. Artif. Intell. Soft Comput. Res. 4(3), 167–179 (2014)CrossRefGoogle Scholar
  24. 24.
    Kyung-Joon, C., Schucany, W.R.: Nonparametric kernel regression estimation near endpoints. J. Stat. Plann. Inf. 66, 289–304 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Marshall, J.C., Hazelton, M.L.: Boundary kernels for adaptive density estimators on regions with irregular boundaries. J. Multivar. Anal. 101, 949–963 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Laskowski, L.: A novel hybrid-maximum neural network in stereo-matching process. Neural Comput. Appl. 23(7–8), 2435–2450 (2013)CrossRefGoogle Scholar
  27. 27.
    Laskowski, L., Jelonkiewicz, J.: Self-correcting neural network for stereo-matching problem solving. Fundamenta Informaticae 138, 1–26 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Müller, H.G.: Smooth optimum kernel estimators near endpoints. Biometrika 78, 521–530 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Nikulin, V.: Prediction of the shoppers loyalty with aggregated data streams. J. Artif. Intell. Soft Comput. Res. 6(2), 69–79 (2016)MathSciNetCrossRefGoogle Scholar
  30. 30.
    Nowak, B.A., Nowicki, R.K., Starczewski, J.T., Marvuglia, A.: The learning of neuro-fuzzy classifier with fuzzy rough sets for imprecise datasets. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014, Part I. LNCS, vol. 8467, pp. 256–266. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  31. 31.
    Parzen, E.: On estimation of a probability density function and mode. Anal. Math. Stat. 33(3), 1065–1076 (1962)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Poměnková-Dluhá, J.: Edge Effects of Gasser-Müller Estimator, Mathematica 15, Brno, Masaryk University, pp. 307–314 (2004)Google Scholar
  33. 33.
    Rafajlowicz, E.: Nonparametric least squares estimation of a regression function statistics. J. Theor. Appl. Stat. 19(3), 349–358 (1988)MathSciNetzbMATHGoogle Scholar
  34. 34.
    Rafajlowicz, E., Schwabe, R.: Halton and hammersley sequences in multivariate nonparametric regression. Stat. Prob. Lett. 76(8), 803–812. Elsevier (2006)Google Scholar
  35. 35.
    Rutkowska, A.: Influence of membership function’s shape on portfolio optimization results. J. Artif. Intell. Soft Comput. Res. 6(1), 45–54 (2016)CrossRefGoogle Scholar
  36. 36.
    Rutkowski, L.: Sequential estimates of probability densities by orthogonal series and their application in pattern classification. IEEE Trans. Syst. Man Cybern. SMC–10(12), 918–920 (1980)MathSciNetzbMATHGoogle Scholar
  37. 37.
    Rutkowski, L.: On bayes risk consistent pattern recognition procedures in a quasi-stationary environment. IEEE Trans. Pattern Anal. Mach. Intell. PAMI–4(1), 84–87 (1982)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Rutkowski, L.: Online identification of time-varying systems by nonparametric techniques. IEEE Trans. Autom. Control 27(1), 228–230 (1982)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Rutkowski, L.: On nonparametric identification with prediction of time-varying systems. IEEE Trans. Autom. Control AC–29, 58–60 (1984)CrossRefzbMATHGoogle Scholar
  40. 40.
    Rutkowski, L.: Nonparametric identification of quasi-stationary systems. Syst. Control Lett. 6, 33–35. Amsterdam (1985)Google Scholar
  41. 41.
    Rutkowski, L.: Real-time identification of time-varying systems by non-parametric algorithms based on parzen kernels. Int. J. Syst. Sci. 16, 1123–1130 (1985)CrossRefzbMATHGoogle Scholar
  42. 42.
    Rutkowski, L.: A general approach for nonparametric fitting of functions and their derivatives with applications to linear circuits identification. IEEE Trans. Circuits Syst. 33, 812–818 (1986)CrossRefzbMATHGoogle Scholar
  43. 43.
    Rutkowski, L.: Sequential pattern recognition procedures derived from multiple Fourier series. Pattern Recogn. Lett. 8, 213–216 (1988)CrossRefzbMATHGoogle Scholar
  44. 44.
    Rutkowski, L.: Application of multiple fourier series to identification of multivariable nonstationary systems. Int. J. Syst. Sci. 20(10), 1993–2002 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  45. 45.
    Rutkowski, L.: Non-parametric learning algorithms in the time-varying environments. Sig. Process. 18(2), 129–137 (1989)MathSciNetCrossRefGoogle Scholar
  46. 46.
    Rutkowski, L., Rafajłowicz, E.: On global rate of convergence of some nonparametric identification procedures. IEEE Trans. Autom. Control AC–34(10), 1089–1091 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  47. 47.
    Rutkowski, L.: Identification of MISO nonlinear regressions in the presence of a wide class of disturbances. IEEE Trans. Inf. Theory IT–37, 214–216 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  48. 48.
    Rutkowski, L.: Multiple fourier series procedures for extraction of nonlinear regressions from noisy data. IEEE Trans. Sig. Process. 41(10), 3062–3065 (1993)CrossRefzbMATHGoogle Scholar
  49. 49.
    Rutkowski, L.: Generalized regression neural networks in time-varying environment. IEEE Trans. Neural Netw. 15(3), 576–596 (2004)CrossRefGoogle Scholar
  50. 50.
    Rutkowski, L.: Adaptive probabilistic neural networks for pattern classification in time-varying environment. IEEE Trans. Neural Netw. 15(4), 811–827 (2004)MathSciNetCrossRefGoogle Scholar
  51. 51.
    Rutkowski, L., Pietruczuk, L., Duda, P., Jaworski, M.: Decision trees for mining data streams based on the mcdiarmid’s bound. IEEE Trans. Knowl. Data Eng. 25(6), 1272–1279 (2013)CrossRefGoogle Scholar
  52. 52.
    Rutkowski, L., Jaworski, M., Duda, P., Pietruczuk, L.: Decision trees for mining data streams based on the gaussian approximation. IEEE Trans. Knowl. Data Eng. 26(1), 108–119 (2014)CrossRefGoogle Scholar
  53. 53.
    Rutkowski, L., Jaworski, M., Pietruczuk, L., Duda, P.: The CART decision trees mining data streams. Inf. Sci. 266, 1–15 (2014)CrossRefGoogle Scholar
  54. 54.
    Rutkowski, L., Jaworski, M., Pietruczuk, L., Duda, P.: A new method for data stream mining based on the misclassification error. IEEE Trans. Neural Netw. Learn. Syst. 26(5), 1048–1059 (2015)MathSciNetCrossRefGoogle Scholar
  55. 55.
    Schuster, E.F.: Incorporating support constraints into nonparametric estimators of densities. Commun. Stat. Part A - Theory Methods 14, 1123–1136 (1985)MathSciNetCrossRefzbMATHGoogle Scholar
  56. 56.
    Skubalska-Rafajlowicz, E.: Pattern recognition algorithms based on space-filling curves and orthogonal expansions. IEEE Trans. Inf. Theory 47(5), 1915–1927 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  57. 57.
    Skubalska-Rafajlowicz, E.: Random projection RBF nets for multidimensional density estimation. Int. J. Appl. Math. Comput. Sci. 18(4), 455–464 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  58. 58.
    Szarek, A., Korytkowski, M., Rutkowski, L., Scherer, R., Szyprowski, J.: Application of neural networks in assessing changes around implant after total hip arthroplasty. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2012, Part II. LNCS, vol. 7268, pp. 335–340. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  59. 59.
    Wang, Z., Zhang-Westmant, L.: New ranking method for fuzzy numbers by their expansion center. J. Artif. Intell. Soft Comput. Res. 4(3), 181–187 (2014)CrossRefGoogle Scholar
  60. 60.
    Zhang, S., Karunamuni, R.J.: On kernel density estimation near endpoints. J. Stat. Plann. Inf. 70, 301–316 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  61. 61.
    Zhang, S., Karunamuni, R.J.: Deconvolution boundary kernel method in nonparametric density estimation. J. Stat. Plann. Inf. 139, 2269–2283 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  62. 62.
    Zhang, S., Karunamuni, R.J.: Boundary performance of the beta kernel estimators. Nonparametric Stat. 22, 81–104 (2010)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Institute of Computational IntelligenceCzestochowa University of TechnologyCzestochowaPoland
  2. 2.Information Technology InstituteUniversity of Social SciencesLodzPoland
  3. 3.Department of Electrical and Computer EngineeringUniversity of ManitobaWinnipegCanada

Personalised recommendations