Advertisement

Non-parametric Residual Variance Estimation in Supervised Learning

  • Elia Liitiäinen
  • Amaury Lendasse
  • Francesco Corona
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4507)

Abstract

The residual variance estimation problem is well-known in statistics and machine learning with many applications for example in the field of nonlinear modelling. In this paper, we show that the problem can be formulated in a general supervised learning context. Emphasis is on two widely used non-parametric techniques known as the Delta test and the Gamma test. Under some regularity assumptions, a novel proof of convergence of the two estimators is formulated and subsequently verified and compared on two meaningful study cases.

Keywords

Mean Square Error Residual Variance Supervise Learning Gamma Test Neighbor Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Müller, U., Schik, A., Wefelmeyer, W.: Estimating the error variance in nonparametric regression by a covariate-matched U-statistic. Statistics 37, 179–188 (2003)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Jones, A.J.: New tools in non-linear modelling and prediction. Computational Management Science 1, 109–149 (2004)CrossRefzbMATHGoogle Scholar
  3. 3.
    Lendasse, A., Ji, Y., Reyhani, N., Verleysen, M.: LS-SVM Hyperparameter Selection with a Nonparametric Noise Estimator. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3697, pp. 625–630. Springer, Heidelberg (2005)Google Scholar
  4. 4.
    Kemp, S.E.: Gamma test analysis tools for non-linear time series. PhD thesis, University of Glamorgran (2006)Google Scholar
  5. 5.
    Reyhani, N., Hao, J., Ji, Y., Lendasse, A.: Mutual information and Gamma test for input selection. In: ESANN’2005 proceedings, Bruges, Belgium, 27-29 April 2005, pp. 503–508 (2005)Google Scholar
  6. 6.
    Lendasse, A., Corona, F., Hao, J., Reyhani, N., Verleysen, M.: Determination of the Mahalanobis matrix using nonparametric noise estimations. In: ESANN’2006 proceedings, Bruges, Belgium, 26-28 April 2006, pp. 227–237 (2006)Google Scholar
  7. 7.
    Evans, D.: Data-derived estimates of noise for unknown smooth models using near-neighbour asymptotics. PhD thesis, Cardiff University (2002)Google Scholar
  8. 8.
    Pi, H., Peterson, C.: Finding the embedding dimension and variable dependencies in time series. Neural Comput. 6, 509–520 (1994)CrossRefGoogle Scholar
  9. 9.
    Shiryaev, A.N.: Probability. Springer, Heidelberg (1995)zbMATHGoogle Scholar
  10. 10.
    Liitiäinen, E., Corona, F., Lendasse, A.: Nearest neighbor distributions and noise variance estimation. In: ESANN 2007, European Symbosium on Artifical Neural Networks (accepted for publication) (2007)Google Scholar
  11. 11.
    Devroye, L., Wagner, T.J.: Distribution-free probability inequalities for the deleted and holdout estimates. IEEE Transactions on Information Theory, 202–207 (1979)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Elia Liitiäinen
    • 1
  • Amaury Lendasse
    • 1
  • Francesco Corona
    • 1
  1. 1.Helsinki University of Technology - Lab. of Computer and Information Science, P.O. Box 5400, FI-2015 HUT - EspooFinland

Personalised recommendations