Computational Mathematics and Modeling

, Volume 23, Issue 3, pp 350–367 | Cite as

A note on kernel principal component regression

Article

Kernel principal component regression (KPCR) was studied by Rosipal et al. [18, 19, 20], Hoegaerts et al. [7], and Jade et al. [8]. However, KPCR still encounters theoretical difficulties in the procedure for constructing KPCR and in the choice rule for the retained number of principal components. In this paper, we revise the method of KPCR to overcome the difficulties. The performance of the revised method is compared to linear regression, nonlinear regression based on Gompertz function, and nonparametric Nadaraya–Watson regression, and gives better results than those of the three methods.

Keywords

Nonlinear regression analysis kernel principal component analysis kernel principal component regression multicollinearity 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    A. Howard, Elementary Linear Algebra, John Wiley and Sons (2000).Google Scholar
  2. 2.
    A. W. Bowman and A. Azzalini, Applied Smoothing Techniques for Data Analysis: The Kernel Approach with S-Plus Illustrations, Oxford Statistical Science Series (1997).Google Scholar
  3. 3.
    K. I. Diamantaras and S. Y. Kung, Principal Component Neural Networks: Theory and Applications, John Wiley and Sons (1996).Google Scholar
  4. 4.
    J. J. Faraway, Linear Models with R, Chapman and Hall/CRC (2005).Google Scholar
  5. 5.
    W. Hardle, M. Muller, S. Sperlich, and A. Werwatz, Nonparametric and Semiparametric Models, Springer (2004).Google Scholar
  6. 6.
    D. A. Harville, Matrix Algebra from a Statistician’s Perspective, Springer (1997).Google Scholar
  7. 7.
    L. Hoegaerts, J. A. K. Suykens, J. Vandewalle, and B. De Moor, “Subset based least squares subspace in reproducing kernel Hilbert space,” Neurocomputing, 293–323 (2005).Google Scholar
  8. 8.
    A. M. Jade, B. Srikanth, B. D Kulkari, J. P Jog, and L. Priya, “Feature extraction and denoising using kernel pca,” Chem. Eng. Sci., 58, 4441–4448 (2003).CrossRefGoogle Scholar
  9. 9.
    I. T. Jolliffe, Principal Component Analysis, Springer (2002).Google Scholar
  10. 10.
    D. Jukic, G. Kralik, and R. Scitovski, “Least-squares fitting Gompertz curve,” J. Comput. Appl. Math., 169, 359–375 (2004).MathSciNetMATHCrossRefGoogle Scholar
  11. 11.
    W. Mendenhall, D. D. Wackerly, and L. Richard, Sheaffer. Mathematical Statistics with Applications, PWS-Kent (1990).Google Scholar
  12. 12.
    Ha Quang Minh, Partha Niyogi, and Yuan Yao, “Mercer’s theorem, feature maps, and smoothing,” Lect. Notes Comput. Sci., Springer, Berlin, 4005/2006 (2009).Google Scholar
  13. 13.
    D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to Linear Regression, Wiley-Interscience (2006).Google Scholar
  14. 14.
    E. A. Nadaraya, “On estimating regression,” Theory of Probability and Its Applications, 10, 186–190 (1964).CrossRefGoogle Scholar
  15. 15.
    E. A. Nadaraya, Nonparametric Estimation of Probability Density and Regression Curve, Kluwer (1989).Google Scholar
  16. 16.
    R. Draper, “Norman and Harry Smith,” Applied Regression Analysis, John Wiley and Sons (1998).Google Scholar
  17. 17.
    E. L. Randall, Nonparametric Regression and Spline Smoothing, Marcel Dekker (1999).Google Scholar
  18. 18.
    R. Rosipal, M. Girolami, L. J. Trejo, and A. Cichoki, “Kernel pca for feature extraction and de-noising in nonlinear regression,” Neural Computing and Applications, 231–243 (2001).Google Scholar
  19. 19.
    R. Rosipal and L. J. Trejo, “Kernel partial least squares regression in reproducing kernel Hilbert space,” J. Machine Learning Res., 2, 97–123 (2002).MATHGoogle Scholar
  20. 20.
    R. Rosipal, L. J. Trejo, and A. Cichoki, “Kernel principal component regression with em approach to nonlinear principal component extraction,” Technical Report, University of Paisley, UK (2001).Google Scholar
  21. 21.
    M. G. Schimek, Smoothing and Regression: Approaches, Computation, and Application, John Wiley and Sons (2000).Google Scholar
  22. 22.
    B. Scholkopf, A. Smola, and K. R. Muller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural Comput., 10, 1299–1319 (1998).CrossRefGoogle Scholar
  23. 23.
    B. Scholkopf and A. J. Smola, “Learning with kernels,” The MIT Press (2002).Google Scholar
  24. 24.
    G. A. F. Seber and A. J. Lee, Linear Regression Analysis, John Wiley and Sons (2003).Google Scholar
  25. 25.
    M. S. Srivastava, Methods of Multivariate Statistics, John Wiley and Sons (2002).Google Scholar
  26. 26.
    G. S. Watson, “Smooth regression analysis,” Sankhy Series (1964).Google Scholar
  27. 27.
    A. Wibowo and Y. Yamamoto, “A new approach of kernel principal component regression,” Discussion Paper Series No. 1195, Department of Social Systems and Management, Univ. Tsukuba (2008).Google Scholar

Copyright information

© Springer Science+Business Media, Inc. 2012

Authors and Affiliations

  1. 1.Faculty of Computer Science and Information SystemsUniversiti Teknologi MalaysiaJohorMalaysia
  2. 2.Graduate School of Systems and Information EngineeringUniversity of TsukubaTsukubaJapan

Personalised recommendations