Skip to main content
Log in

Fuzzy principal component analysis and its Kernel-based model

  • Published:
Journal of Electronics (China)

Abstract

Principal Component Analysis (PCA) is one of the most important feature extraction methods, and Kernel Principal Component Analysis (KPCA) is a nonlinear extension of PCA based on kernel methods. In real world, each input data may not be fully assigned to one class and it may partially belong to other classes. Based on the theory of fuzzy sets, this paper presents Fuzzy Principal Component Analysis (FPCA) and its nonlinear extension model, i.e., Kernel-based Fuzzy Principal Component Analysis (KFPCA). The experimental results indicate that the proposed algorithms have good performances.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. T. Jolliffe. Principal Component Analysis. New York, Springer-Verlag, 1986, 1–68.

    Google Scholar 

  2. B. Schölkopf, A. Smola, and K. R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(1998)5, 1299–1319.

    Article  Google Scholar 

  3. M. Aizerman, E. Braverman, and L. Rozonoer. Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, 25(1964)1, 821–837.

    MathSciNet  Google Scholar 

  4. V. Vapnik. Statistical Learning Theory. New York, Wiley, 1998, chs.7–9.

    MATH  Google Scholar 

  5. S. Mika, et al. Fisher discriminant analysis with kernels. In Neural Networks for Signal Processing IX, Y.-H. Hu, J. Larsen, E. Wilson, S. Douglas, eds., New York, IEEE Press, 1999, 41–48.

    Google Scholar 

  6. F. R. Bach and M. I. Jordan. Kernel independent component analysis. Journal of Machine Learning Research, 3(2002)2, 1–48.

    MathSciNet  Google Scholar 

  7. D. R. Hardoon, S. Szedmak, and J Shawe-Taylor. Canonical correlation analysis: an overview with application to learning methods. Neural Computation, 16(2004)1, 2639–2664.

    Article  MATH  Google Scholar 

  8. J. C. Bezdek. Pattern Recognition with Fuzzy Objective Function Algorithms. New York, Plenum Press, 1981, 79.

    MATH  Google Scholar 

  9. Ch.-F. Lin and Sh.-D. Wang. Fuzzy support vector machines. IEEE Trans. on Neural Networks, 13(2002)3, 464–471.

    Article  Google Scholar 

  10. Ch.-F. Lin and Sh.-D. Wang. Training algorithms for fuzzy support vector machines with noisy data. Pattern Recognition Letters, 25(2004)14, 1647–1656.

    Article  MathSciNet  Google Scholar 

  11. M. Barni, V. Cappellini, and A. Mecocci. Comments on “A Possibilistic Approach to Clustering”. IEEE Trans. on Fuzzy Systems, 4(1996)3, 393–396.

    Article  Google Scholar 

  12. N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge, UK, Cambridge University Press, 2000, ch.3.

    Google Scholar 

  13. A. J. Smola, B. Schölkopf, and K.-R. Müller. The connection between regularization operators and support vector kernels. Neural Networks, 11(1998)4, 637–649.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wu Xiaohong.

About this article

Cite this article

Wu, X., Zhou, J. Fuzzy principal component analysis and its Kernel-based model. J. Electron.(China) 24, 772–775 (2007). https://doi.org/10.1007/s11767-006-0039-z

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11767-006-0039-z

Key words

CLC index

Navigation