Advertisement

Prototype-Based Kernels for Extreme Learning Machines and Radial Basis Function Networks

  • Norbert JankowskiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10841)

Abstract

Extreme learning machines or radial basis function networks depends on kernel functions. If the kernel set is too small or not adequate (for the problem/learning data) the learning can be fruitless and generalization capabilities of classifiers do not become rewarding.

The article presents a method of automatic stochastic selection of kernels. Thanks to the proposed scheme of kernel function selection we obtain the proper number of kernels and proper placements of kernels. Evaluation results clearly show that this methodology works very well and is superior to standard extreme learning machine, support vector machine or k nearest neighbours method.

Keywords

Extreme learning machines Kernel methods Prototypes Prototype selection Machine learning k nearest neighbours method 

References

  1. 1.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: International Joint Conference on Neural Networks, pp. 985–990. IEEE Press (2004)Google Scholar
  2. 2.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)CrossRefGoogle Scholar
  3. 3.
    Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2, 321–355 (1988)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Kasun, L.L.C., Zhou, H., Huang, G.B.: Representational learning with ELMS for big data. IEEE Intell. Syst. 28(6), 31–34 (2013)Google Scholar
  5. 5.
    Jankowski, N., Grochowski, M.: Comparison of instances seletion algorithms I. Algorithms survey. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 598–603. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-24844-6_90CrossRefGoogle Scholar
  6. 6.
    Grochowski, M., Jankowski, N.: Comparison of instance selection algorithms II. Results and comments. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 580–585. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-24844-6_87CrossRefGoogle Scholar
  7. 7.
    Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Mach. Learn. 38, 257–286 (2000)zbMATHCrossRefGoogle Scholar
  8. 8.
    Merz, C.J., Murphy, P.M.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of InformaticsNicolaus Copernicus UniversityToruńPoland

Personalised recommendations