Advertisement

A Performance Study of Gaussian Kernel Classifiers for Data Mining Applications

  • Miyoung Shin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4093)

Abstract

Radial basis function (RBF) models have been successfully employed to study a broad range of data mining problems and benchmark data sets for real world scientific and engineering applications. In this paper we investigate RBF models with Gaussian kernels by developing classifiers in a systematic way. In particular, we employ our newly developed RBF design algorithm for a detailed performance study and sensitivity analysis of the classification models for the popular Monk’s problems. The results show that the accuracy of our classifiers is very impressive while our classification approach is systematic and easy to implement. In addition, differing complexity of the three Monk’s problems is clearly reflected in the classification error surfaces for test data. By exploring these surfaces, we acquire better understanding of the data mining classification problems. Finally, we study the error surfaces to investigate trade-offs between different choices of model parameters to develop efficient and parsimonious models for a given application.

Keywords

Radial Basis Function Test Error Training Error Error Surface Learn Classifier System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Goel, A.L., Shin, M.: Radial basis functions: an algebraic approach (with data mining applications). In: Tutorial notes, European conference on Machine Learning, September 2004, Pisa, Italy (2004)Google Scholar
  2. 2.
    Shin, M., Goel, A.L.: Empirical data modeling in software engineering using radial basis functions. IEEE Trans. on Software Engineering 26(6), 567–576 (2000)CrossRefGoogle Scholar
  3. 3.
    Shin, M., Goel, A.L.: Modeling software component criticality using a machine learning approach. In: Kim, T.G. (ed.) AIS 2004. LNCS (LNAI), vol. 3397, pp. 440–448. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. 4.
    Thrun, S.B., et al.: The Monk’s problems: a performance comparison of different learning algorithms. Technical Report CMU-CS-91-197, Carnegie Mellon University (1991)Google Scholar
  5. 5.
    Saxon, S., Barry, A.: XCS and the Monk’s problems in learning classifier systems: from foundations to applications. In: Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) IWLCS 1999. LNCS (LNAI), vol. 1813, pp. 440–448. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  6. 6.
    Casey, M., Ahmad, K.: In-situ learning in multi-net systems. In: Yang, Z.R., Yin, H., Everson, R.M. (eds.) IDEAL 2004. LNCS, vol. 3177, pp. 752–757. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  7. 7.
    Xiong, H., Swamy, M.N.S., Ahmad, M.O.: Optimizing the kernel in the empirical feature space. IEEE Trans. on Neural Networks (March 2005)Google Scholar
  8. 8.
    Mitchell, M.W.: An architecture for situated learning agents. Ph.D. Dissertation, Monash University, Australia (2003)Google Scholar
  9. 9.
    Huang, S.H.: Dimensionality reduction in automatic knowledge acquisition:a simple greedy search approach. IEEE Trans. on Knowledge and Data Engineeingr. 16(6), 1364–1373 (2003)CrossRefGoogle Scholar
  10. 10.
    Toh, K., Tran, Q.-L., Srinivasan, O.: Benchmarking a reduced multivariate polynomial pattern classifier. IEEE Trans. on Pattern Analysis and Machine Intelligence 16(2), 460–474 (2005)Google Scholar
  11. 11.
    UCI machine learning data repository. http://www.ics.uci.edu/~mlearn/MLRepository.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Miyoung Shin
    • 1
  1. 1.School of Electrical Engineering and Computer ScienceKyungpook National UniversityBuk-guKorea

Personalised recommendations