Advertisement

Classification Ensemble by Genetic Algorithms

  • Hamid Parvin
  • Behrouz Minaei
  • Akram Beigi
  • Hoda Helmi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6593)

Abstract

Different classifiers with different characteristics and methodologies can complement each other and cover their internal weaknesses; Thus Classifier ensemble is an important approach to handle the drawback. If an automatic and fast method is obtained to approximate the accuracies of different classifiers on a typical dataset, the learning can be converted to an optimization problem and genetic algorithm is an important approach in this way. We proposed a selection method for classification ensemble by applying GA for improving performance of classification. CEGA is examined on some datasets and it considerably shows improvements.

Keywords

Classifier Selection Classifier Ensemble Genetic Algorithms 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, NY (2001)zbMATHGoogle Scholar
  2. 2.
    Parvin, H., Alizadeh, H., Minaei-Bidgoli, B., Analoui, M.: An Scalable Method for Improving the Performance of Classifiers in Multiclass Applications by Pairwise Classifiers and GA. NCM, Korea (2008)CrossRefGoogle Scholar
  3. 3.
    Kuncheva, L.I.: Combining Pattern Classifiers, Methods and Algorithms. Wiley, New York (2005)zbMATHGoogle Scholar
  4. 4.
    Saberi, A., Vahidi, M., Minaei-Bidgoli, B.: Learn to Detect Phishing Scams Using Learning and Ensemble Methods. In: IEEE/WIC/ACM International Conference on Intelligent Agent Technology, Workshops (IAT 2007), Silicon Valley, USA, November 2-5, pp. 311–314 (2007)Google Scholar
  5. 5.
    Qiang, F., Shang-xu, H., Sheng-ying, Z.: Clustering-based selective neural network ensemble. Journal of Zhejiang University Science 6A(5), 387–392 (2005); ISSN 1009-3095, Fu et al. / J Zhejiang Univ SCI 2005CrossRefGoogle Scholar
  6. 6.
    Lam, L.: Classifier combinations: Implementations and theoretical issues. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 77–86. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  7. 7.
    Freitas, A.A.: A survey of Evolutionary Algorithms for Data Mining and Knowledge Discovery. Advances in Evolutionary Computation. Springer, Heidelberg (2002)CrossRefzbMATHGoogle Scholar
  8. 8.
    Bandyopadhyay, S., Muthy, C.A.: Pattern Classification Using Genetic Algorithms. Pattern Recognition Letters 16, 801–808 (1995)CrossRefGoogle Scholar
  9. 9.
    Bala, J., De Jong, K., Huang, J., Vafaie, H., Wechsler, H.: Using learning to facilitate the evolution of features for recognizing visual concepts. Evolutionary Computation 4(3) (1997); Special Issue on Evolution, Learning, and Instinct: 100 years of the Baldwin EffectGoogle Scholar
  10. 10.
    Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 405–410 (1997)CrossRefGoogle Scholar
  11. 11.
    Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs, 3rd edn. Springer, Heidelberg (1996)CrossRefzbMATHGoogle Scholar
  12. 12.
    Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)CrossRefzbMATHGoogle Scholar
  13. 13.
    Breiman, L.: Arcing classifiers. The Annals of Statistics 26(3), 801–849 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Gunter, S., Bunke, H.: Creation of classifier ensembles for handwritten word recognition using feature selection algorithms. In: IWFHR (2002)Google Scholar
  15. 15.
    Puuronen, S., Tsymbal, A., Skrypnyk, I.: Correlation-based and contextual merit-based ensemble feature selection. In: Hoffmann, F., Adams, N., Fisher, D., Guimarães, G., Hand, D.J. (eds.) IDA 2001. LNCS, vol. 2189, pp. 135–144. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  16. 16.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
  17. 17.
    Parvin, H., Alizadeh, H., Minaei-Bidgoli, B.: MKNN: Modified K-Nearest Neighbor. In: WCECS 2008, San Francisco, USA (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Hamid Parvin
    • 1
  • Behrouz Minaei
    • 1
  • Akram Beigi
    • 1
  • Hoda Helmi
    • 1
  1. 1.School of Computer EngineeringIran University of Science and Technology (IUST)TehranIran

Personalised recommendations