Advertisement

A Neural Tree Network Ensemble Mode for Disease Classification

Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 269)

Abstract

A neural tree network ensemble model is proposed for classification which is an important research field in data mining and machine learning. Firstly, establishes each single neural tree network by using an improved hybrid breeder genetic programming, and then more neural tree networks are combined to form the final classification model by the idea of ensemble learning. Simulation results on two disease classification problems show that this model is effective for the classification, and has better performance in classification precision, feature selection and structure simplification, especially for classification with multi-class attributes.

Keywords

Neural tree network Breeder genetic programming Ensemble learning Disease classification 

References

  1. 1.
    Han JW, Kamber M (2007) Data mining: concepts and techniques, 2nd edn. Machine Press, ChinaGoogle Scholar
  2. 2.
    Friedman N, Geiger D, Goldszmidt M (1997) Bayesian network classifiers. Mach Learn 29(2–3):131–163Google Scholar
  3. 3.
    Breiman L, Friedman J, Olshen RA, Stone CJ (1984) Classification and regression trees. Wadsworth, BelmontMATHGoogle Scholar
  4. 4.
    Zhang GP (2000) Neural networks for classification: a survey, systems, man, and cybernetics. IEEE Trans Syst C Appl Rev 30(4):451–462CrossRefGoogle Scholar
  5. 5.
    Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297MATHGoogle Scholar
  6. 6.
    Sanchez VD (2003) Advanced support vector machines and kernel methods. Neurocomputing 55(1–2):5–20CrossRefGoogle Scholar
  7. 7.
    Chen YH, Yang B, Dong J (2004) Nonlinear systems modeling via optimal design of neural trees. Int J Neural Syst 14:125–138CrossRefGoogle Scholar
  8. 8.
    Chen YH, Yang B, Dong J (2004) Evolving flexible neural networks using ant programming and PSO algorithm. In: International symposiums neural networks (ISNN’04), Dalian, China, LNCS3173, pp 211–216Google Scholar
  9. 9.
    Chen YH, Yang B, Dong J, Abrahama A (2005) Time series forecasting using flexible neural tree model. Inf Sci 174(3–4):219–235CrossRefGoogle Scholar
  10. 10.
    Chen YH, Abraham A, Yang B (2006) Feature selection and classification using flexible neural tree. Neurocomputing 70:305–313CrossRefGoogle Scholar
  11. 11.
    Zhang BT, Muhlenbein H (1993) Genetic programming of minimal neural nets using Occam’s razor. In: Forrest S (ed) Proceeding of fifth international conference on genetic algorithms, Forum, vol 1. Morgan Kaufmann, pp 342–349Google Scholar
  12. 12.
    Zhang BT, Ohm P, Muhlenbein H (1997) Evolutionary induction of sparse neural trees. Evol Comput 15:213–236CrossRefGoogle Scholar
  13. 13.
    Muhlenbein H (1992) How genetic algorithms really work. Mutation and hill-climbing. In: Parallel problem solving from nature PPSN II, vol 1. North-Holland, pp 15–25Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.School of Management Science and EngineeringShandong Normal UniversityJinanChina

Personalised recommendations