Flexible Neural Tree for Pattern Recognition

  • Hai-Jun Li
  • Zheng-Xuan Wang
  • Li-Min Wang
  • Sen-Miao Yuan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)


This paper presents a novel induction model named Flexible Neural Tree (FNT) for pattern recognition. FNT uses decision tree to do basic analysis and neural network to do subsequent quantitative analysis. The Pure Information Gain I(X i ;ϑ), which is defined as test selection measure for FNT to construct decision tree, can be used to handle continuous attributes directly. When the information embodied by neural network node can show new attribute relations, FNT extracts symbolic rules from neural network to increase the performance of decision process. Experimental studies on a set of natural domains show that FNT has clear advantages with respect to the generalization ability.


Neural Network Decision Tree Continuous Attribute Hide Unit Current Node 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Huan, L., Rudy, S.: Feature Transformation and Multivariate Decision Tree Induction. In: Arikawa, S., Motoda, H. (eds.) DS 1998. LNCS (LNAI), vol. 1532, pp. 279–291. Springer, Heidelberg (1998)Google Scholar
  2. 2.
    Masi, G.B., Stettner, D.L.: Bayesian Adaptive Control of Discrete-time Markov Processes with Long Run Average Cost. Int. J. Systems & Control Letters 34(3), 55–62 (1998)MATHCrossRefGoogle Scholar
  3. 3.
    Hunter, A.: Feature Selection Using Probabilistic Neural Networks. Int. J. Neural Computation & Application 9(2), 124–132 (2000)CrossRefGoogle Scholar
  4. 4.
    Atlas, L., Cole, R., Uthusamy, M., Lippman, A.: A Performance Comparison of Trained Multi-layer Perceptions and Trained Classification Trees. In: Zhong, S., Malla, S. (eds.) Proceedings of the IEEE International Conference on Computer Vision, Osaka, Japan, vol. 78, pp. 1614–1619 (1990)Google Scholar
  5. 5.
    Setiono, R., Huan, L.: Symbolic Representation of Neural Networks. Int. J. Computer 29(5), 71–77 (1996)Google Scholar
  6. 6.
    Zhou, Z.H., Chen, Z.Q.: Hybrid Decision Tree. Knowledge-Based Systems 15(8), 515–528 (2002)CrossRefGoogle Scholar
  7. 7.
    Dougherty, J.: Supervised and Unsupervied Discretization of Coninuous Features. In: Armand, P., Stuart, J.R. (eds.) Proceedings of the 12th International Conference on Machine Learning, Tahoe City, California, USA, pp. 194–201 (1995)Google Scholar
  8. 8.
    Zhou, Z.H., Chen, S.F., Chen, Z.Q.: A FANNC: Fast Adaptive Neural Network Classifier. Int. J. Knowledge and Information Systems 1(2), 115–129 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Hai-Jun Li
    • 1
    • 2
  • Zheng-Xuan Wang
    • 2
  • Li-Min Wang
    • 2
  • Sen-Miao Yuan
    • 2
  1. 1.College of Computer ScienceYantai UniversityYantaiChina
  2. 2.College of Computer Science and TechnologyJilin UniversityChangChunChina

Personalised recommendations