Advertisement

Decision Templates Based RBF Network for Tree-Structured Multiple Classifier Fusion

  • Mohamed Farouk Abdel Hady
  • Friedhelm Schwenker
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5519)

Abstract

Multiclass pattern recognition problems (K > 2) can be decomposed by a tree-structured approach. It constructs an ensemble of K-1 individually trained binary classifiers whose predictions are combined to classify unseen instances. A key factor for an effective ensemble is how to combine its member outputs to give the final decision. Although there are various methods to build the tree structure and to solve the underlying binary problems, there is not much work to develop new combination methods that can best combine these intermediate results. We present here a trainable fusion method that integrates statistical information about the individual outputs (clustered decision templates) into a Radial Basis Function (RBF) network. We compare our model with the decision templates combiner and the existing nontrainable tree ensemble fusion methods: classical decision tree-like approach, product of the unique path and Dempster-Shafer evidence theory based method.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kumar, S., Ghosh, J., Crawford, M.: A hierarchical multiclassifier system for hyperspectral data analysis. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 270–279. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  2. 2.
    Kumar, S., Ghosh, J., Crawford, M.: Hierarchical fusion of multiple classifiers for hyperspectral data analysis. International Journal of Pattern Analysis and Applications 5(2), 210–220 (2002)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Fay, R., Schwenker, F., Thiel, C., Palm, G.: Hierarchical neural networks utilising Dempster-Shafer evidence theory. In: Schwenker, F., Marinai, S. (eds.) ANNPR 2006. LNCS (LNAI), vol. 4087, pp. 198–209. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  4. 4.
    Fay, R.: Feature selection and information fusion in hierarchical neural networks for iterative 3D-object recognition. Ph.D thesis, Ulm University (2007)Google Scholar
  5. 5.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research 2(1), 263–286 (1995)MATHGoogle Scholar
  6. 6.
    Kuncheva, L., Bezdek, J., Duin, R.: Decision templates for multiple classifier fusion: An experimental comparison. Pattern Recognition 34(2), 299–314 (2001)CrossRefMATHGoogle Scholar
  7. 7.
    Dempster, A.P.: A generalization of bayesian inference. Journal of the Royal Statistical Society, 205–247 (1968)Google Scholar
  8. 8.
    Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)MATHGoogle Scholar
  9. 9.
    Wolpert, D.H.: Stacked generalization. Neural Networks 5(6), 1289–1301 (1994)Google Scholar
  10. 10.
    Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley-Interscience, Hoboken (2004)CrossRefMATHGoogle Scholar
  11. 11.
    Schwenker, F., Kestler, H., Palm, G.: Three learning phases for radial basis function networks. Neural Networks 14, 439–458 (2001)CrossRefMATHGoogle Scholar
  12. 12.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco (1999)Google Scholar
  13. 13.
    Blake, C., Merz, C.: UCI repository of machine learning databases. University of California (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
  14. 14.
    Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 62, 239–281 (2003)CrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Mohamed Farouk Abdel Hady
    • 1
  • Friedhelm Schwenker
    • 1
  1. 1.Institute of Neural Information ProcessingUniversity of UlmUlmGermany

Personalised recommendations