Advertisement

Combining Neural Networks Based on Dempster-Shafer Theory for Classifying Data with Imperfect Labels

  • Mahdi Tabassian
  • Reza Ghaderi
  • Reza Ebrahimpour
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6438)

Abstract

This paper addresses the supervised learning in which the class membership of training data are subject to uncertainty. This problem is tackled in the framework of the Dempster-Shafer theory. In order to properly estimate the class labels, different types of features are extracted from the data. The initial labels of the training data are ignored and by utilizing the main classes’ prototypes, each training pattern, in each of the feature spaces, is reassigned to one class or a subset of the main classes based on the level of ambiguity concerning its class label. Multilayer perceptrons neural network is used as base classifier and for a given test sample, its outputs are considered as basic belief assignment. Finally, the decisions of the base classifiers are combined using Dempster’s rule of combination. Experiments with artificial and real data demonstrate that considering ambiguity in class labels can provide better results than classifiers trained with imperfect labels.

Keywords

Data with imperfect labels Dempster-Shafer theory Classifier combination Neural network 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)zbMATHGoogle Scholar
  2. 2.
    Smets, P., Kennes, R.: The Transferable Belief Model. Artif. Intell. 66, 191–243 (1994)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Rogova, G.: Combining the result of several neural network classifiers. Neural Networks 7(5), 777–781 (1994)CrossRefGoogle Scholar
  4. 4.
    Denoeux, T.: A k-Nearest Neighbor Classification Rule Based on Dempster-Shafer Theory. IEEE Trans. Syst., Man, Cybern. 25(3), 804–813 (1995)CrossRefGoogle Scholar
  5. 5.
    Denoeux, T.: A Neural Network Classifier Based on Dempster-Shafer Theory. IEEE Trans. Syst., Man, Cybern. A, Syst., Humans 30, 131–150 (2000)CrossRefGoogle Scholar
  6. 6.
    Basir, O., Karray, F., Zhu, H.: Connectionist-Based Dempster-Shafer Evidential Reasoning for Data Fusion. IEEE Trans. on Neural Net. 16, 1513–1529 (2005)CrossRefGoogle Scholar
  7. 7.
    Quost, B., Denoeux, T., Masson, M.-H.: Pairwise classifier combination using belief functions. Pattern Recognition Letters 28, 644–653 (2007)CrossRefGoogle Scholar
  8. 8.
    Smets, P.: Belif Functions: The Disjunctive Rule of Combination and the Generalized Bayesian Theorem. International Journal of Approximate Reasoning 9, 1–35 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Elouedi, Z., Mellouli, K., Smets, P.: Assessing Sensor Reliability for Multisensor Data Fusion Within the Transferable Belief Model. IEEE Trans. Syst., Man, Cybern. B 34, 782–787 (2004)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Mahdi Tabassian
    • 1
    • 2
  • Reza Ghaderi
    • 1
  • Reza Ebrahimpour
    • 2
    • 3
  1. 1.Faculty of Electrical and Computer EngineeringBabol University of TechnologyBabolIran
  2. 2.School of Cognitive SciencesInstitute for Research in Fundamental Sciences (IPM)TehranIran
  3. 3.Department of Electrical and Computer EgineeringShahid Rajaee Teacher Training UniversityTehranIran

Personalised recommendations