Advertisement

Selecting Variables for Neural Network Committees

  • Marija Bacauskiene
  • Vladas Cibulskis
  • Antanas Verikas
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

The aim of the variable selection is threefold: to reduce model complexity, to promote diversity of committee networks, and to find a trade-off between the accuracy and diversity of the networks. To achieve the goal, the steps of neural network training, aggregation, and elimination of irrelevant input variables are integrated based on the negative correlation learning [1] error function. Experimental tests performed on three real world problems have shown that statistically significant improvements in classification performance can be achieved from neural network committees trained according to the technique proposed.

Keywords

Committee Member Neural Network Training Pattern Recognition Letter Negative Correlation Learn Pima Indian Diabetes 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Liu, Y., Yao, X.: Ensemble Learning via Negatine Correlation. Neural Networks 12, 1399–1404 (1999)CrossRefGoogle Scholar
  2. 2.
    Bacauskiene, M., Verikas, A.: Selecting Salient Features for Classification Based on Neural Network Committees. Pattern Recognition Letters 25, 1879–1891 (2004)CrossRefGoogle Scholar
  3. 3.
    Verikas, A., Lipnickas, A., Malmqvist, K.: Selecting Neural Networks for Making a Committee Decision. In: Dorronsoro, J.R. (ed.) ICANN 2002. LNCS, vol. 2415, pp. 420–425. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  4. 4.
    Monari, G., Dreyfus, G.: Local Overfitting Control via Leverages. Neural Computation 14, 1481–1506 (2002)MATHCrossRefGoogle Scholar
  5. 5.
    Kuncheva, L.I., Bezdek, J.C., Duin, R.P.W.: Decision Templates for Multiple Classifier Fusion. Pattern Recognition 34, 299–314 (2001)MATHCrossRefGoogle Scholar
  6. 6.
    Verikas, A., Lipnickas, A., Malmqvist, K., Bacauskiene, M., Gelzinis, A.: Soft Combination of Neural Classifiers: A Comparative Study. Pattern Recognition Letters 20, 429–444 (1999)CrossRefGoogle Scholar
  7. 7.
    Setiono, R., Liu, H.: Neural-Network Feature Selector. IEEE Transactions on Neural Networks 8, 654–662 (1997)CrossRefGoogle Scholar
  8. 8.
    Verikas, A., Bacauskiene, M.: Feature Selection with Neural Networks. Pattern Recognition Letters 23, 1323–1335 (2002)MATHCrossRefGoogle Scholar
  9. 9.
    Cohen, P.R.: Empirical Methods for Artificial Intelligence. MIT Press, Cambridge (1995)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Marija Bacauskiene
    • 1
  • Vladas Cibulskis
    • 1
  • Antanas Verikas
    • 1
    • 2
  1. 1.Department of Applied ElectronicsKaunas University of TechnologyKaunasLithuania
  2. 2.Intelligent Systems LaboratoryHalmstad UniversityHalmstadSweden

Personalised recommendations