Skip to main content

A Novel Classifier Ensemble Method Based on Class Weightening in Huge Dataset

  • Conference paper
Advances in Neural Networks – ISNN 2011 (ISNN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6676))

Included in the following conference series:

Abstract

While there are many methods in classifier ensemble, there is not any method which uses weighting in class level. Random Forest which uses decision trees for problem solving is the base of our proposed ensemble. In this work, we propose a weightening based classifier ensemble method in class level. The proposed method is like Random Forest method in employing decision tree and neural networks as classifiers, and differs from Random Forest in employing a weight vector per classifier. For evaluating the proposed weighting method, both ensemble of decision tree and neural networks classifiers are applied in experimental results. Main presumption of this method is that the reliability of the predictions of each classifier differs among classes. The proposed ensemble methods were tested on a huge Persian data set of handwritten digits and have improvements in comparison with competitors.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  2. Breiman, L.: Random forests. MachineLearning 45, 5–32 (2001)

    MATH  Google Scholar 

  3. Davis, L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991)

    Google Scholar 

  4. Dietterich, T.G.: Ensemble learning. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks, 2nd edn. MIT Press, Cambridge (2002)

    Google Scholar 

  5. Dimililer, N., Varoğlu, E., Altınçay, H.: Vote-Based Classifier Selection for Biomedical NER Using Genetic Algorithms. In: Martí, J., Benedí, J.M., Mendonça, A.M., Serrat, J. (eds.) IbPRIA 2007. LNCS, vol. 4478, pp. 202–209. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  6. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, NY (2001)

    MATH  Google Scholar 

  7. Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)

    Book  MATH  Google Scholar 

  8. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995)

    Chapter  Google Scholar 

  9. Khosravi, H., Kabir, E.: Introducing a very large dataset of handwritten Farsi digits and a study on the variety of handwriting styles. Pattern Recognition Letters 28(10), 1133–1141 (2007)

    Article  Google Scholar 

  10. Sanchez, A., Alvarez, R., Moctezuma, J.C., Sanchez, S.: Clustering and Artificial Neural Networks as a Tool to Generate Membership Functions. In: Proceedings of the 16th IEEE International Conference on Electronics, Communications and Computers (2006)

    Google Scholar 

  11. Zhou, Z.H., Wu, J.X., Jiang, Y., Chen, S.F.: Genetic Algorithm based Selective Neural Network Ensemble. In: Proceedings of the 17th International Joint Conference on Artificial Intelligence (IJCAI 2001), Seattle, WA, vol. 2, pp. 797–802 (2001)

    Google Scholar 

  12. Zhou, Z.H., Wu, J.X., Tang, W.: Ensembling Neural Networks: Many Could Be Better Than All. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Parvin, H., Minaei, B., Alizadeh, H., Beigi, A. (2011). A Novel Classifier Ensemble Method Based on Class Weightening in Huge Dataset. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds) Advances in Neural Networks – ISNN 2011. ISNN 2011. Lecture Notes in Computer Science, vol 6676. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21090-7_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21090-7_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21089-1

  • Online ISBN: 978-3-642-21090-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics