Skip to main content

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 227))

Abstract

In this paper, an ensemble classifier, namely RotaSVM, is proposed that uses recently developed rotational feature selection approach and Support Vector Machine classifier cohesively. The RotaSVM generates the number of predefined outputs of Support Vector Machines. For each Support Vector Machine, the training data is generated by splitting the feature set randomly into \(\mathcal{S}\) subsets. Subsequently, principal component analysis is used for each subset to create new feature sets and all the principal components are retained to preserve the variability information in the training data. Thereafter, such features are used to train a Support Vector Machine. During the testing phase of RotaSVM, first the rotation specific Support Vector Machines are used to test and then average posterior probability is computed to classify sample data. The effectiveness of the RotaSVM is demonstrated quantitatively by comparing it with other widely used ensemble based classifiers such as Bagging, AdaBoost, MultiBoost and Rotation Forest for 10 real-life data sets. Finally, a statistical test has been conducted to establish the superiority of the result produced by proposed RotaSVM.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Benediktsson, J.A., Kittler, J., Roli, F. (eds.): MCS 2009. LNCS, vol. 5519. Springer, Heidelberg (2009)

    MATH  Google Scholar 

  2. Haindl, M., Kittler, J., Roli, F. (eds.): MCS 2007. LNCS, vol. 4472. Springer, Heidelberg (2007)

    MATH  Google Scholar 

  3. Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.): MCS 2005. LNCS, vol. 3541. Springer, Heidelberg (2005)

    MATH  Google Scholar 

  4. Roli, F., Kittler, J., Windeatt, T. (eds.): MCS 2004. LNCS, vol. 3077. Springer, Heidelberg (2004)

    MATH  Google Scholar 

  5. Windeatt, T., Roli, F. (eds.): MCS 2003. LNCS, vol. 2709. Springer, Heidelberg (2003)

    MATH  Google Scholar 

  6. Roli, F., Kittler, J. (eds.): MCS 2002. LNCS, vol. 2364. Springer, Heidelberg (2002)

    MATH  Google Scholar 

  7. Kittler, J., Roli, F. (eds.): MCS 2001. LNCS, vol. 2096. Springer, Heidelberg (2001)

    MATH  Google Scholar 

  8. Kittler, J., Roli, F. (eds.): MCS 2000. LNCS, vol. 1857. Springer, Heidelberg (2000)

    Google Scholar 

  9. Maulik, U., Bandyopadhyay, S., Saha, I.: Integrating clustering and supervised learning for categorical data analysis. IEEE Transactions on Systems, Man and Cybernetics, Part A 40(4), 664–675 (2010)

    Article  Google Scholar 

  10. Saha, I., Maulik, U., Bandyopadhyay, S., Plewczynski, D.: Unsupervised and supervised learning approaches together for microarray analysis. Fundamenta Informaticae 106(1), 45–73 (2011)

    MathSciNet  Google Scholar 

  11. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  12. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  13. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  14. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  15. Rodriguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation Forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)

    Article  Google Scholar 

  16. Giacinto, G., Roli, F.: An approach to the automatic design of multiple classifier systems. Pattern Recognition Letters 22(1), 25–33 (2001)

    Article  MATH  Google Scholar 

  17. Perdisci, R., Ariu, D., Fogla, P., Giacinto, G., Lee, W.: McPAD: A multiple classifier system for accurate payload-based anomaly detection. Computer Networks 53, 864–881 (2009)

    Article  MATH  Google Scholar 

  18. Giacinto, G., Perdisci, R., Rio, M.D., Roli, F.: Intrusion detection in computer networks by a modular ensemble of one-class classifiers. Information Fusion 9(1), 69–82 (2008)

    Article  Google Scholar 

  19. Giacinto, G., Roli, F., Didaci, L.: Fusion of multiple classifiers for intrusion detection in computer networks. Pattern Recognition Letters 24(12), 1795–1803 (2003)

    Article  Google Scholar 

  20. Saha, I., Maulik, U., Bandyopadhyay, S., Plewczynski, D.: SVMeFC: SVM ensemble fuzzy clustering for satellite image segmentation. IEEE Geoscience and Remote Sensing Letters 9(1), 52–55 (2011)

    Article  MathSciNet  Google Scholar 

  21. Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems 7, pp. 231–238 (1995)

    Google Scholar 

  22. Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152 (1992)

    Google Scholar 

  23. Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/mlrepository.html

  24. Ferguson, G.A., Takane, Y.: Statistical analysis in psychology and education (2005)

    Google Scholar 

  25. Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  26. Goh, K.S., Chang, E.Y., Li, B.: Using one-class and two-class SVMs for multiclass image annotation. IEEE Transactions on Knowledge and Data Engineering 17(10), 1333–1346 (2005)

    Article  Google Scholar 

  27. Juang, C.F., Chiu, S.H., Shiu, S.J.: Fuzzy system learned through fuzzy clustering and support vector machine for human skin color segmentation. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 37(6), 1077–1087 (2007)

    Article  Google Scholar 

  28. Cohen, J.A.: Coefficient of agreement for nominal scales. Educational and Psychological Measurement 20(1), 37–46 (1960)

    Article  Google Scholar 

  29. Marom, N.D., Rokach, L., Shmilovici, A.: Using the confusion matrix for improving ensemble classifiers. In: Proceedings of the IEEE Twenty Sixth Convention of Electrical and Electronics Engineers, pp. 555–559 (2010)

    Google Scholar 

  30. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms. Wiley, Chichester (2001)

    MATH  Google Scholar 

  31. Saha, I., Maulik, U., Plewczynski, D.: A new multi-objective technique for differential fuzzy clustering. Applied Soft Computing 11(2), 2765–2776 (2011)

    Article  Google Scholar 

  32. Maulik, U., Saha, I.: Modified differential evolution based fuzzy clustering for pixel classification in remote sensing imagery. Pattern Recognition 42(9), 2035–2149 (2009)

    Article  Google Scholar 

  33. Maulik, U., Saha, I.: Automatic fuzzy clustering using modified differential evolution for image classification. IEEE Transactions on Geoscience and Remote Sensing 48(9), 3503–3510 (2010)

    Article  Google Scholar 

  34. Saha, I., Maulik, U., Bandyopadhyay, S., Plewczynski, D.: Improvement of new automatic differential fuzzy clustering using svm classifier for microarray analysis. Expert Systems with Applications 38(12), 15122–15133 (2011)

    Article  Google Scholar 

  35. Saha, I., Plewczynski, D., Maulik, U., Bandyopadhyay, S.: Improved differential evolution for microarray analysis. International Journal of Data Mining and Bioinformatics 6(1), 86–103 (2012)

    Article  Google Scholar 

  36. Plewczynski, D., Basu, S., Saha, I.: AMS 4.0: Consensus prediction of post-translational modifications in protein sequences. Amino Acids 43(2), 573–582 (2012)

    Article  Google Scholar 

  37. Saha, I., Maulik, U., Bandyopadhyay, S., Plewczynski, D.: Fuzzy clustering of physicochemical and biochemical properties of amino acids. Amino Acids 43(2), 583–594 (2012)

    Article  Google Scholar 

  38. Saha, I., Mazzocco, G., Plewczynski, D.: Consensus classification of human leukocyte antigens class II proteins. Immunogenetics 65, 97–105 (2013)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shib Sankar Bhowmick .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this paper

Cite this paper

Bhowmick, S.S., Saha, I., Rato, L., Bhattacharjee, D. (2013). RotaSVM: A New Ensemble Classifier. In: Emmerich, M., et al. EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation IV. Advances in Intelligent Systems and Computing, vol 227. Springer, Heidelberg. https://doi.org/10.1007/978-3-319-01128-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-01128-8_4

  • Publisher Name: Springer, Heidelberg

  • Print ISBN: 978-3-319-01127-1

  • Online ISBN: 978-3-319-01128-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics