Advertisement

Combined Classifier Based on Quantized Subspace Class Distribution

Conference paper
  • 1.3k Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11314)

Abstract

Following paper presents Exposer Ensemble (ee), being a combined classifier based on the original model of quantized subspace class distribution. It presents a method of establishing and processing the Planar Exposer – base representation of discrete class distribution over given subspace, and a proposition how to effectively fuse discriminatory power of many Planar Exposers into a combined classifier. The natural property of the representation used in the following article is its resistance to the imbalance of training data, without the need to use over- or undersampling methods and the constant computational complexity of prediction. Description of proposed algorithm is complemented by a series of computer experiments conducted on the collection of balanced and imbalanced datasets with diverse imbalance ratio, proving its usefulness in a supervised learning task.

Keywords

Supervised learning Classification Classifier ensemble 

Notes

Acknowledgment

The work was funded by the statutory funds of Department of Systems and Computer Networks (Faculty of Electronics, Wrocław University of Science and Technology) during realization of Mloda Kadra 2017/2018 task.

References

  1. 1.
    Alcalá-Fdez, J., et al.: Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Mult.-Valued Log. Soft Comput. 17, 255–287 (2011)Google Scholar
  2. 2.
    Rish, I.: An empirical study of the naive Bayes classifier. In: IJCAI 2001 Workshop on Empirical Methods in Artificial Intelligence (2001)Google Scholar
  3. 3.
    Bellman, R.E.: Adaptive Control Processes: A Guided Tour, vol. 2045. Princeton University Press, Princeton (2015)Google Scholar
  4. 4.
    Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16, 321–357 (2002)CrossRefGoogle Scholar
  5. 5.
    Dolph, C.V., Alam, M., Shboul, Z., Samad, M.D., Iftekharuddin, K.M.: Deep learning of texture and structural features for multiclass Alzheimer’s disease classification. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 2259–2266, May 2017.  https://doi.org/10.1109/IJCNN.2017.7966129
  6. 6.
    Estabrooks, A., Jo, T., Japkowicz, N.: A multiple resampling method for learning from imbalanced data sets. Comput. Intell. 20(1), 18–36 (2004)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Fernández, A., del Jesus, M.J., Herrera, F.: Hierarchical fuzzy rule based classification systems with genetic rule selection for imbalanced data-sets. Int. J. Approx. Reason. 50(3), 561–577 (2009)CrossRefGoogle Scholar
  8. 8.
    Gu, J., Jiao, L., Liu, F., Yang, S., Wang, R., Chen, P., Cui, Y., Xie, J., Zhang, Y.: Random subspace based ensemble sparse representation. Pattern Recognit. 74, 544–555 (2018)CrossRefGoogle Scholar
  9. 9.
    He, H., Bai, Y., Garcia, E.A., Li, S.: ADASYN: adaptive synthetic sampling approach for imbalanced learning. In: 2008 IEEE International Joint Conference on Neural Networks, IJCNN 2008 (IEEE World Congress on Computational Intelligence), pp. 1322–1328. IEEE (2008)Google Scholar
  10. 10.
    He, H., Garcia, E.A.: Learning from imbalanced data. IEEE Trans. Knowl. Data Eng. 9, 1263–1284 (2008)Google Scholar
  11. 11.
    Kuncheva, L.: Fuzzy Classifier Design, vol. 49. Springer Science & Business Media, Heidelberg (2000).  https://doi.org/10.1007/978-3-7908-1850-5CrossRefzbMATHGoogle Scholar
  12. 12.
    Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. John Wiley & Sons, Hoboken (2004)CrossRefGoogle Scholar
  13. 13.
    Liu, B., Yu, X., Zhang, P., Yu, A., Fu, Q., Wei, X.: Supervised deep feature extraction for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 56(4), 1909–1921 (2018).  https://doi.org/10.1109/TGRS.2017.2769673CrossRefGoogle Scholar
  14. 14.
    Mitchell, T.M., et al.: Machine learning. WCB (1997)Google Scholar
  15. 15.
    Pedregosa, F.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12(Oct), 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar
  16. 16.
    Perona, P., Malik, J.: Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Mach. Intell. 12(7), 629–639 (1990)CrossRefGoogle Scholar
  17. 17.
    Ramentol, E., Caballero, Y., Bello, R., Herrera, F.: SMOTE-RSB*: a hybrid preprocessing approach based on oversampling and undersampling for high imbalanced data-sets using SMOTE and rough sets theory. Knowl. Inf. Syst. 33(2), 245–265 (2012)CrossRefGoogle Scholar
  18. 18.
    Romero, A., Gatta, C., Camps-Valls, G.: Unsupervised deep feature extraction for remote sensing image classification. IEEE Trans. Geosci. Remote Sens. 54(3), 1349–1362 (2016).  https://doi.org/10.1109/TGRS.2015.2478379CrossRefGoogle Scholar
  19. 19.
    Yen, S.J., Lee, Y.S.: Cluster-based under-sampling approaches for imbalanced data distributions. Expert Syst. Appl. 36(3), 5718–5727 (2009)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Yin, F.L., Pan, X.Y., Liu, X.W., Liu, H.X.: Deep neural network language model research and application overview. In: 2015 12th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), pp. 55–60, December 2015.  https://doi.org/10.1109/ICCWAMTIP.2015.7493906
  21. 21.
    Yousif, H., Yuan, J., Kays, R., He, Z.: Fast human-animal detection from highly cluttered camera-trap images using joint background modeling and deep learning classification. In: 2017 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 1–4, May 2017.  https://doi.org/10.1109/ISCAS.2017.8050762

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Wrocław University of Science and TechnologyWrocławPoland

Personalised recommendations