Abstract
The paper describes an evaluation of novel boosting methods of the commonly used Multinomial Naïve Bayes classifier. Evaluation is made upon the Reuters corpus, which consists of 10788 documents and 90 categories. All experiments use the tf-idf weighting model and the one versus the rest strategy. AdaBoost, XGBoost and Gradient Boost algorithms are tested. Additionally the impact of feature selection is tested. The evaluation is carried out with use of commonly used metrics – precision, recall, F1 and Precision-Recall breakeven points. The novel aspect of this work is that all considered boosted methods are compared to each other and several classical methods (Support Vector Machine methods and a Random Forests classifier). The results are much better than in the classic Joachims paper and slightly better than obtained with maximum discrimination method for feature selection. This is important because for the past 20 years most works were concerned with a change of results upon modification of parameters. Surprisingly, the result obtained with the use of feed-forward neural network is comparable to the Bayesian optimization over boosted Naïve Bayes (despite the medium size of the corpus). We plan to extend these results by using word embedding methods.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Banerjee, S., Majumder, P., Mitra, M.: Re-evaluating the need for modelling term-dependence in text classification problems. CoRR abs/1710.09085 (2017)
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system
Freund, Y., Schapire, R.: A decision theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)
Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2000)
Ji, Y., Noah, A., Smith, N.A.: Neural discourse structure for text categorization. In: ACL (1), pp. 996–1005 (2017)
Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: ECML, pp. 137–142 (1998)
Lewis, D.D., Yang, Y., Rose, T., Li, F.: RCV1: a new benchmark collection for text categorization research. J. Mach. Learn. Res. 5, 361–397 (2004)
Liang, H., Sun, X., Sun, Y., Gao, Y.: Text feature extraction based on deep learning: a review. EURASIP J. Wirel. Commun. Netw. 2017(1), 211 (2017)
Manning, C.D., Raghavan, P., Schütze, H.: Introduction to Information Retrieval. Cambridge University Press, New York (2008)
Yogatama, D., Kong, L., Smith, N.A.: Bayesian optimization of text representations. In: EMNLP, pp. 2100–2105 (2015)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: The International Conference on Learning Representations (ICLR), San Diego (2015)
Salakhutdinov, R., Hinton, G.E.: Semantic hashing. Int. J. Approx. Reason. 50(7), 969–978 (2009)
Yang, Y., Liu, X.: A re-examination of text categorization methods. In: Proceedings of 22nd Annual International SIGIR (1999)
Tang, B., Kay, S., He, H.: Toward optimal feature selection in Naive Bayes for text categorization. IEEE Trans. Knowl. Data Eng. 28(9), 2508–2521 (2016)
Ji, Y., Smith, N.A.: Neural discourse structure for text categorization. In: ACL 2017, Vancouver, Canada, pp. 996–1005 (2017)
Acknowledgements
We acknowledge the Poznan University of Technology grant (04/45/DSPB/0185).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Zdrojewska, A., Dutkiewicz, J., Jędrzejek, C., Olejnik, M. (2019). Comparison of the Novel Classification Methods on the Reuters-21578 Corpus. In: Choroś, K., Kopel, M., Kukla, E., Siemiński, A. (eds) Multimedia and Network Information Systems. MISSI 2018. Advances in Intelligent Systems and Computing, vol 833. Springer, Cham. https://doi.org/10.1007/978-3-319-98678-4_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-98678-4_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-98677-7
Online ISBN: 978-3-319-98678-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)