Abstract
The article is devoted to the problem of feature extraction in online learning tasks. In many cases, the proper feature extraction is very time-consuming. Currently, in some cases, this problem is successfully solved by deep neural networks. However, deep models are computationally expensive and so hardly applicable for online learning tasks which require frequent updating of the model. This paper proposes the lightweight neural net architecture that can be learned in online mode and doesn’t require complex handcrafted features. The small sample processing time distinguishes the proposed model from more complex deep neural networks. The architecture and learning process of the proposed model are discussed in detail. The special attention is paid to fast software implementation. On benchmarks, we show that developed implementation processes one sample several times faster than implementations on the base of deep learning frameworks. The conducted experiments on CTR prediction task show that the proposed neural net with raw features gives the same performance as the logistic regression model with handcrafted features. For a clear description of the proposed architecture, we use the metagraph approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.A. (eds.): Feature Extraction: Foundations and Applications. Studies in Fuzziness and Soft Computing. Springer, Heidelberg (2008)
Manning, C.D., Raghavan, P., Schutze, H.: Introduction to Information Retrieval. University Press, Cambridge (2010)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005)
LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)
Bishop, C.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006). https://doi.org/10.1007/978-1-4615-7566-5
Ruder, S.: An Overview of Gradient Descent Optimization Algorithms (2016)
Fedorenko, Y.S., Gapanyuk, Y.E.: The neural network with automatic feature selection for solving problems with categorical variables. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (eds.) NEUROINFORMATICS 2018. SCI, vol. 799, pp. 129–135. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-01328-8_13
Lasagne Documentation (2019). https://lasagne.readthedocs.io/en/latest/
PyTorch Documentation (2019). https://pytorch.org/docs/stable/index.html
Obuchowski, N.A., Bullen, J.A.: Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine. Phys. Med. Biol. 63, 1361–1367 (2018)
He, X., et al.: Practical lessons from predicting clicks on ads at Facebook. In: Eighth International Workshop on Data Mining for Online Advertising, ADKDD 2014, pp. 1–9. ACM, New York (2014)
Fedorenko, Y.S., Gapanyuk, Y.E.: Multilevel neural net adaptive models using the metagraph approach. Opt. Mem. Neural Netw. 25(4), 228–235 (2016). https://doi.org/10.3103/S1060992X16040020
Fedorenko, Y.S., Gapanyuk, Y.E., Minakova, S.V.: The analysis of regularization in deep neural networks using metagraph approach. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V. (eds.) Advances in Neural Computation, Machine Learning, and Cognitive Research, pp. 3–8. Springer International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-66604-4_1
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Fedorenko, Y., Chernenkiy, V., Gapanyuk, Y. (2019). The Neural Network for Online Learning Task Without Manual Feature Extraction. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11554. Springer, Cham. https://doi.org/10.1007/978-3-030-22796-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-22796-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22795-1
Online ISBN: 978-3-030-22796-8
eBook Packages: Computer ScienceComputer Science (R0)