Abstract
The article deals with the overfitting problem in deep neural networks. Finding the model with proper number of parameters matching the simulated process can be a difficult task. There are a range of recommendations how to chose the number of neurons in hidden layers, but most of them don’t always work well in practice. As a result, neural networks work in underfitting or overfitting regime. Therefore in practice complex model is usually chosen and regularization strategies are applied. In this paper, the main regularization techniques for multilayer perceptrons including early stopping and dropout are discussed. Regularization representation using metagraph approach is described. In the creation mode, the metagraph representation of the neural network is created using metagraph agents. In the training mode, the training metagraph is created. Thus, different regularization strategies may be embedded into the training algorithm. The special metagraph agent for dropout strategy is developed. Comparison of different regularization techniques is conducted on CoverType dataset. Results of experiments are analyzed. Advantages of early stopping and dropout regularization strategies are discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Goodfellow, I., Bengio, Y., Courvile, A.: Deep Learning. MIT Press, Cambridge (2016). 787 p
Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006). 758 p
Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, New Jersey (1999). 1056 p
Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: Conference on Computer Vision and Pattern Recognition (CVPR), USA, pp. 2818–2826 (2016)
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, pp. 249–256 (2010)
Bishop, C.: Regularization and complexity control in feed-forward networks. In: Proceedings International Conference on Artificial Neural Networks ICANN 1995, vol. 1, pp. 141–148
Sjoberg, J., Ljung, L.: Overtraining, regularization and searching for a minimum, with application to neural networks. Int. J. Control 62(6), 1391–1407 (1995)
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)
Fedorenko, YuS, Gapanyuk, YuE: Multilevel neural net adaptive models using the metagraph approach. Opt. Mem. Neural Netw. 25(4), 228–235 (2016)
Blackard, J., Dean, J., Anderson, W.: Forest CoverType Dataset. https://archive.ics.uci.edu/ml/datasets/Covertype. Accessed 14 Jun 2017
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Fedorenko, Y.S., Gapanyuk, Y.E., Minakova, S.V. (2018). The Analysis of Regularization in Deep Neural Networks Using Metagraph Approach. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V. (eds) Advances in Neural Computation, Machine Learning, and Cognitive Research. NEUROINFORMATICS 2017. Studies in Computational Intelligence, vol 736. Springer, Cham. https://doi.org/10.1007/978-3-319-66604-4_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-66604-4_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-66603-7
Online ISBN: 978-3-319-66604-4
eBook Packages: EngineeringEngineering (R0)