Skip to main content

The Analysis of Regularization in Deep Neural Networks Using Metagraph Approach

  • Conference paper
  • First Online:
Advances in Neural Computation, Machine Learning, and Cognitive Research (NEUROINFORMATICS 2017)

Part of the book series: Studies in Computational Intelligence ((SCI,volume 736))

Included in the following conference series:

Abstract

The article deals with the overfitting problem in deep neural networks. Finding the model with proper number of parameters matching the simulated process can be a difficult task. There are a range of recommendations how to chose the number of neurons in hidden layers, but most of them don’t always work well in practice. As a result, neural networks work in underfitting or overfitting regime. Therefore in practice complex model is usually chosen and regularization strategies are applied. In this paper, the main regularization techniques for multilayer perceptrons including early stopping and dropout are discussed. Regularization representation using metagraph approach is described. In the creation mode, the metagraph representation of the neural network is created using metagraph agents. In the training mode, the training metagraph is created. Thus, different regularization strategies may be embedded into the training algorithm. The special metagraph agent for dropout strategy is developed. Comparison of different regularization techniques is conducted on CoverType dataset. Results of experiments are analyzed. Advantages of early stopping and dropout regularization strategies are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Goodfellow, I., Bengio, Y., Courvile, A.: Deep Learning. MIT Press, Cambridge (2016). 787 p

    MATH  Google Scholar 

  2. Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006). 758 p

    MATH  Google Scholar 

  3. Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, New Jersey (1999). 1056 p

    MATH  Google Scholar 

  4. Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: Conference on Computer Vision and Pattern Recognition (CVPR), USA, pp. 2818–2826 (2016)

    Google Scholar 

  5. Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, pp. 249–256 (2010)

    Google Scholar 

  6. Bishop, C.: Regularization and complexity control in feed-forward networks. In: Proceedings International Conference on Artificial Neural Networks ICANN 1995, vol. 1, pp. 141–148

    Google Scholar 

  7. Sjoberg, J., Ljung, L.: Overtraining, regularization and searching for a minimum, with application to neural networks. Int. J. Control 62(6), 1391–1407 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  8. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  9. Fedorenko, YuS, Gapanyuk, YuE: Multilevel neural net adaptive models using the metagraph approach. Opt. Mem. Neural Netw. 25(4), 228–235 (2016)

    Article  Google Scholar 

  10. Blackard, J., Dean, J., Anderson, W.: Forest CoverType Dataset. https://archive.ics.uci.edu/ml/datasets/Covertype. Accessed 14 Jun 2017

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuriy E. Gapanyuk .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Fedorenko, Y.S., Gapanyuk, Y.E., Minakova, S.V. (2018). The Analysis of Regularization in Deep Neural Networks Using Metagraph Approach. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V. (eds) Advances in Neural Computation, Machine Learning, and Cognitive Research. NEUROINFORMATICS 2017. Studies in Computational Intelligence, vol 736. Springer, Cham. https://doi.org/10.1007/978-3-319-66604-4_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-66604-4_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-66603-7

  • Online ISBN: 978-3-319-66604-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics