Skip to main content

Impact of Hyperparameters on Model Development in Deep Learning

  • Conference paper
  • First Online:
Proceedings of International Conference on Computational Intelligence and Data Engineering

Abstract

Deep learning has revolutionized the field of computer vision. To develop a deep learning model, one has to decide on the optimal values of various hyperparameters such as learning rate. These are also called as model parameters which are not learned by the model rather initialized by the user. Hyperparameters control other parameters of the model such as weights and biases. Parameter values are learned effectively by tuning the hyperparameters. Hence, hyperparameters determine the values of the parameters of the model. Manual Tuning is a tedious and time-consuming process. Automating the selection of values for hyperparameters results in the development of effective models. It has to be investigated to figure out which combinations yield the optimum results. This work has considered scikit-optimize library functions to study the impact of hyperparameters on the accuracy of MNIST dataset classification problem. The results obtained for different combination of learning rate, number of dense layers, number of nodes per dense layer, and activation function showed that a minimum of 8.68% and a maximum of 98.96% for gp_minimize function, 8.68% and 98.74% for forest_minimize function and gbrt_minimize generates 9.24% and 98.94% for lowest and highest accuracy, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 279.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 279.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. skopt api documentation. https://scikit-optimize.github.io/. Accessed on 11 February 2019

  2. Akiba T, Sano S, Yanase T, Ohta T, Koyama M (2019) Optuna: a next-generation hyperparameter optimization framework. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp 2623–2631

    Google Scholar 

  3. Bergstra J, Komer B, Eliasmith C, Yamins D, Cox DD (2015) Hyperopt: a python library for model selection and hyperparameter optimization. Comput Sci & Discov 8(1):014008

    Article  Google Scholar 

  4. Bergstra JS, Bardenet R, Bengio Y, Kégl B (2011) Algorithms for hyper-parameter optimization. In: Advances in neural information processing systems, pp 2546–2554

    Google Scholar 

  5. Deng L (2012) The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE Sig Process Mag 29(6):141–142

    Article  Google Scholar 

  6. Domhan T, Springenberg JT, Hutter F (2015) Speeding up automatic hyperparameter optimization of deep neural networks by extrapolation of learning curves. In: Twenty-fourth international joint conference on artificial intelligence

    Google Scholar 

  7. Feurer M, Hutter F (2019) Hyperparameter optimization. In: automated machine learning. Springer, pp 3–33

    Google Scholar 

  8. Ilievski I., Akhtar T, Feng J, Shoemaker CA (2017) Efficient hyperparameter optimization for deep learning algorithms using deterministic rbf surrogates. In: Thirty-first AAAI conference on artificial intelligence

    Google Scholar 

  9. Klein A, Falkner S, Bartels S, Hennig P, Hutter F (2016) Fast Bayesian optimization of machine learning hyperparameters on large datasets. arXiv preprint arXiv:1605.07079

  10. Loshchilov I, Hutter F (2016) CMA-ES for hyperparameter optimization of deep neural networks. arXiv preprint arXiv:1604.07269

  11. Maclaurin D, Duvenaud D, Adams R (2015) Gradient-based hyperparameter optimization through reversible learning. In: International conference on machine learning, pp 2113–2122

    Google Scholar 

  12. Snoek J, Larochelle H, Adams RP (2012) Practical Bayesian optimization of machine learning algorithms. In: Advances in neural information processing systems, pp 2951–2959

    Google Scholar 

  13. Tsai CW, Hsia CH, Yang SJ, Liu SJ, Fang ZY (2020) Optimizing hyperparameters of deep learning in predicting bus passengers based on simulated annealing. Appl Soft Comput 106068

    Google Scholar 

  14. Young SR, Rose DC, Karnowski TP, Lim SH, Patton RM (2015) Optimizing deep learning hyper-parameters through an evolutionary algorithm. In: Proceedings of the workshop on machine learning in high-performance computing environments. ACM, p 4

    Google Scholar 

  15. Zela A, Klein A, Falkner S, Hutter F (2018) Towards automated deep learning: Efficient joint neural architecture and hyperparameter search. arXiv preprint arXiv:1807.06906

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Humera Shaziya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shaziya, H., Zaheer, R. (2021). Impact of Hyperparameters on Model Development in Deep Learning. In: Chaki, N., Pejas, J., Devarakonda, N., Rao Kovvur, R.M. (eds) Proceedings of International Conference on Computational Intelligence and Data Engineering. Lecture Notes on Data Engineering and Communications Technologies, vol 56. Springer, Singapore. https://doi.org/10.1007/978-981-15-8767-2_5

Download citation

Publish with us

Policies and ethics