Skip to main content
Log in

Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

The multilayer perceptron is the most useful artificial neural network widely used to approximate the nonlinear function in various fields, but the determination of its suitable weights and regularization parameters is a fundamental problem due to their direct impact on the network convergence and generalization performance. The Bayesian approach for neural networks is to consider all parameters of networks are random variables; then, computation of the posterior distribution is based on prior overall parameters and likelihood function through the use of Bayes’ theorem. In this paper, we train the network weights by means of Hamiltonian Monte Carlo (HMC); for hyperparameters, we propose to sample from posterior distribution using HMC in order to approximate the derivative of evidence which allow to re-estimate hyperparameters. The case problem studied in this paper includes a regression and classification problem. The obtained results illustrate the advantages of our approach in terms of accuracy compared to old Bayesian approach for neural network.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Andrieu C, de Freitas N (2000) Sequential Monte Carlo for model selection and estimation of neural networks. In: 2000 IEEE international conference on acoustics, speech, and signal processing, 2000. ICASSP’00. Proceedings, vol 6. IEEE, pp 3410–3413

  • Bache K, Lichman M (2013) UCI machine learning repository. https://archive.ics.uci.edu/ml/datasets.html

  • Bishop C M (1995) Neural networks for pattern recognition. Oxford university press, Oxford

    MATH  Google Scholar 

  • Buntine WL, Weigend AS (1991) Bayesian back-propagation. Complex Syst 5(6):603–643

    MATH  Google Scholar 

  • Chua C, Goh A (2003) A hybrid Bayesian back-propagation neural network approach to multivariate modelling. Int J Numer Anal Methods Geomech 27(8):651–667

    Article  MATH  Google Scholar 

  • de Jesús Rubio J (2017a) Discrete time control based in neural networks for pendulums. Appl Soft Comput. https://doi.org/10.1016/j.asoc.2017.04.056

  • de Jesús Rubio J (2017b) Stable kalman filter and neural network for the chaotic systems identification. J Frankl Inst 354(16):7444–7462

    Article  MathSciNet  MATH  Google Scholar 

  • Duane S, Kennedy AD, Pendleton BJ, Roweth D (1987) Hybrid monte carlo. Phys Lett B 195(2):216–222

    Article  Google Scholar 

  • Gelman A, Carlin JB, Stern HS, Rubin DB (2003) Bayesian data analysis, 2nd edn. Chapman and Hall/CRC, Boca Raton, Fla

  • Girolami M, Calderhead B (2011) Riemann manifold langevin and hamiltonian monte carlo methods. J R Stat Soc Ser B (Stat Methodol) 73(2):123–214

    Article  MathSciNet  Google Scholar 

  • Hippert HS, Taylor JW (2010) An evaluation of bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting. Neural Netw 23(3):386–395

    Article  Google Scholar 

  • Husmeier D, Penny WD, Roberts SJ (1999) An empirical evaluation of bayesian sampling with hybrid monte carlo for training neural network classifiers. Neural Netw 12(4–5):677–705

    Article  Google Scholar 

  • Kocadagli O (2012) Hybrid bayesian neural networks with genetic algorithms and fuzzy membership functions. Mimar Sinan FA University, Istanbul

    Google Scholar 

  • Kocadağlı O (2015) A novel hybrid learning algorithm for full bayesian approach of artificial neural networks. Appl Soft Comput 35:52–65

    Article  Google Scholar 

  • Kocadağlı O, Aşıkgil B (2014) Nonlinear time series forecasting with bayesian neural networks. Expert Syst Appl 41(15):6596–6610

    Article  Google Scholar 

  • Lampinen J, Vehtari A (2001) Bayesian approach for neural networks: review and case studies. Neural Netw 14(3):257–274

    Article  Google Scholar 

  • Lan S (2013) Advanced Bayesian computational methods through geometric techniques. University of California, Irvine

    Google Scholar 

  • Lan S, Stathopoulos V, Shahbaba B, Girolami M (2015) Markov chain monte carlo from lagrangian dynamics. J Comput Graph Stat 24(2):357–378

    Article  MathSciNet  Google Scholar 

  • Liang F (2005) Bayesian neural networks for nonlinear time series forecasting. Stat Comput 15(1):13–29

    Article  MathSciNet  Google Scholar 

  • Liang F, Wong WH (2001) Real-parameter evolutionary monte carlo with applications to bayesian mixture models. J Am Stat Assoc 96(454):653–666

    Article  MathSciNet  MATH  Google Scholar 

  • MacKay DJ (1992) The evidence framework applied to classification networks. Neural Comput 4(5):720–736

    Article  Google Scholar 

  • MacKay DJ (1995) Probable networks and plausible predictionsa review of practical bayesian methods for supervised neural networks. Netw Comput Neural Syst 6(3):469–505

    Article  MATH  Google Scholar 

  • Marwala T (2007) Bayesian training of neural networks using genetic programming. Pattern Recognit Lett 28(12):1452–1458

    Article  Google Scholar 

  • Neal RM (1992) Bayesian training of backpropagation networks by the hybrid monte carlo method. Technical report, Citeseer

  • Neal RM (1993) Probabilistic inference using markov chain monte carlo methods. Technical Report CRG-TR-93-1, Department of Computer Science, University of Toronto

  • Neal R M (2012) Bayesian learning for neural networks, vol 118. Springer, Berlin

    MATH  Google Scholar 

  • Neal RM et al (2011) Mcmc using hamiltonian dynamics. In: Handbook of Markov Chain Monte Carlo, vol 2, no 11

  • Niu D-X, Shi H-F, Wu DD (2012) Short-term load forecasting using bayesian neural networks learned by hybrid monte carlo algorithm. Appl Soft Comput 12(6):1822–1827

    Article  Google Scholar 

  • Pan Y, Liu Y, Xu B, Yu H (2016a) Hybrid feedback feedforward: an efficient design of adaptive neural network control. Neural Netw 76:122–134

    Article  Google Scholar 

  • Pan Y, Sun T, Yu H (2016b) Composite adaptive dynamic surface control using online recorded data. Int J Robust Nonlinear Control 26(18):3921–3936

    Article  MathSciNet  MATH  Google Scholar 

  • Penny WD, Roberts SJ (1999) Bayesian neural networks for classification: how useful is the evidence framework? Neural Netw 12(6):877–892

    Article  Google Scholar 

  • Ramchoun H, Idrissi MAJ, Ghanou Y, Ettaouil M (2017) New modeling of multilayer perceptron architecture optimization with regularization: an application to pattern classification. IAENG Int J Comput Sci 44(3):261–269

    Google Scholar 

  • Shahbaba B, Lan S, Johnson WO, Neal RM (2014) Split hamiltonian monte carlo. Stat Comput 24(3):339–349

    Article  MathSciNet  MATH  Google Scholar 

  • Thodberg HH (1996) A review of bayesian neural networks with an application to near infrared spectroscopy. IEEE Trans Neural Netw 7(1):56–72

    Article  Google Scholar 

  • Vehtari A, Lampinen J (2000) Bayesian mlp neural networks for image analysis. Pattern Recognit Lett 21(13–14):1183–1191

    Article  MATH  Google Scholar 

  • Vivarelli F, Williams CK (2001) Comparing bayesian neural network algorithms for classifying segmented outdoor images. Neural Netw 14(4–5):427–437

    Article  Google Scholar 

  • Yan D, Zhou Q, Wang J, Zhang N (2017) Bayesian regularisation neural network based on artificial intelligence optimisation. Int J Prod Res 55(8):2266–2287

    Article  Google Scholar 

  • Zhang H, Tang Y (2017) Online gradient method with smoothing \(l_{0}\) regularization for feedforward neural networks. Neurocomputing 224:1–8

    Article  Google Scholar 

  • Zhang C, Shahbaba B, Zhao H (2017) Hamiltonian monte carlo acceleration using surrogate functions with random bases. Stat Comput 27(6):1473–1490

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hassan Ramchoun.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Communicated by V. Loia.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ramchoun, H., Ettaouil, M. Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network. Soft Comput 23, 4815–4825 (2019). https://doi.org/10.1007/s00500-018-3138-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-018-3138-5

Keywords

Navigation