Features of Application of Monte Carlo Method with Markov Chain Algorithms in Bayesian Data Analysis

  • Peter Bidyuk
  • Yoshio Matsuki
  • Aleksandr GozhyjEmail author
  • Volodymyr Beglytsia
  • Irina Kalinina
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1080)


The article discusses the algorithms of the Monte Carlo method with Markov chains (MCMC). These are the Metropolis-Hastings and Gibbs algorithms. Descriptions and the main features of the algorithms application are given. The MCMC methods are developed to model sets of vectors corresponding to multidimensional probability distributions. The main application of these methods and algorithms in Bayesian data analysis procedures is directed towards study of posterior distributions. The main procedures of Bayesian data analysis are considered and the features of the application of the Metropolis-Hastings and Gibbs algorithms with different types of input data are considered. Examples of application of the algorithms and methods for their evaluation are provided.


Monte Carlo method with Markov chains Bayesian data analysis Metropolis-Hastings algorithm Gibbs algorithm 


  1. 1.
    Andersen, T., Bollerslev, T., Lange, S.: Forecasting financial market volatility: sample frequency visa-vis forecast horizon. J. Empirical Finance 6(5), 457–477 (1999)CrossRefGoogle Scholar
  2. 2.
    Bidyuk, P., Gozhyj, A., Kalinina, I., et al.: The methods Bayesian analysis of the threshold stochastic volatility model. In: Proceedings of the 2018 IEEE 2nd International Conference on Data Stream Mining and Processing, DSMP 2018, Lviv, pp. 70–75 (2018)Google Scholar
  3. 3.
    Bidyuk, P., Gozhyj, A., Kalinina, I., Gozhyj, V.: Analysis of uncertainty types for model building and forecasting dynamic processes. In: Advances in Intelligent Systems and Computing II, vol. 689, pp. 66–78. Springer, Cham (2017)Google Scholar
  4. 4.
    Blasco, A., Sorensen, D., Bidanel, J.P.: Bayesian inference of genetic parameters and selection response for litter size components in pigs. Genetics 149, 301–306 (1998)Google Scholar
  5. 5.
    Casella, G., George, E.I.: Explaining the Gibbs sampler. Am. Stat. 46, 167–174 (1992)MathSciNetGoogle Scholar
  6. 6.
    Chib, S., Greenberg, E.: Understanding the metropolis-hastings algorithm. Am. Stat. 49, 327–335 (1995)Google Scholar
  7. 7.
    Insua, D., Ruggeri, F., Wiper, M.: Bayesian Analysis of Stochastic Process Models. Wiley, Chichester (2012)CrossRefGoogle Scholar
  8. 8.
    Jacquier, E., Polson, N.G., Rossi, P.E.: Bayesian analysis of stochastic volatility models. J. Bus. Econ. Stat. 20, 69–87 (1994)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Jacquier, E., Polson, N.G., Rossi, P.E.: Bayesian analysis of stochastic volatility models with fat-tails and correlated errors. J. Econometrics 122, 185–212 (2004)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Gelman, A., Meng, X.: Applied Bayesian Modeling and Causal Inference from Incomplete-Data Perspectives. Wiley, Chichester (2004)CrossRefGoogle Scholar
  11. 11.
    Gelfand, A.E., Smith, A.F.M.: Sampling-based approaches to calculating marginal densities. J. Am. Stat. Assoc. 85, 398–409 (1990)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Gelman, A., Carlin, J.B., Stern, H.S., Rubin, D.B.: Bayesian Data Analysis. A CRC Press Company, Boca Raton (2004)zbMATHGoogle Scholar
  13. 13.
    Geman, S., Geman, D.: Stochastic relaxation, gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Math. Intell. 6, 721–741 (1984)CrossRefGoogle Scholar
  14. 14.
    Hastings, W.K.: Monte-Carlo sampling methods using Markov chains and their applications. Biometrika 57(1), 97–109 (1970)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Hoff, P.D.: A First Course in Bayesian Statistical Methods. Springer, New York (2009)CrossRefGoogle Scholar
  16. 16.
    Kondratenko Yuriy, P., Kondratenko Nina, Y.: Reduced library of the soft computing analytic models for arithmetic operations with asymmetrical fuzzy numbers. Int. J. Comput. Res. Huttington 23(4), 349–370 (2016)Google Scholar
  17. 17.
    Kondratenko, Y.P., Kozlov, O.V., Korobko, O.V., Topalov, A.M.: Internet of Things approach for automation of the complex industrial systems. In: ICTERI-2017, CEUR Workshop Proceedings Open Access, vol. 1844, pp. 3–18 (2017)Google Scholar
  18. 18.
    Lee, P.: Bayesian Statistics: An Introduction, 2nd edn. Wiley, New York (1997)zbMATHGoogle Scholar
  19. 19.
    Tanner, M.A., Wong, W.H.: The calculation of posterior distributions by data augmentation source. J. Am. Stat. Assoc. 82(398), 528–540 (1987)CrossRefGoogle Scholar
  20. 20.
    Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H.: Equations of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953)CrossRefGoogle Scholar
  21. 21.
    Raftery, A.E., Lewis, S.: How many iterations in the Gibbs sampler. In: Bernardo, J.M., Berger, J.O., Dawid, A.P., Smith, A.F.M. (eds.) Bayesian Statistics, vol. 4, pp. 763–773. Oxford University Press (1992)Google Scholar
  22. 22.
    Raftery, A.E., Lewis, S.: Comment: one long run with diagnostics: implementation strategies for Markov Chain Monte Carlo. Stat. Sci. 7, 493–497 (1992)CrossRefGoogle Scholar
  23. 23.
    Smith, A.F.M.: Bayesian computational methods. Phil. Trans. R. Soc. Lond. A 337, 369–386 (1991)MathSciNetCrossRefGoogle Scholar
  24. 24.
    Smith, A.F.M., Roberts, G.O.: Bayesian computation via the Gibbs sampler and related Markov chain Monte-Carlo methods (with discussion). J. Roy. Stat. Soc. Ser. B 55, 3–23 (1993)zbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”kyivUkraine
  2. 2.Kyoto UniversityKyotoJapan
  3. 3.Petro Mohyla Black Sea National UniversityNikolaevUkraine

Personalised recommendations