Prediction of Oil Prices Using Bagging and Random Subspace

Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 303)

Abstract

The problem of predicting oil prices is worthy of attention. As oil represents the backbone of the world economy, the goal of this paper is to design a model, which is more accurate. We modeled the prediction process comprising of three steps: feature selection, data partitioning and analyzing the prediction models. Six prediction models namely: Multi-Layered Perceptron (MLP), Sequential Minimal Optimization for regression (SMOreg), Isotonic Regression, Multilayer Perceptron Regressor (MLP Regressor), Extra-Tree and Reduced Error Pruning Tree (REPtree). These prediction models were selected and tested after experimenting with other several most widely used prediction models. The comparison of these six algorithms with previous work is presented based on Root mean squared error (RMSE) to find out the best suitable algorithm. Further, two meta schemes namely Bagging and Random subspace are adopted and compared with previous algorithms using Mean squared error (MSE) to evaluate performance. Experimental evidence illustrate that the random subspace scheme outperforms most of the existing techniques.

Keywords

Prediction oil prices Bagging Random subspace Base regression models 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
    Pindyck, R.S.: The Dynamics of Commodity Spot and Futures Markets: A Primer. Energy Journal 22(3) (2001)Google Scholar
  3. 3.
    Pan, H., Haidar, I., Kulkarni, S.: Daily prediction of short-term trends of crude oil prices using neural networks exploiting multimarket dynamics, 177–191 (2009)Google Scholar
  4. 4.
    Soytas, U., et al.: World oil prices, precious metal prices and macroeconomy in Turkey. Energy Policy 37(12), 5557–5566 (2009)CrossRefGoogle Scholar
  5. 5.
    Aloui, C., Jammazi, R.: The effects of crude oil shocks on stock market shifts behaviour: a regime switching approach. Energy Economics 31(5), 789–799 (2009)CrossRefGoogle Scholar
  6. 6.
    Wang, F., Wang, S.: Analysis on impact factors of oil price fluctuation in China. In: 2011 2nd International Conference on Artificial Intelligence, Management Science and Electronic Commerce (AIMSEC). IEEE (2011)Google Scholar
  7. 7.
    Griffin, J.M.: OPEC behavior: a test of alternative hypotheses. The American Economic Review, 954–963 (1985)Google Scholar
  8. 8.
  9. 9.
    Abramson, B., Finizza, A.: Probabilistic forecasts from probabilistic models: a case study in the oil market. International Journal of Forecasting 11(1), 63–72 (1995)CrossRefGoogle Scholar
  10. 10.
    Morana, C.: A semiparametric approach to short-term oil price forecasting. Energy Economics 23(3), 325–338 (2001)CrossRefGoogle Scholar
  11. 11.
    Xie, W., Yu, L., Xu, S., Wang, S.-Y.: A new method for crude oil price forecasting based on support vector machines. In: Alexandrov, V.N., van Albada, G.D., Sloot, P.M.A., Dongarra, J., et al. (eds.) ICCS 2006, Part IV. LNCS, vol. 3994, pp. 444–451. Springer, Heidelberg (2006)Google Scholar
  12. 12.
    Alizadeh, A., Mafinezhad, K.: Monthly Brent oil price forecasting using artificial neural networks and a crisis index. In: 2010 International Conference on Electronics and Information Engineering (ICEIE). IEEE (2010)Google Scholar
  13. 13.
    Tsymbal, A., Puuronen, S., Patterson, D.W.: Ensemble feature selection with the simple Bayesian classification. Information Fusion 4(2), 87–100 (2003)CrossRefGoogle Scholar
  14. 14.
    Wang, S.-J., et al.: Empirical analysis of support vector machine ensemble classifiers. Expert Systems with Applications 36(3), 6466–6476 (2009)CrossRefGoogle Scholar
  15. 15.
    Araghinejad, S., Azmi, M., Kholghi, M.: Application of artificial neural network ensembles in probabilistic hydrological forecasting. Journal of Hydrology 407(1), 94–104 (2011)Google Scholar
  16. 16.
    Sun, X.: Pitch accent prediction using ensemble machine learning. In: INTERSPEECH (2002)Google Scholar
  17. 17.
    Bourlard, H., Kamp, Y.: Auto-association by multilayer perceptrons and singular value decomposition. Biological Cybernetics 59(4-5), 291–294 (1988)CrossRefMATHMathSciNetGoogle Scholar
  18. 18.
  19. 19.
    Gabralla, L.A., Jammazi, R., Abraham, A.: Oil price prediction using ensemble machine learning. In: 2013 International Conference on Computing, Electrical and Electronics Engineering (ICCEEE). IEEE (2013)Google Scholar
  20. 20.
    Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14(3), 199–222 (2004)CrossRefMathSciNetGoogle Scholar
  21. 21.
  22. 22.
  23. 23.
    Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine Learning 63(1), 3–42 (2006)CrossRefMATHGoogle Scholar
  24. 24.
    Mohamed, W., Salleh, M.N.M., Omar, A.H.: A comparative study of Reduced Error Pruning method in decision tree algorithms. In: 2012 IEEE International Conference on Control System, Computing and Engineering (ICCSCE). IEEE (2012)Google Scholar
  25. 25.
    Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)MATHMathSciNetGoogle Scholar
  26. 26.
    Rashedi, E., Mirzaei, A.: A hierarchical clusterer ensemble method based on boosting theory. Knowledge-Based Systems 45, 83–93 (2013)CrossRefGoogle Scholar
  27. 27.
    Yang, P., et al.: A review of ensemble methods in bioinformatics. Current Bioinformatics 5(4), 296–308 (2010)CrossRefGoogle Scholar
  28. 28.
    Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)CrossRefGoogle Scholar
  29. 29.
    Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)CrossRefGoogle Scholar
  30. 30.
  31. 31.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. The Journal of Machine Learning Research 3, 1157–1182 (2003)MATHGoogle Scholar
  32. 32.
    Hall, M.A.: Correlation-based feature selection for machine learning, The University of Waikato (1999)Google Scholar
  33. 33.
    Robnik-Šikonja, M., Kononenko, I.: An adaptation of Relief for attribute estimation in regression. In: Machine Learning: Proceedings of the Fourteenth International Conference, ICML 1997 (1997)Google Scholar
  34. 34.

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Faculty of Computer Science & Information TechnologySudan University of Science & TechnologyKhartoumSudan
  2. 2.Machine Intelligence Research LabsScientific Network for Innovation and Research ExcellenceWashingtonUSA
  3. 3.IT4InnovationsVSB-Technical University of OstravaOstravaCzech Republic

Personalised recommendations