Skip to main content

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 303))

  • 1111 Accesses

Abstract

The Bootstrap aggregating (Bagging) technique is widely used in the Machine Learning area, in order to reduce the prediction error of several unstable predictors. The method trains many predictors using bootstrap samples and combine them generating a new power learning tool. Although, if the training data has temporal dependency the technique is not applicable. One of the most efficient models for the treatment of time series is the Recurrent Neural Network (RNN) model. In this article, we use a RNN to encode the temporal dependency of the input data, then in the new encoding space the Bagging technique can be applied. We analyze the behavior of various neural activation functions for encoding the input data. We use three simulated and three real time-series data to analyze our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  2. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report 148, German National Research Center for Information Technology (2001)

    Google Scholar 

  3. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for a neural computation based on perturbations. Neural Computation, 2531–2560 (November 2002)

    Google Scholar 

  4. Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Networks (3), 287–289 (2007)

    Google Scholar 

  5. Basterrech, S., Snášel, V.: Time-series forecasting using Bagging techniques and Reservoir Computing. In: IEEE Conference on Soft Computing and Pattern Recognition (SoCPaR), Hanoi, Viet Nam (December 2013)

    Google Scholar 

  6. Freund, Y., Shapire, R.E.: Experiments with a new Boosting Algorithm. In: Machine Learning: Proceedings of Thirteenth International Conference (1996)

    Google Scholar 

  7. Shapire, R.E.: The strength of weak learnability. Machine Learning 5(2) (1990)

    Google Scholar 

  8. Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap, 1st edn. Monographs on Statistics & Applied Probability, vol. 57. Chapman and Hall, New York (1994)

    Google Scholar 

  9. Breiman, L.: Bias, variance, and arcing classifiers. Technical Report 460, Statistics Department, University of California at Berkeley (1996)

    Google Scholar 

  10. Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J.J., Stroobandt, D.: Improving Reservoirs using Intrinsic Plasticity. Neurocomputing 71, 1159–1171 (2007)

    Article  Google Scholar 

  11. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review, 127–149 (2009)

    Google Scholar 

  12. Steil, J.J.: Backpropagation-Decorrelation: online recurrent learning with O(N) complexity. In: Proceedings of IJCNN 2004, vol. 1 (2004)

    Google Scholar 

  13. Gagliolo, M., Schmidhuber, J., Wierstra, D., Gomez, F.: Training Recurrent Networks by Evolino. Neural Networks 19, 757–779 (2007)

    MATH  Google Scholar 

  14. Basterrech, S., Rubino, G.: Echo State Queueing Network: a new Reservoir Computing learning tool. In: IEEE Consumer Comunications & Networking Conference, CCNC 2013 (January 2013)

    Google Scholar 

  15. Maass, W., Natschläger, T., Markram, H.: Computational models for generic cortical microcircuits. In: Neuroscience Databases. A Practical Guide, pp. 121–136. Kluwer Academic Publishers, Boston (2003)

    Google Scholar 

  16. Basterrech, S., Zjavka, L., Prokop, L., Mišák, S.: Irradiance prediction using echo state queueing networks and differential polynomial neural networks. In: IEEE Intelligent Systems Design and Applications, ISDA (December 2013)

    Google Scholar 

  17. Basterrech, S., Snášel, V.: Initializing Reservoirs With Exhibitory And Inhibitory Signals Using Unsupervised Learning Techniques. In: International Symposium on Information and Communication Technology (SoICT). ACM Digital Library (December 2013)

    Google Scholar 

  18. Basterrech, S., Fyfe, C., Rubino, G.: Self-Organizing Maps and Scale-Invariant Maps in Echo State Networks. In: 2011 11th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 94–99 (November 2011)

    Google Scholar 

  19. Rodan, A., Tiňo, P.: Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks, 131–144 (2011)

    Google Scholar 

  20. Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) NN: Tricks of the Trade, 2nd edn. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012)

    Google Scholar 

  21. Maass, W.: Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons. Technical Report TR–1999–037, Institute for Theorical Computer Science. Technische Universitaet Graz. Graz, Austria (1999)

    Google Scholar 

  22. Lukoševičius, M.: On self-organizing reservoirs and their hierarchies. Technical Report 25, Jacobs University, Bremen (2010)

    Google Scholar 

  23. Hénon, M.: A two dimensional mapping with a strange attractor. Commun. Math. Phys. 50, 69–77 (1976)

    Article  MATH  Google Scholar 

  24. Cortez, P., Rio, M., Rocha, M., Sousa, P.: Multiscale Internet traffic forecasting using Neural Networks and time series methods. Expert Systems (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sebastián Basterrech .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Basterrech, S., Mesa, A. (2014). Bagging Technique Using Temporal Expansion Functions. In: Kömer, P., Abraham, A., Snášel, V. (eds) Proceedings of the Fifth International Conference on Innovations in Bio-Inspired Computing and Applications IBICA 2014. Advances in Intelligent Systems and Computing, vol 303. Springer, Cham. https://doi.org/10.1007/978-3-319-08156-4_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-08156-4_39

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-08155-7

  • Online ISBN: 978-3-319-08156-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics