Abstract
The Bootstrap aggregating (Bagging) technique is widely used in the Machine Learning area, in order to reduce the prediction error of several unstable predictors. The method trains many predictors using bootstrap samples and combine them generating a new power learning tool. Although, if the training data has temporal dependency the technique is not applicable. One of the most efficient models for the treatment of time series is the Recurrent Neural Network (RNN) model. In this article, we use a RNN to encode the temporal dependency of the input data, then in the new encoding space the Bagging technique can be applied. We analyze the behavior of various neural activation functions for encoding the input data. We use three simulated and three real time-series data to analyze our approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report 148, German National Research Center for Information Technology (2001)
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for a neural computation based on perturbations. Neural Computation, 2531–2560 (November 2002)
Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Networks (3), 287–289 (2007)
Basterrech, S., Snášel, V.: Time-series forecasting using Bagging techniques and Reservoir Computing. In: IEEE Conference on Soft Computing and Pattern Recognition (SoCPaR), Hanoi, Viet Nam (December 2013)
Freund, Y., Shapire, R.E.: Experiments with a new Boosting Algorithm. In: Machine Learning: Proceedings of Thirteenth International Conference (1996)
Shapire, R.E.: The strength of weak learnability. Machine Learning 5(2) (1990)
Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap, 1st edn. Monographs on Statistics & Applied Probability, vol. 57. Chapman and Hall, New York (1994)
Breiman, L.: Bias, variance, and arcing classifiers. Technical Report 460, Statistics Department, University of California at Berkeley (1996)
Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J.J., Stroobandt, D.: Improving Reservoirs using Intrinsic Plasticity. Neurocomputing 71, 1159–1171 (2007)
Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review, 127–149 (2009)
Steil, J.J.: Backpropagation-Decorrelation: online recurrent learning with O(N) complexity. In: Proceedings of IJCNN 2004, vol. 1 (2004)
Gagliolo, M., Schmidhuber, J., Wierstra, D., Gomez, F.: Training Recurrent Networks by Evolino. Neural Networks 19, 757–779 (2007)
Basterrech, S., Rubino, G.: Echo State Queueing Network: a new Reservoir Computing learning tool. In: IEEE Consumer Comunications & Networking Conference, CCNC 2013 (January 2013)
Maass, W., Natschläger, T., Markram, H.: Computational models for generic cortical microcircuits. In: Neuroscience Databases. A Practical Guide, pp. 121–136. Kluwer Academic Publishers, Boston (2003)
Basterrech, S., Zjavka, L., Prokop, L., Mišák, S.: Irradiance prediction using echo state queueing networks and differential polynomial neural networks. In: IEEE Intelligent Systems Design and Applications, ISDA (December 2013)
Basterrech, S., Snášel, V.: Initializing Reservoirs With Exhibitory And Inhibitory Signals Using Unsupervised Learning Techniques. In: International Symposium on Information and Communication Technology (SoICT). ACM Digital Library (December 2013)
Basterrech, S., Fyfe, C., Rubino, G.: Self-Organizing Maps and Scale-Invariant Maps in Echo State Networks. In: 2011 11th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 94–99 (November 2011)
Rodan, A., Tiňo, P.: Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks, 131–144 (2011)
Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) NN: Tricks of the Trade, 2nd edn. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012)
Maass, W.: Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons. Technical Report TR–1999–037, Institute for Theorical Computer Science. Technische Universitaet Graz. Graz, Austria (1999)
Lukoševičius, M.: On self-organizing reservoirs and their hierarchies. Technical Report 25, Jacobs University, Bremen (2010)
Hénon, M.: A two dimensional mapping with a strange attractor. Commun. Math. Phys. 50, 69–77 (1976)
Cortez, P., Rio, M., Rocha, M., Sousa, P.: Multiscale Internet traffic forecasting using Neural Networks and time series methods. Expert Systems (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Basterrech, S., Mesa, A. (2014). Bagging Technique Using Temporal Expansion Functions. In: Kömer, P., Abraham, A., Snášel, V. (eds) Proceedings of the Fifth International Conference on Innovations in Bio-Inspired Computing and Applications IBICA 2014. Advances in Intelligent Systems and Computing, vol 303. Springer, Cham. https://doi.org/10.1007/978-3-319-08156-4_39
Download citation
DOI: https://doi.org/10.1007/978-3-319-08156-4_39
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-08155-7
Online ISBN: 978-3-319-08156-4
eBook Packages: EngineeringEngineering (R0)