Abstract
Traditional machine learning algorithms are focused on batch learning from a static data set or from a well-known distribution. However, these algorithms take a considerable amount of time to learn a large amount of training data and besides many of them are not able to deal with nonstationary distributions. Recent machine learning challenges require the capability of online learning in nonstationary environments. Thus, in this work we propose a new learning method, for single-layer neural networks, that introduces a forgetting function in an incremental learning algorithm. The algorithm employs a recursive formula in order to obtain the solution of a weighted least squares problem. The performance of the method is experimentally checked over different data sets. The proposed algorithm has demonstrated high adaptation to changes while maintaining a low consumption of computational resources.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Haykin, S.: Adaptive Filter Theory. Prentice-Hall, Englewood Cliffs (1996)
Kailath, T., Sayed, A., Hassibi, B.: Linear Estimation. Prentice-Hall, Englewood Cliffs (2000)
Zhou, J., Zhu, Y., Li, X.R., Zhisheng, Y.: Variants of the greville formula with applications to exact recursive least squares. SIAM Journal on Matrix Analysis and Applications 24(1), 150–164 (2002)
Greville, T.: Some applications of pseudoinverse of a matrix. SIAM Review 2, 578–619 (1960)
Fontenla-Romero, O., Guijarro-Berdiñas, B., Pérez-Sánchez, B., Alonso-Betanzos, A.: A new convex objective function for the supervised learning of single-layer neural networks. Pattern Recognition 43, 1984–1992 (2010)
Martínez-Rego, D., Fontenla-Romero, O., Alonso-Betanzos, A., Pérez-Sánchez, B.: A robust incremental learning method for non-stationary environments. Neurocomputing 74(11), 1800–1808 (2011)
Martínez-Rego, D., Fontenla-Romero, O., Alonso-Betanzos, A.: Nonlinear single layer neural network training algorithm for incremental, nonstationary and distributed learning scenarios. Pattern Recognition 45, 4536–4546 (2012)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)
Mackey, M., Glass, L.: Oscillation and chaos in physological control systems. Science 197, 287 (1977)
U.S. National Oceanic and Atmospheric Administration: Federal Climate Complex Global Surface Summary of Day Data, ftp.ncdc.noaa.gov/pub/data/gsod
Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Transactions on Neural Networks 22(10), 1517–1531 (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Martínez-Rego, D., Fontenla-Romero, O., Alonso-Betanzos, A. (2013). Exact Incremental Learning for a Single Non-linear Neuron Based on Taylor Expansion and Greville Formula. In: Bielza, C., et al. Advances in Artificial Intelligence. CAEPIA 2013. Lecture Notes in Computer Science(), vol 8109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40643-0_16
Download citation
DOI: https://doi.org/10.1007/978-3-642-40643-0_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40642-3
Online ISBN: 978-3-642-40643-0
eBook Packages: Computer ScienceComputer Science (R0)