Abstract
This contribution proposes, how the self-organizing process of feature maps can be improved.
The self-organizing process converges to a map, which preserves the neighbourhood relations of the input data, if the learning parameters, learning coefficient and width of the neighbourhood function, are chosen correctly. In general, the parameters are chosen empirically, dependent on the distribution of the training data and the network architecture [3]. Consequently, some experience with the algorithm and the training data is needed to choose proper courses of learning parameters. To avoid time consuming parameter studies a system model of the self-organizing process is developed and a linear Kalman filter used to estimate the learning coefficient. To estimate the width of the neighbourhood function the process of neighbourhood preservation during the training is modelled for the first time successfully. This process is then followed by an extended Kalman filter algorithm, which estimates the width of the neighbourhood function.
In case of fast self-organizing algorithms, as published in [1], the proposed parameter estimation method is essential for the training of data with unknown density distribution.
Preview
Unable to display preview. Download preview PDF.
References
K. Haese, H.-D. vom Stein: Fast Self-Organizing of n-dimensional Topology Maps. In: VIII European Signal Processing Conference. Trieste, Italy: 1996, pages 835–838
H.-U. Bauer, K.R. Pawelzik: Quantifying the Neighborhood Preservation of Self-Organizing Feature Maps. IEEE Transactions on Neural Networks, 3(4):570–579, 1992
C. Bouton, G. Pagès: Self-organization and a.s. convergence of the one-dimensional Kohonen algorithm with non-uniformly distributed stimuli. Stochastic Processes and their Applications, 47:249–274, 1993
C. Bouton, G. Pagès: Convergence in distribution of the one-dimensional Kohonen algorithm when the stimuli are non uniform. Advances in Applied Probability, 26:80–103, 1994
P. S. Chandran: Comments on the ”Comparative Analysis of Backpropagation and the Extended Kalman Filter for training Multilayer Perceptrons”. IEEE Transaction on Pattern Analysis and Machine Intelligence, 16(8):862–863, 1994
Y. P. Jun, H. Yoon, J. W. Cho: L*-Learning: A Fast Self-Organizing Feature Map Learning Algorithm Based on Incremental Ordering. IEICE Transaction on Information & Systems, E76-D(6):698–706, June 1993
T. Kohonen: Self-organized Formation of Topologically Correct Feature Maps. Biological Cybernetics, 43:59–69, 1982
T. Kohonen: Self-Organization and Associative Memory. Springer Series in Information Sciences 8, Heidelberg, 1984
D. W. Ruck, S. K. Rogers, P. S. Kabrisky, M. E. Oxley: Comparative Analysis of Backpropagation and the Extended Kalman Filter for training Multilayer Perceptrons. IEEE Transaction on Pattern Analysis and Machine Intelligence, 14(6):686–691, 1992
H. Ritter, T. Martinetz, K. Schulten: Neuronale Netze. Addison-Wesley (Deutschland) GmbH, Bonn, 1991, 2. erweiterte Auflage
E. Erwin, K. Obermayer, K. Schulten: Self-organzing maps: ordering, convergence properties and energy functions. Biological Cybernetics, 67:47–55, 1992
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Haese, K. (1997). Optimizing the self-organizing-process of topology maps. In: Reusch, B. (eds) Computational Intelligence Theory and Applications. Fuzzy Days 1997. Lecture Notes in Computer Science, vol 1226. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62868-1_102
Download citation
DOI: https://doi.org/10.1007/3-540-62868-1_102
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-62868-2
Online ISBN: 978-3-540-69031-3
eBook Packages: Springer Book Archive