Abstract
We introduce an information maximizing neural network that employs only local learning rules, simple activation functions, and feedback in its functioning. The network consists of an input layer, an output layer that can be overcomplete, and a set of auxiliary layers comprising feed-forward, lateral, and feedback connecwtions. The auxiliary layers implement a novel ”neural multigrid,” and each computes a Fourier mode of a key infomax learning vector. Initially, a partial multigrid computes only low frequency modes of this learning vector, resulting in a spatially correlated topographic map. As higher frequency modes of the learning vector are gradually added, an infomax solution emerges, maximizing the entropy of the output without disrupting the map’s topographic order. When feed-forward and feedback connections to the neural multigrid are passed through a nonlinear activation function, infomax emerges in a phase-independent topographic map. Information rates estimated by Principal Components Analysis (PCA) are comparable to those of standard infomax, indicating the neural multigrid successfully imposes a topographic order on the optimal infomax-derived bases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kohonen, T.: Self-Organizing Maps. Springer, Berlin (1997)
Kohonen, T.: Learning Vector Quantization. Neural Networks 1(Suppl. 1), 303 (1988)
Desieno, D.: Adding a Conscience to Competitive Learning. In: Proc. Int. Conf. on Neural Networks I, pp. 117–124 (1988)
Bednar, J.A., Kelkar, A., Miikkulainen, R.: Scaling Self-Organizing Maps to Model Large Cortical Networks. Neuroinformatics 2, 275–302 (2004)
Linsker, R.: Local Synaptic Learning Rules Suffice to Maximise Mutual Information in a Linear Nnetwork. Neural Computation 4, 691–702 (1992)
Bell, A.J., Sejnowski, T.J.: An Information-Maximisation Approach to Blind Separation and Blind Deconvolution. Neural Computation 7, 1129–1159 (1995)
Shriki, O., Sompolinsky, H., Lee, D.D.: An Information Maximization Approach to Overcomplete and Recurrent Representations. In: 12th Conference on Neural Information Processing Systems, pp. 87–93 (2000)
Olshausen, B.A., Field, D.J.: Sparse Coding with an Overcomplete Basis Set: A Strategy Employed by V1. Vision Research 37, 3311–3325 (1996)
Lewicki, M.S., Sejnowski, T.J.: Learning Overcomplete Representations. Neural Computation 12, 337–365 (2000)
Linsker, R.: How to Generate Ordered Maps by Maximizing the Mutual Information between Input and Output Signals. Neural Computation 1, 402–411 (1989)
Linsker, R.: A Local Learning Rule that Enables Information Maximization for Arbitrary Input Distributions. Neural Computation 9, 1661–1665 (1997)
Hyvärinen, A., Hoyer, P.O.: A Two-Layer Sparse Coding Model Learns Simple and Comlex Cell Receptive Fields and Topogrpahy From Natural Images. Vision Research 41, 2413–2423 (2001)
Briggs, W.L., Henson, V.E., McCormick, S.F.: A Multigrid Tutorial. SIAM, Philadelphia (2000)
Hyvärinen, A., Hoyer, P.O., Inki, M.: Topographic Independent Component Analysis. Neural Computation 13, 1527–1528 (2001)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Kozloski, J., Cecchi, G., Peck, C., Rao, A.R. (2007). Topographic Infomax in a Neural Multigrid. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_60
Download citation
DOI: https://doi.org/10.1007/978-3-540-72393-6_60
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72392-9
Online ISBN: 978-3-540-72393-6
eBook Packages: Computer ScienceComputer Science (R0)