Non-orthogonal bases and metric tensors: An application to artificial neural networks
We consider neural networks such as Radial Basis Function networks as projection operators from function space onto a submanifold. We then interpret learning in such a network as the rotation and shifting of that submanifold in such a way that the projection of a function to be approximated unto the submanifold is as close as possible to the function itself. That rotation and shifting, executed through modification of the parameters of the basis functions of the network, is computed with the help of metric tensors, a geometric object of differential geometry.
The resulting network displays graceful degradation, and adapts dynamically to changes in the environment.
KeywordsTensor Theory Projection Operators Metric Tensors Radial Basis Functions
Unable to display preview. Download preview PDF.
- Amari, S-I, Information Geometry of Boltzmann Machines, IEEE Trans. on Neural Networks, Vol. 3, No. 2, March 1992, pg. 260–271Google Scholar
- Caianiello, E.R., Quantum and Other Physics as Systems Theory, La Rivista del Nuovo Cimento, della Societa Italiana di Fisica, 1992, Editrice Compositon, Bologna, ItalyGoogle Scholar
- Lawson, L.L., and Hanson, R.J., Solving Least Squares Problems, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1974Google Scholar
- Pellionisz, A., Discovery of Neural Geometry by Neurobiology and its Utilization in Neurocompute Theory and Development, in Proc. ICANN Helsinki, Elsevier Pub. North Holland, pg. 485–493, (1991)Google Scholar
- Poggio, T., and Girosi, F., Networks for Approximation and Learning, in Proc. IEEE, Vol. 78. No. 9, Sept. 1990, pg. 1488Google Scholar
- Weigl, K., and Berthod, M., Metric Tensors and Dynamical Non-Orthogonal Bases: An Application to Function Approximation, in Proc. WOPPLOT 1992, Workshop on Parallel Processing: Logic, Organization and Technology, Springer Lecture Notes in Computer Sciences, to be published.Google Scholar