Properties of the Hermite Activation Functions in a Neural Approximation Scheme
The main advantage to use Hermite functions as activation functions is that they offer a chance to control high frequency components in the approximation scheme. We prove that each subsequent Hermite function extends frequency bandwidth of the approximator within limited range of well concentrated energy. By introducing a scalling parameter we may control that bandwidth.
- 1.Hermite polynomial, http://mathworld.wolfram.com/HermitePolynomial.html
- 3.Falhman, S., Lebiere, C.: The cascade correlation learning architecture. Technical report, CMU-CS-90-100 (1991)Google Scholar
- 11.Leshno, T., Lin, V., Pinkus, A., Schocken, S.: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Networks 13, 350–373 (1993)Google Scholar
- 12.Linh, T.H.: Modern Generations of Artificial Neural Networks and their Applications in Selected Classification Problems (in Polish). Publishing House of the Warsaw University of Technology, Warsaw (2004)Google Scholar