The Essential Approximation Order for Neural Networks with Trigonometric Hidden Layer Units
There have been various studies on approximation ability of feedforward neural networks. The existing studies are, however, only concerned with the density or upper bound estimation on how a multivariate function can be approximated by the networks, and consequently, the essential approximation ability of networks cannot be revealed. In this paper, by establishing both upper and lower bound estimations on approximation order, the essential approximation ability of a class of feedforward neural networks with trigonometric hidden layer units is clarified in terms of the second order modulus of smoothness of approximated function.
KeywordsNeural Network Hide Layer Approximation Order Feedforward Neural Network Hide Unit
Unable to display preview. Download preview PDF.
- 12.Kůrkova, V., Kainen, P.C., Kreinovich, V.: Estimates of the Number of Hidden Units and Variation with Respect to Half-Space. Neural Networks 10, 1068–1078 (1997)Google Scholar
- 14.Peetre, J.: On the Connection Between the Theory of Interpolation Spaces and Approximation Theory. In: Alexits, G., Stechkin, S.B. (eds.) Proc. Conf. Construction of Function, Budapest, pp. 351–363 (1969)Google Scholar