Estimates of Approximation Rates by Gaussian Radial-Basis Functions

  • Paul C. Kainen
  • Věra Kůrková
  • Marcello Sanguineti
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4432)


Rates of approximation by networks with Gaussian RBFs with varying widths are investigated. For certain smooth functions, upper bounds are derived in terms of a Sobolev-equivalent norm. Coefficients involved are exponentially decreasing in the dimension. The estimates are proven using Bessel potentials as auxiliary approximating functions.


Approximation Rate Radial Basis Function Network Hide Unit Reproduce Kernel Hilbert Space Normed Linear Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Adams, R.A., Fournier, J.J.F.: Sobolev Spaces. Academic Press, Amsterdam (2003)zbMATHGoogle Scholar
  2. 2.
    Barron, A.R.: Neural net approximation. In: Narendra, K. (ed.) Proc. 7th Yale Workshop on Adaptive and Learning Systems, pp. 69–72. Yale University Press, New Haven (1992)Google Scholar
  3. 3.
    Barron, A.R.: Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory 39, 930–945 (1993)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Carlson, B.C.: Special Functions of Applied Mathematics. Academic Press, New York (1977)zbMATHGoogle Scholar
  5. 5.
    Girosi, F.: Approximation error bounds that use VC-bounds. In: Proceedings of the International Conference on Neural Networks, Paris, pp. 295–302 (1995)Google Scholar
  6. 6.
    Girosi, F., Anzellotti, G.: Rates of convergence for radial basis functions and neural networks. In: Mammone, R.J. (ed.) Artificial Neural Networks for Speech and Vision, pp. 97–113. Chapman & Hall, London (1993)Google Scholar
  7. 7.
    Hartman, E.J., Keeler, J.D., Kowalski, J.M.: Layered neural networks with Gaussian hidden units as universal approximations. Neural Computation 2, 210–215 (1990)CrossRefGoogle Scholar
  8. 8.
    Jones, L.K.: A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training. Annals of Statistics 20, 608–613 (1992)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Kainen, P.C., Kurková, V., Sanguineti, M.: Rates of approximation of smooth functions by Gaussian radial-basis- function networks. Research report ICS–976 (2006),
  10. 10.
    Kon, M.A., Raphael, L.A., Williams, D.A.: Extending Girosi’s approximation estimates for functions in Sobolev spaces via statistical learning theory. J. of Analysis and Applications 3, 67–90 (2005)zbMATHMathSciNetGoogle Scholar
  11. 11.
    Kon, M.A., Raphael, L.A.: Approximating functions in reproducing kernel Hilbert spaces via statistical learning theory. Preprint (2005)Google Scholar
  12. 12.
    Kurková, V.: Dimension–independent rates of approximation by neural networks. In: Warwick, K., Kárný, M. (eds.) Computer–Intensive Methods in Control and Signal Processing: Curse of Dimensionality, pp. 261–270. Birkhäuser, Basel (1997)Google Scholar
  13. 13.
    Kurková, V.: High-dimensional approximation and optimization by neural networks. In: Suykens, J., et al. (eds.) Advances in Learning Theory: Methods, Models and Applications, pp. 69–88. IOS Press, Amsterdam (2003)Google Scholar
  14. 14.
    Kurková, V., Kainen, P.C., Kreinovich, V.: Estimates of the number of hidden units and variation with respect to half-spaces. Neural Networks 10, 1061–1068 (1997)CrossRefGoogle Scholar
  15. 15.
    Martínez, C., Sanz, M.: The Theory of Fractional Powers of Operators. Elsevier, Amsterdam (2001)zbMATHGoogle Scholar
  16. 16.
    Mhaskar, H.N.: Versatile Gaussian networks. In: Proc. IEEE Workshop of Nonlinear Image Processing, pp. 70–73 (1995)Google Scholar
  17. 17.
    Mhaskar, H.N., Micchelli, C.A.: Approximation by superposition of a sigmoidal function and radial basis functions. Advances in Applied Mathematics 13, 350–373 (1992)zbMATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Park, J., Sandberg, I.W.: Universal approximation using radial–basis–function networks. Neural Computation 3, 246–257 (1991)CrossRefGoogle Scholar
  19. 19.
    Park, J., Sandberg, I.: Approximation and radial basis function networks. Neural Computation 5, 305–316 (1993)CrossRefGoogle Scholar
  20. 20.
    Pisier, G.: Remarques sur un resultat non publié de B. Maurey. In: Seminaire d’Analyse Fonctionelle, vol. I(12), pp. 1980–1981, École Polytechnique, Centre de Mathématiques, PalaiseauGoogle Scholar
  21. 21.
    Stein, E.M.: Singular Integrals and Differentiability Properties of Functions. Princeton University Press, Princeton (1970)zbMATHGoogle Scholar
  22. 22.
    Strichartz, R.: A Guide to Distribution Theory and Fourier Transforms. World Scientific, Hackensack (2003)zbMATHGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Paul C. Kainen
    • 1
  • Věra Kůrková
    • 2
  • Marcello Sanguineti
    • 3
  1. 1.Department of Mathematics, Georgetown University, Washington, D. C. 20057-1233USA
  2. 2.Institute of Computer Science, Academy of Sciences of the Czech Republic, Pod Vodárenskou věží 2, Prague 8Czech Republic
  3. 3.Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 GenovaItaly

Personalised recommendations