Pareto-optimal Noise and Approximation Properties of RBF Networks

  • Ralf Eickhoff
  • Ulrich Rückert
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4131)


Neural networks are intended to be robust to noise and tolerant to failures in their architecture. Therefore, these systems are particularly interesting to be integrated in hardware and to be operating under noisy environment. In this work, measurements are introduced which can decrease the sensitivity of Radial Basis Function networks to noise without any degradation in their approximation capability. For this purpose, pareto-optimal solutions are determined for the parameters of the network.


Radial Basis Function Target Function Radial Basis Function Network Goal Attainment Approximation Capability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Haykin, S.: Neural Networks. A Comprehensive Foundation, 2nd edn. Prentice Hall, New Jersey (1999)MATHGoogle Scholar
  2. 2.
    Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural networks architectures. Neural Computation 7(2), 219–269 (1995)CrossRefGoogle Scholar
  3. 3.
    Bernier, J.L., Diáz, A.F., Fernández, F.J., Cañas, A., González, J., Martín-Smith, P., Ortega, J.: Assessing the noise immunity and generalization of radial basis function networks. Neural Processing Letters 18(1), 35–48 (2003)CrossRefGoogle Scholar
  4. 4.
    Rudin, W.: Principles of Mathematical Analysis. International Series in Pure and Applied Mathematics. McGraw-Hill (1976)MATHGoogle Scholar
  5. 5.
    Eickhoff, R., Rückert, U.: Robustness of Radial Basis Functions. Neurocomputing (2006) (in press)Google Scholar
  6. 6.
    Bronstein, I.N., Semendyayev, K.A.: Handbook of Mathematics, 3rd edn. Springer, Heidelberg (1997)MATHGoogle Scholar
  7. 7.
    Sawaragi, Y., Nakayama, H., Tanino, T.: Theory of Multiobjective Optimization. Academic Press, Orlando (1985)MATHGoogle Scholar
  8. 8.
    Collette, Y., Siarry, P.: Multiobjective Optimization: Principles and Case Studies (Decision Engineering). Springer, Heidelberg (2004)MATHGoogle Scholar
  9. 9.
    Gonzalez, J., Rojas, I., Ortega, J., Pomares, H., Fernandez, F., Diaz, A.: Multiobjective evolutionary optimization of the size, shape, and position parameters of radial basis function networks for function approximation. IEEE Transactions on Neural Networks 14(6), 1478–1495 (2003)CrossRefGoogle Scholar
  10. 10.
    Pohlheim, H.: GEATbx: Genetic and Evolutionary Algorithm Toolbox for use with Matlab. Online resource (2005),
  11. 11.
    Köster, M., Grauel, A., Klene, G., Convey, H.J.: A new paradigm of optimisation by using artificial immune reactions. In: 7th International Conference on Knowledge-Based Intelligent Information & Engineering Systems, pp. 287–292 (2003)Google Scholar
  12. 12.
    Nabney, I.: NETLAB: algorithms for pattern recognitions. Advances in pattern recognition. pub-SV, New York (2002)Google Scholar
  13. 13.
    Verleysen, M., François, D.: The Curse of Dimensionality in Data Mining and Time Series Prediction. In: Cabestany, J., Prieto, A.G., Sandoval, F. (eds.) IWANN 2005. LNCS, vol. 3512, pp. 758–770. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  14. 14.
    Dellnitz, M., Schütze, O., Hestermeyer, T.: Covering Pareto sets by multilevel subdivision techniques. J. Optim. Theory Appl. 124(1), 113–136 (2005)CrossRefMathSciNetMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ralf Eickhoff
    • 1
  • Ulrich Rückert
    • 1
  1. 1.Heinz Nixdorf Institute, System and Circuit TechnologyUniversity of PaderbornGermany

Personalised recommendations