Fusion of Gaussian Kernels Within Support Vector Classification

  • Javier M. Moguerza
  • Alberto Muñoz
  • Isaac Martín de Diego
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4225)


In this paper we propose some methods to build a kernel matrix for classification purposes using Support Vector Machines (SVMs) by fusing Gaussian kernels. The proposed techniques have been successfully evaluated on artificial and real data sets. The new methods outperform the best individual kernel under consideration and they can be used as an alternative to the parameter selection problem in Gaussian kernel methods.


  1. 1.
    Amari, S., Wu, S.: Improving support vector machine classifiers by modifying kernel functions. Neural Networks 12, 783–789 (1999)CrossRefGoogle Scholar
  2. 2.
    Bousquet, O., Herrmann, D.J.L.: On the complexity of learning the kernel matrix. In: Becker, S., Thurn, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 15, pp. 415–422. The MIT Press, Cambridge (2003)Google Scholar
  3. 3.
    Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1/3), 131–159 (2002)zbMATHCrossRefGoogle Scholar
  4. 4.
    Gower, J.C., Legendre, P.: Metric and euclidean properties of dissimilarity coefficients. Journal of Classification 3, 5–48 (1986)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Keerthi, S.S., Lin, C.: Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation 15, 1667–1689 (2003)zbMATHCrossRefGoogle Scholar
  6. 6.
    Lanckriet, G.R.G., Cristianini, N., Barlett, P., El Ghaoui, L., Jordan, M.I.: Learning the kernel matrix with semi-definite programming. Journal of Machine Learning Research 5, 27–72 (2004)Google Scholar
  7. 7.
    Lee, J.-H., Lin, C.-J.: Automatic model selection for support vector machines. Technical report, National Taiwan University (2000)Google Scholar
  8. 8.
    Lehmann, E.L.: Nonparametrics: Statistical Methods Based on Ranks. McGraw-Hill, New York (1975)zbMATHGoogle Scholar
  9. 9.
    Mangasarian, O.L., Wolberg, W.H.: Cancer diagnosis via linear programming. SIAM News 23(5), 1–18 (1990)Google Scholar
  10. 10.
    Moguerza, J.M., Martín de Diego, I., Muñoz, A.: Improving support vector classificacion via the combination of multiple sources of information. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 592–600. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Pȩkalska, E., Duin, R.P.W., Günter, S., Bunke, H.: On not making dissimilarities euclidean. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 1145–1154. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  12. 12.
    Pȩkalska, E., Paclík, P., Duin, R.P.W.: A generalized kernel approach to dissimilarity-based classification. Journal of Machine Learning Research, Special Issue on Kernel Methods 2(12), 175–211 (2001)Google Scholar
  13. 13.
    Schittkowski, K.: Optimal parameter selection in support vector machines. Journal of Industrial and Management Optimization 1(4), 465–476 (2005)zbMATHMathSciNetGoogle Scholar
  14. 14.
    Schölkopf, B., Mika, S., Burges, C.J.C., Müller, K.-R., Knirsch, P., Rätsch, G., Smola, A.J.: Input space vs. feature space in kernel-based methods. IEEE Transactions on Neural Networks (1999)Google Scholar
  15. 15.
    Silverman, B.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, London (1986)zbMATHGoogle Scholar
  16. 16.
    Vandenberghe, L., Boyd, S.: Semidefinite programming. SIAM Review 38(1), 49–95 (1996)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Javier M. Moguerza
    • 1
  • Alberto Muñoz
    • 2
  • Isaac Martín de Diego
    • 1
  1. 1.University Rey Juan CarlosMóstolesSpain
  2. 2.University Carlos III de MadridGetafeSpain

Personalised recommendations