A New Formulation for Classification by Ellipsoids

  • Ayşegül Uçar
  • Yakup Demir
  • Cüneyt Güzeliş
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3949)


We propose a new formulation for the optimal separation problems. This robust formulation is based on finding the minimum volume ellipsoid covering the points belong to the class. Idea is to separate by ellipsoids in the input space without mapping data to a high dimensional feature space unlike Support Vector Machines. Thus the distance order in the input space is preserved. Hopfield Neural Network is described for solving the optimization problem. The benchmark Iris data is given to evaluate the formulation.


Support Vector Machine Feature Space Input Space Polynomial Kernel Radial Basis Function Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Vapnik, V.: Statistical Learning Theory. John Wiley, New York (1998)MATHGoogle Scholar
  2. 2.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)MATHGoogle Scholar
  3. 3.
    Zhang, B.: Is the Maximal Margin Hyperplane Special in a Feature Space? Hewlett-Packard Research Laboratories Palo Alto (2001)Google Scholar
  4. 4.
    Barnes, E.R.: An Algorithm for Separating Patterns by Ellipsoids. IBM. J. Res. Develop. 26, 759–764 (1982)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Vandenberghe, L., Boyd, S.: Applications of Semidefinite Programming. Technical Report of California University (1998)Google Scholar
  6. 6.
    Glineur, F.: Pattern Separation Via Ellipsoids and Conic Programming, Mémoire de D.E.A., Faculté Polytechnique de Mons, Mons, Belgium (September 1998)Google Scholar
  7. 7.
    Astorino, A., Gaudioso, M.: Ellipsoidal Separation for Classification Problems. Optimizations Methods and Software 20, 267–276 (2005)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Doğan, H.: Gradient Networks Design for Clustering in Novel Optimization Frameworks. Dokuz Eylül University, PhD. Thesis (December 2004)Google Scholar
  9. 9.
    Kruss, M.: Nonlinear Multivariate Analysis with Geodesic Kernels. Berlin Technical University, Thesis (February 2002)Google Scholar
  10. 10.
    Zhang, Z.: Learning Metrics Via Discriminant Kernels and Multidimensional Scaling: To-ward Expected Euclidean Representation. In: ICML 2003, pp. 872–879 (2003)Google Scholar
  11. 11.
    Lyhyaoui, A., Martinez, M., Mora, I., Vaquez, M., Sancho, J.-L., Figueiras-Vidal, A.R.: Sample Selection Via Clustering to Construct Support Vector-Like Classifiers. IEEE Trans. Neural Networks 10, 1474–1481 (1999)CrossRefGoogle Scholar
  12. 12.
    Tax, D.M.J., Duin, R.P.W.: Support Vector Domain Description. Pattern Recognition Letters 20, 1191–1199 (1999)CrossRefGoogle Scholar
  13. 13.
    Vincent, W.: SVM and Co-Training. Honc Konc Bapist University, Technical Report (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ayşegül Uçar
    • 1
  • Yakup Demir
    • 1
  • Cüneyt Güzeliş
    • 2
  1. 1.Electrical and Electronics Engineering Department, Engineering FacultyFırat UniversityElazigTurkey
  2. 2.Electrical and Electronics Engineering DepartmentDokuz Eylül UniversityİzmirTurkey

Personalised recommendations