Some Marginal Learning Algorithms for Unsupervised Problems

  • Qing Tao
  • Gao-Wei Wu
  • Fei-Yue Wang
  • Jue Wang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3495)


In this paper, we investigate one-class and clustering problems by using statistical learning theory. To establish a universal framework, a unsupervised learning problem with predefined threshold η is formally described and the intuitive margin is introduced. Then, one-class and clustering problems are formulated as two specific η-unsupervised problems. By defining a specific hypothesis space in η-one-class problems, the crucial minimal sphere algorithm for regular one-class problems is proved to be a maximum margin algorithm. Furthermore, some new one-class and clustering marginal algorithms can be achieved in terms of different hypothesis spaces. Since the nature in SVMs is employed successfully, the proposed algorithms have robustness, flexibility and high performance. Since the parameters in SVMs are interpretable, our unsupervised learning framework is clear and natural. To verify the reasonability of our formulation, some synthetic and real experiments are conducted. They demonstrate that the proposed framework is not only of theoretical interest, but they also has a legitimate place in the family of practical unsupervised learning techniques.


Learning Problem Unsupervised Learning Cluster Problem Hypothesis Space Machine Learn Research 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)zbMATHGoogle Scholar
  2. 2.
    Vapnik, V.: Statistical Learning Theory. Addison-Wiley, Chichester (1998)zbMATHGoogle Scholar
  3. 3.
    Cristianini, N., Schawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge Univ. Press, Cambridge (2000)Google Scholar
  4. 4.
    Valiant, L.G.: A Theory of the Learnable. Communications of the ACM 27(11), 1134–1142 (1984)zbMATHCrossRefGoogle Scholar
  5. 5.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, Chichester (2001)zbMATHGoogle Scholar
  6. 6.
    Schapire, R., Freund, Y., Bartlett, P., Sun Lee, W.: Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Statist. 26(5), 1651–1686 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Rätsch, G.: Robust boosting via convex optimization. Ph. D. thesis, University of Posdam (2001)Google Scholar
  8. 8.
    Kégl, B.: Principal curves: learning, design, and applications. Ph. D. Thesis, Concordia University (1999)Google Scholar
  9. 9.
    Smola, A.J., Mika, S., Schölkopf, B., Williamson, R.C.: Regularized principal manifolds. Journal of Machine Learning Research 1, 179–200 (2001)zbMATHCrossRefGoogle Scholar
  10. 10.
    Schölkopf, B., Platt, J., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the Support of a High-Dimensional Distribution. Neural Computation 13(7), 1443–1471 (2001)zbMATHCrossRefGoogle Scholar
  11. 11.
    Rätsch, G., Mika, S., Schölkopf, B., Müller, K.R.: Constructing boosting algorithms from SVMs: an application to one-class classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 9(4), 1184–1199 (2002)CrossRefGoogle Scholar
  12. 12.
    Tax, D., Duin, R.: Support vector domain description. Pattern Recognition Letters 20, 1191–1199 (1999)CrossRefGoogle Scholar
  13. 13.
    Tax, D., Duin, R.: Support vector data description. Machine Learning 54, 45–66 (2004)zbMATHCrossRefGoogle Scholar
  14. 14.
    Tao, Q., Wu, G., Wang, J.: A new maximum margin algorithm for one-class problems and its boosting implementation. Pattern Recognition (2005) (accepted)Google Scholar
  15. 15.
    Girolami, M.: Mercer Kernel-Based Clustering in Feature Space. IEEE Trans. Neural Networks 13(3), 780–784 (2002)CrossRefGoogle Scholar
  16. 16.
    Ben-Hur, A., Horn, D., Siegelmann, H.T., Vapnik, V.: Support Vector Clustering. Journal of Machine Learning Research 2, 135–137 (2001)Google Scholar
  17. 17.
    Schölkopf, B., Smola, A.J., Williamson, R., Bartlett, P.: New support vector algorithms. Neural Computation 12, 1083–1121, 225–254 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Qing Tao
    • 1
  • Gao-Wei Wu
    • 2
  • Fei-Yue Wang
    • 1
  • Jue Wang
    • 1
  1. 1.The Key Laboratory of Complex Systems and Intelligence Science, Institute of AutomationChinese Academy of SciencesBeijingP. R. China
  2. 2.Bioinformatics Research Group, Key Laboratory of Intelligent Information Processing, Institute of Computing TechnologyChinese Academy of SciencesBeijingP. R. China

Personalised recommendations