Abstract
Support Vector Machines (SVMs) have been dominant learning techniques for more than ten years, and mostly applied to supervised learning problems. Recently nice results are obtained by two-class unsupervised classification algorithms where the optimization problems based on Bounded C-SVMs, Bounded ν-SVMs and Lagrangian SVMs respectively are relaxed to Semi-definite Programming. In this paper we propose another approach to solve unsupervised classification problem, which directly relaxes a modified version of primal problem of SVMs with label variables to a semi-definite programming. The preliminary numerical results show that our new algorithm often obtains more accurate results than other unsupervised classification methods, although the relaxation has no tight bound, as shown by an example where its approximate ratio of optimal values can be arbitrarily large.
Supported by the Key Project of the National Natural Science Foundation of China (No.10631070),the National Natural Science Foundation of China (No.10601064) and Funding Project for Academic Human Resource Development in Institutions of Higher Learning Under the Jurisdiction of Beijing Municipality.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Schoelkopf, B., Smola, A.: Learning with kernels: Support Vector Machines, Regularization, Optimization,and Beyond. MIT Press, Cambridge (2002)
Lanckriet, G., Cristianini, N., Bartlett, P., Ghaoui, L., Jordan, M.: Learning the kernel matrix with semidefinite programming. Journal of Machine learning research 5 (2004)
De Bie, T., Crisrianini, N.: Convex methods for transduction. In: Advances in Neural Information Processing Systems (NIPS-2003), vol. 16 (2003)
Xu, L., Neufeld, J., Larson, B., Schuurmans, D.: Maximum margin clustering. In: Advances in Neural Information Processing Systems (NIPS-2004). 17 (2004)
Zhao, K., Tian, Y.J., Deng, N.Y.: Unsupervised and Semi-supervised Two-class Support Vector Machines. In: Sixth IEEE Internaitonal Conference on Data Minging workshops, Hong Kong, December 2006, pp. 813–817 (2006)
Zhao, K., Tian, Y.-J., Deng, N.-Y.: Unsupervised and semi-supervised lagrangian support vector machines. In: Shi, Y., van Albada, G.D., Dongarra, J., Sloot, P.M.A. (eds.) ICCS 2007. LNCS, vol. 4489, pp. 882–889. Springer, Heidelberg (2007)
Zhao, K.: Unsupervised and Semi-supervised Support Vector Classification, Phd.Thesis (2008)
Deng, N.Y., Tian, Y.J.: A New Method of Data Mining: Support Vector Machines. Science Press (2004)
Balcan, M.F., Blum, A., Vempala, S.: Kernels as Features: On Kernels, Margins and Low-dimension Mappings. Machine Learning 65, 79–94 (2006)
Ester, M., Kriegel, H., Sander, J., Xu, X.: Density-based Algorithm for Discovering Clusters in Large Spatial Databases with Noise. In: Proc. 2nd Int. Conf. on Knowledge Discovery and Data Mining, pp. 226–231 (1996)
Sturm, J.F.: Using SeDuMi1.02, A Matlab Toolbox for Optimization over Symmetric Cones. Optimization Methods and Software 11-12, 625–653 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhao, K., Tian, Yj., Deng, Ny. (2009). New Unsupervised Support Vector Machines. In: Shi, Y., Wang, S., Peng, Y., Li, J., Zeng, Y. (eds) Cutting-Edge Research Topics on Multiple Criteria Decision Making. MCDM 2009. Communications in Computer and Information Science, vol 35. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02298-2_89
Download citation
DOI: https://doi.org/10.1007/978-3-642-02298-2_89
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02297-5
Online ISBN: 978-3-642-02298-2
eBook Packages: Computer ScienceComputer Science (R0)