Parameter Estimation of One-Class SVM on Imbalance Text Classification

  • Ling Zhuang
  • Honghua Dai
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4013)


Compared with conventional two-class learning schemes, one-class classification simply uses a single class for training purposes. Applying one-class classification to the minorities in an imbalanced data has been shown to achieve better performance than the two-class one. In this paper, in order to make the best use of all the available information during the learning procedure, we propose a general framework which first uses the minority class for training in the one-class classification stage; and then uses both minority and majority class for estimating the generalization performance of the constructed classifier. Based upon this generalization performance measurement, parameter search algorithm selects the best parameter settings for this classifier. Experiments on UCI and Reuters text data show that one-class SVM embedded in this framework achieves much better performance than the standard one-class SVM alone and other learning schemes, such as one-class Naive Bayes, one-class nearest neighbour and neural network.


Grid Search Generalization Performance Minority Class Target Class Positive Instance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Manevitz, L.M., Yousef, M.: One-class svms for document classification. Journal of Machine Learning Research 2, 139–154 (2001)CrossRefGoogle Scholar
  2. 2.
    Raskutti, B., Kowalczyk, A.: Extreme re-balancing for svms: a case study. SIGKDD Explorations 6, 60–69 (2004)CrossRefGoogle Scholar
  3. 3.
    Scholkopt, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Computation 13, 1443–1471 (2001)CrossRefGoogle Scholar
  4. 4.
    Tran, Q.A., Duan, H., Li, X.: One-class support vector machine for anomaly network traffic detection. In: Proceedings of the 2nd Network Research Workshop of the 18th APAN (2004)Google Scholar
  5. 5.
    Kruengkrai, C., Jaruskulchai, C.: Using one-class svms for relevant sentence extraction. In: Proceedings of the 3rd International Symposium on Communications and Information Technologies (ISCIT-2003) (2003)Google Scholar
  6. 6.
    Chen, Y., Zhou, X.S., Huang, T.S.: One-class svm for learning in image retrieval. In: Proceedings of the International Conference in Image Processing (ICIP 2001) (2001)Google Scholar
  7. 7.
    Tran, Q.A., Li, X., Duan, H.: Efficient performance estimate for one-class support vector machine. Pattern Recognition Letters 26, 1174–1182 (2005)CrossRefGoogle Scholar
  8. 8.
    Tran, Q.A., Zhang, Q., Li, X.: Evolving training model method for one-class svm. In: Proceeding of 2003 IEEE International Conference on Systems, Man & Cybernetics (SMC 2003) (2003)Google Scholar
  9. 9.
    Lunts, A., Brailovskiy, V.: Evaluation of attributes obtained in statistical decision rules. Engineering Cybernetics_ pp. 98–109 (1967)Google Scholar
  10. 10.
    Staelin, C.: Parameter selection for support vector machines. Technical Report HPL-2002-354R1, Hewlett-Packard Company (2003)Google Scholar
  11. 11.
    Kubat, M., Matwin, S.: Addressing the curse of imbalanced training sets: one-sided selection. In: Proc. 14th International Conference on Machine Learning, pp. 179–186. Morgan Kaufmann, San Francisco (1997)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ling Zhuang
    • 1
  • Honghua Dai
    • 1
  1. 1.School of Engineering and Information TechnologyDeakin UniversityAustralia

Personalised recommendations