The Novelty Detection Approach for Different Degrees of Class Imbalance

  • Hyoung-joo Lee
  • Sungzoon Cho
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4233)


We show that the novelty detection approach is a viable solution to the class imbalance and examine which approach is suitable for different degrees of imbalance. In experiments using SVM-based classifiers, when the imbalance is extreme, novelty detectors are more accurate than balanced and unbalanced binary classifiers. However, with a relatively moderate imbalance, balanced binary classifiers should be employed. In addition, novelty detectors are more effective when the classes have a non-symmetrical class relationship.


Support Vector Machine Minority Class Class Imbalance Novelty Detector Support Vector Data Description 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Kubat, M., Matwin, S.: Addressing the Curse of Imbalanced Training Sets: One-sided Selection. In: Proceedings of 14th International Conference on Machine Learning, pp. 179–186 (1997)Google Scholar
  2. 2.
    Japkowicz, N., Stephen, S.: The Class Imbalance Problem: A Systematic Study. Intelligent Data Analysis 6(5), 429–450 (2002)MATHGoogle Scholar
  3. 3.
    Elkan, C.: The Foundations of Cost-sensitive Learning. In: Proceedings of the Seventh International Joint Conference on Artificial Intelligence, pp. 973–978 (2001)Google Scholar
  4. 4.
    Weiss, G.M.: Mining with Rarity: A Unifying Framework. SIGKDD Explorations 6(1), 7–19 (2004)CrossRefGoogle Scholar
  5. 5.
    Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 321–357 (2002)MATHGoogle Scholar
  6. 6.
    Shin, H.J., Cho, S.: Response Modeling with Support Vector Machines. Expert Systems with Applications 30(4), 746–760 (2006)CrossRefGoogle Scholar
  7. 7.
    He, C., Girolami, M., Ross, G.: Employing Optimized Combinations of One-class Classifiers for Automated Currency Validation. Pattern Recognition 37, 1085–1096 (2004)CrossRefGoogle Scholar
  8. 8.
    Japkowicz, N.: Concept-Learning in the Absence of Counter-Examples: An Autoassociation-based Approach to Classification. PhD thesis. Rutgers University, New Jersey (1999)Google Scholar
  9. 9.
    Lee, H., Cho, S.: SOM-based Novelty Detection Using Novel Data. In: Gallagher, M., Hogan, J.P., Maire, F. (eds.) IDEAL 2005. LNCS, vol. 3578, pp. 359–366. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  10. 10.
    Raskutti, B., Kowalczyk, A.: Extreme Re-balancing for SVMs: A Case Study. SIGKDD Explorations 6(1), 60–69 (2004)CrossRefGoogle Scholar
  11. 11.
    Bishop, C.: Novelty Detection and Neural Network Validation. Proceedings of IEE Conference on Vision, Image and Signal Processing 141(4), 217–222 (1994)CrossRefGoogle Scholar
  12. 12.
    Tax, D.M.J., Duin, R.P.W.: Support Vector Data Description. Machine Learning 54, 45–66 (2004)MATHCrossRefGoogle Scholar
  13. 13.
    Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the Support of a High-dimensional Distribution. Neural Computation 13, 1443–1471 (2001)MATHCrossRefGoogle Scholar
  14. 14.
    Schölkopf, B., Platt, J.C., Smola, A.J.: Kernel Method for Percentile Feature Extraction. Technical Report, MSR-TR-2000-22. Microsoft Research, WA (2000)Google Scholar
  15. 15.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge University Press, Cambridge (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Hyoung-joo Lee
    • 1
  • Sungzoon Cho
    • 1
  1. 1.Seoul National UniversitySeoulKorea

Personalised recommendations