Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Support Vector Data Description


Data domain description concerns the characterization of a data set. A good description covers all target data but includes no superfluous space. The boundary of a dataset can be used to detect novel data or outliers. We will present the Support Vector Data Description (SVDD) which is inspired by the Support Vector Classifier. It obtains a spherically shaped boundary around a dataset and analogous to the Support Vector Classifier it can be made flexible by using other kernel functions. The method is made robust against outliers in the training set and is capable of tightening the description by using negative examples. We show characteristics of the Support Vector Data Descriptions using artificial and real data.


  1. Barnett, V., & Lewis, T. (1978). Outliers in Statistical Data, 2nd ed. Wiley series in probability and mathematical statistics. John Wiley & Sons Ltd.

  2. Bishop, C. (1994). Novelty detection and neural network validation. IEE Proceedings on Vision, Image and Signal Processing. Special Issue on Applications of Neural Networks, 141:4, 217–222.

  3. Bishop, C. (1995). Neural Networks for Pattern Recognition. Oxford University Press, Walton Street, Oxford OX2 6DP.

  4. Blake, C., Keogh, E., & Merz, C. (1998). UCI repository of machine learning databases., University of California, Irvine, Dept. of Information and Computer Sciences.

  5. Bradley, A. (1997). The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognition, 30:7, 1145–1159.

  6. Duin, R. (1976). On the choice of the smoothing parameters for Parzen estimators of probability density functions. IEEE Transactions on Computers, C-25:11, 1175–1179.

  7. Japkowicz, N., Myers, C., & Gluck, M. (1995). A novelty detection approach to classification. In Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence (pp. 518–523).

  8. Koch, M., Moya, M., Hostetler, L., & Fogler, R. (1995). Cueing, feature discovery and one-class learning for synthetic aperture radar automatic target recognition. Neural Networks, 8:7/8, 1081–1102.

  9. MacKay, D. (1992). Bayesian methods for adaptive models. Master's thesis, California Institute of Technology, Pasadena, California.

  10. Metz, C. (1978). Basic principles of ROC analysis. Seminars in Nuclear Medicine, VIII:4.

  11. Moya, M., & Hush, D. (1996). Network contraints and multi-objective optimization for one-class classification. Neural Networks, 9:3, 463–474.

  12. Moya, M., Koch, M., & Hostetler, L. (1993). One-class classifier networks for target recognition applications. In Proceedings World Congress on Neural Networks (pp. 797–801). Portland, OR: International Neural Network Society.

  13. Parra, L., Deco, G., & Miesbach, S. (1996). Statistical independence and novelty detection with information preserving nonlinear maps. Neural Computation, 8, 260–269.

  14. Richard, M., & Lippmann, R. (1991). Neural network classifiers estimate Bayesian a posteriori probabilities. Neural Computation, 3, 461–483.

  15. Ripley, B. (1996). Pattern Recognition and Neural Networks. Cambridge: Cambridge University Press.

  16. Ritter, G., & Gallegos, M. (1997). Outliers in statistical pattern recognition and an application to automatic chromosome classification. Pattern Recognition Letters, 18, 525–539.

  17. Roberts, S., Tarassenko, L., Pardey, J., & Siegwart, D. (1994). A validation index for artificial neural networks. In Proceedings of Int. Conference on Neural Networks and Expert Systems in Medicine and Healthcare (pp. 23–30).

  18. Roberts, S., & Penny, W. (1996). Novelty, confidence and errors in connectionist systems. Tech. rep., Imperial College, London. TR-96-1.

  19. Rosen, J. (1965). Pattern seperation by convex programming. Journal of Mathematical Analysis and Applications, 10:1, 123–134.

  20. Schölkopf, B. (1997). Support vector learning. Ph.D. thesis, Technischen Universität Berlin.

  21. Schölkopf, B., Burges, C., & Vapnik, V. (1995). Extracting support data for a given task. In U. Fayyad, & R. Uthurusamy (eds.), Proc. of First International Conference on Knowledge Discovery and Data Mining (pp. 252–257). Menlo Park, CA. AAAI Press.

  22. Schölkopf, B., Platt, J., Shawe-Taylor, J., A., S., & Williamson, R. (1999a). Estimating the support of a high-dimensional distribution. Neural Computation, 13:7.

  23. Schölkopf, B., Williamson, R., Smola, A., & Shawe-Taylor, J. (1999b). SV estimation of a distribution's support. In Advances in Neural Information Processing Systems.

  24. Smola, A., Schölkopf, B., & Müller, K. (1998). The connection between regularization operators and support vector kernels. Neural Networks, 11, 637–649.

  25. Tarassenko, L., Hayton, P., & Brady, M. (1995). Novelty detection for the identification of masses in mammograms. In Proc. of the Fourth International IEE Conference on Artificial Neural Networks Vol. 409 (pp. 442–447).

  26. Tax, D., Ypma, A., & Duin, R. (1999). Support vector data description applied to machine vibration analysis. In M. Boassen, J. Kaandorp, J. Tonino, & V. M.G. (eds.), Proceedings of the Fifth Annual Conference of the ASCI (pp. 398–405).

  27. Tax, D., & Duin, R. (1999). Support vector domain description. Pattern Recognition Letters, 20:11–13, 1191–1199.

  28. Tax, D., & Duin, R. (2000). Data descriptions in subspaces. In Proceedings of the International Conference on Pattern Recognition 2000, Vol. 2 (pp. 672–675).

  29. Vapnik, V. (1998). Statistical Learning Theory. Wiley.

Download references

Author information

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Tax, D.M., Duin, R.P. Support Vector Data Description. Machine Learning 54, 45–66 (2004).

Download citation

  • outlier detection
  • novelty detection
  • one-class classification
  • support vector classifier
  • support vector data description