Advertisement

Belief-based chaotic algorithm for support vector data description

  • Javad HamidzadehEmail author
  • Neda Namaei
Methodologies and Application

Abstract

One of the efficient tools to handle segregation of imbalanced data is support vector data description (SVDD). In contrast to support vector machine (SVM), enclosing target data in a hyper-sphere by SVDD leads to avoid biasing toward major data. SVDD can gain the best description of data when its free parameters are set to proper values. In this paper, we propose belief-based chaotic krill herd algorithm for SVDD (BCKH-SVDD) with the aim of designing effective description of data. First, we introduce a new SVDD based on belief function theory, and then, we tune the free parameters by chaotic krill herd algorithm. Belief function theory is one of the best methods to enhance decision making for uncertain data. By adding a new belief-based weight, we can decide better about the data around the SVDD boundary and the classification will be more precise. Chaotic krill herd optimization algorithm introduces chaotic maps in the krill herd algorithm. With the help of chaotic maps, the two issues, namely local optima avoidance and convergence speed, can be overcome. Thus, chaotic krill herd algorithm is constructed based on chaotic functions and automatic switching between global and local searches of krill herd. To present the power of BCKH-SVDD, several experiments have been conducted based on tenfold cross-validation over real-world data sets from UCI repository. Experimental results show the superiority of the proposed algorithm to state-of-the-art methods in terms of classification accuracy, precision and recall measures.

Keywords

One-class classification Support vector data description Belief function theory Outlier detection Belief-based chaotic algorithm for SVDD 

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with animals performed by any of the authors.

References

  1. Bartlett P, Mendelson S (2002) Rademacher and Gaussian complexities: risk bounds and structural results. J Mach Learn Res 3:463–482MathSciNetzbMATHGoogle Scholar
  2. Cha M, Kim JS, Park SH, Baek J (2012) Nonparametric control chart using density weighted support vector data description. In: Proceedings of world academy of science, engineering and technology, world academy of science, engineering and technology (WASET) 1020Google Scholar
  3. Cha M, Kim JS, Baek J (2014) Density weighted support vector data description. Expert Syst Appl 41:3343–3350CrossRefGoogle Scholar
  4. Chen G, Zhang X, Wang J, Li F (2015) Robust SVDD for outlier detection with noise or uncertain data. Knowl Based Syst 90:129–137CrossRefGoogle Scholar
  5. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297zbMATHGoogle Scholar
  6. Demsar J (2006) Statistical comparisons of classifers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetzbMATHGoogle Scholar
  7. El Boujnouni M, Jedra M, Zahid N (2014) Support vector domain description with a new confidence coefficient. In: 2014 9th International conference on intelligent systems: theories and applications (SITA-14), pp 1–8Google Scholar
  8. Ernst M, Haesbroeck G (2016) Comparison of local outlier detection techniques in spatial multivariate data. Data Min Knowl Disc.  https://doi.org/10.1007/s10618-016-0471-0 Google Scholar
  9. Esme E, Karlik B (2016) Fuzzy c-mean based support vector machines classifier for perfume recognition. Appl Soft Comput 46:452–458CrossRefGoogle Scholar
  10. GhasemiGol M, Sabzekar M, Monsefi R, Naghibzadeh M, Yazdi HS (2010) A new support vector data description with fuzzy constraints. In: 2010 International conference on intelligent systems, modelling and simulation (ISMS), pp 10–14Google Scholar
  11. Ghoting A, Parthasarathy S, Eric Otey M (2008) Fast mining of distance-based outliers in high-dimensional datasets. Data Min Knowl Disc 16:349–364MathSciNetCrossRefGoogle Scholar
  12. Hamidzadeh J, Monsefi R, Yazdi H (2015) IRAHC: instance reduction algorithm using hyperrectangle clustering. Pattern Recognit 48:1878–1889CrossRefzbMATHGoogle Scholar
  13. Hao P, Chiang J, Lin Y (2009) a new maximal-margin spherical-structured multi-class support vector machine. Appl Intell 30:98–111CrossRefGoogle Scholar
  14. Hu Y, Liu JN, Wang Y, Lai L (2012) A weighted support vector data description based on rough neighborhood approximation. In: 2012 IEEE 12th international conference on data mining workshops (ICDMW), pp 635–642Google Scholar
  15. Hu W, Wang S, Chung F, Liu Y, Ying W (2015) Privacy preserving and fast decision for novelty detection using support vector data description. Soft Comput 19(5):1–16CrossRefGoogle Scholar
  16. Jeong YS, Jayaraman R (2015) Support vector-based algorithms with weighted dynamic time warping kernel function for time series classification. Knowl Based Syst 75:184–191CrossRefGoogle Scholar
  17. Jiang Y, Wang Y, Luo H (2015) Fault diagnosis of analog circuit based on a second map SVDD. Analog Integr Circ Sig Process 85:395–404CrossRefGoogle Scholar
  18. Jones M, Nikovski D, Imamura M, Hirata T (2016) Exemplar learning for extremely efficient anomaly detection in real-valued time series. Data Min Knowl Disc 30(6):1427–1454MathSciNetCrossRefGoogle Scholar
  19. Kaveh A, Talatahari S (2010) Optimum design of skeletal structures using imperialist competitive algorithm. Comput Struct 88:1220–1229CrossRefzbMATHGoogle Scholar
  20. Krawczyk B, Woźniak M (2015) Bagging for combining weighted one-class support vector machines. Procedia Comput Sci 51:1565–1573CrossRefGoogle Scholar
  21. Kutsuna T, Yamamoto A (2016) Outlier detection using binary decision diagrams. Data Min Knowl Disc 458:1–26Google Scholar
  22. Lai V, Nguyen D, Nguyen K, Le T (2015) Mixture of support vector data description. In: 2nd National foundation for science and technology development conference on information and computer science (NICS), pp 135–140Google Scholar
  23. Lee K, Kim DW, Lee KH, Lee D (2007) Density-induced support vector data description. IEEE Trans Neural Netw 18:284–289CrossRefGoogle Scholar
  24. Li J, Su L, Cheng C (2011) Finding pre-image via evolution strategies. Appl Soft Comput 11:4183–4194CrossRefGoogle Scholar
  25. Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml. Accessed 10 Mar 2015
  26. Liu B, Xiao Y, Cao L, Hao Z, Deng F (2013) SVDD-based outlier detection on uncertain data. Knowl Inf Syst 34:597–618CrossRefGoogle Scholar
  27. Liu B, Xiao Y, Yu P, Hao Z, Cao L (2014a) An efficient approach for outlier detection with imperfect data labels. IEEE Trans Knowl Data Eng 26:1602–1616CrossRefGoogle Scholar
  28. Liu Z, Pan Q, Dezert J, Mercier G (2014b) Credal classification rule for uncertain data based on belief functions. Pattern Recognit 47:2532–2541CrossRefGoogle Scholar
  29. Luo J, Ding L, Pan Z, Ni G, Hu G (2007) Research on cost-sensitive learning in one-class anomaly detection algorithms. In: Autonomic and trusted computing, pp 259–268Google Scholar
  30. Mirylenka K, Giannakopoulos G, Minh Do L, Palpanas T (2016) On classifier behavior in the presence of mislabeling noise. Data Min Knowl Disc.  https://doi.org/10.1007/s10618-016-0484-8
  31. Moghaddam V, Hamidzadeh J (2016) New Hermite orthogonal polynomial kernel and combined kernels in Support Vector Machine classifier. Pattern Recognit 60:921–935CrossRefGoogle Scholar
  32. Peng X, Tan J (2015) EL-SVDD: an improved and localized multi-class classification algorithm. In: Applied mechanics and materials, Trans. Tech. Publ., vol 713, pp 1693–1698Google Scholar
  33. Peng X, Xu D (2012) Efficient support vector data descriptions for novelty detection. Neural Comput Appl 21:2023–2032CrossRefGoogle Scholar
  34. Sadeghi R, Hamidzadeh J (2016) Automatic support vector data description. Soft Comput.  https://doi.org/10.1007/s00500-016-2317-5
  35. Shafer G (1976) A mathematical theory of evidence. Princeton University Press, PrincetonzbMATHGoogle Scholar
  36. Smets P (1990) The combination of evidence in the transferable belief model. IEEE Trans Pattern Anal Mach Intell 12:447–458CrossRefGoogle Scholar
  37. Smets P (2007) Analyzing the combination of conflicting belief functions. Inf Fus 8:387–412CrossRefGoogle Scholar
  38. Smets P, Kennes R (1994) The transferable belief model. Artif Intell 66:191–243MathSciNetCrossRefzbMATHGoogle Scholar
  39. Tavakkoli A, Nicolescu M, Bebis G (2008) Incremental SVDD training: improving efficiency of background modeling in videos. In: Proceedings of the 10th IASTED international conference, pp 92Google Scholar
  40. Tax D, Duin R (2004) Support vector data description. Mach Learn 54:45–66CrossRefzbMATHGoogle Scholar
  41. Tax D, Laskov P (2003) Online SVM learning: from classification to data description and back. In: 2003 IEEE 13th workshop on neural networks for signal processing, 2003. NNSP’03, pp 499–508Google Scholar
  42. Theljani F, Laabidi K, Zidi S, Ksouri M (2015) Tennessee Eastman Process diagnosis based on dynamic classification with SVDD. J Dyn Syst Meas Control.  https://doi.org/10.1115/1.4030429
  43. Wang G, Guo L, Gandomi AH, Hao G, Wang H (2014) Chaotic krill herd algorithm. Inf Sci 274:17–34MathSciNetCrossRefGoogle Scholar
  44. Wang Z, Zhao Z, Weng S, Zhang C (2015) Solving one-class problem with outlier examples by SVM. Neurocomputing 149:100–105CrossRefGoogle Scholar
  45. Wang L, Jia P, Huang T, Duan S, Yan J, Wang L (2016) A novel optimization technique to improve gas recognition by electronic noses based on the enhanced krill herd algorithm. Sensors 8:16Google Scholar
  46. Wu M, Ye J (2009) A small sphere and large margin approach for novelty detection using training data with outliers. IEEE Trans Pattern Anal Mach Intell 31:2088–2092CrossRefGoogle Scholar
  47. Zhang Y, Chi Z, Li K (2009) Fuzzy multi-class classifier based on support vector data description and improved PCM. Expert Syst Appl 36:8714–8718CrossRefGoogle Scholar
  48. Zhang X, Li A, Pan R (2016) Stock trend prediction based on a new status box method and AdaBoost probabilistic support vector machine. Appl Soft Comput 49:385–398CrossRefGoogle Scholar
  49. Zheng S (2016) Smoothly approximated support vector domain description. Pattern Recognit 49:55–64CrossRefGoogle Scholar
  50. Zhu X, Wu X (2004) Class noise vs. attribute noise: a quantitative study of their impacts. Artif Intell Rev 22:177–210CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Faculty of Computer Engineering and Information TechnologySadjad University of TechnologyMashhadIran

Personalised recommendations