Skip to main content
Log in

Training SVMs on a bound vectors set based on Fisher projection

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

Standard support vector machines (SVMs) training algorithms have O(l 3) computational and O(l 2) space complexities, where l is the training set size. It is thus computationally infeasible on very large data sets. To alleviate the computational burden in SVM training, we propose an algorithm to train SVMs on a bound vectors set that is extracted based on Fisher projection. For linear separate problems, we use linear Fisher discriminant to compute the projection line, while for non-linear separate problems, we use kernel Fisher discriminant to compute the projection line. For each case, we select a certain ratio samples whose projections are adjacent to those of the other class as bound vectors. Theoretical analysis shows that the proposed algorithm is with low computational and space complexities. Extensive experiments on several classification benchmarks demonstrate the effectiveness of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Vapnik V N. Estimation of Dependences Based on Empirical Data. Springer-Verlag, 1982

    MATH  Google Scholar 

  2. Vapnik V N. The Nature of Statistical Learning Theory. Springer-Verlag, 1995

    Book  MATH  Google Scholar 

  3. Li S, Wu H, Wan D, Zhu J. An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine. Knowledge-Based Systems, 2011, 24(1): 40–48

    Article  Google Scholar 

  4. Yang J, Yu X, Xie Z Q. A novel virtual sample generation method based on gaussian distribution. Knowledge-Based Systems, 2011, 24(6): 740–748

    Article  Google Scholar 

  5. Vamvakas G, Gatos B, Perantonis S J. Handwritten character recognition through two-stage foreground sub-sampling. Pattern Recognition, 2010, 43(8): 2807–2816

    Article  MATH  Google Scholar 

  6. Oren M, Papageorgious C, Sinha P, Osuna E, Poggio T. Pedestrian detection using wavelet templates. In: Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 1997, 193–199

    Google Scholar 

  7. Assheton P, Hunter A. A shape-based voting algorithm for pedestrian detection and tracking. Pattern Recognition, 2011, 44(5): 1106–1120

    Article  MATH  Google Scholar 

  8. Joachims T. Text categorization with support vector machines. Technical report, University of Dortmund, 1997

    Google Scholar 

  9. Qin J Z, Yung N H C. Scene categorization via contextual visual words. Pattern Recognition, 2010, 43(5): 1874–1888

    Article  MATH  Google Scholar 

  10. Zhang W, Yoshida T, Tang X J. Text classification based on multi-word with support vector machine. Knowledge-Based Systems, 2008, 21(8): 879–886

    Article  Google Scholar 

  11. Schölkopf B, Smola A. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press, 2002

    Google Scholar 

  12. Boser B E, Guyon I M, Vapnik V N. A training algorithm for optimal margin classifiers. In: Proceedings of the 5th Annual Workshop on Computational Learning Theory. 1992, 144–152

    Google Scholar 

  13. Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines. In: Proceedings of the IEEE Workshop on Neural Networks for Signal Processing. 1997, 276–285

    Google Scholar 

  14. Platt J C. Fast training of support vector machines using sequential minimal optimization. In: Schölkopf B, Burges C, Smola A, eds. Advances in kernel methods-support vector learning, 185–208. MIT Press, 1999

    Google Scholar 

  15. Burges C J. Simplified support vector decision rules. In: Proceedings of the ICML. 1996, 71–77

    Google Scholar 

  16. Lee Y J, Mangasarian O L. RSVM: reduced support vector machines. In: Proceedings of the 1st SIAM International Conference on Data Mining. 2001, 5–7

    Google Scholar 

  17. Lin K M, Lin C J. A study on reduced support vector machines. IEEE Transactions on Neural Networks, 2003, 14(6): 1449–1459

    Article  Google Scholar 

  18. Wu M, Schölkopf B, Bakır G. A direct method for building sparse kernel learning algorithms. The Journal of Machine Learning Research, 2006, 7: 603–624

    MATH  Google Scholar 

  19. Kwok J Y, Tsang I H. The pre-image problem in kernel methods. IEEE Transactions on Neural Networks, 2004, 15(6): 1517–1525

    Article  Google Scholar 

  20. Joachims T, Yu C N J. Sparse kernel SVMs via cutting-plane training. Machine Learning, 2009, 76(2–3): 179–193

    Article  Google Scholar 

  21. Shin H, Cho S. Pattern selection for support vector classifiers. In: Proceedings of the 3rd International Conference on Intelligent Data Engineering and Automated Learning. 2002, 469–474

    Google Scholar 

  22. Chen H, Yang B, Wang G, Liu J, Xu X, Wang S, Liu D. A novel bankruptcy prediction model based on an adaptive fuzzy k-nearest neighbor method. Knowledge-Based Systems, 2011, 24(8): 1348–1359

    Article  Google Scholar 

  23. Jiao L, Zhang L, Zhou W. Pre-extracting support vectors for support vector machine. Aata Electronica Sinica, 2001, 29(3): 383–386

    Google Scholar 

  24. Boley D, Cao D. Training support vector machine using adaptive clustering. In: Berry MW, Dayal U, Kamath C, Skillicorn D, eds. Proceedings of the 4th SIAM International Conference on Data Mining, SIAM Press, Philadelpha. 2004, 126–137

    Google Scholar 

  25. Joachims T. Training linear SVMs in linear time. In: Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2006, 217–226

    Chapter  Google Scholar 

  26. Tsang I W, Kwok J T, Cheung P M. Core vector machines: fast SVM training on very large data sets. Journal of Machine Learning Research, 2005, 6(4): 363–392

    MathSciNet  MATH  Google Scholar 

  27. Fisher R. The use of multiple measurements in taxonomic problems. Annals of Human Genetics, 1936, 7(2): 179–188

    Google Scholar 

  28. Khashei M, Hamadani A, Bijari M. A fuzzy intelligent approach to the classification problem in gene expression data analysis. Knowledge-Based Systems, 2011, 27: 465–474

    Article  Google Scholar 

  29. Mika S, Ratsch G, Weston J, Scholkopf B, Mullers K. Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop. 1999, 41–48

    Google Scholar 

  30. Duda R O, Hart P E, Stork D G. Pattern Classification. 2nd ed. Wiley, 2001

    MATH  Google Scholar 

  31. Xu J, Zhang X, Li Y. Kernel MSE algorithm: a unified framework for kfd, ls-svm and krr. In: Proceedings of the 2001 International Joint Conference on Neural Networks. 2001, 1486–1491

    Google Scholar 

  32. Mika S, Smola A, Schölkopf B. An improved training algorithm for kernel fisher discriminants. In: Proceedings of the 14th International Conference on Artificial Intelligence and Statistics. 2001, 98–104

    Google Scholar 

  33. Keerthi S, Lin C. Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation, 2003, 15(7): 1667–1689

    Article  MATH  Google Scholar 

  34. Kohavi R. A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the 14th Joint International Conference on Artificial Intelligence. 1995, 1137–1143

    Google Scholar 

  35. Zhang L. The theory of SVM and programming based learning algorithms in neural networks. Chinese Journal of Computers, 2001, 24(2): 113–118

    MathSciNet  Google Scholar 

  36. Vapnik V N. Statistical Learning Theory. Wiley, 1998

    MATH  Google Scholar 

  37. Cortes C, Vapnik V N. Support-vector network. Machine Learning, 1995, 20(3): 273–297

    MATH  Google Scholar 

  38. Dong C Y. Study of support vector machines and its application in intrusion detection systems. PhD thesis, Xidian University, China, 2004

    Google Scholar 

  39. LeCun Y, Jackel L D, Bottou L, Denker J S, Drucker H, Guyon I, Müller U A, Sackinger E, Simard P, Vapnik V N. Comparison of learning algorithms for handwritten digit recognition. In: Proceedings of International Conference on Artificial Neural Network. 1995, 53–60

    Google Scholar 

  40. DeCoste D, Schölkopf B. Training invariant support vector machines. Machine Learning, 2002, 46(1–3): 161–190

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xu Yu.

Additional information

Xu Yu, PhD. Currently he is in School of Information Science and Technology, Qingdao University of Science and Technology, China. His current research interests include statistical learning theory and transfer learning.

Jing Yang received the PhD from College of Computer Science and Technology, Harbin Engineering University, China. Currently she is a professor and PhD supervisor in College of Computer Science and Technology, Harbin Engineering University, China. Her main research interests include database, data mining, and privacy preservation.

Zhiqiang Xie, PhD. Currently he is a professor and PhD supervisor in College of Computer Science and Technology, Harbin University of Science and Technology, China. His research interests include CIMS and scheduling optimization, etc.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, X., Yang, J. & Xie, Z. Training SVMs on a bound vectors set based on Fisher projection. Front. Comput. Sci. 8, 793–806 (2014). https://doi.org/10.1007/s11704-014-3161-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11704-014-3161-3

Keywords

Navigation