Advertisement

Rough Support Vectors: Classification, Regression, Clustering

  • Pawan Lingras
  • Parag Bhalchandra
  • Cory Butz
  • S. Asharaf
Part of the Intelligent Systems Reference Library book series (ISRL, volume 42)

Abstract

Support vector techniques were proposed by Vapnik as an alternative to neural networks for solving non-linear problems. The concepts of margins in support vector techniques provides a natural relationship with the rough set theory. This chapter describes rough set theoretic extensions of support vector technologies for classification, prediction, and clustering. The theoretical formulations of rough support vector machines, rough support vector regression, and rough support vector clustering are supported with a summary of experimental results.

Keywords

Support vector machines clustering prediction classification rough sets rough patterns financial modeling conservative and aggressive modeling regression ε-insensitive loss function 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Asharaf, S., Shevade, S.K., Murty, N.M.: Rough support vector clustering. Pattern Recognition 38(10), 1779–1783 (2005)MATHCrossRefGoogle Scholar
  2. 2.
    Ben-Hur, A., Horn, D., Siegelmann, H.T., Vapnik, V.N.: Support Vector Clustering. Journal of Machine Learning Research 2(2), 125–137 (2001)Google Scholar
  3. 3.
    Boser, B.E., Guyon, I., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proc. 5th Annual Workshop Computational Learning Theory, pp. 144–152. ACM, New York (1992)Google Scholar
  4. 4.
    Burges, C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)CrossRefGoogle Scholar
  5. 5.
    Cristianini, N.: Support vector and kernel methods for pattern recognition (2003), http://www.support-vector.net/tutorial.html
  6. 6.
    Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley-Interscience, New York (2000)Google Scholar
  7. 7.
    Hoffmann, A.: Vc learning theory and support vector machines (2003), http://www.cse.unsw.edu.au/~cs9444/Notes02/Achim-Week11.pdf
  8. 8.
    Knerr, S., Personnaz, L., Dreyfus, G.: Single-layer learning revisited: A stepwise procedure for building and training a neural network. In: Neurocomputing: Algorithms, Architectures and Applications. Springer, Heidelberg (1990)Google Scholar
  9. 9.
    Lingras, P.: Applications of rough patterns. In: Rough Sets in Data Mining and Knowledge Discovery, vol. 2, pp. 369–384. Springer (1998)Google Scholar
  10. 10.
    Lingras, P.: Fuzzy-rough and rough-fuzzy serial combinations in neurocomputing. Neurocomputing Journal 36, 29–44 (2001)MATHCrossRefGoogle Scholar
  11. 11.
    Lingras, P., Butz, C.J.: Interval set classifiers using support vector machines. In: Proc. the North American Fuzzy Inform Processing Socety Conference, pp. 707–710. Computer Society Press, Washington, DC (2004)Google Scholar
  12. 12.
    Lingras, P., Butz, C.J.: Rough set based 1-v-1 and 1-v-r approaches to support vector machine multi-classification. Information Sciences 177, 3298–3782 (2007)CrossRefGoogle Scholar
  13. 13.
    Lingras, P., Butz, C.J.: Rough support vector regression. European Journal of Operational Research 206, 445–455 (2010)MATHCrossRefGoogle Scholar
  14. 14.
    Lingras, P., Butz, C.J.: Conservative and aggressive rough svr modeling. Theoretical Computer Science Journal Section on Theory of Natural Computing 412(42), 5885–5901(2011)Google Scholar
  15. 15.
    Lingras, P., Davies, C.: Applications of rough genetic algorithms. Computational Intelligence: An International Journal 17(3), 435–445 (2001)Google Scholar
  16. 16.
    Mukherjee, S., Osuna, E., Girosi, F.: Nonlinear prediction of chaotic time series using support vector machines. In: Principe, N.M.J., Giles, L., Wilson, E. (eds.) IEEE Workshop on Neural Networks for Signal Process, vol. VII, pp. 511–519. IEEE Computer Society Press, Washington, DC (1997)Google Scholar
  17. 17.
    Muller, K.R., Smola, A., Ratsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.: Predicting time series with support vector machines. In: ICANN, pp. 999–1004 (1997)Google Scholar
  18. 18.
    Pawlak, Z.: Rough Sets. International Journal of Computer and Information Sciences  11, 341–356 (1982)MathSciNetMATHCrossRefGoogle Scholar
  19. 19.
    Pawlak, Z.: Rough real functions (1994), http://citeseer.ist.psu.edu/105864.html
  20. 20.
    Peters, J.F., Han, L., Ramanna, S.: Rough neural computing in signal analysis. Computational Intelligence 17(3), 493–513 (2001)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Platt, J.C.: Support vector machines (2003), http://research.microsoft.com/users/jplatt/svm.html
  22. 22.
    Platt, J.C., Cristianini, N., Shawe-Taylor, J.: Large margin dag’s for multiclass classification. In: Advances in Neural Inform Processing Systems, pp. 547–553. MIT Press (2000)Google Scholar
  23. 23.
    da Rocha, A., Yager, R.: Neural nets and fuzzy logic. In: Kandel, A., Langholz, G. (eds.) Hybrid Architectures for Intelligent Systems, pp. 3–28. CRC Press, Ann Arbor (1992)Google Scholar
  24. 24.
    Smola, A., Scolkopf, B.: A tutorial on support vector regression. In: NeuroCOLT2 (1998)Google Scholar
  25. 25.
    Vapnik, V.: Estimation of Dependencies Based on Empirical Data. Nauka, Moscow (1979)Google Scholar
  26. 26.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, NY (1995)MATHGoogle Scholar
  27. 27.
    Vapnik, V.: Statistical Learning Theory. Wiley, NY (1998)MATHGoogle Scholar
  28. 28.
    Vapnik, V.N., Golowich, S., Smola, A.: Support vector method for function approximation, regression estimation and signal processing. In: Advances in Neural Information Processing Systems, vol. 9, pp. 281–287 (1997)Google Scholar
  29. 29.
    Varma, C.S., Asharaf, S., Murty, M.N.: Rough Core Vector Clustering. In: Ghosh, A., De, R.K., Pal, S.K. (eds.) PReMI 2007. LNCS, vol. 4815, pp. 304–310. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  30. 30.
    Yang, H.: Margin Variations in Support Vector Regression for the Stock Market Prediction. M. Phil. dissertation, Department of Computer Science and Engineering, The Chinese University of Hong Kong (2003), http://www.svms.org/finance/Yang2003.pdf
  31. 31.
    Yang, H., Chan, L.-W., King, I.: Support Vector Machine Regression for Volatile Stock Market Prediction. In: Yin, H., Allinson, N., Freeman, R., Keane, J., Hubbard, S. (eds.) IDEAL 2002. LNCS, vol. 2412, pp. 391–396. Springer, Heidelberg (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Pawan Lingras
    • 1
    • 2
  • Parag Bhalchandra
    • 2
  • Cory Butz
    • 3
  • S. Asharaf
    • 4
  1. 1.Saint Mary’s University HalifaxHalifaxCanada
  2. 2.Swami Ramanand Teerth Marathwada UniversityNandedIndia
  3. 3.University of ReginaReginaCanada
  4. 4.Indian Institute of ManagementKozhikodeIndia

Personalised recommendations