Advertisement

Multi-Kernel Based Feature Selection for Regression

  • Chao-Zhe Lin
  • Xian-Kai Chen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7390)

Abstract

A frequent problem in support vector regression is to select appropriate features or parameters. We present an efficient feature selection method for regression problem where optimal kernel weights and model parameters are learned alternatively. Our approach generalizes v support vector regression and can be formulized as quadratic constrained quadratic programming which can be efficiently solved by level method. Moreover, we introduce an elastic-net-type constrain on the kernel weights. It finds the best trade-off sparsity and accuracy. Our algorithm keeps the useful information and discards redundant information; meanwhile it has the similar properties of v parameter. The experimental evaluation of the proposed algorithm on synthetic dataset and stock marketing price forecasting task show that our method can select suitable features for building model and attain competitive performance.

Keywords

Learning Kernel Feature Selection Support Vector Regression Level Method Sparsity 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Mehmet, G., Ethem, A.: Multiple Kernel Learning Algorithms. Journal of Machine Learning Research 12, 2211–2268 (2011)Google Scholar
  2. 2.
    Abbasnejad, M., Ramachandram, D., Mandava, R.: A Survey of The State of The Art in Learning The Kernels. Knowledge and Information Systems 29 (2011)Google Scholar
  3. 3.
    Qiu, S., Lane, T.: Multiple Kernel Support Vector Regression for siRNA Efficacy Prediction. In: Măndoiu, I., Wang, S.-L., Zelikovsky, A. (eds.) ISBRA 2008. LNCS (LNBI), vol. 4983, pp. 367–378. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Yeh, C.Y., Huang, C.W., Lee, S.J.: A Multiple-kernel Support Vector Regression Approach for Stock Market price Forecasting. Expert Syst. Appl. 38, 2177–2186 (2011)CrossRefGoogle Scholar
  5. 5.
    Suard, F., Rakotomamonjy, A., Bensrhair, A.: IEEE: Model Selection in Pedestrian Detection using Multiple Kernel Learning. In: IEEE Intelligent Vehicles Symposium, pp. 824–829 (2007)Google Scholar
  6. 6.
    Lin, Y.Y., Liu, T.L., Fuh, C.S.: Multiple Kernel Learning for Dimensionality Reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 1147–1160 (2011)CrossRefGoogle Scholar
  7. 7.
    Vedaldi, A., Gulshan, V., Varma, M., Zisserman, A.: Multiple Kernels for Object Detection. In: Proceedings of the International Conference on Computer Vision (2009)Google Scholar
  8. 8.
    Rakotomamonjy, A., Bach, F.R., Canu, S., Grandvalet, Y.: SimpleMKL. Journal of Machine Learning Research 9, 2491–2521 (2008)MathSciNetzbMATHGoogle Scholar
  9. 9.
    LIBSVM Dataset: Classification, Regression and Multy-label, http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/
  10. 10.
    An implementation of Support Vector Machines (SVMs) in C, http://svmlight.joachims.org
  11. 11.
    Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core Vector Machines: Fast SVM Training on Very Large Data Sets. Journal of Machine Learning Research 6, 363–392 (2005)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Qiu, S.: A Framework for Multiple Kernel Support Vector Regression and Its Applications to siRNA Efficacy Prediction. IEEE/ACM Transactions on Computational Biology and Bioinformatics 6, 190–199 (2009)CrossRefGoogle Scholar
  13. 13.
    Gonen, M.: Ethem: Localized Multiple Kernel Regression. In: Proceedings of the 20th IAPR International Conference on Pattern Recognition, Istanbul, Turkey (2010)Google Scholar
  14. 14.
    Haiqin, Y., Zenglin, X., Jieping, Y., King, I., Lyu, M.R.: Efficient Sparse Generalized Multiple Kernel Learning. IEEE Transactions on Neural Networks 22, 433–446 (2011)CrossRefGoogle Scholar
  15. 15.
    Scholkopf, B., Smola, A.J., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Computation 12, 1207–1245 (2000)CrossRefGoogle Scholar
  16. 16.
    Vapnik, V.: Statistical Learning Theory. Wiley, NewYork (1998)zbMATHGoogle Scholar
  17. 17.
    Sonnenburg, S., Ratsch, G., Schafer, C., Scholkopf, B.: Large Scale Multiple Kernel Learning. Journal of Machine Learning Research 7, 1531–1565 (2006)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Lanckriet, G.R.G., Cristianini, N., Bartlett, P., El Ghaoui, L., Jordan, M.I.: Learning the Kernel Matrix with Semidefinite Programming. Journal of Machine Learning Research 5, 27–72 (2004)zbMATHGoogle Scholar
  19. 19.
    Kloft, M., Brefeld, U., Laskov, P.: Non-sparse Multiple Kernel Learning. In: NIPS Workshop on Kernel Learning: Automatic Selection of Optimal Kernels (2008)Google Scholar
  20. 20.
    Kloft, M., Brefeld, U., Sonnenburg, S., Zien, A.: l(p)-Norm Multiple Kernel Learning. Journal of Machine Learning Research 12, 953–997 (2011)MathSciNetGoogle Scholar
  21. 21.
    Vapnik, V.N.: Statistical Learning Theory. Wiley-Interscience (1998)Google Scholar
  22. 22.
    Grandvalet, Y., Canu, S.: Outcomes of the Equivalence of Adaptive Ridge with Least Absolute Shrinkage. Advances in Neural Information Processing Systems, pp. 445–451 (1998)Google Scholar
  23. 23.
    MOSEK ApS. MOSEK Optimization Software (2010), http://www.mosek.com
  24. 24.
    Xu, Z., Jin, R., King, I., Lyu, M.: An Extended Level Method for Efficient Multiple Kernel Learning (2009)Google Scholar
  25. 25.
    Cao, L.J.: Support Vector Machines Experts for Time Series Forecasting. Neurocomputing 51, 321–339 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Chao-Zhe Lin
    • 1
  • Xian-Kai Chen
    • 2
  1. 1.Shenzhen Power Supply Bureau Co., Ltd, China Southern Power GridShenzhenChina
  2. 2.Center for Digital Media Computing, Shenzhen Institute of Advanced TechnologyChinese Academy of SciencesShenzhenChina

Personalised recommendations